Enlightenment Now: The Case for Reason, Science, Humanism, and Progress

Enlightenment Now Book Cover Enlightenment Now
Steven Pinker
Psychology
Penguin
2018
556

Feeling like everything has gone to $%#!? Worried about the future? This is excellent medicine. Pinker takes an analytical approach using data to show that quality of life, wealth, safety, peace, knowledge, and happiness are on the up across the globe!

I would read Thinking Fast and Slow first. It will help with understanding various biases.

There are a LOT of notes. I might try to trim them down by removing some of the stuff that only I would note. There are lots of history and economic history bits.

“Why should I live?”

In the very act of asking that question, you are seeking reasons for your convictions, and so you are committed to reason as the means to discover and justify what is important to you. And there are so many reasons to live! As a sentient being, you have the potential to flourish. You can refine your faculty of reason itself by learning and debating. You can seek explanations of the natural world through science, and insight into the human condition through the arts and humanities. You can make the most of your capacity for pleasure and satisfaction, which allowed your ancestors to thrive and thereby allowed you to exist. You can appreciate the beauty and richness of the natural and cultural world. As the heir to billions of years of life perpetuating itself, you can perpetuate life in turn. You have been endowed with a sense of sympathy—the ability to like, love, respect, help, and show kindness—and you can enjoy the gift of mutual benevolence with friends, family, and colleagues. And because reason tells you that none of this is particular to you, you have the responsibility to provide to others what you expect for yourself. You can foster the welfare of other sentient beings by enhancing life, health, knowledge, freedom, abundance, safety, beauty, and peace. History shows that when we sympathize with others and apply our ingenuity to improving the human condition, we can make progress in doing so, and you can help to continue that progress.

The Enlightenment principle that we can apply reason and sympathy to enhance human flourishing may seem obvious, trite, old-fashioned. I wrote this book because I have come to realize that it is not. More than ever, the ideals of reason, science, humanism, and progress need a wholehearted defense.

We ignore the achievements of the Enlightenment at our peril.

The ideals of the Enlightenment are products of human reason, but they always struggle with other strands of human nature: loyalty to tribe, deference to authority, magical thinking, the blaming of misfortune on evildoers.

Harder to find is a positive vision that sees the world’s problems against a background of progress that it seeks to build upon by solving those problems in their turn.

“The West is shy of its values—it doesn’t speak up for classical liberalism,”

The Islamic State, which “knows exactly what it stands for,”

Friedrich Hayek observed, “If old truths are to retain their hold on men’s minds, they must be restated in the language and concepts of successive generations”

What is enlightenment? In a 1784 essay with that question as its title, Immanuel Kant answered that it consists of “humankind’s emergence from its self-incurred immaturity,” its “lazy and cowardly” submission to the “dogmas and formulas” of religious or political authority.1 Enlightenment’s motto, he proclaimed, is “Dare to understand!”

David Deutsch’s defense of enlightenment, The Beginning of Infinity.

All failures—all evils—are due to insufficient knowledge.

It is a mistake to confuse hard problems with problems unlikely to be solved.

The thinkers of the Enlightenment sought a new understanding of the human condition. The era was a cornucopia of ideas, some of them contradictory, but four themes tie them together: reason, science, humanism, and progress.

If there’s anything the Enlightenment thinkers had in common, it was an insistence that we energetically apply the standard of reason to understanding our world, and not fall back on generators of delusion like faith, dogma, revelation, authority, charisma, mysticism, divination, visions, gut feelings, or the hermeneutic parsing of sacred texts.

Others were pantheists, who used “God” as a synonym for the laws of nature.

They insisted that it was only by calling out the common sources of folly that we could hope to overcome them. The deliberate application of reason was necessary precisely because our common habits of thought are not particularly reasonable.

That leads to the second ideal, science, the refining of reason to understand the world.

To the Enlightenment thinkers the escape from ignorance and superstition showed how mistaken our conventional wisdom could be, and how the methods of science—skepticism, fallibilism, open debate, and empirical testing—are a paradigm of how to achieve reliable knowledge.

The need for a “science of man” was a theme that tied together Enlightenment thinkers who disagreed about much else, including Montesquieu, Hume, Smith, Kant, Nicolas de Condorcet, Denis Diderot, Jean-Baptiste d’Alembert, Jean-Jacques Rousseau, and Giambattista Vico.

They were cognitive neuroscientists, who tried to explain thought, emotion, and psychopathology in terms of physical mechanisms of the brain. They were evolutionary psychologists, who sought to characterize life in a state of nature and to identify the animal instincts that are “infused into our bosoms.” They were social psychologists, who wrote of the moral sentiments that draw us together, the selfish passions that divide us, and the foibles of shortsightedness that confound our best-laid plans. And they were cultural anthropologists, who mined the accounts of travelers and explorers for data both on human universals and on the diversity of customs and mores across the world’s cultures.

The idea of a universal human nature brings us to a third theme, humanism. The thinkers of the Age of Reason and the Enlightenment saw an urgent need for a secular foundation for morality, because they were haunted by a historical memory of centuries of religious carnage: the Crusades, the Inquisition, witch hunts, the European wars of religion. They laid that foundation in what we now call humanism, which privileges the well-being of individual men, women, and children over the glory of the tribe, race, nation, or religion.

We are endowed with the sentiment of sympathy, which they also called benevolence, pity, and commiseration. Given that we are equipped with the capacity to sympathize with others, nothing can prevent the circle of sympathy from expanding from the family and tribe to embrace all of humankind, particularly as reason goads us into realizing that there can be nothing uniquely deserving about ourselves or any of the groups to which we belong.

A humanistic sensibility impelled the Enlightenment thinkers to condemn not just religious violence but also the secular cruelties of their age, including slavery,

The Enlightenment is sometimes called the Humanitarian Revolution, because it led to the abolition of barbaric practices that had been commonplace across civilizations for millennia.

With our understanding of the world advanced by science and our circle of sympathy expanded through reason and cosmopolitanism, humanity could make intellectual and moral progress.

Government is not a divine fiat to reign, a synonym for “society,” or an avatar of the national, religious, or racial soul. It is a human invention, tacitly agreed to in a social contract, designed to enhance the welfare of citizens by coordinating their behavior and discouraging selfish acts that may be tempting to every individual but leave everyone worse off. As the most famous product of the Enlightenment, the Declaration of Independence, put it, in order to secure the right to life, liberty, and the pursuit of happiness, governments are instituted among people, deriving their just powers from the consent of the governed.

The Enlightenment also saw the first rational analysis of prosperity.

Specialization works only in a market that allows the specialists to exchange their goods and services, and Smith explained that economic activity was a form of mutually beneficial cooperation (a positive-sum game, in today’s lingo): each gets back something that is more valuable to him than what he gives up. Through voluntary exchange, people benefit others by benefiting themselves;

He only said that in a market, whatever tendency people have to care for their families and themselves can work to the good of all.

“If the tailor goes to war against the baker, he must henceforth bake his own bread.”)

doux commerce, gentle commerce.

Another Enlightenment ideal, peace.

Together with international commerce, he recommended representative republics (what we would call democracies), mutual transparency, norms against conquest and internal interference, freedom of travel and immigration, and a federation of states that would adjudicate disputes between them.

The first keystone in understanding the human condition is the concept of entropy or disorder, which emerged from 19th-century physics and was defined in its current form by the physicist Ludwig Boltzmann.1 The Second Law of Thermodynamics states that in an isolated system (one that is not interacting with its environment), entropy never decreases.

It follows that any perturbation of the system, whether it is a random jiggling of its parts or a whack from the outside, will, by the laws of probability, nudge the system toward disorder or uselessness—not because nature strives for disorder, but because there are so many more ways of being disorderly than of being orderly.

Law of Entropy.

Life and happiness depend on an infinitesimal sliver of orderly arrangements of matter amid the astronomical number of possibilities.

The Law of Entropy is widely acknowledged in everyday life in sayings such as “Things fall apart,” “Rust never sleeps,” “Shit happens,” “Whatever can go wrong will go wrong,”

“The Second Law of Thermodynamics Is the First Law of Psychology.”4 Why the awe for the Second Law? From an Olympian vantage point, it defines the fate of the universe and the ultimate purpose of life, mind, and human striving: to deploy energy and knowledge to fight back the tide of entropy and carve out refuges of beneficial order.

in 1859, it was reasonable to think they were the handiwork of a divine designer—one of the reasons, I suspect, that so many Enlightenment thinkers were deists rather than outright atheists. Darwin and Wallace made the designer unnecessary. Once self-organizing processes of physics and chemistry gave rise to a configuration of matter that could replicate itself, the copies would make copies, which would make copies of the copies, and so on, in an exponential explosion.

Organisms are open systems: they capture energy from the sun, food, or ocean vents to carve out temporary pockets of order in their bodies and nests while they dump heat and waste into the environment, increasing disorder in the world as a whole.

Nature is a war, and much of what captures our attention in the natural world is an arms race.

the third keystone, information.8 Information may be thought of as a reduction in entropy—as the ingredient that distinguishes an orderly, structured system from the vast set of random, useless ones.

The information contained in a pattern depends on how coarsely or finely grained our view of the world is.

Information is what gets accumulated in a genome in the course of evolution. The sequence of bases in a DNA molecule correlates with the sequence of amino acids in the proteins that make up the organism’s body, and they got that sequence by structuring the organism’s ancestors—reducing their entropy—into the improbable configurations that allowed them to capture energy and grow and reproduce.

Energy channeled by knowledge is the elixir with which we stave off entropy, and advances in energy capture are advances in human destiny. The invention of farming around ten thousand years ago multiplied the availability of calories from cultivated plants and domesticated animals, freed a portion of the population from the demands of hunting and gathering, and eventually gave them the luxury of writing, thinking, and accumulating their ideas. Around 500 BCE, in what the philosopher Karl Jaspers called the Axial Age, several widely separated cultures pivoted from systems of ritual and sacrifice that merely warded off misfortune to systems of philosophical and religious belief that promoted selflessness and promised spiritual transcendence.

(Confucius, Buddha, Pythagoras, Aeschylus, and the last of the Hebrew prophets walked the earth at the same time.)

The Axial Age was when agricultural and economic advances provided a burst of energy: upwards of 20,000 calories per person per day in food, fodder, fuel, and raw materials. This surge allowed the civilizations to afford larger cities, a scholarly and priestly class, and a reorientation of their priorities from short-term survival to long-term harmony. As Bertolt Brecht put it millennia later: Grub first, then ethics.19

And the next leap in human welfare—the end of extreme poverty and spread of abundance, with all its moral benefits—will depend on technological advances that provide energy at an acceptable economic and environmental cost to the entire world

The first piece of wisdom they offer is that misfortune may be no one’s fault. A major breakthrough of the Scientific Revolution—perhaps its biggest breakthrough—was to refute the intuition that the universe is saturated with purpose.

Galileo, Newton, and Laplace replaced this cosmic morality play with a clockwork universe in which events are caused by conditions in the present, not goals for the future.

Not only does the universe not care about our desires, but in the natural course of events it will appear to thwart them, because there are so many more ways for things to go wrong than for them to go right.

Awareness of the indifference of the universe was deepened still further by an understanding of evolution.

As Adam Smith pointed out, what needs to be explained is wealth. Yet even today, when few people believe that accidents or diseases have perpetrators, discussions of poverty consist mostly of arguments about whom to

Another implication of the Law of Entropy is that a complex system like an organism can easily be disabled, because its functioning depends on so many improbable conditions being satisfied at once.

So for all the flaws in human nature, it contains the seeds of its own improvement, as long as it comes up with norms and institutions that channel parochial interests into universal benefits. Among those norms are free speech, nonviolence, cooperation, cosmopolitanism, human rights, and an acknowledgment of human fallibility, and among the institutions are science, education, media, democratic government, international organizations, and markets. Not coincidentally, these were the major brainchildren of the Enlightenment.

And the second decade of the 21st century saw the rise of populist movements that blatantly repudiate the ideals of the Enlightenment.1 They are tribalist rather than cosmopolitan, authoritarian rather than democratic, contemptuous of experts rather than respectful of knowledge, and nostalgic for an idyllic past rather than hopeful for a better future.

The disdain for reason, science, humanism, and progress has a long pedigree in elite intellectual and artistic culture.

The Enlightenment was swiftly followed by a counter-Enlightenment, and the West has been divided ever since.

The Romantic movement pushed back particularly hard against Enlightenment ideals. Rousseau, Johann Herder, Friedrich Schelling, and others denied that reason could be separated from emotion, that individuals could be considered apart from their culture, that people should provide reasons for their acts, that values applied across times and places, and that peace and prosperity were desirable ends. A human is a part of an organic whole—a culture, race, nation, religion, spirit, or historical force—and people should creatively channel the transcendent unity of which they are a part. Heroic struggle, not the solving of problems, is the greatest good, and violence is inherent to nature and cannot be stifled without draining life of its vitality. “There are but three groups worthy of respect,” wrote Charles Baudelaire, “the priest, the warrior, and the poet. To know, to kill, and to create.”

The most obvious is religious faith.

Religions also commonly clash with humanism whenever they elevate some moral good above the well-being of humans, such as accepting a divine savior, ratifying a sacred narrative, enforcing rituals and taboos, proselytizing other people to do the same, and punishing or demonizing those who don’t.

A second counter-Enlightenment idea is that people are the expendable cells of a superorganism—a clan, tribe, ethnic group, religion, race, class, or nation—and that the supreme good is the glory of this collectivity rather than the well-being of the people who make it up. An obvious example is nationalism, in which the superorganism is the nation-state, namely an ethnic group with a government.

Nationalism should not be confused with civic values, public spirit, social responsibility, or cultural pride.

It’s quite another thing when a person is forced to make the supreme sacrifice for the benefit of a charismatic leader, a square of cloth, or colors on a map.

Religion and nationalism are signature causes of political conservatism, and continue to affect the fate of billions of people in the countries under their influence.

Left-wing and right-wing political ideologies have themselves become secular religions, providing people with a community of like-minded brethren, a catechism of sacred beliefs, a well-populated demonology, and a beatific confidence in the righteousness of their cause.

Political ideology undermines reason and science.7 It scrambles people’s judgment, inflames a primitive tribal mindset, and distracts them from a sounder understanding of how to improve the world. Our greatest enemies are ultimately not our political adversaries but entropy, evolution (in the form of pestilence and the flaws in human nature), and most of all ignorance—a shortfall of knowledge of how best to solve our problems.

For almost two centuries, a diverse array of writers has proclaimed that modern civilization, far from enjoying progress, is in steady decline and on the verge of collapse.

Declinism bemoans our Promethean dabbling with technology.9 By wresting fire from the gods, we have only given our species the means to end its own existence, if not by poisoning our environment then by loosing nuclear weapons, nanotechnology, cyberterror, bioterror, artificial intelligence, and other existential threats upon the world

Another variety of declinism agonizes about the opposite problem—not that modernity has made life too harsh and dangerous, but that it has made it too pleasant and safe. According to these critics, health, peace, and prosperity are bourgeois diversions from what truly matters in life.

In the twilight of a decadent, degenerate civilization, true liberation is to be found not in sterile rationality or effete humanism but in an authentic, heroic, holistic, organic, sacred, vital being-in-itself and will to power.

Friedrich Nietzsche, who coined the term will to power, recommends the aristocratic violence of the “blond Teuton beasts” and the samurai, Vikings, and Homeric heroes: “hard, cold, terrible, without feelings and without conscience, crushing everything, and bespattering everything with blood.”

The historical pessimists dread the downfall but lament that we are powerless to stop it. The cultural pessimists welcome it with a “ghoulish schadenfreude.” Modernity is so bankrupt, they say, that it cannot be improved, only transcended.

A final alternative to Enlightenment humanism condemns its embrace of science. Following C. P. Snow, we can call it the Second Culture, the

Second Culture today. Many intellectuals and critics express a disdain for science as anything but a fix for mundane problems. They write as if the consumption of elite art is the ultimate moral good.

Intellectual magazines regularly denounce “scientism,” the intrusion of science into the territory of the humanities such as politics and the arts.

Science is commonly blamed for racism, imperialism, world wars, and the Holocaust.

Intellectuals hate progress.

It’s the idea of progress that rankles the chattering class—the Enlightenment belief that by understanding the world we can improve the human condition.

A modern optimist believes that the world can be much, much better than it is today. Voltaire was satirizing not the Enlightenment hope for progress but its opposite, the religious rationalization for suffering called theodicy, according to which God had no choice but to allow epidemics and massacres because a world without them is metaphysically impossible.

In The Idea of Decline in Western History, Arthur Herman shows that prophets of doom are the all-stars of the liberal arts curriculum, including Nietzsche, Arthur Schopenhauer, Martin Heidegger, Theodor Adorno, Walter Benjamin, Herbert Marcuse, Jean-Paul Sartre, Frantz Fanon, Michel Foucault, Edward Said, Cornel West, and a chorus of eco-pessimists.

Psychologists have long known that people tend to see their own lives through rose-colored glasses: they think they’re less likely than the average person to become the victim of a divorce, layoff, accident, illness, or crime. But change the question from the people’s lives to their society, and they transform from Pollyanna to Eeyore.

Public opinion researchers call it the Optimism Gap.

The news, far from being a “first draft of history,” is closer to play-by-play sports commentary.

The nature of news is likely to distort people’s view of the world because of a mental bug that the psychologists Amos Tversky and Daniel Kahneman called the Availability heuristic: people estimate the probability of an event or the frequency of a kind of thing by the ease with which instances come to mind.

Availability errors are a common source of folly in human reasoning.

Vacationers stay out of the water after they have read about a shark attack or if they have just seen Jaws.12 Plane

How can we soundly appraise the state of the world?

The answer is to count. How many people are victims of violence as a proportion of the number of people alive? How many are sick, how many starving, how many poor, how many oppressed, how many illiterate, how many unhappy? And are those numbers going up or down? A quantitative mindset, despite its nerdy aura, is in fact the morally enlightened one, because it treats every human life as having equal value rather than privileging the people who are closest to us or most photogenic.

Resistance to the idea of progress runs deeper than statistical fallacies.

Many people lack the conceptual tools to ascertain whether progress has taken place or not; the very idea that things can get better just doesn’t compute.

A decline is not the same thing as a disappearance. (The statement “x > y” is different from the statement “y = 0.”) Something can decrease a lot without vanishing altogether. That means that the level of violence today is completely irrelevant to the question of whether violence has declined over the course of history.

The only way to answer that question is to compare the level of violence now with the level of violence in the past. And whenever you look at the level of violence in the past, you find a lot of it, even if it isn’t as fresh in memory as the morning’s headlines.

No, the psychological roots of progressophobia run deeper. The deepest is a bias that has been summarized in the slogan “Bad is stronger than good.”21 The idea can be captured in a set of thought experiments suggested by Tversky.

The psychological literature confirms that people dread losses more than they look forward to gains, that they dwell on setbacks more than they savor good fortune, and that they are more stung by criticism than they are heartened by praise. (As a psycholinguist I am compelled to add that the English language has far more words for negative emotions than for positive ones.)

One exception to the Negativity bias is found in autobiographical memory. Though we tend to remember bad events as well as we remember good ones, the negative coloring of the misfortunes fades with time, particularly the ones that happened to us.24 We are wired for nostalgia: in human memory, time heals most wounds.

The cure for the Availability bias is quantitative thinking,

Trump was the beneficiary of a belief—near universal in American journalism—that “serious news” can essentially be defined as “what’s going wrong.” . . . For decades, journalism’s steady focus on problems and seemingly incurable pathologies was preparing the soil that allowed Trump’s seeds of discontent and despair to take root. . . . One consequence is that many Americans today have difficulty imagining, valuing or even believing in the promise of incremental system change, which leads to a greater appetite for revolutionary, smash-the-machine change.

The shift during the Vietnam and Watergate eras from glorifying leaders to checking their power—with an overshoot toward indiscriminate cynicism, in which everything about America’s civic actors invites an aggressive takedown.

Sentiment mining assesses the emotional tone of a text by tallying the number and contexts of words with positive and negative connotations, like good, nice, terrible, and horrific.

Putting aside the wiggles and waves that reflect the crises of the day, we see that the impression that the news has become more negative over time is real. The New York Times got steadily more morose from the early 1960s to the early 1970s, lightened up a bit (but just a bit) in the 1980s and 1990s, and then sank into a progressively worse mood in the first decade of the new century.

And here is a shocker: The world has made spectacular progress in every single measure of human well-being. Here is a second shocker: Almost no one knows about it.

In the mid-18th century, life expectancy in Europe and the Americas was around 35, where it had been parked for the 225 previous years for which we have data.3 Life expectancy for the world as a whole was 29.

The life expectancy of hunter-gatherers is around 32.5, and it probably decreased among the peoples who first took up farming because of their starchy diet and the diseases they caught from their livestock and each other.

It returned to the low 30s by the Bronze Age, where it stayed put for thousands of years, with small fluctuations across centuries and regions.4 This period in human history may be called the Malthusian Era, when any advance in agriculture or health was quickly canceled by the resulting bulge in population, though “era” is an odd term for 99.9 percent of our species’ existence.

Progress is an outcome not of magic but of problem-solving.

Problems are inevitable, and at times particular sectors of humanity have suffered terrible setbacks.

Average life spans are stretched the most by decreases in infant and child mortality, both because children are fragile and because the death of a child brings down the average more than the death of a 60-year-old.

Are we really living longer, or are we just surviving infancy in greater numbers?

So do those of us who survive the ordeals of childbirth and childhood today live any longer than the survivors of earlier eras? Yes, much longer. Figure

No matter how old you are, you have more years ahead of you than people of your age did in earlier decades and centuries.

The economist Steven Radelet has pointed out that “the improvements in health among the global poor in the last few decades are so large and widespread that they rank among the greatest achievements in human history. Rarely has the basic well-being of so many people around the world improved so substantially, so quickly. Yet few people are even aware that it is happening.”13

In his 2005 bestseller The Singularity Is Near, the inventor Ray Kurzweil forecasts that those of us who make it to 2045 will live forever, thanks to advances in genetics, nanotechnology (such as nanobots that will course through our bloodstream and repair our bodies from the inside), and artificial intelligence, which will not just figure out how to do all this but recursively improve its own intelligence without limit.

Lacking the gift of prophecy, no one can say whether scientists will ever find a cure for mortality. But evolution and entropy make it unlikely. Senescence is baked into our genome at every level of organization, because natural selection favors genes that make us vigorous when we are young over those that make us live as long as possible.

Peter Hoffman points out, “Life pits biology against physics in mortal combat.”

“Income—although important both in and of itself and as a component of wellbeing . . .—is not the ultimate cause of wellbeing.”16 The fruits of science are not just high-tech pharmaceuticals such as vaccines, antibiotics, antiretrovirals, and deworming pills. They also comprise ideas—ideas that may be cheap to implement and obvious in retrospect, but which save millions of lives. Examples include boiling, filtering, or adding bleach to water; washing hands;

The historian Fernand Braudel has documented that premodern Europe suffered from famines every few decades.

Many of those who were not starving were too weak to work, which locked them into poverty.

As the comedian Chris Rock observed, “This is the first society in history where the poor people are fat.”

hardship everywhere before the 19th century, rapid improvement in Europe and the United States over the next two centuries, and, in recent decades, the developing world catching up.

Fortunately, the numbers reflect an increase in the availability of calories throughout the range, including the bottom.

Figure 7-2 shows the proportion of children who are stunted in a representative sample of countries which have data for the longest spans of time.

We see that in just two decades the rate of stunting has been cut in half.

Not only has chronic undernourishment been in decline, but so have catastrophic famines—the crises that kill people in large numbers and cause widespread wasting (the condition of being two standard deviations below one’s expected weight)

Figure 7-4 shows the number of deaths in major famines in each decade for the past 150 years, scaled by world population at the time.

The link from crop failure to famine has been broken. Most recent drought- or flood-triggered food crises have been adequately met by a combination of local and international humanitarian response.

In 1798 Thomas Malthus explained that the frequent famines of his era were unavoidable and would only get worse, because “population, when unchecked, increases in a geometrical ratio. Subsistence increases only in an arithmetic ratio. A slight acquaintance with numbers will show the immensity of the first power in comparison with the second.” The implication was that efforts to feed the hungry would only lead to more misery, because they would breed more children who were doomed to hunger in their turn.

Where did Malthus’s math go wrong? Looking at the first of his curves, we already saw that population growth needn’t increase in a geometric ratio indefinitely, because when people get richer and more of their babies survive, they have fewer babies (see also figure 10-1). Conversely, famines don’t reduce population growth for long. They disproportionately kill children and the elderly, and when conditions improve, the survivors quickly replenish the population.13 As Hans Rosling put it, “You can’t stop population growth by letting poor children die.”14

Looking at the second curve, we discover that the food supply can grow geometrically when knowledge is applied to increase the amount of food that can be coaxed out of a patch of land. Since the birth of agriculture ten thousand years ago, humans have been genetically engineering plants and animals by selectively breeding the ones that had the most calories and fewest toxins and that were the easiest to plant and harvest.

Clever farmers also tinkered with irrigation, plows, and organic fertilizers, but Malthus always had the last word.

The moral imperative was explained to Gulliver by the King of Brobdingnag: “Whoever makes two ears of corn, or two blades of grass to grow where only one grew before, deserves better of humanity, and does more essential service to his country than the whole race of politicians put together.”

British Agricultural Revolution.16 Crop rotation and improvements to plows and seed drills were followed by mechanization, with fossil fuels replacing human and animal muscle.

But the truly gargantuan boost would come from chemistry. The N in SPONCH, the acronym taught to schoolchildren for the chemical elements that make up the bulk of our bodies, stands for nitrogen, a major ingredient of protein, DNA, chlorophyll, and the energy carrier ATP. Nitrogen atoms are plentiful in the air but bound in pairs (hence the chemical formula N2), which are hard to split apart so that plants can use them.

Fertilizer on an industrial scale,

Over the past century, grain yields per hectare have swooped upward while real prices have plunged.

In the United States in 1901, an hour’s wages could buy around three quarts of milk; a century later, the same wages would buy sixteen quarts. The amount of every other foodstuff that can be bought with an hour of labor has multiplied as well: from a pound of butter to five pounds, a dozen eggs to twelve dozen, two pounds of pork chops to five pounds, and nine pounds of flour to forty-nine pounds.

In addition to beating back hunger, the ability to grow more food from less land has been, on the whole, good for the planet. Despite their bucolic charm, farms are biological deserts which sprawl over the landscape at the expense of forests and grasslands. Now that farms have receded in some parts of the world, temperate forests have been bouncing back,

High-tech agriculture, the critics said, consumes fossil fuels and groundwater, uses herbicides and pesticides, disrupts traditional subsistence agriculture, is biologically unnatural, and generates profits for corporations. Given that it saved a billion lives and helped consign major famines to the dustbin of history, this seems to me like a reasonable price to pay. More important, the price need not be with us forever. The beauty of scientific progress is that it never locks us into a technology but can develop new ones with fewer problems than the old ones (a dynamic we will return to here).

There is no such thing as a genetically unmodified crop). Yet traditional environmentalist groups, with what the ecology writer Stewart Brand has called their “customary indifference to starvation,” have prosecuted a fanatical crusade to keep transgenic crops from people—not just from whole-food gourmets in rich countries but from poor farmers in developing ones.

Poverty has no causes,” wrote the economist Peter Bauer.

History is written not so much by the victors as by the affluent, the sliver of humanity with the leisure and education to write about it.

Norberg, drawing on Braudel, offers vignettes of this era of misery, when the definition of poverty was simple: “if you could afford to buy bread to survive another day, you were not poor.”

Economists speak of a “lump fallacy” or “physical fallacy” in which a finite amount of wealth has existed since the beginning of time, like a lode of gold, and people have been fighting over how to divide it up ever since.4 Among the brainchildren of the Enlightenment is the realization that wealth is created.5 It is created primarily by knowledge and cooperation: networks of people arrange matter into improbable but useful configurations and combine the fruits of their ingenuity and labor. The corollary, just as radical, is that we can figure out how to make more of it.

The endurance of poverty and the transition to modern affluence can be shown in a simple but stunning graph. It plots, for the past two thousand years, a standard measure of wealth creation, the Gross World Product, measured in 2011 international dollars.

The story of the growth of prosperity in human history depicted in figure 8-1 is close to: nothing . . . nothing . . . nothing . . . (repeat for a few thousand years) . . . boom! A millennium after the year 1 CE, the world was barely richer than it was at the time of Jesus.

Starting in the 19th century, the increments turned into leaps and bounds. Between 1820 and 1900, the world’s income tripled. It tripled again in a bit more than fifty years. It took only twenty-five years for it to triple again, and another thirty-three years to triple yet another time. The Gross World Product today has grown almost a hundredfold since the Industrial Revolution was in place in 1820, and almost two hundredfold from the start of the Enlightenment in the 18th century.

Indeed, the Gross World Product is a gross underestimate of the expansion of prosperity.

Adam Smith called it the paradox of value: when an important good becomes plentiful, it costs far less than what people are willing to pay for it. The difference is called consumer surplus, and the explosion of this surplus over time is impossible to tabulate.

Economic historian Joel Mokyr calls “the enlightened economy.”8 The machines and factories of the Industrial Revolution, the productive farms of the Agricultural Revolution, and the water pipes of the Public Health Revolution could deliver more clothes, tools, vehicles, books, furniture, calories, clean water, and other things that people want than the craftsmen and farmers of a century before.

“After 1750 the epistemic base of technology slowly began to expand. Not only did new products and techniques emerge; it became better understood why and how the old ones worked, and thus they could be refined, debugged, improved, combined with others in novel ways and adapted to new uses.”

One was the development of institutions that lubricated the exchange of goods, services, and ideas—the dynamic singled out by Adam Smith as the generator of wealth. The economists Douglass North, John Wallis, and Barry Weingast argue that the most natural way for states to function, both in history and in many parts of the world today, is for elites to agree not to plunder and kill each other, in exchange for which they are awarded a fief, franchise, charter, monopoly, turf, or patronage network that allows them to control some sector of the economy and live off the rents (in the economist’s sense of income extracted from exclusive access to a resource).

The third innovation, after science and institutions, was a change in values: an endorsement of what the economic historian Deirdre McCloskey calls bourgeois virtue.12 Aristocratic, religious, and martial cultures have always looked down on commerce as tawdry and venal. But in 18th-century England and the Netherlands, commerce came to be seen as moral and uplifting. Voltaire and other Enlightenment philosophes valorized the spirit of commerce for its ability to dissolve sectarian hatreds:

The Enlightenment thus translated the ultimate question ‘How can I be saved?’ into the pragmatic ‘How can I be happy?’—thereby heralding a new praxis of personal and social adjustment.”

In 1905 the sociologist Max Weber proposed that capitalism depended on a “Protestant ethic” (a hypothesis with the intriguing prediction that Jews should fare poorly in capitalist societies, particularly in business and finance). In any case the Catholic countries of Europe soon zoomed out of poverty too, and a succession of other escapes shown in figure 8-2 have put the lie to various theories explaining why Buddhism, Confucianism, Hinduism, or generic “Asian” or “Latin” values were incompatible with dynamic market economies.

Sarting in the late 20th century, poor countries have been escaping from poverty in their turn. The Great Escape is becoming the Great Convergence.

Extreme poverty is being eradicated, and the world is becoming middle class.

In 1800, at the dawn of the Industrial Revolution, most people everywhere were poor. The average income was equivalent to that in the poorest countries in Africa today (about $500 a year in international dollars), and almost 95 percent of the world lived in what counts today as “extreme poverty” (less than $1.90 a day). By 1975, Europe and its offshoots had completed the Great Escape, leaving the rest of the world behind, with one-tenth their income, in the lower hump of a camel-shaped curve.20 In the 21st century the camel has become a dromedary, with a single hump shifted to the right and a much lower tail on the left: the world had become richer and more equal.

In two hundred years the rate of extreme poverty in the world has tanked from 90 percent to 10, with almost half that decline occurring in the last thirty-five years.

Also, an increase in the number of people who can withstand the grind of entropy and the struggle of evolution is a testimonial to the sheer magnitude of the benevolent powers of science, markets, good government, and other modern institutions.

“In 1976,” Radelet writes, “Mao single-handedly and dramatically changed the direction of global poverty with one simple act: he died.”

The death of Mao Zedong is emblematic of three of the major causes of the Great Convergence.

The first is the decline of communism (together with intrusive socialism). For reasons we have seen, market economies can generate wealth prodigiously while totalitarian planned economies impose scarcity, stagnation, and often famine.

A shift from collectivization, centralized control, government monopolies, and suffocating permit bureaucracies (what in India was called “the license raj”) to open economies took place on a number of fronts beginning in the 1980s. They included Deng Xiaoping’s embrace of capitalism in China, the collapse of the Soviet Union and its domination of Eastern Europe, and the liberalization of the economies of India, Brazil, Vietnam, and other countries.

It’s important to add that the market economies which blossomed in the more fortunate parts of the developing world were not the laissez-faire anarchies of right-wing fantasies and left-wing nightmares. To varying degrees, their governments invested in education, public health, infrastructure, and agricultural and job training, together with social insurance and poverty-reduction programs.35

Radelet’s second explanation of the Great Convergence is leadership.

During the decades of stagnation from the 1970s to the early 1990s, many other developing countries were commandeered by psychopathic strongmen with ideological, religious, tribal, paranoid, or self-aggrandizing agendas rather than a mandate to enhance the well-being of their citizens.

The 1990s and 2000s saw a spread of democracy (chapter 14) and the rise of levelheaded, humanistic leaders—not just national statesmen like Nelson Mandela, Corazon Aquino, and Ellen Johnson Sirleaf but local religious and civil-society leaders acting to improve the lives of their compatriots.38

A third cause was the end of the Cold War. It not only pulled the rug out from under a number of tinpot dictators but snuffed out many of the civil wars that had racked developing countries since they attained independence in the 1960s.

A fourth cause is globalization, in particular the explosion in trade made possible by container ships and jet airplanes and by the liberalization of tariffs and other barriers to investment and trade. Classical economics and common sense agree that a larger trading network should make everyone, on average, better off.

Radelet, who observes that “while working on the factory floor is often referred to as sweatshop labor, it is often better than the granddaddy of all sweatshops: working in the fields as an agricultural day laborer.”

Over the course of a generation, slums, barrios, and favelas can morph into suburbs, and the working class can become middle class.47

Progress consists of unbundling the features of a social process as much as we can to maximize the human benefits while minimizing the harms.

The last, and in many analyses the most important, contributor to the Great Convergence is science and technology.49 Life is getting cheaper, in a good way. Thanks to advances in know-how, an hour of labor can buy more food, health, education, clothing, building materials, and small necessities and luxuries than it used to. Not only can people eat cheaper food and take cheaper medicines, but children can wear cheap plastic sandals instead of going barefoot, and adults can hang out together getting their hair done or watching a soccer game using cheap solar panels and appliances.

Today about half the adults in the world own a smartphone, and there are as many subscriptions as people. In parts of the world without roads, landlines, postal service, newspapers, or banks, mobile phones are more than a way to share gossip and cat photos; they are a major generator of wealth. They allow people to transfer money, order supplies, track the weather and markets, find day labor, get advice on health and farming practices, even obtain a primary education.

Quality of life.

Health, longevity, and education are so much more affordable than they used to be.

Everyone is living longer regardless of income.55 In the richest country two centuries ago (the Netherlands), life expectancy was just forty, and in no country was it above forty-five.

Today, life expectancy in the poorest country in the world (the Central African Republic) is fifty-four, and in no country is it below forty-five.56

GDP per capita correlates with longevity, health, and nutrition.57 Less obviously, it correlates with higher ethical values like peace, freedom, human rights, and tolerance.

Between 2009 and 2016, the proportion of articles in the New York Times containing the word inequality soared tenfold, reaching 1 in 73.1

The Great Recession began in 2007.

In the United States, the share of income going to the richest one percent grew from 8 percent in 1980 to 18 percent in 2015, while the share going to the richest tenth of one percent grew from 2 percent to 8 percent.4

I need a chapter on the topic because so many people have been swept up in the dystopian rhetoric and see inequality as a sign that modernity has failed to improve the human condition. As we will see, this is wrong, and for many reasons.

Income inequality is not a fundamental component of well-being.

The point is made with greater nuance by the philosopher Harry Frankfurt in his 2015 book On Inequality.5 Frankfurt argues that inequality itself is not morally objectionable; what is objectionable is poverty. If a person lives a long, healthy, pleasurable, and stimulating life, then how much money the Joneses earn, how big their house is, and how many cars they drive are morally irrelevant. Frankfurt writes, “From the point of view of morality, it is not important everyone should have the same. What is morally important is that each should have enough.”

Lump fallacy—the mindset in which wealth is a finite resource,

Since the Industrial Revolution, it has expanded exponentially. That means that when the rich get richer, the poor can get richer, too.

“The poorer half of the population are as poor today as they were in the past, with barely 5 percent of total wealth in 2010, just as in 1910.”8 But total wealth today is vastly greater than it was in 1910, so if the poorer half own the same proportion, they are far richer, not “as poor.”

Among the world’s billionaires is J. K. Rowling, author of the Harry Potter novels, which have sold more than 400 million copies and have been adapted into a series of films seen by a similar number of people.10 Suppose that a billion people have handed over $10 each for the pleasure of a Harry Potter paperback or movie ticket, with a tenth of the proceeds going to Rowling. She has become a billionaire, increasing inequality, but she has made people better off, not worse off (which is not to say that every rich person has made people better off).

Her wealth arose as a by-product of the voluntary decisions of billions of book buyers and moviegoers.

When the rich get too rich, everyone else feels poor, so inequality lowers well-being even if everyone gets richer. This is an old idea in social psychology, variously called the theory of social comparison, reference groups, status anxiety, or relative deprivation.

We will see in chapter 18 that richer people and people in richer countries are (on average) happier than poorer people and people in poorer countries.

In their well-known book The Spirit Level, the epidemiologists Richard Wilkinson and Kate Pickett claim that countries with greater income inequality also have higher rates of homicide, imprisonment, teen pregnancy, infant mortality, physical and mental illness, social distrust, obesity, and substance abuse.14

The Spirit Level theory has been called “the left’s new theory of everything,” and it is as problematic as any other theory that leaps from a tangle of correlations to a single-cause explanation. For one thing, it’s not obvious that people

Wilkinson and Pickett’s sample was restricted to developed countries, but even within that sample the correlations are evanescent, coming and going with choices about which countries to include.

Kelley and Evans held constant the major factors that are known to affect happiness, including GDP per capita, age, sex, education, marital status, and religious attendance, and found that the theory that inequality causes unhappiness “comes to shipwreck on the rock of the facts.”

The authors suggest that whatever envy, status anxiety, or relative deprivation people may feel in poor, unequal countries is swamped by hope. Inequality is seen as a harbinger of opportunity, a sign that education and other routes to upward mobility might pay off for them and their children.

People are content with economic inequality as long as they feel that the country is meritocratic, and they get angry when they feel it isn’t. Narratives about the causes of inequality loom larger in people’s minds than the existence of inequality. That creates an opening for politicians to rouse the rabble by singling out cheaters who take more than their fair share: welfare queens, immigrants, foreign countries, bankers, or the rich, sometimes identified with ethnic minorities.18

Investment in research and infrastructure to escape economic stagnation, regulation of the finance sector to reduce instability, broader access to education and job training to facilitate economic mobility, electoral transparency and finance reform to eliminate illicit influence, and so on.

Economic inequality, then, is not itself a dimension of human well-being, and it should not be confused with unfairness or with poverty. Let’s now turn from the moral significance of inequality to the question of why it has changed over time.

The simplest narrative of the history of inequality is that it comes with modernity.

Inequality, in this story, started at zero, and as wealth increased over time, inequality grew with it. But the story is not quite right.

The image of forager egalitarianism is misleading. For one thing, the hunter-gatherer bands that are still around for us to study are not representative of an ancestral way of life, because they have been pushed into marginal lands and lead nomadic lives that make the accumulation of wealth impossible, if for no other reason than that it would be a nuisance to carry around. But sedentary hunter-gatherers, such as the natives of the Pacific Northwest, which is flush with salmon, berries, and fur-bearing animals, were florid inegalitarians, and developed a hereditary nobility who kept slaves, hoarded luxuries, and flaunted their wealth in gaudy potlatches.

They are less likely to share plant foods, since gathering is a matter of effort, and indiscriminate sharing would allow free-riding.

What happens when a society starts to generate substantial wealth? An increase in absolute inequality (the difference between the richest and poorest) is almost a mathematical necessity.

Some people are bound to take greater advantage of the new opportunities than others, whether by luck, skill, or effort, and they will reap disproportionate rewards.

As the Industrial Revolution gathered steam, European countries made a Great Escape from universal poverty, leaving the other countries behind.

What’s significant about the decline in inequality is that it’s a decline in poverty.

But then, starting around 1980, inequality bounced into a decidedly un-Kuznetsian rise.

The rise and fall in inequality in the 19th century reflects Kuznets’s expanding economy, which gradually pulls more people into urban, skilled, and thus higher-paying occupations. But the 20th-century plunge—which has been called the Great Leveling or the Great Compression—had more sudden causes. The plunge overlaps the two world wars, and that is no coincidence: major wars often level the income distribution.

The historian Walter Scheidel identifies “Four Horsemen of Leveling”: mass-mobilization warfare, transformative revolution, state collapse, and lethal pandemics.

The four horsemen reduce inequality by killing large numbers of workers, driving up the wages of those who survive.

But modernity has brought a more benign way to reduce inequality. As we have seen, a market economy is the best poverty-reduction program we know of for an entire country.

(Another way of putting it is that a market economy maximizes the average, but we also care about the variance and the range.) As the circle of sympathy in a country expands to encompass the poor (and as people want to insure themselves should they ever become poor), they increasingly allocate a portion of their pooled resources—that is, government funds—to alleviating that poverty.

The net result is “redistribution,” but that is something of a misnomer, because the goal is to raise the bottom, not lower the top, even if in practice the top is lowered.

Figure 9-4 shows that social spending took off in the middle decades of the 20th century (in the United States, with the New Deal in the 1930s; in other developed countries, with the rise of the welfare state after World War II). Social spending now takes up a median of 22 percent of their GDP.31

The explosion in social spending has redefined the mission of government: from warring and policing to also nurturing.32 Governments underwent this transformation for several reasons. Social spending inoculates citizens against the appeal of communism and fascism. Some of the benefits, like universal education and public health, are public goods that accrue to everyone, not just the direct beneficiaries.

Social spending is designed to help people who have less money, with the bill footed by people who have more money. This is the principle known as redistribution, the welfare state, social democracy, or socialism (misleadingly, because free-market capitalism is compatible with any amount of social spending).

The United States is famously resistant to anything smacking of redistribution. Yet it allocates 19 percent of its GDP to social services, and despite the best efforts of conservatives and libertarians the spending has continued to grow. The most recent expansions are a prescription drug benefit introduced by George W. Bush and the eponymous health insurance plan known as Obamacare introduced by his successor.

Many Americans are forced to pay for health, retirement, and disability benefits through their employers rather than the government. When this privately administered social spending is added to the public portion, the United States vaults from twenty-fourth into second place among the thirty-five OECD countries, just behind France.34

Social spending, like everything, has downsides. As with all insurance, it can create a “moral hazard” in which the insured slack off or take foolish risks, counting on the insurer to bail them out if they fail.

The rise of inequality in wealthy nations that began around 1980. This is the development that inspired the claim that life has gotten worse for everyone but the richest.

A “second industrial revolution” driven by electronic technologies replayed the Kuznets rise by creating a demand for highly skilled professionals, who pulled away from the less educated at the same time that the jobs requiring less education were eliminated by automation. Globalization allowed workers in China, India, and elsewhere to underbid their American competitors in a worldwide labor market, and the domestic companies that failed to take advantage of these offshoring opportunities were outcompeted on price.

Declining inequality worldwide, increasing inequality within rich countries—into a single graph which pleasingly takes the shape of an elephant (figure 9-5

The cliché about globalization is that it creates winners and losers, and the elephant curve displays them as peaks and valleys. It reveals that the winners include most of humanity. The elephant’s bulk (its body and head), which includes about seven-tenths of the world’s population, consists of the “emerging global middle class,” mainly in Asia. Over this period they saw cumulative gains of 40 to 60 percent in their real incomes. The nostrils at the tip of the trunk consist of the world’s richest one percent, who also saw their incomes soar.

Globalization’s “losers”: the lower middle classes of the rich world, who gained less than 10 percent. These are the focus of the new concern about inequality: the “hollowed-out middle class,” the Trump supporters, the people globalization left behind.

The rich certainly have prospered more than anyone else, perhaps more than they should have, but the claim about everyone else is not accurate, for a number of reasons.

Most obviously, it’s false for the world as a whole: the majority of the human race has become much better off. The two-humped camel has become

Extreme poverty has plummeted and may disappear; and both international and global inequality coefficients are in decline. Now, it’s true that the world’s poor have gotten richer in part at the expense of the American lower middle class, and if I were an American politician I would not publicly say that the tradeoff was worth it. But as citizens of the world considering humanity as a whole, we have to say that the tradeoff is worth it.

Today’s discussions of inequality often compare the present era unfavorably with a golden age of well-paying, dignified, blue-collar jobs that have been made obsolete by automation and globalization.

What’s relevant to well-being is how much people earn, not how high they rank.

Stephen Rose divided the American population into classes using fixed milestones rather than quantiles. “Poor” was defined as an income of $0–$30,000 (in 2014 dollars) for a family of three, “lower middle class” as $30,000–$50,000, and so on.46 The study found that in absolute terms, Americans have been moving on up. Between 1979 and 2014, the percentage of poor Americans dropped from 24 to 20,

Upper middle class ($100,000–$350,000),

The middle class is being hollowed out in part because so many Americans are becoming affluent. Inequality undoubtedly increased—the rich got richer faster than the poor and middle class got richer—but everyone (on average) got richer.

A third reason that rising inequality has not made the lower classes worse off is that low incomes have been mitigated by social transfers. For all its individualist ideology, the United States has a lot of redistribution. The income tax is still graduated, and low incomes are buffered by a “hidden welfare state” that includes unemployment insurance, Social Security, Medicare, Medicaid, Temporary Assistance for Needy Families, food stamps, and the Earned Income Tax Credit, a kind of negative income tax in which the government boosts the income of low earners. Put them together and America becomes far less unequal.

The United States has not gone as far as countries like Germany and Finland,

Some kind of welfare state may be found in all developed countries, and it reduces inequality even when it is hidden.50

The sociologist Christopher Jencks has calculated that when the benefits from the hidden welfare state are added up, and the cost of living is estimated in a way that takes into account the improving quality and falling price of consumer goods, the poverty rate has fallen in the past fifty years by more than three-quarters, and in 2013 stood at 4.8 percent.

The progress stagnated around the time of the Great Recession, but it picked up in 2015 and 2016 (not shown in the graph), when middle-class income reached a record high and the poverty rate showed its largest drop since 1999.54

The unsheltered homeless—fell in number between 2007 and 2015 by almost a third, despite the Great Recession.55

Income is just a means to an end: a way of paying for things that people need, want, and like, or as economists gracelessly call it, consumption. When poverty is defined in terms of what people consume rather than what they earn, we find that the American poverty rate has declined by ninety percent since 1960, from 30 percent of the population to just 3 percent. The two forces that have famously increased inequality in income have at the same time decreased inequality in what matters.

Together, technology and globalization have transformed what it means to be a poor person, at least in developed countries.

The poor used to be called the have-nots. In 2011, more than 95 percent of American households below the poverty line had electricity, running water, flush toilets, a refrigerator, a stove, and a color TV.58 (A century and a half before, the Rothschilds, Astors, and Vanderbilts had none of these things.)

The rich have gotten richer, but their lives haven’t gotten that much better. Warren Buffett may have more air conditioners than most people, or better ones, but by historical standards the fact that a majority of poor Americans even have an air conditioner is astonishing.

Though disposable income has increased, the pace of the increase is slow, and the resulting lack of consumer demand may be dragging down the economy as a whole.62 The hardships faced by one sector of the population—middle-aged, less-educated, non-urban white Americans—are real and tragic, manifested in higher rates of drug overdose (chapter 12) and suicide

Truck drivers, for example, make up the most common occupation in a majority of states, and self-driving vehicles may send them the way of scriveners, wheelwrights, and switchboard operators. Education, a major driver of economic mobility, is not keeping up with the demands of modern economies: tertiary education has soared in cost (defying the inexpensification of almost every other good), and in poor American neighborhoods, primary and secondary education are unconscionably substandard. Many parts of the American tax system are regressive, and money buys too much political influence.

Rather than tilting at inequality per se it may be more constructive to target the specific problems lumped with it.65 An obvious priority is to boost the rate of economic growth, since it would increase everyone’s slice of the pie and provide more pie to redistribute.

The next step in the historic trend toward greater social spending may be a universal basic income (or its close relative, a negative income tax).

Despite its socialist aroma, the idea has been championed by economists (such as Milton Friedman), politicians (such as Richard Nixon), and states (such as Alaska) that are associated with the political right, and today analysts across the political spectrum are toying with it.

It could rationalize the kludgy patchwork of the hidden welfare state, and it could turn the slow-motion disaster of robots replacing workers into a horn of plenty. Many of the jobs that robots will take over are jobs that people don’t particularly enjoy, and the dividend in productivity, safety, and leisure could be a boon to humanity as long as it is widely shared.

Inequality is not the same as poverty, and it is not a fundamental dimension of human flourishing. In comparisons of well-being across countries, it pales in importance next to overall wealth. An increase in inequality is not necessarily bad: as societies escape from universal poverty, they are bound to become more unequal, and the uneven surge may be repeated when a society discovers new sources of wealth.

THE ENVIRONMENT

The key idea is that environmental problems, like other problems, are solvable, given the right knowledge.

Beginning in the 1960s, the environmental movement grew out of scientific knowledge (from ecology, public health, and earth and atmospheric sciences) and a Romantic reverence for nature.

In this chapter I will present a newer conception of environmentalism which shares the goal of protecting the air and water, species, and ecosystems but is grounded in Enlightenment optimism rather than Romantic declinism.

Ecomodernism, Ecopragmatism, Earth Optimism,

Enlightenment Environmentalism or Humanistic Environmentalism.3

Ecomodernism begins with the realization that some degree of pollution is an inescapable consequence of the Second Law of Thermodynamics. When people use energy to create a zone of structure in their bodies and homes, they must increase entropy elsewhere in the environment in the form of waste, pollution, and other forms of disorder.

When native peoples first set foot in an ecosystem, they typically hunted large animals to extinction, and often burned and cleared vast swaths of forest.

When humans took up farming, they became more disruptive still.

A second realization of the ecomodernist movement is that industrialization has been good for humanity.8 It has fed billions, doubled life spans, slashed extreme poverty, and, by replacing muscle with machinery, made it easier to end slavery, emancipate women, and educate children (chapters 7, 15, and 17). It has allowed people to read at night, live where they want, stay warm in winter, see the world, and multiply human contact. Any costs in pollution and habitat loss have to be weighed against these gifts.

The third premise is that the tradeoff that pits human well-being against environmental damage can be renegotiated by technology. How to enjoy more calories, lumens, BTUs, bits, and miles with less pollution and land is itself a technological problem, and one that the world is increasingly solving.

Figure 10-1 shows that the world population growth rate peaked at 2.1 percent a year in 1962, fell to 1.2 percent by 2010, and will probably fall to less than 0.5 percent by 2050 and be close to zero around 2070, when the population is projected to level off and then decline.

The other scare from the 1960s was that the world would run out of resources. But resources just refuse to run out. The 1980s came and went without the famines that were supposed to starve tens of millions of Americans and billions of people worldwide. Then the year 1992 passed and, contrary to projections from the 1972 bestseller The Limits to Growth and similar philippics, the world did not exhaust its aluminum, copper, chromium, gold, nickel, tin, tungsten, or zinc.

From the 1970s to the early 2000s newsmagazines periodically illustrated cover stories on the world’s oil supply with a gas gauge pointing to Empty. In 2013 The Atlantic ran a cover story about the fracking revolution entitled “We Will Never Run Out of Oil.”

And the Rare Earths War? In reality, when China squeezed its exports in 2010 (not because of shortages but as a geopolitical and mercantilist weapon), other countries started extracting rare earths from their own mines, recycling them from industrial waste, and re-engineering products so they no longer needed them.15

Instead, as the most easily extracted supply of a resource becomes scarcer, its price rises, encouraging people to conserve it, get at the less accessible deposits, or find cheaper and more plentiful substitutes.

In reality, societies have always abandoned a resource for a better one long before the old one was exhausted.

In The Big Ratchet: How Humanity Thrives in the Face of Natural Crisis, the geographer Ruth DeFries describes the sequence as “ratchet-hatchet-pivot.” People discover a way of growing more food, and the population ratchets upward. The method fails to keep up with the demand or develops unpleasant side effects, and the hatchet falls. People then pivot to a new method.

Figure 10-3 shows that since 1970, when the Environmental Protection Agency was established, the United States has slashed its emissions of five air pollutants by almost two-thirds. Over the same period, the population grew by more than 40 percent, and those people drove twice as many miles and became two and a half times richer. Energy use has leveled off, and even carbon dioxide emissions have turned a corner, a point to which we will return.

They mainly reflect gains in efficiency and emission control.

Though tropical forests are still, alarmingly, being cut down, between the middle of the 20th century and the turn of the 21st the rate fell by two-thirds (figure 10-4).24 Deforestation of the world’s largest tropical forest, the Amazon, peaked in 1995, and from 2004 to 2013 the rate fell by four-fifths.25

Thanks to habitat protection and targeted conservation efforts, many beloved species have been pulled from the brink of extinction, including albatrosses, condors, manatees, oryxes, pandas, rhinoceroses, Tasmanian devils, and tigers; according to the ecologist Stuart Pimm, the rate of bird extinctions has been reduced by 75 percent.31 Though many species remain in precarious straits, a number of ecologists and paleontologists believe that the claim that humans are causing a mass extinction like the Permian and Cretaceous is hyperbolic.

One key is to decouple productivity from resources: to get more human benefit from less matter and energy. This puts a premium on density.36 As agriculture becomes more intensive by growing crops that are bred or engineered to produce more protein, calories, and fiber with less land, water, and fertilizer, farmland is spared, and it can morph back to natural habitats. (Ecomodernists point out that organic farming, which needs far more land to produce a kilogram of food, is neither green nor sustainable.)

All these processes are helped along by another friend of the Earth, dematerialization. Progress in technology allows us to do more with less.

Digital technology is also dematerializing the world by enabling the sharing economy, so that cars, tools, and bedrooms needn’t be made in huge numbers that sit around unused most of the time.

Hipsterization leads them to distinguish themselves by their tastes in beer, coffee, and music.

Just as we must not accept the narrative that humanity inexorably despoils every part of the environment, we must not accept the narrative that every part of the environment will rebound under our current practices.

If the emission of greenhouse gases continues, the Earth’s average temperature will rise to at least 1.5°C (2.7°F) above the preindustrial level by the end of the 21st century, and perhaps to 4°C (7.2°F) above that level or more. That will cause more frequent and more severe heat waves, more floods in wet regions, more droughts in dry regions, heavier storms, more severe hurricanes, lower crop yields in warm regions, the extinction of more species, the loss of coral reefs (because the oceans will be both warmer and more acidic), and an average rise in sea level of between 0.7 and 1.2 meters (2 and 4 feet) from both the melting of land ice and the expansion of seawater. (Sea level has already risen almost eight inches since 1870, and the rate of the rise appears to be accelerating.) Low-lying areas would be flooded, island nations would disappear beneath the waves, large stretches of farmland would no longer be arable, and millions of people would be displaced. The effects could get still worse in the 22nd century and beyond, and in theory could trigger upheavals such as a diversion of the Gulf Stream (which would turn Europe into Siberia) or a collapse of Antarctic ice sheets.

A recent survey found that exactly four out of 69,406 authors of peer-reviewed articles in the scientific literature rejected the hypothesis of anthropogenic global warming, and that “the peer-reviewed literature contains no convincing evidence against [the hypothesis].

Nonetheless, a movement within the American political right, heavily underwritten by fossil fuel interests, has prosecuted a fanatical and mendacious campaign to deny that greenhouse gases are warming the planet.47

The problem is that carbon emissions are a classic public goods game, also known as a Tragedy of the Commons. People benefit from everyone else’s sacrifices and suffer from their own, so everyone has an incentive to be a free rider and let everyone else make the sacrifice, and everyone suffers. A standard remedy for public goods dilemmas is a coercive authority that can punish free riders. But any government with the totalitarian power to abolish artistic pottery is unlikely to restrict that power to maximizing the common good. One can, alternatively, daydream

Most important, the sacrifice needed to bring carbon emissions down by half and then to zero is far greater than forgoing jewelry: it would require forgoing electricity, heating, cement, steel, paper, travel, and affordable food and clothing.

Escaping from poverty requires abundant energy.

Economic progress is an imperative in rich and poor countries alike precisely because it will be needed to adapt to the climate change that does occur. Thanks in good part to prosperity, humanity has been getting healthier (chapters 5 and 6), better fed (chapter 7), more peaceful (chapter 11), and better protected from natural hazards and disasters (chapter 12). These advances have made humanity more resilient to natural and human-made threats: disease outbreaks don’t become pandemics, crop failures in one region are alleviated by surpluses in another, local skirmishes are defused before they erupt into war, populations are better protected against storms, floods, and droughts.

The enlightened response to climate change is to figure out how to get the most energy with the least emission of greenhouse gases. There is, to be sure, a tragic view

Ausubel notes that the modern world has been progressively decarbonizing.

Annual CO2 emissions may have leveled off for the time being at around 36 billion tons, but that’s still a lot of CO2 added to the atmosphere every year, and there is no sign of the precipitous plunge we would need to stave off the harmful outcomes. Instead, decarbonization needs to be helped along with pushes from policy and technology, an idea called deep decarbonization.73

A second key to deep decarbonization brings up an inconvenient truth for the traditional Green movement: nuclear power is the world’s most abundant and scalable carbon-free energy source.

Nuclear energy, in contrast, represents the ultimate in density,

It’s often said that with climate change, those who know the most are the most frightened, but with nuclear power, those who know the most are the least frightened.

“The French have two kinds of reactors and hundreds of kinds of cheese, whereas in the United States the figures are reversed.”89

The benefits of advanced nuclear energy are incalculable.

An energy source that is cheaper, denser, and cleaner than fossil fuels would sell itself, requiring no herculean political will or international cooperation.92 It would not just mitigate climate change but furnish manifold other gifts. People in the developing world could skip the middle rungs in the energy ladder, bringing their standard of living up to that of the West without choking on coal smoke. Affordable desalination of seawater, an energy-ravenous process, could irrigate farms, supply drinking water, and, by reducing the need for both surface water and hydro power, allow dams to be dismantled, restoring the flow of rivers to lakes and seas and revivifying entire ecosystems.

The last of these is critical for a simple reason. Even if greenhouse gas emissions are halved by 2050 and zeroed by 2075, the world would still be on course for risky warming, because the CO2 already emitted will remain in the atmosphere for a very long time. It’s not enough to stop thickening the greenhouse; at some point we have to dismantle it.

The obvious way to remove CO2 from the air, then, is to recruit as many carbon-hungry plants as we can to help us. We can do this by encouraging the transition from deforestation to reforestation and afforestation (planting new forests), by reversing tillage and wetland destruction, and by restoring coastal and marine habitats.

Will any of this happen? The obstacles are unnerving; they include the world’s growing thirst for energy, the convenience of fossil fuels with their vast infrastructure, the denial of the problem by energy corporations and the political right, the hostility to technological solutions from traditional Greens and the climate justice left, and the tragedy of the carbon commons.

Despite a half-century of panic, humanity is not on an irrevocable path to ecological suicide.

PEACE

In The Better Angels of Our Nature I showed that, as of the first decade of the 21st century, every objective measure of violence had been in decline.

For most of human history, war was the natural pastime of governments, peace a mere respite between wars.2

(Great powers are the handful of states and empires that can project force beyond their borders, that treat each other as peers, and that collectively control a majority of the world’s military resources.)

It’s not just the great powers that have stopped fighting each other. War in the classic sense of an armed conflict between the uniformed armies of two nation-states appears to be obsolescent.

The world’s wars are now concentrated almost exclusively in a zone stretching from Nigeria to Pakistan, an area containing less than a sixth of the world’s population. Those wars are civil wars, which the Uppsala Conflict Data Program (UCDP) defines as an armed conflict between a government and an organized force which verifiably kills at least a thousand soldiers and civilians a year.

The flip is driven mainly by conflicts that have a radical Islamist group on one side (eight of the eleven in 2015, ten of the twelve in 2016); without them, there would have been no increase in the number of wars at all. Perhaps not coincidentally, two of the wars in 2014 and 2015 were fueled by another counter-Enlightenment ideology, Russian nationalism, which drove separatist forces, backed by Vladimir Putin, to battle the government of Ukraine in two of its provinces.

The worst of the ongoing wars is in Syria,

“Wars begin in the minds of men.” And indeed we find that the turn away from war consists in more than just a reduction in wars and war deaths; it also may be seen in nations’ preparations for war. The prevalence of conscription, the size of armed forces, and the level of global military spending as a percentage of GDP have all decreased in recent decades.

Kant’s famous essay “Perpetual Peace.”19

As we saw in chapter 1, many Enlightenment thinkers advanced the theory of gentle commerce, according to which international trade should make war less appealing. Sure enough, trade as a proportion of GDP shot up in the postwar era, and quantitative analyses have confirmed that trading countries are less likely to go to war, holding all else constant.21

Another brainchild of the Enlightenment is the theory that democratic government serves as a brake on glory-drunk leaders who would drag their countries into pointless wars. Starting in the 1970s, and accelerating

Democratic Peace theory, in which pairs of countries that are more democratic are less likely to confront each other in militarized disputes.22

Yet the biggest single change in the international order is an idea we seldom appreciate today: war is illegal.

That cannot happen today: the world’s nations have committed themselves to not waging war except in self-defense or with the approval of the United Nations Security Council. States are immortal, borders are grandfathered in, and any country that indulges in a war of conquest can expect opprobrium, not acquiescence, from the rest.

War “enlarges the mind of a people and raises their character,” wrote Alexis de Tocqueville. It is “life itself,” said Émile Zola; “the foundation of all the arts . . . [and] the high virtues and faculties of man,” wrote John Ruskin.

Romantic militarism sometimes merged with romantic nationalism, which exalted the language, culture, homeland, and racial makeup of an ethnic group—the ethos of blood and soil—and held that a nation could fulfill its destiny only as an ethnically cleansed sovereign state.

But perhaps the biggest impetus to romantic militarism was declinism, the revulsion among intellectuals at the thought that ordinary people seemed to be enjoying their lives in peace and prosperity.34 Cultural pessimism became particularly entrenched in Germany through the influence of Schopenhauer, Nietzsche, Jacob Burckhardt, Georg Simmel, and Oswald Spengler, author in 1918–23 of The Decline of the West. (We will return to these ideas in chapter 23.) To this day, historians of World War I puzzle over why England and Germany, countries with a lot in common—Western, Christian, industrialized, affluent—would choose to hold a pointless bloodbath. The reasons are many and tangled, but insofar as they involve ideology, Germans before World War I “saw themselves as outside European or Western civilization,” as Arthur Herman points out.35 In particular, they thought they were bravely resisting the creep of a liberal, democratic, commercial culture that had been sapping the vitality of the West since the Enlightenment, with the complicity of Britain and the United States. Only from the ashes of a redemptive cataclysm, many thought, could a new heroic order arise.

Worldwide, injuries account for about a tenth of all deaths, outnumbering the victims of AIDS, malaria, and tuberculosis combined, and are responsible for 11 percent of the years lost to death and disability.

Though lethal injuries are a major scourge of human life, bringing the numbers down is not a sexy cause. The inventor of the highway guard rail did not get a Nobel Prize, nor are humanitarian awards given to designers of clearer prescription drug labels.

More people are killed in homicides than wars.

But in a sweeping historical development that the German sociologist Norbert Elias called the Civilizing Process, Western Europeans, starting in the 14th century, began to resolve their disputes in less violent ways.6 Elias credited the change to the emergence of centralized kingdoms out of the medieval patchwork of baronies and duchies, so that the endemic feuding, brigandage, and warlording were tamed by a “king’s peace.” Then, in the 19th century, criminal justice systems were further professionalized by municipal police forces and a more deliberative court system.

People became enmeshed in networks of commercial and occupational obligations laid out in legal and bureaucratic rules. Their norms for everyday conduct shifted from a macho culture of honor, in which affronts had to be answered with violence, to a gentlemanly culture of dignity, in which status was won by displays of propriety and self-control.

(Homicide rates are the most reliable indicator of violent crime across different times and places because a corpse is always hard to overlook, and rates of homicide correlate with rates of other violent crimes like robbery, assault, and rape.)

Violent crime is a solvable problem.

Half of the world’s homicides are committed in just twenty-three countries containing about a tenth of humanity, and a quarter are committed in just four: Brazil (25.2), Colombia (25.9), Mexico (12.9), and Venezuela. (The world’s two murder zones—northern Latin America and southern sub-Saharan Africa—are distinct from its war zones, which stretch from Nigeria through the Middle East into Pakistan.) The lopsidedness continues down the fractal scale. Within a country, most of the homicides cluster in a few cities, such as Caracas (120 per 100,000) and San Pedro Sula (in Honduras, 187). Within cities, the homicides cluster in a few neighborhoods; within neighborhoods, they cluster in a few blocks; and within blocks, many are carried out by a few individuals.17 In my hometown of Boston, 70 percent of the shootings take place in 5 percent of the city, and half the shootings were perpetrated by one percent of the youths.18

High rates of homicide can be brought down quickly.

Combine the cockeyed distribution of violent crime with the proven possibility that high rates of violent crime can be brought down quickly, and the math is straightforward: a 50 percent reduction in thirty years is not just practicable but almost conservative.

This “Hobbesian trap,” as it is sometimes called, can easily set off cycles of feuding and vendetta: you have to be at least as violent as your adversaries lest you become their doormat. The largest category of homicide, and the one that varies the most across times and places, consists of confrontations between loosely acquainted young men over turf, reputation, or revenge. A disinterested third party with a monopoly on the legitimate use of force—that is, a state with a police force and judiciary—can nip this cycle in the bud. Not only does it disincentivize aggressors by the threat of punishment, but it reassures everyone else that the aggressors are disincentivized and thereby relieves them of the need for belligerent self-defense.

Here is Eisner’s one-sentence summary of how to halve the homicide rate within three decades: “An effective rule of law, based on legitimate law enforcement, victim protection, swift and fair adjudication, moderate punishment, and humane prisons is critical to sustainable reductions in lethal violence.”32 The adjectives effective, legitimate, swift, fair, moderate, and humane differentiate his advice from the get-tough-on-crime rhetoric favored by right-wing politicians.

Together with the presence of law enforcement, the legitimacy of the regime appears to matter, because people not only respect legitimate authority themselves but factor in the degree to which they expect their potential adversaries to respect it.

Thomas Abt and Christopher Winship

They concluded that the single most effective tactic for reducing violent crime is focused deterrence. A “laser-like focus” must first be directed on the neighborhoods where crime is rampant or even just starting to creep up, with the “hot spots” identified by data gathered in real time. It must be further beamed at the individuals and gangs who are picking on victims or roaring for a fight. And it must deliver a simple and concrete message about the behavior that is expected of them, like “Stop shooting and we will help you, keep shooting and we will put you in prison.” Getting the message through, and then enforcing it, depends on the cooperation of other members of the community—the store owners, preachers, coaches, probation officers, and relatives.

Also provably effective is cognitive behavioral therapy.

It is a set of protocols designed to override the habits of thought and behavior that lead to criminal acts.

Therapies that teach strategies of self-control. Troublemakers also have narcissistic and sociopathic thought patterns, such as that they are always in the right, that they are entitled to universal deference, that disagreements are personal insults, and that other people have no feelings or interests.

Together with anarchy, impulsiveness, and opportunity, a major trigger of criminal violence is contraband.

Violent crime exploded in the United States when alcohol was prohibited in the 1920s and when crack cocaine became popular in the late 1980s, and it is rampant in Latin American and Caribbean countries in which cocaine, heroin, and marijuana are trafficked today. Drug-fueled violence remains an unsolved international problem.

“Aggressive drug enforcement yields little anti-drug benefits and generally increases violence,” while “drug courts and treatment have a long history of effectiveness.”

Neither right-to-carry laws favored by the right, nor bans and restrictions favored by the left, have been shown to make much difference—though there is much we don’t know, and political and practical impediments to finding out more.39

In 1965 a young lawyer named Ralph Nader published Unsafe at Any Speed, a j’accuse of the industry for neglecting safety in automotive design. Soon after, the National Highway Traffic Safety Administration was established and legislation was passed requiring new cars to be equipped with a number of safety features. Yet the graph shows that steeper reductions came before the activism and the legislation, and the auto industry was sometimes ahead of its customers and regulators.

In 1980 Mothers Against Drunk Driving was formed, and they lobbied for higher drinking ages, lowered legal blood alcohol levels, and the stigmatization of drunk driving, which popular culture had treated as a source of comedy (such as in the movies North by Northwest and Arthur).

The Brooklyn Dodgers, before they moved to Los Angeles, had been named after the city’s pedestrians, famous for their skill at darting out of the way of hurtling streetcars.

When robotic cars are ubiquitous, they could save more than a million lives a year, becoming one of the greatest gifts to human life since the invention of antibiotics.

After car crashes, the likeliest cause of accidental death consists of falls, followed by drownings and fires, followed by poisonings.

Figure 12-6 shows an apparent exception to the conquest of accidents: the category called “Poison (solid or liquid).” The steep rise starting in the 1990s is anomalous in a society that is increasingly latched,

Then I realized that the category of accidental poisonings includes drug overdoses.

In 2013, 98 percent of the “Poison” deaths were from drugs (92 percent) or alcohol (6 percent), and almost all the others were from gases and vapors (mostly carbon monoxide). Household and occupational hazards like solvents, detergents, insecticides, and lighter fluid were responsible for less than a half of one percent of the poisoning deaths, and would scrape the bottom of figure 12-6

The curve begins to rise in the psychedelic 1960s, jerks up again during the crack cocaine epidemic of the 1980s, and blasts off during the far graver epidemic of opioid addiction in the 21st century. Starting in the 1990s, doctors overprescribed synthetic opioid painkillers like oxycodone, hydrocodone, and fentanyl, which are not just addictive but gateway drugs to heroin.

A sign that the measures might be effective is that the number of overdoses of prescription opioids (though not of illicit heroin and fentanyl) peaked in 2010 and may be starting to come down.56

The peak age of poisoning deaths in 2011 was around fifty, up from the low forties in 2003, the late thirties in 1993, the early thirties in 1983, and the early twenties in 1973.57 Do the subtractions and you find that in every decade it’s the members of the generation born between 1953 and 1963 who are drugging themselves to death. Despite perennial panic about teenagers, today’s kids are, relatively speaking, all right, or at least better. According to a major longitudinal study of teenagers called Monitoring the Future, high schoolers’ use of alcohol, cigarettes, and drugs (other than marijuana and vaping) have dropped to the lowest levels since the survey began in 1976.58

Humanity’s conquest of everyday danger is a peculiarly unappreciated form of progress.

Just as people tend not to see accidents as atrocities (at least when they are not the victims), they don’t see gains in safety as moral triumphs, if they are aware of them at all. Yet the sparing of millions of lives, and the reduction of infirmity, disfigurement, and suffering on a massive scale, deserve our gratitude and demand an explanation. That is true even of murder, the most moralized of acts, whose rate has plummeted for reasons that defy standard narratives.

TERRORISM

It’s because terrorism, as it is now defined, is largely a phenomenon of war, and wars no longer take place in the United States or Western Europe.

A majority of the world’s terrorist deaths take place in zones of civil war (including 8,831 in Iraq, 6,208 in Afghanistan, 5,288 in Nigeria, 3,916 in Syria, 1,606 in Pakistan, and 689 in Libya), and many of these are double-counted as war deaths, because “terrorism” during a civil war is simply a war crime—a deliberate attack on civilians—committed by a group other than the government.

About twice as many Americans have been killed since 1990 by right-wing extremists as by Islamist terror groups.

Modern terrorism is a by-product of the vast reach of the media.

Killing innocent people, especially in circumstances in which readers of the news can imagine themselves. News media gobble the bait and give the atrocities saturation coverage. The Availability heuristic kicks in and people become stricken with a fear that is unrelated to the level of danger.

The legal scholar Adam Lankford has analyzed the motives of the overlapping categories of suicide terrorists, rampage shooters, and hate crime killers, including both the self-radicalized lone wolves and the bomb fodder recruited by terrorist masterminds.14 The killers tend to be loners and losers, many with untreated mental illness, who are consumed with resentment and fantasize about revenge and recognition. Some fused their bitterness with Islamist ideology, others with a nebulous cause such as “starting a race war” or “a revolution against the federal government, taxes, and anti-gun laws.” Killing a lot of people offered them the chance to be a somebody, even if only in the anticipation, and going out in a blaze of glory meant that they didn’t have to deal with the irksome aftermath of being a mass murderer.

The historian Yuval Harari notes that terrorism is the opposite of military action, which tries to damage the enemy’s ability to retaliate and prevail.16

From their position of weakness, Harari notes, what terrorists seek to accomplish is not damage but theater.

Harari points out that in the Middle Ages, every sector of society retained a private militia—aristocrats, guilds, towns, even churches and monasteries—and they secured their interests by force: “If in 1150 a few Muslim extremists had murdered a handful of civilians in Jerusalem, demanding that the Crusaders leave the Holy Land, the reaction would have been ridicule rather than terror. If you wanted to be taken seriously, you should have at least gained control of a fortified castle or two.”

Sociologist Eric Madfis, has recommended a policy for rampage shootings of “Don’t Name Them, Don’t Show Them, but Report Everything Else,” based on a policy for juvenile shooters already in effect in Canada and on other strategies of calculated media self-restraint.)

DEMOCRACY

humanity has tried to steer a course between the violence of anarchy and the violence of tyranny.

Early governments pacified the people they ruled, reducing internecine violence, but imposed a reign of terror that included slavery, harems, human sacrifice, summary executions, and the torture and mutilation of dissidents and deviants.

Chaos is deadlier than tyranny. More of these multicides result from the breakdown of authority rather than the exercise of authority.

One can think of democracy as a form of government that threads the needle, exerting just enough force to prevent people from preying on each other without preying on the people itself.

Democracy is a major contributor to human flourishing. But it’s not the only reason: democracies also have higher rates of economic growth, fewer wars and genocides, healthier and better-educated citizens, and virtually no famines.4 If the world has become more democratic over time, that is progress.

The political scientist Samuel Huntington organized the history of democratization into three waves.5 The first swelled in the 19th century, when that great Enlightenment experiment, American constitutional democracy with its checks on government power, seemed to be working.

With the defeat of fascism in World War II, a second wave gathered force as colonies gained independence from their European overlords, pushing the number of recognized democracies up to thirty-six by 1962.

The West German chancellor Willy Brandt lamented that “Western Europe has only 20 or 30 more years of democracy left in it; after that it will slide, engineless and rudderless, under the surrounding sea of dictatorship.”

Military and fascist governments fell in southern Europe (Greece and Portugal in 1974, Spain in 1975), Latin America (including Argentina in 1983, Brazil in 1985, and Chile in 1990), and Asia (including Taiwan and the Philippines around 1986, South Korea around 1987, and Indonesia in 1998). The Berlin Wall was torn down in 1989,

In 1989 the political scientist Francis Fukuyama published a famous essay in which he proposed that liberal democracy represented “the end of history,” not because nothing would ever happen again but because the world was coming to a consensus over the humanly best form of governance and no longer had to fight over it.8

The rise of alternatives to democracy such as theocracy in the Muslim world and authoritarian capitalism in China. Democracies themselves appeared to be backsliding into authoritarianism with populist victories in Poland and Hungary and power grabs by Recep Erdogan in Turkey and Vladimir Putin in Russia (the return of the sultan and the czar).

After swelling in the 1990s, this third wave spilled into the 21st century in a rainbow of “color revolutions” including Croatia (2000), Serbia (2000), Georgia (2003), Ukraine (2004), and Kyrgyzstan (2005), bringing the total at the start of the Obama presidency in 2009 to 87.14

As of 2015, the most recent year in the dataset, the total stood at 103.

It is true that stable, top-shelf democracy is likelier to be found in countries that are richer and more highly educated.17 But governments that are more democratic than not are a motley collection: they are entrenched in most of Latin America, in floridly multiethnic India, in Muslim Malaysia, Indonesia, Niger, and Kosovo, in fourteen countries in sub-Saharan Africa (including Namibia, Senegal, and Benin), and in poor countries elsewhere such as Nepal, Timor-Leste, and most of the Caribbean.18

Political scientists are repeatedly astonished by the shallowness and incoherence of people’s political beliefs, and by the tenuous connection of their preferences to their votes and to the behavior of their representatives.21 Most voters are ignorant not just of current policy options but of basic facts, such as what the major branches of government are, who the United States fought in World War II, and which countries have used nuclear weapons. Their opinions flip depending on how a question is worded: they say that the government spends too much on “welfare” but too little on “assistance to the poor,” and that it should “use military force” but not “go to war.” When they do formulate a preference, they commonly vote for a candidate with the opposite one. But it hardly matters, because once in office politicians vote the positions of their party regardless of the opinions of their constituents.

Many political scientists have concluded that most people correctly recognize that their votes are astronomically unlikely to affect the outcome of an election, and so they prioritize work, family, and leisure over educating themselves about politics and calibrating their votes. They use the franchise as a form of self-expression: they vote for candidates who they think are like them and stand for their kind of people.

Also, autocrats can learn to use elections to their advantage. The latest fashion in dictatorship has been called the competitive, electoral, kleptocratic, statist, or patronal authoritarian regime.22 (Putin’s Russia is the prototype.) The incumbents use the formidable resources of the state to harass the opposition, set up fake opposition parties, use state-controlled media to spread congenial narratives, manipulate electoral rules, tilt voter registration, and jigger the elections themselves. (Patronal authoritarians, for all that, are not invulnerable—the color revolutions sent several of them packing.)

In his 1945 book The Open Society and Its Enemies, the philosopher Karl Popper argued that democracy should be understood not as the answer to the question “Who should rule?” (namely, “The People”), but as a solution to the problem of how to dismiss bad leadership without bloodshed.

Steven Levitsky and Lucan Way point out, “State failure brings violence and instability; it almost never brings democratization.”27

The freedom to complain rests on an assurance that the government won’t punish or silence the complainer. The front line in democratization, then, is constraining the government from abusing its monopoly on force to brutalize its uppity citizens.

Has the rise in democracy brought a rise in human rights, or are dictators just using elections and other democratic trappings to cover their abuses with a smiley-face?

The abolition of capital punishment has gone global (figure 14-3), and today the death penalty is on death row.

We are seeing a moral principle—Life is sacred, so killing is onerous—become distributed across a wide range of actors and institutions that have to cooperate to make the death penalty possible. As these actors and institutions implement the principle more consistently and thoroughly, they inexorably push the country away from the impulse to avenge a life with a life.

EQUAL RIGHTS

First Lady Michelle Obama in a speech at the Democratic National Convention in 2016: “I wake up every morning in a house that was built by slaves, and I watch my daughters, two beautiful, intelligent black young women, playing with their dogs on the White House lawn.”

A string of highly publicized killings by American police officers of unarmed African American suspects, some of them caught on smartphone videos, has led to a sense that the country is suffering an epidemic of racist attacks by police on black men. Media coverage of athletes who have assaulted their wives or girlfriends, and of episodes of rape on college campuses, has suggested to many that we are undergoing a surge of violence against

The data suggest that the number of police shootings has decreased, not increased, in recent decades (even as the ones that do occur are captured on video), and three independent analyses have found that a black suspect is no more likely than a white suspect to be killed by the police.6 (American police shoot too many people, but it’s not primarily a racial issue.)

The Pew Research Center has probed Americans’ opinions on race, gender, and sexual orientation over the past quarter century, and has reported that these attitudes have undergone a “fundamental shift” toward tolerance and respect of rights, with formerly widespread prejudices sinking into oblivion.

Other surveys show the same shifts.8 Not only has the American population become more liberal, but each generational cohort is more liberal than the one born before it.

Millennials (those born after 1980), who are even less prejudiced than the national average, tell us which way the country is going.10

A decline in prejudice or simply a decline in the social acceptability of prejudice, with fewer people willing to confess their disreputable attitudes to a pollster.

And contrary to the fear that the rise of Trump reflects (or emboldens) prejudice, the curves continue their decline through his period of notoriety in 2015–2016 and inauguration in early 2017.

Stephens-Davidowitz has pointed out to me that these curves probably underestimate the decline in prejudice because of a shift in who’s Googling.

Stephens-Davidowitz confirmed that bigoted searches tended to come from regions with older and less-educated populations. Compared with the country as a whole, retirement communities are seven times as likely to search for “nigger jokes” and thirty times as likely to search for “fag jokes.”

These threads confirmed that racists may be a dwindling breed: someone who searches for “nigger” is likely to search for other topics that appeal to senior citizens, such as “social security” and “Frank Sinatra.”

Private prejudice is declining with time and declining with youth, which means that we can expect it to decline still further as aging bigots cede the stage to less prejudiced cohorts.

Until they do, these older and less-educated people (mainly white men) may not respect the benign taboos on racism, sexism, and homophobia that have become second nature to the mainstream, and may even dismiss them as “political correctness.”

Trump’s success, like that of right-wing populists in other Western countries, is better understood as the mobilization of an aggrieved and shrinking demographic in a polarized political landscape than as the sudden reversal of a century-long movement toward equal rights.

Hate crimes against Asian, Jewish, and white targets have declined as well. And despite claims that Islamophobia has become rampant in America, hate crimes targeting Muslims have shown little change other than a one-time rise following 9/11 and upticks following other Islamist terror attacks, such as the ones in Paris and San Bernardino in 2015.20

Women’s status, too, is ascendant.

Violence against women is best measured by victimization surveys, because they circumvent the problem of underreporting to the police; these instruments show that rates of rape and violence against wives and girlfriends have been sinking for decades and are now at a quarter or less of their peaks in the past

No form of progress is inevitable, but the historical erosion of racism, sexism, and homophobia are more than a change in fashion.

Also, as people are forced to justify the way they treat other people, rather than dominating them out of instinctive, religious, or historical inertia, any justification for prejudicial treatment will crumble under scrutiny.

In his book Freedom Rising, the political scientist Christian Welzel (building on a collaboration with Ron Inglehart, Pippa Norris, and others) has proposed that the process of modernization has stimulated the rise of “emancipative values.”36 As societies shift from agrarian to industrial to informational, their citizens become less anxious about fending off enemies and other existential threats and more eager to express their ideals and to pursue opportunities in life. This shifts their values toward greater freedom for themselves and others. The transition is consistent with the psychologist Abraham Maslow’s theory of a hierarchy of needs from survival and safety to belonging, esteem, and self-actualization (and with Brecht’s “Grub first, then ethics”). People begin to prioritize freedom over security, diversity over uniformity, autonomy over authority, creativity over discipline, and individuality over conformity. Emancipative values may also be called liberal values, in the classical sense related to “liberty” and “liberation” (rather than the sense of political leftism).

The graph displays a historical trend that is seldom appreciated in the hurly-burly of political debate: for all the talk about right-wing backlashes and angry white men, the values of Western countries have been getting steadily more liberal (which, as we will see, is one of the reasons those men are so angry).

A critical discovery displayed in the graph is that the liberalization does not reflect a growing bulge of liberal young people who will backslide into conservatism as they get older.

The liberalization trends shown in figure 15-6 come from the Prius-driving, chai-sipping, kale-eating populations of post-industrial Western countries.

What is surprising, though, is that in every part of the world, people have become more liberal. A lot more liberal:

We’ve already seen that children the world over have become better off: they are less likely to enter the world motherless, die before their fifth birthday, or grow up stunted for lack of food.

Starting with influential treatises by John Locke in 1693 and Jean-Jacques Rousseau in 1762, childhood was reconceptualized.50 A carefree youth was now considered a human birthright. Play was an essential form of learning, and the early years of life shaped the adult and determined the future of society.

KNOWLEDGE

Homo sapiens, “knowing man,” is the species that uses information to resist the rot of entropy and the burdens of evolution.

Social science, correlation is not causation. Do better-educated countries get richer, or can richer countries afford more education? One way to cut the knot is to take advantage of the fact that a cause must precede its effect.

Better education today makes a country more democratic and peaceful tomorrow.

Better-educated girls grow up to have fewer babies, and so are less likely to beget youth bulges with their surfeit of troublemaking young men.9 And better-educated countries are richer, and as we saw in chapters 11 and 14, richer countries tend to be more peaceful and democratic.

So much changes when you get an education! You unlearn dangerous superstitions, such as that leaders rule by divine right, or that people who don’t look like you are less than human.

Studies of the effects of education confirm that educated people really are more enlightened. They are less racist, sexist, xenophobic, homophobic, and authoritarian.10 They place a higher value on imagination, independence, and free speech.11 They are more likely to vote, volunteer, express political views, and belong to civic associations such as unions, political parties, and religious and community organizations.12 They are also likelier to trust their fellow citizens—a prime ingredient of the precious elixir called social capital which gives people the confidence to contract, invest, and obey the law without fearing that they are chumps who will be shafted by everyone else.13

Intelligence Quotient (IQ) scores have been rising for more than a century, in every part of the world, at a rate of about three IQ points (a fifth of a standard deviation) per decade.

Also, it beggars belief to think that an average person of 1910, if he or she had entered a time machine and materialized today, would be borderline retarded by our standards, while if Joe and Jane Average made the reverse journey, they would outsmart 98 percent of the befrocked and bewhiskered Edwardians who greeted them as they emerged.

It’s no paradox that a heritable trait can be boosted by changes in the environment. That’s what happened with height, a trait that also is highly heritable and has increased over the decades, and for some of the same reasons: better nutrition and less disease.

Does the Flynn effect matter in the real world? Almost certainly. A high IQ is not just a number that you can brag about in a bar or that gets you into Mensa; it is a tailwind in life.38 People with high scores on intelligence tests get better jobs, perform better in their jobs, enjoy better health and longer lives, are less likely to get into trouble with the law, and have a greater number of noteworthy accomplishments like starting companies, earning patents, and creating respected works of art—all holding socioeconomic status constant.

Still, there have been some signs of a smarter populace, such as the fact that the world’s top-ranked chess and bridge players have been getting younger.

QUALITY OF LIFE

the worry that all that extra healthy life span and income may not have increased human flourishing after all if they just consign people to a rat race of frenzied careerism, hollow consumption, mindless entertainment, and soul-deadening anomie.

Cultural criticism can be a thinly disguised snobbery that shades into misanthropy.

In practice, “consumerism” often means “consumption by the other guy,” since the elites who condemn it tend themselves to be conspicuous consumers of exorbitant luxuries like hardcover books, good food and wine, live artistic performances, overseas travel, and Ivy-class education for their children.

In Development as Freedom, Amartya Sen sidesteps this trap by proposing that the ultimate goal of development is to enable people to make choices: strawberries and cream for those who want them. The philosopher Martha Nussbaum has taken the idea a step further and laid out a set of “fundamental capabilities” that all people should be given the opportunity to exercise.3 One can think of them as the justifiable sources of satisfaction and fulfillment that human nature makes available to us. Her list begins with capabilities that, as we have seen, the modern world increasingly allows people to realize: longevity, health, safety, literacy, knowledge, free expression, and political participation. It goes on to include aesthetic experience, recreation and play, enjoyment of nature, emotional attachments, social affiliations, and opportunities to reflect on and engage in one’s own conception of the good life.

That life is getting better even beyond the standard economists’ metrics like longevity and wealth.

As Morgan Housel notes, “We constantly worry about the looming ‘retirement funding crisis’ in America without realizing that the entire concept of retirement is unique to the last five decades.

Think of it this way: The average American now retires at age 62. One hundred years ago, the average American died at age 51.”

Today an average American worker with five years on the job receives 22 days of paid time off a year (compared with 16 days in 1970), and that is miserly by the standards of Western Europe.

In 1919, an average American wage earner had to work 1,800 hours to pay for a refrigerator; in 2014, he or she had to work fewer than 24 hours (and the new fridge was frost-free and came with an icemaker).

Hans Rosling suggests, the washing machine deserves to be called the greatest invention of the Industrial Revolution.

Time is not the only life-enriching resource granted to us by technology. Another is light. Light is so empowering that it serves as the metaphor of choice for a superior intellectual and spiritual state: enlightenment.

The economist William Nordhaus has cited the plunging price (and hence the soaring availability) of this universally treasured resource as an emblem of progress.

Adam Smith pointed out, “The real price of every thing . . . is the toil and trouble of acquiring it.”

The technology expert Kevin Kelly has proposed that “over time, if a technology persists long enough, its costs begin to approach (but never reach) zero.”

What are people doing with that extra time and money?

With the rise of two-career couples, overscheduled kids, and digital devices, there is a widespread belief (and recurring media panic) that families are caught in a time crunch that’s killing the family dinner.

But the new tugs and distractions have to be weighed against the 24 extra hours that modernity has granted to breadwinners every week and the 42 extra hours it has granted to homemakers.

In 2015, men reported 42 hours of leisure per week, around 10 more than their counterparts did fifty years earlier, and women reported 36 hours, more than 6 hours more

And at the end of the day, the family dinner is alive and well. Several studies and polls agree that the number of dinners families have together changed little from 1960 through 2014, despite the iPhones, PlayStations, and Facebook accounts.

Indeed, over the course of the 20th century, typical American parents spent more time, not less, with their children.

Today, almost half of the world’s population has Internet access, and three-quarters have access to a mobile phone.

The late 19th-century American diet consisted mainly of pork and starch.29 Before refrigeration and motorized transport, most fruits and vegetables would have spoiled before they reached a consumer, so farmers grew nonperishables like turnips, beans, and potatoes.

There can be no question of which was the greatest era for culture; the answer has to be today, until it is superseded by tomorrow.

HAPPINESS

According to the theory of the hedonic treadmill, people adapt to changes in their fortunes, like eyes adapting to light or darkness, and quickly return to a genetically determined baseline.4 According to the theory of social comparison (or reference groups, status anxiety, or relative deprivation, which we examined in chapter 9), people’s happiness is determined by how well they think they are doing relative to their compatriots, so as the country as a whole gets richer, no one feels happier—indeed, if their country becomes more unequal, then even if they get richer they may feel worse.

Some intellectuals are incredulous, even offended, that happiness has become a subject for economists rather than just poets, essayists, and philosophers. But the approaches are not opposed. Social scientists often begin their studies of happiness with ideas that were first conceived by artists and philosophers, and they can pose questions about historical and global patterns that cannot be answered by solitary reflection, no matter how insightful.

Freedom or autonomy: the availability of options to lead a good life (positive freedom) and the absence of coercion that prevents a person from choosing among them (negative freedom).

Happiness has two sides, an experiential or emotional side, and an evaluative or cognitive side.13 The experiential component consists of a balance between positive emotions like elation, joy, pride, and delight, and negative emotions like worry, anger, and sadness.

The ultimate measure of happiness would consist of a lifetime integral or weighted sum of how happy people are feeling and how long they feel that way.

People’s evaluations of how they are living their lives. People can be asked to reflect on how satisfied they feel “these days” or “as a whole” or “taking all things together,” or to render the almost philosophical judgment of where they stand on a ten-rung ladder ranging from “the worst possible life for you” to “the best possible life for you.”

Social scientists have become resigned to the fact that happiness, satisfaction, and best-versus-worst-possible life are blurred in people’s minds and that it’s often easiest just to average them together.14

And this brings us to the final dimension of a good life, meaning and purpose. This is the quality that, together with happiness, goes into Aristotle’s ideal of eudaemonia or “good spirit.”16

Roy Baumeister and his colleagues probed for what makes people feel their lives are meaningful. The respondents separately rated how happy and how meaningful their lives were, and they answered a long list of questions about their thoughts, activities, and circumstances. The results suggest that many of the things that make people happy also make their lives meaningful, such as being connected to others, feeling productive, and not being alone or bored.

People who lead meaningful lives may enjoy none of these boons. Happy people live in the present; those with meaningful lives have a narrative about their past and a plan for the future. Those with happy but meaningless lives are takers and beneficiaries; those with meaningful but unhappy lives are givers and benefactors.

Meaning is about expressing rather than satisfying the self: it is enhanced by activities that define the person and build a reputation.

The most immediate is the absence of a cross-national Easterlin paradox: the cloud of arrows is stretched along a diagonal, which indicates that the richer the country, the happier its people.

Most strikingly, the slopes of the arrows are similar to each other, and identical to the slope for the swarm of arrows as a whole (the dashed gray line lurking behind the swarm). That means that a raise for an individual relative to that person’s compatriots adds as much to his or her happiness as the same increase for their country across the board.

Happiness, of course, depends on much more than income.

Bowling Alone.

Though people have reallocated their time because families are smaller, more people are single, and more women work, Americans today spend as much time with relatives, have the same median number of friends and see them about as often, report as much emotional support, and remain as satisfied with the number and quality of their friendships as their counterparts in the decade of Gerald Ford and Happy Days. Users of the Internet and social media have more contact with friends (though a bit less face-to-face contact), and they feel that the electronic ties have enriched their relationships.

Social media users care too much, not too little, about other people, and they empathize with them over their troubles rather than envying them their successes.

Standard formula for sowing panic: Here’s an anecdote, therefore it’s a trend, therefore it’s a crisis.

But just because social life looks different today from the way it looked in the 1950s, it does not mean that humans, that quintessentially social species, have become any less social.

One of psychology’s best-kept secrets is that cognitive behavior therapy is demonstrably effective (often more effective than drugs) in treating many forms of distress, including depression, anxiety, panic attacks, PTSD, insomnia, and the symptoms of schizophrenia.

Everything is amazing. Are we really so unhappy? Mostly we are not. Developed countries are actually pretty happy, a majority of all countries have gotten happier, and as long as countries get richer they should get happier still. The dire warnings about plagues of loneliness, suicide, depression, and anxiety don’t survive fact-checking.

A modicum of anxiety may be the price we pay for the uncertainty of freedom. It is another word for the vigilance, deliberation, and heart-searching that freedom demands. It’s not entirely surprising that as women gained in autonomy relative to men they also slipped in happiness. In earlier times, women’s list of responsibilities rarely extended beyond the domestic sphere. Today young women increasingly say that their life goals include career, family, marriage, money, recreation, friendship, experience, correcting social inequities, being a leader in their community, and making a contribution to society.83 That’s a lot of things to worry about, and a lot of ways to be frustrated: Woman plans and God laughs.

As people become better educated and increasingly skeptical of received authority, they may become unsatisfied with traditional religious verities and feel unmoored in a morally indifferent cosmos.

EXISTENTIAL THREATS

In The Progress Paradox, the journalist Gregg Easterbrook suggests that a major reason that Americans are not happier, despite their rising objective fortunes, is “collapse anxiety”: the fear that civilization may implode and there’s nothing anyone can do about it.

Remember the Y2K bug?12 In the 1990s, as the turn of the millennium drew near, computer scientists began to warn the world of an impending catastrophe.

When 12:00 A.M. on January 1, 2000, arrived and the digits rolled over, a program would think it was 1900 and would crash or go haywire (presumably because it would divide some number by the difference between what it thought was the current year and the year 1900, namely zero, though why a program would do this was never made clear).

A hundred billion dollars was spent worldwide on reprogramming software for Y2K Readiness, a challenge that was likened to replacing every bolt in every bridge in the world.

A typical mammalian species lasts around a million years, and it’s hard to insist that Homo sapiens will be an exception.

Even if we did invent superhumanly intelligent robots, why would they want to enslave their masters or take over the world?

The second fallacy is to think of intelligence as a boundless continuum of potency, a miraculous elixir with the power to solve any problem, attain any goal.

Knowledge is acquired by formulating explanations and testing them against reality, not by running an algorithm faster and faster.

The real world gets in the way of many digital apocalypses. When HAL gets uppity, Dave disables it with a screwdriver, leaving it pathetically singing “A Bicycle Built for Two” to itself.

If we gave an AI the goal of maintaining the water level behind a dam, it might flood a town, not caring about the people who drowned. If we gave it the goal of making paper clips, it might turn all the matter in the reachable universe into paper clips, including our possessions and bodies.

Artificial intelligence is like any other technology. It is developed incrementally, designed to satisfy multiple conditions, tested before it is implemented, and constantly tweaked for efficacy and safety (chapter 12). As the AI expert Stuart Russell puts it, “No one in civil engineering talks about ‘building bridges that don’t fall down.’ They just call it ‘building bridges.’”

In 2002 Martin Rees publicly offered the bet that “by 2020, bioterror or bioerror will lead to one million casualties in a single event.”35

The question I’ll consider is whether the grim facts should lead any reasonable person to conclude that humanity is screwed.

The key is not to fall for the Availability bias and assume that if we can imagine something terrible, it is bound to happen. The real danger depends on the numbers: the proportion of people who want to cause mayhem or mass murder, the proportion of that genocidal sliver with the competence to concoct an effective cyber or biological weapon, the sliver of that sliver whose schemes will actually succeed, and the sliver of the sliver of the sliver that accomplishes a civilization-ending cataclysm rather than a nuisance, a blow, or even a disaster, after which life goes on.

Such attacks could take place in every city in the world many times a day, but in fact take place somewhere or other every few years (leading the security expert Bruce Schneier to ask, “Where are all the terrorist attacks?”).

Far from being criminal masterminds, most terrorists are bumbling schlemiels.

Serious threats to the integrity of a country’s infrastructure are likely to require the resources of a state.50 Software hacking is not enough; the hacker needs detailed knowledge about the physical construction of the systems he hopes to sabotage.

State-based cyber-sabotage escalates the malevolence from terrorism to a kind of warfare, where the constraints of international relations, such as norms, treaties, sanctions, retaliation, and military deterrence, inhibit aggressive attacks, as they do in conventional “kinetic” warfare.

But disaster sociology (yes, there is such a field) has shown that people are highly resilient in the face of catastrophe.53 Far from looting, panicking, or sinking into paralysis, they spontaneously cooperate to restore order and improvise networks for distributing goods and services.

It may be more than just luck that the world so far has seen just one successful bioterror attack (the 1984 tainting of salad with salmonella in an Oregon town by the Rajneeshee religious cult, which killed no one) and one spree killing (the 2001 anthrax mailings, which killed five).60

CRISPR-Cas9,

Prognosticators are biased toward scaring people.

As early as 1945, the theologian Reinhold Niebuhr observed, “Ultimate perils, however great, have a less lively influence upon the human imagination than immediate resentments and frictions, however small by comparison.”

As we saw with climate change, people may be likelier to acknowledge a problem when they have reason to think it is solvable than when they are terrified into numbness and helplessness.

The most obvious is to whittle down the size of the arsenal. The process is well under way. Few people are aware of how dramatically the world has been dismantling nuclear weapons. Figure 19-1 shows that the United States has reduced its inventory by 85 percent from its 1967 peak, and now has fewer nuclear warheads than at any time since 1956.113 Russia, for its part, has reduced its arsenal by 89 percent from its Soviet-era peak. (Probably even fewer people realize that about 10 percent of electricity in the United States comes from dismantled nuclear warheads, mostly Soviet.)114 In 2010 both countries signed

THE FUTURE OF PROGRESS

The poor may not always be with us. The world is about a hundred times wealthier today than it was two centuries ago, and the prosperity is becoming more evenly distributed across the world’s countries and people. The proportion of humanity living in extreme poverty has fallen from almost 90 percent to less than 10 percent, and within the lifetimes of most of the readers of this book it could approach zero.

The world is giving peace a chance.

The proportion of people killed annually in wars is less than a quarter of what it was in the 1980s, a seventh of what it was in the early 1970s, an eighteenth of what it was in the early 1950s, and a half a percent of what it was during World War II.

People are getting not just healthier, richer, and safer but freer. Two centuries ago a handful of countries, embracing one percent of the world’s people, were democratic; today, two-thirds of the world’s countries, embracing two-thirds of its people, are.

As people are getting healthier, richer, safer, and freer, they are also becoming more literate, knowledgeable, and smarter. Early in the 19th century, 12 percent of the world could read and write; today 83 percent can.

As societies have become healthier, wealthier, freer, happier, and better educated, they have set their sights on the most pressing global challenges. They have emitted fewer pollutants, cleared fewer forests, spilled less oil, set aside more preserves, extinguished fewer species, saved the ozone layer, and peaked in their consumption of oil, farmland, timber, paper, cars, coal, and perhaps even carbon. For all their differences, the world’s nations came to a historic agreement on climate change, as they did in previous years on nuclear testing, proliferation, security, and disarmament. Nuclear weapons, since the extraordinary circumstances of the closing days of World War II, have not been used in the seventy-two years they have existed. Nuclear terrorism, in defiance of forty years of expert predictions, has never happened. The world’s nuclear stockpiles have been reduced by 85 percent, with more reductions to come, and testing has ceased (except by the tiny rogue regime in Pyongyang) and proliferation has frozen. The world’s two most pressing problems, then, though not yet solved, are solvable: practicable long-term agendas have been laid out for eliminating nuclear weapons and for mitigating climate change. For all the bleeding headlines, for all the crises, collapses, scandals, plagues, epidemics, and existential threats, these are accomplishments to savor. The Enlightenment is working: for two and a half centuries, people have used knowledge to enhance human flourishing. Scientists have exposed the workings of matter, life, and mind. Inventors have harnessed the laws of nature to defy entropy, and entrepreneurs have made their innovations affordable. Lawmakers have made people better off by discouraging acts that are individually beneficial but collectively harmful. Diplomats have done the same with nations. Scholars have perpetuated the treasury of knowledge and augmented the power of reason. Artists have expanded the circle of sympathy. Activists have pressured the powerful to overturn repressive measures, and their fellow citizens to change repressive norms. All these efforts have been channeled into institutions that have allowed us to circumvent the flaws of human nature and empower our better angels. At the same time . . . Seven hundred million people in the world today live in extreme poverty. In the regions where they are concentrated, life expectancy is less than 60, and almost a quarter of the people are undernourished. Almost a million children die of pneumonia every year, half a million from diarrhea or malaria, and hundreds of thousands from measles and AIDS. A dozen wars are raging in the world, including one in which more than 250,000 people have died, and in 2015 at least ten thousand people were slaughtered in genocides. More than two billion people, almost a third of humanity, are oppressed in autocratic states. Almost a fifth of the world’s people lack a basic education; almost a sixth are illiterate.

Progress is not utopia, and that there is room—indeed, an imperative—for us to strive to continue that progress.

How reasonable is the hope for continuing progress?

The Scientific Revolution and the Enlightenment set in motion the process of using knowledge to improve the human condition.

Solutions create new problems, which take time to solve in their term. But when we stand back from these blips and setbacks, we see that the indicators of human progress are cumulative: none is cyclical, with gains reliably canceled by losses.3

The technological advances that have propelled this progress should only gather speed. Stein’s Law continues to obey Davies’s Corollary (Things that can’t go on forever can go on much longer than you think), and genomics, synthetic biology, neuroscience, artificial intelligence, materials science, data science, and evidence-based policy analysis are flourishing.

So too with moral progress. History tells us that barbaric customs can not only be reduced but essentially abolished, lingering at most in a few benighted backwaters.

If economies stop growing, things could get ugly.

As the entrepreneur Peter Thiel lamented, “We wanted flying cars; instead we got 140 characters.”

Whatever its causes, economic stagnation is at the root of many other problems and poses a significant challenge for 21st-century policymakers.

The second decade of the 21st century has seen the rise of a counter-Enlightenment movement called populism, more accurately, authoritarian populism.24 Populism calls for the direct sovereignty of a country’s “people” (usually an ethnic group, sometimes a class), embodied in a strong leader who directly channels their authentic virtue and experience.

By focusing on the tribe rather than the individual, it has no place for the protection of minority rights or the promotion of human welfare worldwide.

Populism comes in left-wing and right-wing varieties, which share a folk theory of economics as zero-sum competition: between economic classes in the case of the left, between nations or ethnic groups in the case of the right.

Populism looks backward to an age in which the nation was ethnically homogeneous, orthodox cultural and religious values prevailed, and economies were powered by farming and manufacturing, which produced tangible goods for local consumption and for export.

Nothing captures the tribalistic and backward-looking spirit of populism better than Trump’s campaign slogan: Make America Great Again.

Trump’s authoritarian instincts are subjecting the institutions of American democracy to a stress test, but so far it has pushed back on a number of fronts. Cabinet secretaries have publicly repudiated various quips, tweets, and stink bombs; courts have struck down unconstitutional measures; senators and congressmen have defected from his party to vote down destructive legislation; Justice Department and Congressional committees are investigating the administration’s ties to Russia; an FBI chief has publicly called out Trump’s attempt to intimidate him (raising talk about impeachment for obstruction of justice); and his own staff, appalled at what they see, regularly leak compromising facts to the press—all in the first six months of the administration.

Globalization in particular is a tide that is impossible for any ruler to order back.

Where the new president, Emmanuel Macron, proclaimed that Europe was “waiting for us to defend the spirit of the Enlightenment, threatened in so many places.”

In the American election, voters in the two lowest income brackets voted for Clinton 52–42, as did those who identified “the economy” as the most important issue. A majority of voters in the four highest income brackets voted for Trump, and Trump voters singled out “immigration” and “terrorism,” not “the economy,” as the most important issues.34

“Education, Not Income, Predicted Who Would Vote for Trump.”35 Why should education have mattered so much? Two uninteresting explanations are that the highly educated happen to affiliate with a liberal political tribe, and that education may be a better long-term predictor of economic security than current income. A more interesting explanation is that education exposes people in young adulthood to other races and cultures in a way that makes it harder to demonize them. Most interesting of all is the likelihood that education, when it does what it is supposed to do, instills a respect for vetted fact and reasoned argument, and so inoculates people against conspiracy theories, reasoning by anecdote, and emotional demagoguery.

Silver found that the regional map of Trump support did not overlap particularly well with the maps of unemployment, religion, gun ownership, or the proportion of immigrants. But it did align with the map of Google searches for the word nigger, which Seth Stephens-Davidowitz has shown is a reliable indicator of racism (chapter 15).36 This doesn’t mean that most Trump supporters are racists. But overt racism shades into resentment and distrust, and the overlap suggests that the regions of the country that gave Trump his Electoral College victory are those with the most resistance to the decades-long process of integration and the promotion of minority interests (particularly racial preferences, which they see as reverse discrimination against them).

Populist voters are older, more religious, more rural, less educated, and more likely to be male and members of the ethnic majority. They embrace authoritarian values, place themselves on the right of the political spectrum, and dislike immigration and global and national governance.39 Brexit voters, too, were older, more rural, and less educated than those who voted to remain: 66 percent of high school graduates voted to leave, but only 29 percent of degree holders did.40

Populism is an old man’s movement.

This raises the possibility that as the Silent Generation and older Baby Boomers shuffle off this mortal coil, they will take authoritarian populism with them.

Since populist movements have achieved an influence beyond their numbers, fixing electoral irregularities such as gerrymandering and forms of disproportionate representation which overweight rural areas (such as the US Electoral College) would help. So would journalistic coverage that tied candidates’ reputations to their record of accuracy and coherence rather than to trivial gaffes and scandals.

I believe that the media and intelligentsia were complicit in populists’ depiction of modern Western nations as so unjust and dysfunctional that nothing short of a radical lurch could improve them.

“I’d rather see the empire burn to the ground under Trump, opening up at least the possibility of radical change, than cruise on autopilot under Clinton,” flamed a left-wing advocate of “the politics of arson.”50

People have a tremendous amount to lose when charismatic authoritarians responding to a “crisis” trample over democratic norms and institutions and command their countries by the force of their personalities.

Such is the nature of progress. Pulling us forward are ingenuity, sympathy, and benign institutions. Pushing us back are the darker sides of human nature and the Second Law of Thermodynamics. Kevin Kelly explains how this dialectic can nonetheless result in forward motion: Ever since the Enlightenment and the invention of science, we’ve managed to create a tiny bit more than we’ve destroyed each year. But that few percent positive difference is compounded over decades into what we might call civilization. . . . [Progress] is a self-cloaking action seen only in retrospect. Which is why I tell people that my great optimism of the future is rooted in history.53

Kelly offers “protopia,” the pro- from progress and process. Others have suggested “pessimistic hopefulness,” “opti-realism,” and “radical incrementalism.”54 My favorite comes from Hans Rosling, who, when asked whether he was an optimist, replied, “I am not an optimist. I’m a very serious possibilist.”55

“The ruling ideas of each age have ever been the ideas of its ruling class.” Karl Marx

REASON

“One can’t criticize something with nothing”:

To begin with, no Enlightenment thinker ever claimed that humans were consistently rational.

What they argued was that we ought to be rational, by learning to repress the fallacies and dogmas that so readily seduce us, and that we can be rational, collectively if not individually, by implementing institutions and adhering to norms that constrain our faculties, including free speech, logical analysis, and empirical testing. And if you disagree, then why should we accept your claim that humans are incapable of rationality?

But real evolutionary psychology treats humans differently: not as two-legged antelopes but as the species that outsmarts antelopes. We are a cognitive species that depends on explanations of the world. Since the world is the way it is regardless of what people believe about it, there is a strong selection pressure for an ability to develop explanations that are true.7

The standard explanation of the madness of crowds is ignorance: a mediocre education system has left the populace scientifically illiterate, at the mercy of their cognitive biases, and thus defenseless against airhead celebrities, cable-news gladiators, and other corruptions from popular culture. The standard solution is better schooling and more outreach to the public by scientists on television, social media, and popular Web sites. As an outreaching scientist I’ve always found this theory appealing, but I’ve come to realize it’s wrong, or at best a small part of the problem.

Kahan concludes that we are all actors in a Tragedy of the Belief Commons: what’s rational for every individual to believe (based on esteem) can be irrational for the society as a whole to act upon (based on reality).17

What’s going on is that these people are sharing blue lies. A white lie is told for the benefit of the hearer; a blue lie is told for the benefit of an in-group (originally, fellow police officers).19 While some of the conspiracy theorists may be genuinely misinformed, most express these beliefs for the purpose of performance rather than truth: they are trying to antagonize liberals and display solidarity with their blood brothers.

Another paradox of rationality is that expertise, brainpower, and conscious reasoning do not, by themselves, guarantee that thinkers will approach the truth. On the contrary, they can be weapons for ever-more-ingenious rationalization. As Benjamin Franklin observed, “So convenient a thing is it to be a rational creature, since it enables us to find or make a reason for everything one has a mind to do.”

Engagement with politics is like sports fandom in another way: people seek and consume news to enhance the fan experience, not to make their opinions more accurate.25 That explains another of Kahan’s findings: the better informed a person is about climate change, the more polarized his or her opinion.

So we can’t blame human irrationality on our lizard brains: it was the sophisticated respondents who were most blinded by their politics. As two other magazines summarized the results: “Science Confirms: Politics Wrecks Your Ability to Do Math” and “How Politics Makes Us Stupid.”29

Of the two forms of politicization that are subverting reason today, the political is far more dangerous than the academic, for an obvious reason.

In 21st-century America, the control of Congress by a Republican Party that became synonymous with the extreme right has been pernicious, because it is so convinced of the righteousness of its cause and the evil of its rivals that it has undermined the institutions of democracy to get what it wants. The corruptions include gerrymandering, imposing voting restrictions designed to disenfranchise Democratic voters, encouraging unregulated donations from moneyed interests, blocking Supreme Court nominations until their party controls the presidency, shutting down the government when their maximal demands aren’t met, and unconditionally supporting Donald Trump over their own objections to his flagrantly antidemocratic impulses.71 Whatever differences in policy or philosophy divide the parties, the mechanisms of democratic deliberation should be sacrosanct. Their erosion, disproportionately by the right, has led many people, including a growing share of young Americans, to see democratic government as inherently dysfunctional and to become cynical about democracy itself.72

What can be done to improve standards of reasoning? Persuasion by facts and logic, the most direct strategy, is not always futile.

When people are first confronted with information that contradicts a staked-out position, they become even more committed to it, as we’d expect from the theories of identity-protective cognition, motivated reasoning, and cognitive dissonance reduction. Feeling their identity threatened, belief holders double down and muster more ammunition to fend off the challenge. But since another part of the human mind keeps a person in touch with reality, as the counterevidence piles up the dissonance can mount until it becomes too much to bear and the opinion topples over, a phenomenon called the affective tipping point.80 The tipping point depends on the balance between how badly the opinion holder’s reputation would be damaged by relinquishing the opinion and whether the counterevidence is so blatant and public as to be common knowledge: a naked emperor, an elephant in the room.81

The reasons are familiar to education researchers.84 Any curriculum will be pedagogically ineffective if it consists of a lecturer yammering in front of a blackboard, or a textbook that students highlight with a yellow marker. People understand concepts only when they are forced to think them through, to discuss them with others, and to use them to solve problems.

All students should learn about cognitive biases fell deadborn from my lips.)

Effective training in critical thinking and cognitive debiasing may not be enough to cure identity-protective cognition, in which people cling to whatever opinion enhances the glory of their tribe and their status within it.

Experiments have shown that the right rules can avert the Tragedy of the Belief Commons and force people to dissociate their reasoning from their identities.88 One technique was discovered long ago by rabbis: they forced yeshiva students to switch sides in a Talmudic debate and argue the opposite position. Another is to have people try to reach a consensus in a small discussion group; this forces them to defend their opinions to their groupmates, and the truth usually wins.

Most of us are deluded about our degree of understanding of the world, a bias called the Illusion of Explanatory Depth.

Perhaps most important, people are less biased when they have skin in the game and have to live with the consequences of their opinions.

Experiments have shown that when people hear about a new policy, such as welfare reform, they will like it if it is proposed by their own party and hate it if it is proposed by the other—all the while convinced that they are reacting to it on its objective merits.

However long it takes, we must not let the existence of cognitive and emotional biases or the spasms of irrationality in the political arena discourage us from the Enlightenment ideal of relentlessly pursuing reason and truth. If we can identify ways in which humans are irrational, we must know what rationality is. Since there’s nothing special about us, our fellows must have at least some capacity for rationality as well. And it’s in the very nature of rationality that reasoners can always step back, consider their own shortcomings, and reason out ways to work around them.

SCIENCE

That gravity is the curvature of space-time, and that life depends on a molecule that carries information, directs metabolism, and replicates itself.

But the scorn for scientific consensus has widened into a broadband know-nothingness.

Positivism depends on the reductionist belief that the entire universe, including all human conduct, can be explained with reference to precisely measurable, deterministic physical processes. . . . Positivist assumptions provided the epistemological foundations for Social Darwinism and pop-evolutionary notions of progress, as well as for scientific racism and imperialism. These tendencies coalesced in eugenics, the doctrine that human well-being could be improved and eventually perfected through the selective breeding of the “fit” and the sterilization or elimination of the “unfit.”

An endorsement of scientific thinking must first of all be distinguished from any belief that members of the occupational guild called “science” are particularly wise or noble. The culture of science is based on the opposite belief. Its signature practices, including open debate, peer review, and double-blind methods, are designed to circumvent the sins to which scientists, being human, are vulnerable. As Richard Feynman put it, the first principle of science is “that you must not fool yourself—and you are the easiest person to fool.”

The lifeblood of science is the cycle of conjecture and refutation: proposing a hypothesis and then seeing whether it survives attempts to falsify it.

The fallacy (putting aside the apocryphal history) is a failure to recognize that what science allows is an increasing confidence in a hypothesis as the evidence accumulates, not a claim to infallibility on the first try.

As Wieseltier puts it, “It is not for science to say whether science belongs in morality and politics and art. Those are philosophical matters, and science is not philosophy.”

Today most philosophers (at least in the analytic or Anglo-American tradition) subscribe to naturalism, the position that “reality is exhausted by nature, containing nothing ‘supernatural,’ and that the scientific method should be used to investigate all areas of reality, including the ‘human spirit.’”17 Science, in the modern conception, is of a piece with philosophy and with reason itself.

The world is intelligible.

In making sense of our world, there should be few occasions on which we are forced to concede, “It just is” or “It’s magic” or “Because I said so.”

Many people are willing to credit science with giving us handy drugs and gadgets and even with explaining how physical stuff works. But they draw the line at what truly matters to us as human beings: the deep questions about who we are, where we came from, and how we define the meaning and purpose of our lives. That is the traditional territory of religion, and its defenders tend to be the most excitable critics of scientism. They are apt to endorse the partition plan proposed by the paleontologist and science writer Stephen Jay Gould in his book Rocks of Ages, according to which the proper concerns of science and religion belong to “non-overlapping magisteria.” Science gets the empirical universe; religion gets the questions of morality, meaning, and value.

The moral worldview of any scientifically literate person—one who is not blinkered by fundamentalism—requires a clean break from religious conceptions of meaning and value.

To begin with, the findings of science imply that the belief systems of all the world’s traditional religions and cultures—their theories of the genesis of the world, life, humans, and societies—are factually mistaken. We know, but our ancestors did not, that humans belong to a single species of African primate that developed agriculture, government, and writing late in its history. We know that our species is a tiny twig of a genealogical tree that embraces all living things and that emerged from prebiotic chemicals almost four billion years ago. We know that we live on a planet that revolves around one of a hundred billion stars in our galaxy, which is one of a hundred billion galaxies in a 13.8-billion-year-old universe, possibly one of a vast number of universes. We know that our intuitions about space, time, matter, and causation are incommensurable with the nature of reality on scales that are very large and very small. We know that the laws governing the physical world (including accidents, disease, and other misfortunes) have no goals that pertain to human well-being. There is no such thing as fate, providence, karma, spells, curses, augury, divine retribution, or answered prayers—though the discrepancy between the laws of probability and the workings of cognition may explain why people believe there are. And we know that we did not always know these things, that the beloved convictions of every time and culture may be decisively falsified, doubtless including many we hold today.

What happens to those who are taught that science is just another narrative like religion and myth, that it lurches from revolution to revolution without making progress, and that it is a rationalization of racism, sexism, and genocide?

Ultimately the greatest payoff of instilling an appreciation of science is for everyone to think more scientifically.

Three-quarters of the nonviolent resistance movements succeeded, compared with only a third of the violent ones.50 Gandhi and King were right, but without data, you would never know it.

HUMANISM

The goal of maximizing human flourishing—life, health, happiness, freedom, knowledge, love, richness of experience—may be called humanism.

It is humanism that identifies what we should try to achieve with our knowledge. It provides the ought that supplements the is. It distinguishes true progress from mere mastery.

Some Eastern religions, including Confucianism and varieties of Buddhism, always grounded their ethics in human welfare rather than divine dictates.

First, any Moral Philosophy student who stayed awake through week 2 of the syllabus can also rattle off the problems with deontological ethics. If lying is intrinsically wrong, must we answer truthfully when the Gestapo demand to know the whereabouts of Anne Frank?

If a terrorist has hidden a ticking nuclear bomb that would annihilate millions, is it immoral to waterboard him into revealing its location? And given the absence of a thundering voice from the heavens, who gets to pull principles out of the air and pronounce that certain acts are inherently immoral even if they hurt no one?

A viable moral philosophy for a cosmopolitan world cannot be constructed from layers of intricate argumentation or rest on deep metaphysical or religious convictions. It must draw on simple, transparent principles that everyone can understand and agree upon. The ideal of human flourishing—that it’s good for people to lead long, healthy, happy, rich, and stimulating lives—is just such a principle, since it is based on nothing more (and nothing less) than our common humanity.

Our universe can be specified by a few numbers, including the strengths of the forces of nature (gravity, electromagnetism, and the nuclear forces), the number of macroscopic dimensions of space-time (four), and the density of dark energy (the source of the acceleration of the expansion of the universe). In Just Six Numbers, Martin Rees enumerates them on one hand and a finger; the exact tally depends on which version of physical theory one invokes and on whether one counts the constants themselves or ratios between them. If any of these constants were off by a minuscule iota, then matter would fly apart or collapse upon itself, and stars, galaxies, and planets, to say nothing of terrestrial life and Homo sapiens, could never have formed.

If the factual tenets of religion can no longer be taken seriously, and its ethical tenets depend entirely on whether they can be justified by secular morality, what about its claims to wisdom on the great questions of existence? A favorite talking point of faitheists is that only religion can speak to the deepest yearnings of the human heart. Science will never be adequate to address the great existential questions of life, death, love, loneliness, loss, honor, cosmic justice, and metaphysical hope.

To begin with, the alternative to “religion” as a source of meaning is not “science.” No one ever suggested that we look to ichthyology or nephrology for enlightenment on how to live, but rather to the entire fabric of human knowledge, reason, and humanistic values, of which science is a part. It’s true that the fabric contains important strands that originated in religion, such as the language and allegories of the Bible and the writings of sages, scholars, and rabbis. But today it is dominated by secular content, including debates on ethics originating in Greek and Enlightenment philosophy, and renderings of love, loss, and loneliness in the works of Shakespeare, the Romantic poets, the 19th-century novelists, and other great artists and essayists. Judged by universal standards, many of the religious contributions to life’s great questions turn out to be not deep and timeless but shallow and archaic, such as a conception of “justice” that includes punishing blasphemers, or a conception of “love” that adjures a woman to obey her husband.

A “spirituality” that sees cosmic meaning in the whims of fortune is not wise but foolish. The first step toward wisdom is the realization that the laws of the universe don’t care about you. The next is the realization that this does not imply that life is meaningless, because people care about you, and vice versa. You care about yourself, and you have a responsibility to respect the laws of the universe that keep you alive, so you don’t squander your existence. Your loved ones care about you, and you have a responsibility not to orphan your children, widow your spouse, and shatter your parents. And anyone with a humanistic sensibility cares about you, not in the sense of feeling your pain—human empathy is too feeble to spread itself across billions of strangers—but in the sense of realizing that your existence is cosmically no less important than theirs, and that we all have a responsibility to use the laws of the universe to enhance the conditions in which we all can flourish.

It would not be fanciful to say that over the course of the 20th century the global rate of atheism increased by a factor of 500, and that it has doubled again so far in the 21st. An additional 23 percent of the world’s population identify themselves as “not a religious person,” leaving 59 percent of the world as “religious,” down from close to 100 percent a century before.

The Secularization Thesis, irreligion is a natural consequence of affluence and education.66 Recent studies confirm that wealthier and better-educated countries tend to be less religious.

Why is the world losing its religion? There are several reasons.80 The Communist governments of the 20th century outlawed or discouraged religion, and when they liberalized, their citizenries were slow to reacquire the taste. Some of the alienation is part of a decline in trust in all institutions from its high-water mark in the 1960s.81 Some of it is carried by the global current toward emancipative values (chapter 15) such as women’s rights, reproductive freedom, and tolerance of homosexuality.82 Also, as people’s lives become more secure thanks to affluence, medical care, and social insurance, they no longer pray to God to save them from ruin: countries with stronger safety nets are less religious, holding other factors constant.83 But the most obvious reason may be reason itself: when people become more intellectually curious and scientifically literate, they stop believing in miracles.

No discussion of global progress can ignore the Islamic world, which by a number of objective measures appears to be sitting out the progress enjoyed by the rest. Muslim-majority countries score poorly on measures of health, education, freedom, happiness, and democracy, holding wealth constant.90 All of the wars raging in 2016 took place in Muslim-majority countries or involved Islamist groups, and those groups were responsible for the vast majority of terrorist attacks.

Still others were exacerbated by clumsy Western interventions in the Middle East, including the dismemberment of the Ottoman Empire, support of the anti-Soviet mujahedin in Afghanistan, and the invasion of Iraq.

But part of the resistance to the tide of progress can be attributed to religious belief. The problem begins with the fact that many of the precepts of Islamic doctrine, taken literally, are floridly antihumanistic. The Quran contains scores of passages that express hatred of infidels, the reality of martyrdom, and the sacredness of armed jihad.

Of course many of the passages in the Bible are floridly antihumanistic too. One needn’t debate which is worse; what matters is how literally the adherents take them.

Self-identifying as a Muslim, regardless of the particular branch of Islam, seems to be almost synonymous with being strongly religious.”94

Between 50 and 93 percent believe that the Quran “should be read literally, word by word,” and that “overwhelming percentages of Muslims in many countries want Islamic law (sharia) to be the official law of the land.”

All these troubling patterns were once true of Christendom, but starting with the Enlightenment, the West initiated a process (still ongoing) of separating the church from the state, carving out a space for secular civil society, and grounding its institutions in a universal humanistic ethics. In most Muslim-majority countries, that process is barely under way.

Making things worse is a reactionary ideology that became influential through the writings of the Egyptian author Sayyid Qutb (1906–1966), a member of the Muslim Brotherhood and the inspiration for Al Qaeda and other Islamist movements.100 The ideology looks back to the glory days of the Prophet, the first caliphs, and classical Arab civilization, and laments subsequent centuries of humiliation at the hands of Crusaders, horse tribes, European colonizers, and, most recently, insidious secular modernizers.

While the West might enjoy the peace, prosperity, education, and happiness of post-Enlightenment societies, Muslims will never accept this shallow hedonism, and it’s only understandable that they should cling to a system of medieval beliefs and customs forever.

Tunisia, Bangladesh, Malaysia, and Indonesia have made long strides toward liberal democracy (chapter 14). In many Islamic countries, attitudes toward women and minorities are improving (chapter 15)—slowly, but more detectably among women, the young, and the educated.

Let me turn to the second enemy of humanism, the ideology behind resurgent authoritarianism, nationalism, populism, reactionary thinking, even fascism. As with theistic morality, the ideology claims intellectual merit, affinity with human nature, and historical inevitability. All three claims, we shall see, are mistaken.

A thinker who represented the opposite of humanism (indeed, of pretty much every argument in this book), one couldn’t do better than the German philologist Friedrich Nietzsche (1844–1900).109 Earlier in the chapter I fretted about how humanistic morality could deal with a callous, egoistic, megalomaniacal sociopath. Nietzsche argued that it’s good to be a callous, egoistic, megalomaniacal sociopath. Not good for everyone, of course, but that doesn’t matter: the lives of the mass of humanity (the “botched and the bungled,” the “chattering dwarves,” the “flea-beetles”) count for nothing. What is worthy in life is for a superman (Übermensch, literally “overman”) to transcend good and evil, exert a will to power, and achieve heroic glory. Only through such heroism can the potential of the species be realized and humankind lifted to a higher plane of being.

Western civilization has gone steadily downhill since the heyday of Homeric Greeks, Aryan warriors, helmeted Vikings, and other manly men. It has been especially corrupted by the “slave morality” of Christianity, the worship of reason by the Enlightenment, and the liberal movements of the 19th century that sought social reform and shared prosperity. Such effete sentimentality led only to decadence and degeneration.

Man shall be trained for war and woman for the recreation of the warrior. All else is folly. . . . Thou goest to woman? Do not forget thy whip.

A declaration of war on the masses by higher men is needed. . . . A doctrine is needed powerful enough to work as a breeding agent: strengthening the strong, paralyzing and destructive for the world-weary. The annihilation of the humbug called “morality.” . . . The annihilation of the decaying races. . . . Dominion over the earth as a means of producing a higher type.

Most obviously, Nietzsche helped inspire the romantic militarism that led to the First World War and the fascism that led to the Second. Though Nietzsche himself was neither a German nationalist nor an anti-Semite, it’s no coincidence that these quotations leap off the page as quintessential Nazism: Nietzsche posthumously became the Nazis’ court philosopher. (In his first year as chancellor, Hitler made a pilgrimage to the Nietzsche Archive, presided over by Elisabeth Förster-Nietzsche, the philosopher’s sister and literary executor, who tirelessly encouraged the connection.) The link to Italian Fascism is even more direct: Benito Mussolini wrote in 1921 that “the moment relativism linked up with Nietzsche, and with his Will to Power, was when Italian Fascism became, as it still is, the most magnificent creation of an individual and a national Will to Power.”

The connections between Nietzsche’s ideas and the megadeath movements of the 20th century are obvious enough: a glorification of violence and power, an eagerness to raze the institutions of liberal democracy, a contempt for most of humanity, and a stone-hearted indifference to human life.

As Bertrand Russell pointed out in A History of Western Philosophy, they “might be stated more simply and honestly in the one sentence: ‘I wish I had lived in the Athens of Pericles or the Florence of the Medici.’” The ideas fail the first test of moral

Though she later tried to conceal it, Ayn Rand’s celebration of selfishness, her deification of the heroic capitalist, and her disdain for the general welfare had Nietzsche written all over them.113

Disdaining the commitment to truth-seeking among scientists and Enlightenment thinkers, Nietzsche asserted that “there are no facts, only interpretations,” and that “truth is a kind of error without which a certain species of life could not live.”

A godfather to all the intellectual movements of the 20th century that were hostile to science and objectivity, including Existentialism, Critical Theory, Poststructuralism, Deconstructionism, and Postmodernism.

A surprising number of 20th-century intellectuals and artists have gushed over totalitarian dictators, a syndrome that the intellectual historian Mark Lilla calls tyrannophilia.115 Some tyrannophiles were Marxists, working on the time-honored principle “He may be an SOB, but he’s our SOB.”

Professional narcissism. Intellectuals and artists may feel unappreciated in liberal democracies, which allow their citizens to tend to their own needs in markets and civic organizations. Dictators implement theories from the top down, assigning a role to intellectuals that they feel is commensurate with their worth. But tyrannophilia is also fed by a Nietzschean disdain for the common man, who annoyingly prefers schlock to fine art and culture, and by an admiration of the superman who transcends the messy compromises of democracy and heroically implements a vision of the good society.

And Trump has been closely advised by two men, Stephen Bannon and Michael Anton, who are reputed to be widely read and who consider themselves serious intellectuals. Anyone who wants to go beyond personality in understanding authoritarian populism must appreciate the two ideologies behind them, both of them militantly opposed to Enlightenment humanism and each influenced, in different ways, by Nietzsche. One is fascist, the other reactionary—not in the common left-wing sense of “anyone who is more conservative than me,” but in their original, technical senses.118

The early fascist intellectuals, including Julius Evola (1898–1974) and Charles Maurras (1868–1952), have been rediscovered by neo-Nazi parties in Europe and by Bannon and the alt-right movement in the United States, all of whom acknowledge the influence of Nietzsche.

A multicultural, multiethnic society can never work, because its people will feel rootless and alienated and its culture will be flattened to the lowest common denominator. For a nation to subordinate its interests to international agreements is to forfeit its birthright to greatness and become a chump in the global competition of all against all. And since a nation is an organic whole, its greatness can be embodied in the greatness of its leader, who voices the soul of the people directly, unencumbered by the millstone of an administrative state.

The first theocons were 1960s radicals who redirected their revolutionary fervor from the hard left to the hard right. They advocate nothing less than a rethinking of the Enlightenment roots of the American political order. The recognition of a right to life, liberty, and the pursuit of happiness, and the mandate of government to secure these rights, are, they believe, too tepid for a morally viable society. That impoverished vision has only led to anomie, hedonism, and rampant immorality, including illegitimacy, pornography, failing schools, welfare dependency, and abortion. Society should aim higher than this stunted individualism, and promote conformity to more rigorous moral standards from an authority larger than ourselves. The obvious source of these standards is traditional Christianity.

Theocons hold that the erosion of the church’s authority during the Enlightenment left Western civilization without a solid moral foundation, and a further undermining during the 1960s left it teetering on the brink.

Lilla points out an irony in theoconservativism. While it has been inflamed by radical Islamism (which the theocons think will soon start World War III), the movements are similar in their reactionary mindset, with its horror of modernity and progress.124 Both believe that at some time in the past there was a happy, well-ordered state where a virtuous people knew their place. Then alien secular forces subverted this harmony and brought on decadence and degeneration. Only a heroic vanguard with memories of the old ways can restore the society to its golden age.

First, the claim that humans have an innate imperative to identify with a nation-state (with the implication that cosmopolitanism goes against human nature) is bad evolutionary psychology. Like the supposed innate imperative to belong to a religion, it confuses a vulnerability with a need. People undoubtedly feel solidarity with their tribe, but whatever intuition of “tribe” we are born with cannot be a nation-state, which is a historical artifact of the 1648 Treaties of Westphalia. (Nor could it be a race, since our evolutionary ancestors seldom met a person of another race.) In reality, the cognitive category of a tribe, in-group, or coalition is abstract and multidimensional.126 People see themselves as belonging to many overlapping tribes: their clan, hometown, native country, adopted country, religion, ethnic group, alma mater, fraternity or sorority, political party, employer, service organization, sports team, even brand of camera equipment. (If you want to see tribalism at its fiercest, check out a “Nikon vs. Canon” Internet discussion group.)

It’s true that political salesmen can market a mythology and iconography that entice people into privileging a religion, ethnicity, or nation as their fundamental identity. With the right package of indoctrination and coercion, they can even turn them into cannon fodder.127 That does not mean that nationalism is a human drive. Nothing in human nature prevents a person from being a proud Frenchman, European, and citizen of the world, all at the same time.128

Vibrant cultures sit in vast catchment areas in which people and innovations flow from far and wide. This explains why Eurasia, rather than Australia, Africa, or the Americas, was the first continent to give birth to expansive civilizations (as documented by Sowell in his Culture trilogy and Jared Diamond in Guns, Germs, and Steel).129 It explains why the fountains of culture have always been trading cities on major crossroads and waterways.

Between 1803 and 1945, the world tried an international order based on nation-states heroically struggling for greatness. It didn’t turn out so well.

After 1945 the world’s leaders said, “Well, let’s not do that again,” and began to downplay nationalism in favor of universal human rights, international laws, and transnational organizations. The result, as we saw in chapter 11, has been seventy years of peace and prosperity in Europe and, increasingly, the rest of the world.

The European elections and self-destructive flailing of the Trump administration in 2017 suggest that the world may have reached Peak Populism, and as we saw in chapter 20, the movement is on a demographic road to nowhere.

Still, the appeal of regressive ideas is perennial, and the case for reason, science, humanism, and progress always has to be made.

Remember your math: an anecdote is not a trend. Remember your history: the fact that something is bad today doesn’t mean it was better in the past. Remember your philosophy: one cannot reason that there’s no such thing as reason, or that something is true or good because God said it is. And remember your psychology: much of what we know isn’t so, especially when our comrades know it too.

Keep some perspective. Not every problem is a Crisis, Plague, Epidemic, or Existential Threat, and not every change is the End of This, the Death of That, or the Dawn of a Post-Something Era. Don’t confuse pessimism with profundity: problems are inevitable, but problems are solvable, and diagnosing every setback as a symptom of a sick society is a cheap grab for gravitas. Finally, drop the Nietzsche.