CHARLES MURRAY: DOES AMERICA STILL HAVE WHAT IT TAKES?

http://mosaicmagazine.com/essay/2014/04/does-america-still-have-what-it-takes/?utm_source=Mosaic+Daily+Email&utm_campaign=29d187ca02-2014_4_1&utm_medium=email&utm_term=0_0b0517b2ab-29d187ca02-41165129

Why the American spirit of innovation is in trouble, and what culture has to do with it.

By Charles Murray

Some years ago, I conducted an ambitious research project to document and explain patterns of human accomplishment across time and cultures. My research took me from 800 BCE, when Homo sapiens’ first great surviving works of thought appeared, to 1950, my cut-off date for assessing lasting influence. I assembled world-wide inventories of achievements in physics, biology, chemistry, geology, astronomy, mathematics, medicine, and technology, plus separate inventories of Western, Chinese, and Indian philosophy; Western, Chinese, and Japanese art; Western, Arabic, Chinese, Indian, and Japanese literature; and Western music. These inventories were analyzed using quantitative techniques alongside standard qualitative historical analysis. The result was Human Accomplishment: The Pursuit of Excellence in the Arts and Sciences (2003).

 

My study confirmed important patterns. Foremost among them is that human achievement has clustered at particular times and places, including Periclean Athens, Renaissance Florence, Sung China, and Western Europe of the Enlightenment and the Industrial Revolution. But why? What was special about those times and places? In the book’s final chapters, I laid out my best understanding of the environment within which great accomplishment occurs.

 

In what follows, I want to conduct an inquiry into the ways in which the environment of achievement in early 21st-century America corresponds or fails to correspond to the patterns of the past. As against pivotal moments in the story of human accomplishment, does today’s America, for instance, look more like Britain blooming at the end of the 18th century or like France fading at the end of the 19th century? If the latter, are there idiosyncratic features of the American situation that can override what seem to be longer-run tendencies?

 

To guide the discussion, I’ll provide a running synopsis, in language drawn from Human Accomplishment, of the core conditions that prevailed during the glorious periods of past achievement. I’ll focus in particular on science and technology, since these are the fields that preoccupy our contemporary debates over the present course and future prospects of American innovation.

 

 

 

 

1. Wealth, Cities, Politics

 

I begin with enabling conditions. They don’t explain how the fires of innovative periods are ignited—we’ll come to that later—but they help explain how those fires are sustained.

 

  • Accomplishment in the sciences and technology is facilitated by growing national wealth, both through the additional resources that can support those endeavors and through the indirect, spillover effect of economic vitality on cultural vitality.

 

What is the relation between innovation and economic growth? The standard account assumes that the former is a cause and the latter is an effect. To judge from past accomplishment in fields other than technology, however, the causal arrow points in the other direction as well. Growing wealth encouraged a competitive art market in Renaissance Florence, providing incentives for the young and talented to enter the field. Growing wealth in 18th-century Europe enabled patrons to support the work of the great Baroque and classical composers. Similarly with technological innovation: growing wealth is not only caused by it but helps to finance the pure and applied research that leads to it.

 

Growing national wealth also appears to have a more diffuse but important effect: encouraging the cultural optimism and vibrancy that accompany significant achievement. With only one conspicuous exception—Athens in the fourth century BCE, which endured a variety of catastrophes as it produced great philosophy and literature—accomplishment of all sorts flourishes in a context of prosperity.

 

In assessing contemporary America’s situation from this angle, the big unanswered question is whether the upward growth curve that has characterized the nation’s history will continue or whether our present low-growth mode is a sign of creeping economic senescence. It is too soon to say, but if the latter proves to be the case, innovation can be expected to diminish. No society has ever been economically sluggish and remained at the forefront of technological innovation.

 

  • Streams of accomplishment become self-reinforcing as new scientists and innovators build on the models before them.

 

Statistically, one of the strongest predictors of creativity in a given generation is the number of important creative figures in the two preceding generations. By itself, the correlation tells us only that periods of creativity tend to last longer than two generations. The reasons are unknown, but one specific causal factor has been noted by writers going all the way back to the Roman historian Velleius Paterculus in the first century CE. Explaining the improbable concentration of great accomplishment in Periclean Athens, Paterculus observed that “genius is fostered by emulation, and it is now envy, now admiration, which enkindles imitation.” In the modern era, the psychologist Dean Simonton has documented the reality underlying Paterculus’s assertion: a Titian is more likely to appear in the 1520s if Michelangelo and Leonardo were being lionized in the 1500s; a James Maxwell is more likely to turn his mathematical abilities to physics in the 1850s if Michael Faraday was a national hero in the 1840s.

 

By this standard, American culture would seem to be going downhill. It’s likely that individuals within most technological industries still have heroes, unknown to the public at large, who serve as models. People within the microchip industry know about Jack Kilby, Robert Noyce, and Gordon Moore; people within the energy-development industry know about George Mitchell. But such local fame is not what inspires members of one generation to emulate members of the preceding generation or generations.

 

In part, the declining visibility of outsized individuals reflects the increasingly corporate nature of technological innovation itself. Insiders may be aware of the steps that led to the creation of the modern microchip or the development of slickwater fracturing, but those steps have no counterpart to the moments when Samuel Morse telegraphed “What hath God wrought” and Alexander Graham Bell said “Mr. Watson, come here,” or to the day when Thomas Edison watched an incandescent bulb with a carbon filament burn for 13.5 hours after hundreds of other filaments had failed. Even Steve Jobs and Bill Gates, the most famous people involved in the development of the personal computer, didn’t actually invent anything themselves.

 

In part, too, the decline I’m tracing here reflects a larger cultural shift. In America, inventors once loomed large in the popular imagination. In the classroom, schoolchildren throughout the 19th and early 20th centuries grew up on the stories of Bell and Morse and Edison, of Eli Whitney, Robert Fulton, the Wright brothers, Henry Ford, and more—as well as on stories of awe-inspiring technological achievements like the building of the transcontinental railway and the Panama Canal. Popular fiction celebrated inventors and scientists—Sinclair Lewis’s Arrowsmith provoked a surge of interest among young people in becoming medical researchers—and Hollywood made movies about them. There are still occasional exceptions (the movies Apollo 13 and The Social Network come to mind), but they are rare. The genre is out of fashion, as is the ethos that supported it.

 

  • Streams of accomplishment are fostered by the existence of cities that serve as centers of human capital and that supply audiences and patrons for the arts and sciences.

 

Here is one enabling condition for accomplishment that today’s America obviously meets, and has met since the last half of the 19th century. In Silicon Valley, we see a new kind of critical mass of human capital, centered not in a city but in a geographical area and producing spectacular innovation. The Internet and all its ancillary effects have created a new way to achieve such critical masses: today, the world’s dozen top researchers in one or another arcane topic are likely to be in continual contact, trading drafts of papers, exchanging results, arguing and inspiring each other as effectively as if they were working in the same laboratory.

 

When I formulated this enabling condition for my 2003 book, I was still working off of traditional models for centers of human capital. Today the formulation should be usefully rephrased to embrace the great leap forward fueled by the revolution in information technology (IT).

 

  • Streams of accomplishment are fostered by political regimes that give de-facto freedom of action to their potential scientists and inventors.

 

All great scientific and technological accomplishments prior to the 19th century occurred in societies that we would consider unfree by today’s standards. Most took place in autocracies. What this suggests is that the crucial factor is not freedom in the political and legal sense but de-facto freedom of action. The latter was provided by most nations of Western Europe from the 15th century onward.

 

By this measure, too, America has been sliding downward. Technically, its innovators of tomorrow still live in a free society. But the environment within which they operate is increasingly subject to two large and growing constraints.

 

The first comprises the regulatory regimes at the federal, state, and local levels that make it impossible or financially unfeasible to implement innovation in many sectors of the economy. The second is the American system of tort law with its rules governing class-action suits and punitive damages. These render many kinds of innovations vulnerable to ruinous liability if attorneys can convince a jury that a product is to blame for anything that goes wrong.

 

The combination of health, safety, and environmental regulation with tort liability bears down heavily on the risks presented by new products and procedures—even when those risks are measurably fewer and smaller than those presented by the products and procedures they replace. The result is to instill in potential innovators a version of the precautionary principle attributed to Thomas Schelling: “Never do anything for the first time.”

 

It is impossible to say with certainty where such fields as medical technology, pharmaceuticals, or cheap energy would be today in the absence of these constraints, but the phenomenal innovation we have seen in IT may afford a contrasting lesson. No one in the federal regulatory jungle has been able to devise an excuse for regulating the number of transistors that a microchip may contain; nor have class-action lawyers been able to make a case that Google’s zillions of linked servers cause cancer. The nature of information technology has left it uniquely free among growing American industries of the last few decades—and the results speak for themselves.

 

Constraints on de-facto freedom of action may well constitute the single most decisive factor in impeding American technological innovation over the last half-century. It is a sad commentary, but true: for people who want to shake up the world by building a better mousetrap, when it comes to most kinds of mousetraps, America is no longer all that free.

 

 

 

 

2. Raw Materials   

 

  • The magnitude and the content of a stream of accomplishment in a given domain vary according to the richness and age of the organizing structure.

 

Part of what determines the rate of innovation in science and technology is the abundance, or lack of it, of raw material. A good way to think about this is through the idea of organizing structures.

 

Imagine you are a young painter at the beginning of the 15th century, practicing your art as painters have been doing since time out of mind. Then, in 1413, in the piazza in front of Florence’s Baptistery, Filippo Brunelleschi unveils the techniques of linear perspective. Now you are able to portray a three-dimensional world on a two-dimensional canvas with unprecedented fidelity. Your amount of raw material has suddenly and hugely expanded.

 

Linear perspective is an example of an organizing structure, one whose result in history was an outpouring of great art. Similar organizing structures include polyphony and tonal harmony in music, which opened up vast new raw material for composers, and the novel, which did the same for writers. In today’s world, the graphical user interface with windows and mice that we now take for granted, developed at Xerox PARC in the 1970s, constitutes another such organizing structure, one that afforded huge new possibilities to programmers.

 

The degree of creativity triggered by an organizing structure can be measured in two dimensions. One is the structure’s inherent richness. Both checkers and chess enjoy organizing structures, but chess’s is much the richer, making the potential for accomplishment in that game commensurately greater. Something similar may be said of the sonnet versus the novel: many beautiful sonnets have been written, but the organizing structure of that form is much more restrictive than the novel’s.

 

The second dimension is the structure’s age. However rich they may be, organizing structures do grow old. In the arts, talented creators in each generation want to do new things; although the form of the classical symphony may well have room for more great works to appear, young composers want to try something else. In science, the aging process works differently. The discovery of E=mc² can happen only once. Sooner or later, each scientific discipline not only ages, it “fills up.”

 

Some disciplines—human anatomy and the geography of the earth, for example—are for practical purposes completely filled up. Others are in an advanced stage, like a jigsaw puzzle of a landscape missing only the sky: more remains to be done, but doing it is not going to change materially the state of knowledge. For example, it is believed that many varieties of insects are yet to be undiscovered, but it is unlikely that their discovery will change entomologists’ understanding of insects.

 

Something analogous has happened to innovation in the tasks of everyday life. The technology for the easing of everyday life is not an organizing structure, strictly defined. Think of it rather as a large bin. At the beginning of the 18th century, the bin was almost empty. Cooking food was laborious; so was cleaning clothes; so was keeping the house warm. Would people have preferred not having to carry all their water to the house from a well or to chop wood for the fireplace? If asked, they would have said yes, of course. They might not have been able to envision how these chores could be eliminated, but they knew they were defects in daily life. There were many such defects, on which inventive minds might exercise their inventiveness.

 

The 18th century saw the rapid spread of the Franklin stove (invented in 1741) for heating rooms, an early version of the kitchen stove (1780), and the flush toilet with ball-valve and siphon (1778). The next one-hundred years saw both the stove and the toilet spread throughout middle-class society, along with gas lighting, running water, and sewage systems in cities; then came electric lighting, central heating, hot water on demand in bathroom and kitchen, the carpet sweeper, sewing machine, canned foods, and iceboxes using commercially manufactured ice. The first half of the 20th century witnessed the advent of the washing machine and dryer, the refrigerator, dishwasher, home freezer, garbage disposal, vacuum cleaner, and frozen foods. The first commercial microwave oven went on sale in 1947.

 

By 1950, revolutionary innovation in the technology of daily life was over. Since then, we have seen refinements. Microwave ovens didn’t become affordable and widespread until the 1970s. The range and quality of precooked foods has expanded by orders of magnitude. We now have Cuisinarts, bread makers, electric pasta machines, and a dozen other kitchen gadgets that didn’t exist in 1950. We have sybaritic options for our bathtubs and showers that were invented after 1950. But the effects of these improvements on daily life, compared with, say, the effect of not having and then having indoor plumbing, are trivial. The bin has largely been filled up.

 

A similar story could be told of transportation. As of 1700, we could move on land no more rapidly than a horse could gallop. At sea, a transatlantic voyage took several weeks. By 1960, more than a half-century ago, trains, cars, ships, and airplanes had reached their present speeds and levels of comfort, with only minor changes—the bullet train, still not available in the U.S., and the short-lived hypersonic Concorde plane—since then. Hypersonic international travel will be introduced at some point in the future, and not many years from now cars will drive themselves; but these, too, are attractive enhancements of a bin that has been largely filled up—at least until the time Star Trek’s transporter becomes a reality.

 

But there is an important exception: both in scientific knowledge and in technological innovation, some of the remaining gaps are huge. In pure science, dark matter is still a mystery; unlocking that mystery is sure to be a landmark event in the history of human knowledge. The reality of “quantum entanglement” is now accepted, but it involves some sort of unexplained instantaneous effect: not just faster than the speed of light but instantaneous at unlimited distances. Who knows what our eventual understanding of that phenomenon will lead to?

 

In technology, the obvious example of a bin that is not yet close to being filled up is the one produced by the IT revolution. Those who compare the effects of that revolution with the effects of the industrial revolution are not being hyperbolic. Thanks to the rich organizing structures of the microchip, the graphical user interface, and the Internet, we are probably nearing an apogee of innovation in this field that has few parallels in human history. The potential effects are so open-ended that a whole movement—the “singularity” movement, associated most prominently with the names of Ray Kurzweil and Vernor Vinge—has come into being to predict and, some fear, control how humanity itself will be transformed. Technical advances in genomics and genetics offer the prospect of other sweeping revolutions with no less ambiguous effects.

 

In sum, we are living at a time when, because of inherent constraints, scientific accomplishment and technological innovation are declining. Simultaneously, an explosion of innovation is taking place in other fields where the state of knowledge still exhibits important gaps and the potential for advance is still rich. We should not be surprised to see uneven rates of innovation in different fields. Some of the unevenness may be attributable to features of American culture or politics, the subject of the next section, but others may be due to the workings of organizing structures.

 

 

 

 

3. The Need for Purpose and Autonomy

 

Enabling conditions help explain how periods of innovation continue. Organizing structures help explain the magnitude and the content of the innovations themselves. But neither category explains how the fires of innovation are ignited, or why they die out. In Human Accomplishment, I proposed two places to look for an answer: first, the sources of personal energy that impel potential innovators to realize their potential; second, the characteristics of the milieu in which they grow up.

 

I’ll begin with the sources of personal energy, which also come in a pair: purpose and autonomy. The two are closely intertwined.

 

  • A major stream of human accomplishment is fostered by a culture in which the most talented people believe that life has a purpose and that the function of life is to fulfill that purpose.
  • A major stream of human accomplishment is fostered by a culture that encourages the belief that individuals can act efficaciously as individuals, and encourages them to do so.

 

In science and technology, people with a strong sense of “This is what I have been put on earth to do”—people who have a sense of vocation—are more likely to try to accomplish great things than are equally talented people who don’t have that sense.The reason is self-evident. People who choose these fields because they see them as the way to fulfill their destiny also tend to set their sights on ambitious, even grandiose goals. People who go into science or technology for the paycheck are less likely to do that.

 

People with a sense of vocation are also more likely to succeed in achieving the great goals they set for themselves. Suppose a talented scientist without a sense of vocation is assigned to an intellectually arduous task. He is less likely than a person with a sense of vocation to come up with the breakthrough, because the overriding reality about great accomplishment is that it almost always requires incredibly hard work. No other finding emerges so consistently from studies of the lives of the great figures in the sciences, arts, business, and academia. Fame can occur overnight, and is not necessarily connected either with merit or with hard work. Prodigious raw talent occasionally produces an isolated gem. But the highest forms of achievement virtually always require a long apprenticeship, persistence in the face of setbacks, single-mindedness (often obsessive), and brutally long hours. As an anonymous Greek poet put it: “Before the gates of excellence the high gods have placed sweat.”

 

In addition to believing that life has a purpose, people also need to believe that they have the power and even the responsibility to fulfill that purpose through their own independent acts: to believe that they are autonomous, efficacious individuals. To see the role that autonomy plays, consider the case of classical China.

 

China has always enjoyed an intellectually talented, industrious population, and historically China’s stock of human capital led to a sophisticated civilization that in many ways was more advanced than the West’s until halfway through the second millennium CE. But in China, one’s role in life was defined in terms of one’s obligations to family, especially to parents. The eighteen-year-old in classical China who set out to follow his star in defiance of his parents’ wishes was not just headstrong or willful, as he might have been seen in the West, but behaving so very, very badly that no Chinese son of good character would consider such a course.

 

Classical Chinese culture also disapproved of open, vociferous intellectual argument. Correspondingly, it did not foster the kind of “I’m right, you’re wrong, and I’ll prove it” frame of mind that has been central to the West’s scientific and technological progress.

 

These aspects of China’s culture did not prevent many Chinese from achieving great innovation—paper and gunpowder are just the most famous of dozens of important innovations that occurred first in China. But the absence of a tradition of individualism lowered the creative energy that the human capital of China was capable of generating.

 

Through the Middle Ages, the West also lacked such a tradition. Not even the golden age of Hellenic philosophy espoused individualism as we think of it today. The polis took precedence. Nor did the advent of Christianity bring individualism immediately to the West. In some ways, Christian theology was individualistic from its inception—teaching that all persons are created in the image of God, are equal in the sight of God, and are invited into a personal relationship with God. But Christianity as it was practiced for its first 1,200 years did not attach much importance to individual accomplishment of great things in this life. To the contrary: many of the most talented young people were drawn into a monastic life of prayer and contemplation in preparation for the life to come.

 

Then in the 13th century came Thomas Aquinas, teaching that humans are morally autonomous beings who best serve God by using all of their capacities of intellect and will, whether to unravel the mysteries of the universe or to create works of beauty. The humanism that Aquinas grafted onto Christianity’s central promises of eternal salvation and a personal relationship with God created a potent force.

 

It is increasingly accepted by historians who have explored the question that the single most powerful cultural force fostering Western individualism has been post-Aquinian Christianity, augmented later by the Reformation and its contribution to what Max Weber would call the Protestant Ethic. Historians still argue about the specific role of the Reformation in this process, but in the terms I am using here, post-Aquinian Christianity fostered both purpose and autonomy.

 

 

 

And America? Throughout most of its history, American culture has run with the concept of the autonomous individual as no other culture has ever done. One of the signal features of American exceptionalism is the fierce belief that, if they are willing to work hard enough, people can achieve whatever they set their minds to.

 

But that sense of autonomy has been deteriorating for at least a half-century.

 

One of the most important psychological measures for predicting success in life (apart from IQ score) is one’s place on the “locus of control” scale. This positions people on a spectrum from “highly internal”—i.e., believing that one’s fate is within one’s own control—to “highly external”—i.e., believing that one’s fate is determined by outside forces. In other words, the locus-of-control scale is a direct measure of the sense of autonomy. According to a meta-analysis of 97 studies with results running from 1960 to 2002, locus of control among college students fell steadily over the course of that four-decade-plus span, with the average student of 2002 displaying a lower (less “internal”) sense of autonomy than did 80 percent of college students in 1960.

 

Apart from the social-science data, indicators in everyday life reveal how much the traditional American veneration of individuals triumphing by dint of perseverance and hard work has faded. A few decades ago, it would have been unthinkable for a president of the United States to make President Obama’s “You didn’t build that” speech, celebrating the supremacy of the collective and denigrating the contribution of the individual. It would have been political suicide. No longer.

 

The data on sense of purpose tell a similar story of decline. We know from questions asked by the General Social Survey that people attach more and more importance to job security and short working hours and less and less importance to work that “gives a feeling of accomplishment.” The government’s Current Population Survey tells us that the percentage of employed males who work fewer than 40 hours a week has been rising even in the healthiest economies, and so has the percentage of males who aren’t in the labor force even when they’re in their prime working years and even in periods when the economy has had jobs for anyone who has wanted to work for any number of hours per week.

 

Those trends began among men with lower levels of education. In the last decade, they have been increasing among the college-educated as well. Among the latter, the percentage of men who work more than 48 hours a week has been decreasing since the turn of the century. There do remain niches in the economy where people routinely work long hours—new hires at prestigious hedge funds and investment banks, associates at top law firms seeking to make partner, and in much of the IT industry. There also remain many people in other fields who love their vocations and work long hours all their lives. But for American culture as a whole, the drive to find meaning in work and to do whatever it takes to be the best appears to have been diminishing.

 

 

 

What accounts for the declines in purpose and autonomy? I have already hinted at the answer, which resides in the second place to look for the sources that ignite or suppress the fires of innovation: namely, the cultural milieu in which potential innovators grow up.

 

One plausible part of the answer is secularization. If you have been put on earth for a purpose, the universe must have a purpose, which in turn necessitates some form of God. Since 1972, the proportion of Americans aged thirty to forty-nine who are explicitly nonbelievers has quintupled, reaching 20 percent in 2010. Another 30 percent in the same year said they had a religion but attended worship services no more than once a year. Both of these trends have accelerated in the last two decades.

 

Secularization is particularly evident among intellectuals. In the population at large, explicit atheists may be at only 20 percent, but among members of the National Academy of Sciences, 65 percent in one poll said they did not believe in God. To put it another way, the people who are best positioned to be great innovators in science and technology are precisely the people who are now least likely to have a sense of vocation coming from God. Most of the people who make it into the National Academy of Sciences have had a strong sense of vocation without religion, so religion is not a necessary condition for a sense of vocation. But it helps. The question to be asked is: how much has secularization contributed to vocational ennui among those with the intellectual potential to become great scientists or innovators?

 

Growing wealth and security are also implicated. For the first time in human history, a high proportion of the most talented people in advanced societies get advanced educations, find good careers, and take prosperity and security for granted. Furthermore, affluence and technology have proliferated attractive leisure alternatives: second homes, trips abroad, and a multitude of time-intensive avocations that were unknown a half-century ago and that compete with spending 60 hours a week in the laboratory.

 

Combine all this with a worldview that says there is no God, no destiny, no ultimate good, and it is natural that people develop what I have elsewhere called the “Europe Syndrome,” after the Western European countries where it is most visible: a way of life based on the belief that humans are collections of chemicals activated by conception and deactivated by death, and that the purpose of life is to while away the intervening years as pleasantly as possible, with as little trouble as possible.

 

Along with secularization, the Europe Syndrome is spreading in contemporary America. I cannot put coefficients to the size of the effect, but the existence of the trend is not open to argument. Fewer young Americans than in earlier stages of our history come to adulthood assuming that they have a purpose in life, that they are impelled to fulfill that purpose, and that they can expect to do so through their own efforts. It follows that, to some extent, the net amount of National Creative Energy brought to scientific and technological innovation must suffer as well.

 

 

 

4. Transcendental Goods

 

  • A major stream of accomplishment in any domain requires a well- articulated vision of, and use of, the transcendental goods relevant to that domain.

 

Closely associated with the roles of purpose and autonomy in stimulating great achievement is the final condition that I identified in Human Accomplishment: the concept of transcendental goods.

 

“Transcendental” refers to perfect qualities that lie beyond direct, complete experience. In the classical Western tradition, the worth of something that exists in our world can be characterized by the three dimensions known as the true, the beautiful, and the good. Those three are what I mean by transcendental goods. Since the good is not a term in common use these days, I should specify that I am using it in Aristotle’s sense in the opening statement of the Nicomachean Ethics:

 

Every art and every inquiry, and similarly every action and pursuit, is thought to aim at some good; and for this reason the good has rightly been declared to be that at which all things aim.

 

To put this in operational terms: if a culture possesses a coherent, well-articulated sense of what constitutes excellence in being human, it has a conception of the good as I am using it.

 

The discussion of transcendental goods in Human Accomplishment focused on the arts, where the concepts of beauty and the good are crucially important. In contrast, the transcendental good that has mattered most in science and technology has been truth. Many scientists have also seen beauty in the laws of mathematics and physics, but the beauty was incidental. Above all else, science has been a search for the truth, and technology has consisted in the application of those truths. During the last century, the abiding devotion to truth has helped to keep science vital.

 

But what happens when a society that still believes in pursuing truth in the sciences becomes heedless of the good? That society brings a lack of moral seriousness to moral problems. This is our condition, and it has come upon us at a peculiarly dangerous juncture in human history.

 

Until now, science hasn’t possessed the power to tinker with the nature of the human animal. Within a few decades at most, progress in genetic engineering will give science that power. Perhaps it will occur in the form of designer babies, perhaps in ways of linking human consciousness with computers, perhaps in ways that are still unforeseeable. But in one form or another, it will be within the power of human beings to alter the nature of our very humanity. We will arrive at this crossroads just as philosophical and religious discourse about the good—the nature of human flourishing—is nearly inaudible.

 

We have brilliant people thinking about such issues. Leon Kass’s writings on the intersection of bioethics and the nature of what it means to be human are profound. The “singularity” movement, as I mentioned earlier, has prompted some well-known people to argue for the beneficent implications of a transformation in human intelligence; others, like Kass, point to the potential menace. But the latter do their work in a cultural milieu that disdains them, and the uses to which the new technologies will be put will likely be determined by that cultural milieu.

 

Making the case that ours has become a cultural milieu indifferent to the good is the subject for a book. Here I will leave it as an assertion: if genetic engineering develops a capability of the kind I have been describing, that capability will be realized somewhere in the world with little resistance by the general public, under governments that cannot be trusted to act wisely.

 

Imagine, for example, that it becomes possible to engineer male fetuses so that they are no longer any more aggressive than female fetuses. Should the government permit individual parents to make such choices? What should be the government’s role in forbidding, encouraging, or even requiring such choices? That is just one of dozens of issues that will need to be assessed not just on a case-by-case basis but in the context of a rich, rigorous discourse on what it means to be human. The prospect of trying to address such questions in a culture that increasingly rejects the belief that human life has a transcendental dimension is profoundly troubling.

 

 

 

 

5. How America Matches Up

 

How, then, does the ideal culture of innovation match up with today’s America? Our evidence straddles two quite different conclusions.

 

On the positive side, the beginning of the 21st century has seen the opening of new and rich organizing structures in information technology and genetics that permit prodigious innovation. So far, America is still at the forefront of innovation in both of those domains.

 

On the negative side:

 

  • As America increasingly resembles Europe economically, it must be expected increasingly to resemble Europe in terms of its innovative energy.
  • Celebration of innovation and innovators is out of fashion.
  • The restrictions on de-facto freedom of action are already extreme in many industries, and can be presumed eventually to encroach on information technology. In the realm of genetic innovation involving humans, where some regulation is desirable, bad regulation will needlessly inhibit the desirable forms of genetic innovation as well; the history of government regulation is hardly encouraging on this point.
  • As reflected in American attitudes and behaviors, the fostering of a sense of purpose and autonomy has declined in our culture.

 

These negatives are not etched in stone. Changes in tort law and regulation could have major effects on the climate of innovation. American cultural norms have been known to change quickly and dramatically in the past, and could do so again. When some of the large gaps in our scientific knowledge are filled, we might see huge effects on the amount and quality of human capital turned toward innovation.

 

Finally, in assessing contemporary America as it looks from the template drawn in Human Accomplishment, I have not taken into account the specific dynamics that might make America an exceptional case. In light of that template, though, it is clear that if we are to override historical tendencies and avoid deep trouble, we had better have at our disposal some of those exceptional dynamics. For when a government is increasingly hostile to innovation, as America’s is, and a society is decreasingly industrious, as America’s is, and a culture stops lionizing innovators, as America’s has, and elites increasingly deny that life has transcendent purpose, as America’s do, innovation must be expected to diminish markedly.

 

To return to the contrast I suggested at the outset: today we bear little resemblance to England at the end of the 18th century, and look a lot like France fading at the end of the 19th.

 


Read online at http://mosaicmagazine.com/essay/2014/04/does-america-still-have-what-it-takes/ | © Copyright 2014 Mosaic Magazine.

Comments are closed.