Thursday, January 06, 2011

Ten Most Significant Cultural Trends of the Last Decade

by Andy Crouch

Originally printed in Qideas

Ten years is a very short time. As I reflect on the world in 2011 compared to the world in 2001, I’m less struck by how much has changed than by how much is the same. Terror, war, new technology, economic boom and bust, surprising political triumphs followed by sudden changes of fortune—yup, sounds like the 1990s, 1980s, 1970s, and 1960s to me. It’s almost axiomatic that any change big enough to shape an entire nation or society happens in long waves spanning generations, not a mere ten years.

Indeed, when I reflect on the most significant developments of the never-adequately-named 2000s (the aughts? the aughties? the naughties?), it seems that almost all of them were well under way in 1999, or even 1989. At the same time, in the last ten years some long-wave trends accelerated in notable ways. Acceleration matters. In one sense, walking, riding a horse, driving a car, and traveling by plane are simply variations on the millennia-old human theme of mobility, tracing back literally to the earliest signs of our restless race. But the difference between five miles an hour and 500 miles an hour is not just a quantitative matter of speed, but a qualitative change in the horizons of possibility.

Here are ten significant trends in North American culture that accelerated dramatically in the 2000s—almost always for better and for worse at the same time.

One | Connection

By far the most significant acceleration was in our technologies of connection. In June 2000, 97 million mobile phone subscribers existed in the United States; in June 2010, the number rose to 293 million. Urban and suburban Americans swim in a sea of WiFi (sitting in my living room on a quiet side street I can see 8 wireless networks)—and in the middle of Nebraska, you can get online at McDonald’s.

What did not take off in the 2000s was “virtual reality”—a world constructed entirely of disembodied bits, populated by avatars and existing only in the realm of the ideal. As the 2000s ended, the virtual-reality world Second Life was on virtual life support.

Instead, we used technology to reinforce our embodied relationships. Facebook was the highest trafficked website in 2010 (US subscribers in 2000: zero; in 2010: 116 million). Look at your Facebook friends—unless you are a celebrity, the vast majority of them are people you have met in the flesh. Same with the recents on your cell phone. Rather than replacing embodied connection, our devices supplemented and extended it, an electromagnetic nervous system to match the physical infrastructure of transport built in the twentieth century.

Two | Place

Therefore, oddly enough after a decade of wild growth in invisible telecommunications, place mattered more in 2010 than it did in 2000. Travel and transport remained basically flat throughout the decade. Total vehicle miles driven, while an impressive 3 billion miles in 2010, were only up from 2.7 billion miles in 2000, a period during which the population increased from 288 to 318 million—meaning the average American drove less in 2010 than in 2000. At 9:45 tomorrow morning there will be roughly 4,500 commercial flights in the air, just as there were on 9:45 the morning of 11 September 2001—no change despite a decade of economic and population growth. And mobility, the hallmark of twentieth-century United States culture, declined throughout the decade and reached a post-war low in 2010, with less than 10% of American households changing their address.

At the Q gathering in 2010, urbanologist Richard Florida observed that young adults meeting one another no longer ask, “What do you do?” They ask, “Where do you live?” More and more people will change careers in order to stay in a place—connected to family, friends, and local culture—than will change place to stay in a career. The 20th-century American dream was to move out and move up; the 21st-century dream seems to be to put down deeper roots. This quest for local, embodied, physical presence may well be driven by the omnipresence of the virtual and a dawning awareness of the thinness of disembodied life.

Three | Cities

Cities, the places where both connection and local presence can thrive simultaneously, had an extraordinary renaissance in the 2000s. The revival of American cities was underway already in 2000, but it reached its full flowering by 2010. Of course not every single American city flourished in the last decade, but those of us old enough to remember New York, Chicago, Atlanta, or Houston circa 1990—not to mention Portland, Columbus, or Phoenix—can only be astonished at the way economically fading and often crime-ridden city centers revived as centers of commerce and creativity.

The challenges often associated with urban life, meanwhile, began a movement to the suburbs that may well accelerate in the 2010s. The frontiers of justice, mercy, compassion, and reconciliation are now in the suburbs—places where connections are harder to sustain and local culture is thinner and less appealing than the cities. Some suburban environments will reinvent themselves, but multi-generational poverty, crime, and gangs that provide a substitute social network where others have failed are already as common in Westchester County as in the Bronx, in the San Fernando Valley as in Compton. The really radical and difficult place to raise a family by 2020 will be . . . the suburbs.

[See Tim Keller's Q talk on "Grace and the City" and Joel Kotkin's on "The Future of the Suburbs."]

Four | The End of the Majority

Everywhere in the 2000s, cultural majorities collapsed. Predominantly black neighborhoods became half Hispanic. White rural communities saw dramatic immigration from Asia and Latin America. City centers became internationalized. Mercados and Asian food markets sprung up in suburbia and in exurbia (drive down a thoroughfare well beyond the 285 beltway in Atlanta, and you will see shop signs in a dozen different languages). White Americans were still a bare majority of the population by the end of the decade, but in delivery rooms they were already only a plurality (the largest of many minorities).

We are all minorities now. Evangelical Christians are a minority, as are liberal Protestants, Catholics, Jews, Muslims, Buddhists, agnostics, and atheists. The establishment of Will Herberg’s 1955 book Protestant—Catholic—Jew is now a minority. Barack Obama is a minority, but so is Sarah Palin. Republicans are a minority—so are Democrats, and so are independents.

There may never have been a society in history that was as culturally, religiously, and politically diverse as the United States is today—except perhaps the Roman Empire. There are few models for how such a diverse community can sustain itself, and plenty of models for failure. Perhaps the most hopeful model is a community that arose at the edges of that Empire and eventually spread to its heart, among whom there was neither Jew nor Gentile, slave nor free, male nor female.

Five | Polarity

We used the technologies of connection and the commitment to place to sort ourselves into more and more tightly homogenous subcultures, refuges both virtual and real from the heterogeneity of our society. Republicans became more Republican; Democrats became more Democratic. Salon lost ground to the Huffington Post—CNN lost ground to Fox News. A president elected on the premise of unity presided over two years of ever-sharper rhetoric of division and seemed unable to change the game. Hipsters got more extremely hip. The Reformed became truly Reformed.

It was not at all clear, as polarization accelerated, that anyone could convince any large number of Americans that they had anything crucial in common.

Six | The Self Shot

When movie directors in the 2030s are trying to convey in a single glance that their scene is set in the 2000s, they will use the self shot—the self-portrait shot from a digital camera or cell phone held by one hand extended away from the subject. We look out at our own hand, perhaps squeezing another friend into the frame, composing our face in a smile or a laugh. We are shooting ourselves.
.
The visual presentation of the self accelerated in the 2000s. Previous generations saw themselves most often in mirrors. But mirrors do not show us what others see—they show us a mirror image with right and left reversed. The difference is subtle but real, and symbolic of a deeper reality. Now most 20-year-olds have seen thousands of images of themselves as others see them. In the 2000s we learned to shape and groom our image for public consumption. Body modification—augmentation, reduction, smoothing, straightening, whitening, tanning, not to mention tattooing—became normative. The closing years of the decade gave us the word “manscaping.” Enough said.

Seven | Pornography

Underneath it all was porn. Pornography is as old as visual art, but in the 2000s it was more ubiquitous than it had been since the ancient Greeks erected herms at every crossroads. Superimposed on every image of our own bodies, and the bodies of our friends and lovers, were the idealized bodies of pornography and its close cousin, advertising and popular culture, which differ from porn only in not consummating the voyeuristic impulses they arouse.

And yet as omnipresent as porn was, it remained underground—a subject of shame even among the most secular and urbane. Our culture seemed to draw back from the brink at the same time as it plunged into the abyss. The bestselling memoir was titled Eat, Pray, Love, not, Eat, Pray, F@#k. No one really wanted the culture of porn to become a runaway train. But neither was anyone sure how to stop it.

Eight | Informality

Men untucked their shirts. Billionaires wore jeans. The most powerful CEO in America was universally known as “Steve.” Indeed, informality was now a sign of privilege—only low-status workers wore uniforms. And the ubiquity of the camera meant that everyone—including celebrities, politicians, business leaders, people who in past decades would have been insulated by privilege—was caught off guard, meaning that status now accrued to those who could be most artfully informal, rather than those who could protect themselves from view.

Most institutions, with layers of tradition and deference accumulated over years, struggled to stay relevant to an informal culture. Tie-wearing network news anchors were eclipsed by cable-channel comedians with open collars. Journalistic codes of integrity and objectivity looked simply foolish next to the raw data of The Smoking Gun and Wikileaks. Marriage, with its vows and formal attire, became for many young people a distant aspiration far on the horizon, while cohabitation became the accepted gateway to adult relationships. A crippling blow was dealt to the cultural legitimacy of the oldest institution of all, the Roman Catholic Church, not by sexual abuse per se (almost all the cases reported had happened at least a decade earlier) but by the realization of how its hierarchy had covered up the scandal. The most informal and anti-institutional demographic cohort in a century, Generation X, moved uneasily and unsteadily into adulthood—symbolized neatly by its most celebrated religious movement, the emerging church, refusing to institutionalize at all and naming the leader of its most prominent organization a “coordinator.”

Nine | Liquidity

Wealth was ever more disconnected from real assets. Countries that pumped one particular liquid from the ground acquired vast resources of sovereign wealth that went looking for high returns. The most storied and prominent financial firm, Goldman Sachs, ended its century-long system of limited partnership and become a publicly traded company. Hedge funds made billions by trading not shares, but shares of bets on the future price of shares (and derivatives far more exotic). Your mortgage, once the most boring and staid of financial instruments, was sliced and diced into tranches of risk.

Money sloshed around the globe like quicksilver (the title of Neal Stephenson’s epic 2003 novel about the earliest moments of modernity). It sloshed beyond the borders of nations, of national regulators and politicians, quickly breaching the levees of international financial standards like Basel 1 (replaced by Basel 2, soon to be replaced by the soon-to-be-swamped Basel 3). Anyone unwilling to swim in the sea of liquidity drowned (or, as one Wall Street executive said, as long as the music was playing you had to keep dancing). As money sloshed, prices of oil, food, housing, and labor spiked, then collapsed, then threatened to spike again. Those who could trade on volatility often made untold fortunes; those actually needing to buy and sell real goods often suffered.

Ten | Complexity

There was a bull market in oversimplification, and no shortage of attempts to find someone to blame or, more hopefully, some way to make a difference. At the close of the decade some Christians were especially excited about the potential for cultural elites to change the world—just at the moment when elites everywhere were waking up to how little they could do to change anything at all. If there ever had been reliable levers of power—the Federal Funds Rate, Fashion Week, the New York Times bestseller list, the Nobel Peace Prize—they no longer carried much leverage in a world of countless connections, devolved into countless particular locations and conurbations, filled with fractious and fissiparous minorities, and ceaseless self-preoccupied informality. It was not a good time, to say the least, to be a central planner.

Yet all this complexity also contained the seed of certain kinds of promise. The human brain, after all, is also complex, interconnected, embodied, improvisational, constantly being rewired—simply put, the most complex system known in our universe. The culture of North America in the 2000s took several not inconsiderable steps toward having those same qualities. Not without risks, not without loss, and with every expectation of grave difficulty ahead. And yet in the most surprising places what was emerging could be called intelligence. Of course, intelligence needs to be married to wisdom—and in surveying the history of that most elusive of all cultural goods, we can only conclude that the 2000s left us neither worse nor better off than human beings have ever been.

-----
In your opinion, did Andy miss something? What would be on your list?

No comments: