Would a Work-Free World Be So Bad?
Fears of civilization-wide idleness are based too much on the downsides of being unemployed in a society premised on the concept of employment.
A 1567 painting by Pieter Bruegel the Elder depicts a mythical land of plenty, where people grow idle in the absence of work. Wikimedia * Ilana E. Strauss* Jun 28, 2016 People have speculated for centuries about a future without work, and today is no different, with academics, writers, and activists once again warning that technology is replacing human workers. Some imagine that the coming work-free world will be defined by inequality: A few wealthy people will own all the capital, and the masses will struggle in an impoverished wasteland.
A different, less paranoid, and not mutually exclusive prediction holds that the future will be a wasteland of a different sort, one characterized by purposelessness: Without jobs to give their lives meaning, people will simply become lazy and depressed. Indeed, today’s unemployed don’t seem to be having a great time. One Gallup poll found that 20 percent of Americans who have been unemployed for at least a year report having depression, double the rate for working Americans. Also, some research suggests that the explanation for rising rates of mortality, mental-health problems, and addiction among poorly-educated, middle-aged people is a shortage of well-paid jobs. Another study shows that people are often happier at work than in their free time. Perhaps this is why many worry about the agonizing dullness of a jobless future.
But it doesn’t necessarily follow from findings like these that a world without work would be filled with malaise. Such visions are based on the downsides of being unemployed in a society built on the concept of employment. In the absence of work, a society designed with other ends in mind could yield strikingly different circumstances for the future of labor and leisure. Today, the virtue of work may be a bit overblown. “Many jobs are boring, degrading, unhealthy, and a squandering of human potential,” says John Danaher, a lecturer at the National University of Ireland in Galway who has written about a world without work. “Global surveys find that the vast majority of people are unhappy at work.”
These days, because leisure time is relatively scarce for most workers, people use their free time to counterbalance the intellectual and emotional demands of their jobs. “When I come home from a hard day’s work, I often feel tired,” Danaher says, adding, “In a world in which I don’t have to work, I might feel rather different”—perhaps different enough to throw himself into a hobby or a passion project with the intensity usually reserved for professional matters.
Having a job can provide a measure of financial stability, but in addition to stressing over how to cover life’s necessities, today’s jobless are frequently made to feel like social outcasts. “People who avoid work are viewed as parasites and leeches,” Danaher says. Perhaps as a result of this cultural attitude, for most people, self-esteem and identity are tied up intricately with their job, or lack of job.
Plus, in many modern-day societies, unemployment can also be downright boring. American towns and cities aren’t really built for lots of free time: Public spaces tend to be small islands in seas of private property, and there aren’t many places without entry fees where adults can meet new people or come up with ways to entertain one another.
The roots of this boredom may run even deeper. Peter Gray, a professor of psychology at Boston College who studies the concept of play, thinks that if work
disappeared tomorrow, people might be at a loss for things to do, growing bored and depressed because they have forgotten how to play. “We teach children a distinction between play and work,” Gray explains. “Work is something that you don’t want to do but you have to do.” He says this training, which starts in school, eventually “drills the play” out of many children, who grow up to be adults who are aimless when presented with free time.
“Sometimes people retire from their work, and they don’t know what to do,” Gray says. “They’ve lost the ability to create their own activities.” It’s a problem that never seems to plague young children. “There are no three-year-olds that are going to be lazy and depressed because they don’t have a structured activity,” he says.
But need it be this way? Work-free societies are more than just a thought experiment—they’ve existed throughout human history. Consider hunter-gatherers, who have no bosses, paychecks, or eight-hour workdays. Ten thousand years ago, all humans were hunter-gatherers, and some still are. Daniel Everett, an anthropologist at Bentley University, in Massachusetts, studied a group of hunter-gathers in the Amazon called the Pirahã for years. According to Everett, while some might consider hunting and gathering work, hunter-gatherers don’t. “They think of it as fun,” he says. “They don’t have a concept of work the way we do.”
“It’s a pretty laid-back life most of the time,” Everett says. He described a typical day for the Pirahã: A man might get up, spend a few hours canoeing and fishing, have a barbecue, go for a swim, bring fish back to his family, and play until the evening. Such subsistence living is surely not without its own set of worries, but the anthropologist Marshall Sahlins argued in a 1968 essay that hunter-gathers belonged to “the original affluent society,” seeing as they only “worked” a few hours a day; Everett estimates that Pirahã adults on average work about 20 hours a week (not to mention without bosses peering over their shoulders). Meanwhile, according to the Bureau of Labor Statistics, the average employed American with children works about nine hours a day.
Does this leisurely life lead to the depression and purposelessness seen among so many of today’s unemployed? “I’ve never seen anything remotely like depression there, except people who are physically ill,” Everett says. “They have a blast. They play all the time.” While many may consider work a staple of human life, work as it exists today is a relatively new invention in the course of thousands of years of human culture. “We think it’s bad to just sit around with nothing to do,” says Everett. “For the Pirahã, it’s quite a desirable state.”
Gray likens these aspects of the hunter-gatherer lifestyle to the carefree adventures of many children in developed countries, who at some point in life are expected to put away childish things. But that hasn’t always been the case. According to Gary Cross’s 1990 book A Social History of Leisure Since 1600, free time in the U.S. looked quite different before the 18th and 19th centuries. Farmers—which was a fair way to
describe a huge number of Americans at that time—mixed work and play in their daily lives. There were no managers or overseers, so they would switch fluidly between working, taking breaks, joining in neighborhood games, playing pranks, and spending time with family and friends. Not to mention festivals and other gatherings: France, for instance, had 84 holidays a year in 1700, and weather kept them from farming another 80 or so days a year.
This all changed, writes Cross, during the Industrial Revolution, which replaced farms with factories and farmers with employees. Factory owners created a more rigidly scheduled environment that clearly divided work from play. Meanwhile, clocks—which were becoming widespread at that time—began to give life a quicker pace, and religious leaders, who traditionally endorsed most festivities, started associating leisure with sin and tried to replace rowdy festivals with sermons.
As workers started moving into cities, families no longer spent their days together on the farm. Instead, men worked in factories, women stayed home or worked in factories, and children went to school, stayed home, or worked in factories too. During the workday, families became physically separated, which affected the way people entertained themselves: Adults stopped playing “childish” games and sports, and the streets were mostly wiped clean of fun, as middle- and upper-class families found working-class activities like cockfighting and dice games distasteful. Many such diversions were soon outlawed.
With workers’ old outlets for play having disappeared in a haze of factory smoke, many of them turned to new, more urban ones. Bars became a refuge where tired workers drank and watched live shows with singing and dancing. If free time means beer and TV to a lot of Americans, this might be why.
At times, developed societies have, for a privileged few, produced lifestyles that were nearly as play-filled as hunter-gatherers’. Throughout history, aristocrats who earned their income simply by owning land spent only a tiny portion of their time minding financial exigencies. According to Randolph Trumbach, a professor of history at Baruch College, 18th-century English aristocrats spent their days visiting friends, eating elaborate meals, hosting salons, hunting, writing letters, fishing, and going to church. They also spent a good deal of time participating in politics, without pay. Their children would learn to dance, play instruments, speak foreign languages, and read Latin. Russian nobles frequently became intellectuals, writers, and artists. “As a 17th-century aristocrat said, ‘We sit down to eat and rise up to play, for what is a gentleman but his pleasure?’” Trumbach says.
It’s unlikely that a world without work would be abundant enough to provide everyone with such lavish lifestyles. But Gray insists that injecting any amount of additional play into people’s lives would be a good thing, because, contrary to that 17th-century aristocrat, play is about more than pleasure. Through play, Gray says, children (as well as adults) learn how to strategize, create new mental connections,
express their creativity, cooperate, overcome narcissism, and get along with other people. “Male mammals typically have difficulty living in close proximity to each other,” he says, and play’s harmony-promoting properties may explain why it came to be so central to hunter-gatherer societies. While most of today’s adults may have forgotten how to play, Gray doesn’t believe it’s an unrecoverable skill: It’s not uncommon, he says, for grandparents to re-learn the concept of play after spending time with their young grandchildren.
When people ponder the nature of a world without work, they often transpose present-day assumptions about labor and leisure onto a future where they might no longer apply; if automation does end up rendering a good portion of human labor unnecessary, such a society might exist on completely different terms than societies do today.
So what might a work-free U.S. look like? Gray has some ideas. School, for one thing, would be very different. “I think our system of schooling would completely fall by the wayside,” says Gray. “The primary purpose of the educational system is to teach people to work. I don’t think anybody would want to put our kids through what we put our kids through now.” Instead, Gray suggests that teachers could build lessons around what students are most curious about. Or, perhaps, formal schooling would disappear altogether.
Trumbach, meanwhile, wonders if schooling would become more about teaching children to be leaders, rather than workers, through subjects like philosophy and rhetoric. He also thinks that people might participate in political and public life more, like aristocrats of yore. “If greater numbers of people were using their leisure to run the country, that would give people a sense of purpose,” says Trumbach.
Social life might look a lot different too. Since the Industrial Revolution, mothers, fathers, and children have spent most of their waking hours apart. In a work-free world, people of different ages might come together again. “We would become much less isolated from each other,” Gray imagines, perhaps a little optimistically. “When a mom is having a baby, everybody in the neighborhood would want to help that mom.” Researchers have found that having close relationships is the number-one predictor of happiness, and the social connections that a work-free world might enable could well displace the aimlessness that so many futurists predict.
In general, without work, Gray thinks people would be more likely to pursue their passions, get involved in the arts, and visit friends. Perhaps leisure would cease to be about unwinding after a period of hard work, and would instead become a more colorful, varied thing. “We wouldn’t have to be as self-oriented as we think we have to be now,” he says. “I believe we would become more human.”
The surprising truth about American manufacturing
The decline in American manufacturing is a common refrain, particularly from Donald Trump. “We don’t make anything anymore,” he told Fox News last October, while defending his own made-in-Mexico clothing line.
On Tuesday, in rust belt Pennsylvania, he doubled down, saying that he had "visited cities and towns across this country where a third or even half of manufacturing jobs have been wiped out in the last 20 years." The Pacific trade deal, he added, "would be the death blow for American manufacturing."
Without question, manufacturing has taken a significant hit during recent decades, and further trade deals raise questions about whether new shocks could hit manufacturing.
But there is also a different way to look at the data.
In reality, United States manufacturing output is at an all-time high, worth $2.2 trillion in 2015, up from $1.7 trillion in 2009. And while total employment has fallen by nearly a third since 1970, the jobs that remain are increasingly skilled.
Across the country, factory owners are now grappling with a new challenge: Instead of having too many workers, as they did during the Great Recession, they may end up with too few. Despite trade competition and outsourcing, American manufacturing still needs to replace tens of thousands of retiring boomers every year. Millennials may not be that interested in taking their place. Other industries are recruiting them with similar or better pay. And those industries don’t have the stigma of 40 years of recurring layoffs and downsizing.
“We’ve never had so much attention from manufacturers. They’re calling and saying: ‘Can we meet your students?’ They’re asking, ‘Why aren’t they looking at my job postings?' ” says Julie Parks, executive director of workforce training at Grand Rapids Community College in western Michigan.
The region is a microcosm of the national challenge. Unemployment here is low (around 3 percent, compared with a statewide average of 5 percent). There aren’t many extra workers waiting for a job. And the need is high:1 in 5 people work in manufacturing, churning out auto parts, machinery, plastics, office furniture, and medical devices. Other industries, including agribusiness and life sciences, are vying for the same workers.
For factory owners, it all adds up to stiff competition for workers – and upward pressure on wages. “They’re harder to find and they have job offers,” says Jay Dunwell, president of Wolverine Coil Spring, a family-owned firm. “They may be coming [into the workforce], but they’ve been plucked by other industries that are also doing as well as manufacturing,”
Mr. Dunwell has begun bringing high school juniors to the factory so they can get exposed to its culture. He is also part of a public-private initiative to promote manufacturing to students that includes job fairs and sending a mobile demonstration vehicle to rural schools. One of their messages is that factories are no longer dark, dirty, and dangerous; computer-run systems are the norm and recruits can receive apprenticeships that include paid-for college classes.
At RoMan Manufacturing, a maker of electrical transformers and welding equipment that his father cofounded in 1980, Robert Roth keeps a close eye on the age of his nearly 200 workers. Five are retiring this year. Mr. Roth has three community-college students enrolled in a work-placement program, with a starting wage of $13 an hour that rises to $17 after two years.
At a worktable inside the transformer plant, young Jason Stenquist looks flustered by the copper coils he’s trying to assemble and the arrival of two visitors. It’s his first week on the job; this is his first encounter with Roth, his boss. Asked about his choice of career, he says at high school he considered medical school before switching to electrical engineering.
“I love working with tools. I love creating,” he says.
But to win over these young workers, manufacturers have to clear another major hurdle: parents, who lived through the worst US economic downturn since the Great Depression, telling them to avoid the factory. Millennials “remember their father and mother both were laid off. They blame it on the manufacturing recession,” says Birgit Klohs, chief executive of The Right Place, a business development agency for western Michigan.
These concerns aren’t misplaced: Employment in manufacturing has fallen from 17 million in 1970 to 12 million in 2015. The steepest declines came after 2001, when China gained entry to the World Trade Organization and ramped up exports of consumer goods to the US and other rich countries. In areas exposed to foreign trade, every additional $1,000 of imports per worker meant a $550 annual drop in household income per working-age adult, according to a 2013 study in the American Economic Review. And unemployment, Social Security, and other government benefits went up $60 per person.
The 2008-09 recession was another blow. And advances in computing and robotics offer new ways for factory owners to increase productivity using fewer workers.
When the recovery began, worker shortages first appeared in the high-skilled trades. Electricians, plumbers, and pipefitters are in in short supply across Michigan and elsewhere; vocational schools
and union-run apprenticeships aren’t keeping pace with demand and older tradespeople are leaving the workforce. Now shortages are appearing at the mid-skill levels.
“The gap is between the jobs that take no skills and those that require a lot of skill,” says Rob Spohr, a business professor at Montcalm Community College an hour from Grand Rapids. “There’s enough people to fill the jobs at McDonalds and other places where you don’t need to have much skill. It’s that gap in between, and that’s where the problem is.”
Ms. Parks of Grand Rapids Community College points to another key to luring Millennials into manufacturing: a work/life balance. While their parents were content to work long hours, young people value flexibility. “Overtime is not attractive to this generation. They really want to live their lives,” she says.
Roth says he gets this distinction. At RoMan, workers can set their own hours on their shift, choosing to start earlier or end later, provided they get the job done. That the factory floor isn’t a standard assembly line – everything is custom-built for industrial clients – makes it easier to drop the punch-clocks.
“People have lives outside,” Roth says. “It’s not always easy to schedule doctors’ appointments around a ‘punch-in at 7 and leave at 3:30’ schedule.”
While factory owners like Roth like to stress the flexibility of manufacturing careers, one aspect is nonnegotiable: location. Millennials looking for a job that allow them to work from home are not likely to get a callback. "I'm not putting a machine tool in your garage," says Roth.
My dream has always been to work somewhere in an area between fashion and publishing. Two years before graduating from secondary school, I took a sewing and design course thinking that I would move on to a fashion design course. However, during that course I realised that I was not good enough in this area to compete with other creative personalities in the future, so I decided that it was not the right path for me. Before applying for university I told everyone that I would study journalism, because writing was, and still is, one of my favourite activities. But, to be absolutely honest, I said it, because I thought that fashion and me together was just a dream - I knew that no one, apart from myself, could imagine me in the fashion industry at all!
A shift in thought
Wildfire season has become longer and more intense lately. But beyond addressing climate change, some researchers call for a paradigm shift to address the various human factors relating to prevention and safety. By Jessica Mendoza, Staff writer July 1, 2016*Azusa, Calif. — On a chain-link fence along Route 39 hangs a homemade poster, peppered with hearts, thanking firefighters and police.
The sign, one of a handful scattered across town, salutes efforts to battle the San Gabriel Complex fire, twin blazes that had erupted on June 20 in the mountains of Angeles National Forest just to the north of the city. Within a day of igniting, the fire had burned through nearly 5,000 acres and forced hundreds to evacuate. Nearly a week passed before the US Forest Service and local and state authorities managed to contain even half of the inferno. Recommended: Could you be a Hotshot? Take our quiz!
“Three days in, you could still see the flames,” says Jasmine Perez, a teacher’s assistant and resident of Azusa, which sits northeast of Los Angeles. And because of the smoke, she adds, “In the mornings, it kind of looked like nighttime still.” The San Gabriel Complex was one of 12 large fires that about 4,000 firefighters were battling across California as of Thursday. Such numbers so early in the fire season are a testament to the growing frequency and intensity of wildfires in the western US, fire officials say – a shift that many experts say is likely intertwined with climate change and its associated consequences, such as drought.
But climate, however critical, is only part of the problem, scientists say. A growing body of evidence suggests that other human activity and policy have at least as much impact on wildfires as climate change. To effectively address a longer and more intense wildfire season – and ensure the safety of residents in fire-prone areas – both environmental and human factors have to be taken into account in more holistic ways, they say.
That means more than just sweeping dry brush off the front porch. Though such steps are an important part of the process, officials and researchers alike are calling for a comprehensive approach to wildfires: one that incorporates fire safety and behavior in key policy decisions and legislation. Such an effort would also recognize that fire can be helpful as well as harmful and embrace fire’s place in human society.
“We need not just a policy shift but also a cultural shift in the dialogue around fires in our landscape and how to manage them,” says Jennifer Balch, director of Earth Lab and a professor of geography at the University of Colorado in Boulder. “Fire is not something we can remove. A large majority of the country is living in fire-prone areas. How do we live with wildfire? How do we manage?”
“More and more researchers are arguing that anthropogenic influences are really important [to understanding wildfires],” adds Max Moritz, a specialist in fire ecology and management and a professor at the College of Natural Resources at the University
of California, Berkeley. “By leaving them out we’re missing a critical piece of the solution.”
Changing attitudes on fire
Though often viewed as a problem for western states, the growing frequency of wildfires is a national concern because of its impact on federal tax dollars, Professor Moritz and others say. In 2015, the US Forest Service for the first time spent more than half of its $5.5 billion annual budget fighting fires – nearly double the percentage it spent on such efforts 20 years ago. In effect, fewer federal funds today are going towards the agency’s other work – such as forest conservation, watershed and cultural resources management, and infrastructure upkeep – that affect the lives of all Americans.
Another nationwide concern is whether public funds from other agencies, such as the Department of Housing and Urban Development, are going into construction in fire-prone districts. As Moritz puts it, how often are federal dollars building homes that are likely to be lost to a wildfire?
“It’s already a huge problem from a public expenditure perspective for the whole country,” he says. “We need to take a magnifying glass to that. Like, ‘Wait a minute, is this OK?’ Do we want instead to redirect those funds to concentrate on lower-hazard parts of the landscape?”
Such a pivot would require a corresponding shift in the way US society today views fire, researchers say.
For one thing, conversations about wildfires need to be more inclusive. Over the past decade, the focus has been on climate change – how the warming of the Earth from greenhouse gases (including human carbon emissions) is leading to conditions that exacerbate fires.
While climate is a key element, Moritz says, it shouldn’t come at the expense of the rest of the equation.
“The human systems and the landscapes we live on are linked, and the interactions go both ways,” he says. Failing to recognize that, he notes, leads to “an overly simplified view of what the solutions might be. Our perception of the problem and perception of what the solution is [becomes] very limited.”
At the same time, people continue to treat fire as an event that needs to be wholly controlled and unleashed only out of necessity, says Professor Balch at the University of Colorado. But acknowledging fire’s inevitable presence in human life is an attitude crucial to developing the laws, policies, and practices that make it as safe as possible, she says.
“We’ve disconnected ourselves from living with fire,” Balch says. “It is really important to understand and try and tease out what is the human connection [with fire] today.”
Role for citizens ... and for policy
After nearly 30 years in the state fire service, Janet Upton understands the value of that connection.
During her early days with the California Department of Forestry and Fire Protection (Cal Fire), veterans would tell war stories of huge fires that happened once in a career, she recalls.
“But in my generation, those of us who’ve come up through the '80s, '90s, 2000s … we feel like we don’t have the license to use the word ‘unprecedented’ any more. We’ve seen it all in the last few years,” she says. “I’ve probably had 15 once-in-a-career fires.” And people caused most of them, Ms. Upton says. About 90 percent of all fires in California can be traced to human activity, whether it’s a stove left on or a campfire left burning. Which is why public education has been Upton’s main goal since 2008, when then-Gov. Arnold Schwarzenegger appointed her Cal Fire’s deputy communications director. The department has since made strides, playing a major role in launching state and nationalcampaigns that underscore the public’s role in fire safety. But people’s tendency to put danger out of their minds until it’s too late continues to pose serious challenges, Upton says.
“This is going to sound cold. But if someone chooses to live in a rural area and continues to not be responsive to [fire-safety] education, sadly, the worst punishment they’re going to get is they’re going to lose their home in a fire,” she says.
A paradigm shift, some researchers hope, can address that gap between education and action. Environmental policy specialist Ray Rasker, for instance, envisions whole communities designed around the concept of fire safety, and a slate of fire-prevention policies at the local, state, and national level.
“What we’re telling the public now is, ‘Reduce the risk of fires – if you so choose.’ Imagine if we tried driving our cars like that,” says Dr. Rasker, who is also executive director of Headwaters Economics, a nonprofit research firm based in Bozeman, Mont. “Why not use regulations, building codes, and subdivision design standards, development codes and ordinances that say, ‘Look if you’re going to build there, there are certain conditions you have to meet first’?” Some places are already taking steps. San Diego’s municipal code, for instance, requires property owners to maintain landscape and vegetation standards – or face a penalty equivalent to the cost of hiring a private contractor to do so. Austin, Texas, has set aside close to 30 percent of city land as conservation areas, curbing the number of new structures that can be built within the fire-prone “wildland-urban interface” (WUI) – the space between unoccupied natural land and human developments. Flagstaff, Ariz., Boulder, Colo., and Santa Fe, N.M., have all enacted similar policies.
But the need for action continues to grow. As bad as wildfires have been in recent years, research shows they’re likely to get worse as the US population increases and people build more homes in the WUI, more than 80 percent of which remain undeveloped.
“We keep building more and more homes in harm’s way,” Rasker notes. “Unless we get a handle on development, we’re really not addressing the problem.”
Mind-set matters, too – for everyone, says Upton at Cal Fire.
“It’s a mitigation issue. You can take the lens we’re looking at [in California] and take it to Tornado Alley or the Eastern Seaboard,” she says. In the end, “it’s about informing yourself as a member of the public or a policymaker. How can you do something comprehensive?”