A Nation of Wimps
By Hara Estroff Marano, published on November 01, 2004
Maybe it's the cyclist in the park, trim under his sleek metallic blue helmet, cruising along the dirt path... at three miles an hour. On his tricycle.
Or perhaps it's today's playground, all-rubber-cushioned surface where kids used to skin their knees. And... wait a minute... those aren't little kids playing. Their mommies—and especially their daddies—are in there with them, coplaying or play-by-play coaching. Few take it half-easy on the perimeter benches, as parents used to do, letting the kids figure things out for themselves.
Then there are the sanitizing gels, with which over a third of parents now send their kids to school, according to a recent survey. Presumably, parents now worry that school bathrooms are not good enough for their children.
Consider the teacher new to an upscale suburban town. Shuffling through the sheaf of reports certifying the educational "accommodations" he was required to make for many of his history students, he was struck by the exhaustive, well-written—and obviously costly—one on behalf of a girl who was already proving among the most competent of his ninth-graders. "She's somewhat neurotic," he confides, "but she is bright, organized and conscientious—the type who'd get to school to turn in a paper on time, even if she were dying of stomach flu." He finally found the disability he was to make allowances for: difficulty with Gestalt thinking. The 13-year-old "couldn't see the big picture." That cleverly devised defect (what 13-year-old can construct the big picture?) would allow her to take all her tests untimed, especially the big one at the end of the rainbow, the college-worthy SAT.
Behold the wholly sanitized childhood, without skinned knees or the occasional C in history. "Kids need to feel badly sometimes," says child psychologist David Elkind, professor at Tufts University. "We learn through experience and we learn through bad experiences. Through failure we learn how to cope."
Messing up, however, even in the playground, is wildly out of style. Although error and experimentation are the true mothers of success, parents are taking pains to remove failure from the equation.
"Life is planned out for us," says Elise Kramer, a Cornell University junior. "But we don't know what to want." As Elkind puts it, "Parents and schools are no longer geared toward child development, they're geared to academic achievement."
No one doubts that there are significant economic forces pushing parents to invest so heavily in their children's outcome from an early age. But taking all the discomfort, disappointment and even the play out of development, especially while increasing pressure for success, turns out to be misguided by just about 180 degrees. With few challenges all their own, kids are unable to forge their creative adaptations to the normal vicissitudes of life. That not only makes them risk-averse, it makes them psychologically fragile, riddled with anxiety. In the process they're robbed of identity, meaning and a sense of accomplishment, to say nothing of a shot at real happiness. Forget, too, about perseverance, not simply a moral virtue but a necessary life skill. These turn out to be the spreading psychic fault lines of 21st-century youth. Whether we want to or not, we're on our way to creating a nation of wimps.
The Fragility Factor
College, it seems, is where the fragility factor is now making its greatest mark. It's where intellectual and developmental tracks converge as the emotional training wheels come off. By all accounts, psychological distress is rampant on college campuses. It takes a variety of forms, including anxiety and depression—which are increasingly regarded as two faces of the same coin—binge drinking and substance abuse, self-mutilation and other forms of disconnection. The mental state of students is now so precarious for so many that, says Steven Hyman, provost of Harvard University and former director of the National Institute of Mental Health, "it is interfering with the core mission of the university."
The severity of student mental health problems has been rising since 1988, according to an annual survey of counseling center directors. Through 1996, the most common problems raised by students were relationship issues. That is developmentally appropriate, reports Sherry Benton, assistant director of counseling at Kansas State University. But in 1996, anxiety overtook relationship concerns and has remained the major problem. The University of Michigan Depression Center, the nation's first, estimates that 15 percent of college students nationwide are suffering from that disorder alone.
Relationship problems haven't gone away; their nature has dramatically shifted and the severity escalated. Colleges report ever more cases of obsessive pursuit, otherwise known as stalking, leading to violence, even death. Anorexia or bulimia in florid or subclinical form now afflicts 40 percent of women at some time in their college career. Eleven weeks into a semester, reports psychologist Russ Federman, head of counseling at the University of Virginia, "all appointment slots are filled. But the students don't stop coming."
Drinking, too, has changed. Once a means of social lubrication, it has acquired a darker, more desperate nature. Campuses nationwide are reporting record increases in binge drinking over the past decade, with students often stuporous in class, if they get there at all. Psychologist Paul E. Joffe, chair of the suicide prevention team at the University of Illinois at Urbana-Champaign, contends that at bottom binge-drinking is a quest for authenticity and intensity of experience. It gives young people something all their own to talk about, and sharing stories about the path to passing out is a primary purpose. It's an inverted world in which drinking to oblivion is the way to feel connected and alive.
"There is a ritual every university administrator has come to fear," reports John Portmann, professor of religious studies at the University of Virginia. "Every fall, parents drop off their well-groomed freshmen and within two or three days many have consumed a dangerous amount of alcohol and placed themselves in harm's way. These kids have been controlled for so long, they just go crazy."
Heavy drinking has also become the quickest and easiest way to gain acceptance, says psychologist Bernardo J. Carducci, professor at Indiana University Southeast and founder of its Shyness Research Institute. "Much of collegiate social activity is centered on alcohol consumption because it's an anxiety reducer and demands no social skills," he says. "Plus it provides an instant identity; it lets people know that you are willing to belong."
Welcome to the Hothouse
Talk to a college president or administrator and you're almost certainly bound to hear tales of the parents who call at 2 a.m. to protest Branden's C in economics because it's going to damage his shot at grad school.
Shortly after psychologist Robert Epstein announced to his university students that he expected them to work hard and would hold them to high standards, he heard from a parent—on official judicial stationery—asking how he could dare mistreat the young. Epstein, former editor-in-chief of Psychology Today, eventually filed a complaint with the California commission on judicial misconduct, and the judge was censured for abusing his office—but not before he created havoc in the psychology department at the University of California, San Diego.
Enter: grade inflation. When he took over as president of Harvard in July 2001, Lawrence Summers publicly ridiculed the value of honors after discovering that 94 percent of the college's seniors were graduating with them. Safer to lower the bar than raise the discomfort level. Grade inflation is the institutional response to parental anxiety about school demands on children, contends social historian Peter Stearns of George Mason University. As such, it is a pure index of emotional overinvestment in a child's success. And it rests on a notion of juvenile frailty—the assumption that children are easily bruised and need explicit uplift," Stearns argues in his book, Anxious Parenting: A History of Modern
Childrearing in America.
Parental protectionism may reach its most comic excesses in college, but it doesn't begin there. Primary schools and high schools are arguably just as guilty of grade inflation. But if you're searching for someone to blame, consider Dr. Seuss. "Parents have told their kids from day one that there's no end to what they are capable of doing," says Virginia's Portmann. "They read them the Dr. Seuss book Oh, the Places You'll Go! and create bumper stickers telling the world their child is an honor student. American parents today expect their children to be perfect—the smartest, fastest, most charming people in the universe. And if they can't get the children to prove it on their own, they'll turn to doctors to make their kids into the people that parents want to believe their kids are."
What they're really doing, he stresses, is "showing kids how to work the system for their own benefit."
And subjecting them to intense scrutiny. "I wish my parents had some hobby other than me," one young patient told David Anderegg, a child psychologist in Lenox, Massachusetts, and professor of psychology at Bennington College. Anderegg finds that anxious parents are hyperattentive to their kids, reactive to every blip of their child's day, eager to solve every problem for their child—and believe that's good parenting. "If you have an infant and the baby has gas, burping the baby is being a good parent. But when you have a 10-year-old who has metaphoric gas, you don't have to burp him. You have to let him sit with it, try to figure out what to do about it. He then learns to tolerate moderate amounts of difficulty, and it's not the end of the world."
In the hothouse that child raising has become, play is all but dead. Over 40,000 U.S. schools no longer have recess. And what play there is has been corrupted. The organized sports many kids participate in are managed by adults; difficulties that arise are not worked out by kids but adjudicated by adult referees.
"So many toys now are designed by and for adults," says Tufts' Elkind. When kids do engage in their own kind of play, parents become alarmed. Anderegg points to kids exercising time-honored curiosity by playing doctor. "It's normal for children to have curiosity about other children's genitals," he says. "But when they do, most parents I know are totally freaked out. They wonder what's wrong."
Kids are having a hard time even playing neighborhood pick-up games because they've never done it, observes Barbara Carlson, president and cofounder of Putting Families First. "They've been told by their coaches where on the field to stand, told by their parents what color socks to wear, told by the referees who's won and what's fair. Kids are losing leadership skills."
A lot has been written about the commercialization of children's play, but not the side effects, says Elkind. "Children aren't getting any benefits out of play as they once did." From the beginning play helps children learn how to control themselves, how to interact with others. Contrary to the widely held belief that only intellectual activities build a sharp brain, it's in play that cognitive agility really develops. Studies of children and adults around the world demonstrate that social engagement actually improves intellectual skills. It fosters decision-making, memory and thinking, speed of mental processing. This shouldn't come as a surprise. After all, the human mind is believed to have evolved to deal with social problems.
The Eternal Umbilicus
It's bad enough that today's children are raised in a psychological hothouse where they are overmonitored and oversheltered. But that hothouse no longer has geographical or temporal boundaries. For that you can thank the cell phone. Even in college—or perhaps especially at college—students are typically in contact with their parents several times a day, reporting every flicker of experience. One long-distance call overheard on a recent cross-campus walk: "Hi, Mom. I just got an ice-cream cone; can you believe they put sprinkles on the bottom as well as on top?"
"Kids are constantly talking to parents," laments Cornell student Kramer, which makes them perpetually homesick. Of course, they're not telling the folks everything, notes Portmann. "They're not calling their parents to say, 'I really went wild last Friday at the frat house and now I might have chlamydia. Should I go to the student health center?'"
The perpetual access to parents infantilizes the young, keeping them in a permanent state of dependency. Whenever the slightest difficulty arises, "they're constantly referring to their parents for guidance," reports Kramer. They're not learning how to manage for themselves.
Think of the cell phone as the eternal umbilicus. One of the ways we grow up is by internalizing an image of Mom and Dad and the values and advice they imparted over the early years. Then, whenever we find ourselves faced with uncertainty or difficulty, we call on that internalized image. We become, in a way, all the wise adults we've had the privilege to know. "But cell phones keep kids from figuring out what to do," says Anderegg. "They've never internalized any images; all they've internalized is 'call Mom or Dad.'"
Some psychologists think we have yet to recognize the full impact of the cell phone on child development, because its use is so new. Although there are far too many variables to establish clear causes and effects, Indiana's Carducci believes that reliance on cell phones undermines the young by destroying the ability to plan ahead. "The first thing students do when they walk out the door of my classroom is flip open the cell phone. Ninety-five percent of the conversations go like this: 'I just got out of class; I'll see you in the library in five minutes.' Absent the phone, you'd have to make arrangements ahead of time; you'd have to think ahead."
Herein lies another possible pathway to depression. The ability to plan resides in the prefrontal cortex (PFC), the executive branch of the brain. The PFC is a critical part of the self-regulation system, and it's deeply implicated in depression, a disorder increasingly seen as caused or maintained by unregulated thought patterns—lack of intellectual rigor, if you will. Cognitive therapy owes its very effectiveness to the systematic application of critical thinking to emotional reactions. Further, it's in the setting of goals and progress in working toward them, however mundane they are, that positive feelings are generated. From such everyday activity, resistance to depression is born.
What's more, cell phones—along with the instant availability of cash and almost any consumer good your heart desires—promote fragility by weakening self-regulation. "You get used to things happening right away," says Carducci. You not only want the pizza now, you generalize that expectation to other domains, like friendship and intimate relationships. You become frustrated and impatient easily. You become unwilling to work out problems. And so relationships fail—perhaps the single most powerful experience leading to depression.
From Scrutiny to Anxiety... and Beyond
The 1990s witnessed a landmark reversal in the traditional patterns of psychopathology. While rates of depression rise with advancing age among people over 40, they're now increasing fastest among children, striking more children at younger and younger ages.
In his now-famous studies of how children's temperaments play out, Harvard psychologist Jerome Kagan has shown unequivocally that what creates anxious children is parents hovering and protecting them from stressful experiences. About 20 percent of babies are born with a high-strung temperament. They can be spotted even in the womb; they have fast heartbeats. Their nervous systems are innately programmed to be overexcitable in response to stimulation, constantly sending out false alarms about what is dangerous.
As infants and children this group experiences stress in situations most kids find unthreatening, and they may go through childhood and even adulthood fearful of unfamiliar people and events, withdrawn and shy. At school age they become cautious, quiet and introverted. Left to their own devices they grow up shrinking from social encounters. They lack confidence around others. They're easily influenced by others. They are sitting ducks for bullies. And they are on the path to depression.
While their innate reactivity seems to destine all these children for later anxiety disorders, things didn't turn out that way. Between a touchy temperament in infancy and persistence of anxiety stand two highly significant things: parents. Kagan found to his surprise that the development of anxiety was scarcely inevitable despite apparent genetic programming. At age 2, none of the overexcitable infants wound up fearful if their parents backed off from hovering and allowed the children to find some comfortable level of accommodation to the world on their own. Those parents who overprotected their children—directly observed by conducting interviews in the home—brought out the worst in them.
A small percentage of children seem almost invulnerable to anxiety from the start. But the overwhelming majority of kids are somewhere in between. For them, overparenting can program the nervous system to create lifelong vulnerability to anxiety and depression.
There is in these studies a lesson for all parents. Those who allow their kids to find a way to deal with life's day-to-day stresses by themselves are helping them develop resilience and coping strategies. "Children need to be gently encouraged to take risks and learn that nothing terrible happens," says Michael Liebowitz, clinical professor of psychiatry at Columbia University and head of the Anxiety Disorders Clinic at New York State Psychiatric Institute. "They need gradual exposure to find that the world is not dangerous. Having overprotective parents is a risk factor for anxiety disorders because children do not have opportunities to master their innate shyness and become more comfortable in the world." They never learn to dampen the pathways from perception to alarm reaction.
Hothouse parenting undermines children in other ways, too, says Anderegg. Being examined all the time makes children extremely self-conscious. As a result they get less communicative; scrutiny teaches them to bury their real feelings deeply. And most of all, self-consciousness removes the safety to be experimental and playful. "If every drawing is going to end up on your parents' refrigerator, you're not free to fool around, to goof up or make mistakes," says Anderegg.
Parental hovering is why so many teenagers are so ironic, he notes. It's a kind of detachment, "a way of hiding in plain sight. They just don't want to be exposed to any more scrutiny."
Parents are always so concerned about children having high self-esteem, he adds. "But when you cheat on their behalf to get them ahead of other children"—by pursuing accommodations and recommendations—you just completely corrode their sense of self. They feel 'I couldn't do this on my own.' It robs them of their own sense of efficacy." A child comes to think, "if I need every advantage I can get, then perhaps there is really something wrong with me." A slam-dunk for depression.
Virginia's Portmann feels the effects are even more pernicious; they weaken the whole fabric of society. He sees young people becoming weaker right before his eyes, more responsive to the herd, too eager to fit in—less assertive in the classroom, unwilling to disagree with their peers, afraid to question authority, more willing to conform to the expectations of those on the next rung of power above them.
The end result of cheating childhood is to extend it forever. Despite all the parental pressure, and probably because of it, kids are pushing back—in their own way. They're taking longer to grow up.
Adulthood no longer begins when adolescence ends, according to a recent report by University of Pennsylvania sociologist Frank F. Furstenberg and colleagues. There is, instead, a growing no-man's-land of postadolescence from 20 to 30, which they dub "early adulthood." Those in it look like adults but "haven't become fully adult yet—traditionally defined as finishing school, landing a job with benefits, marrying and parenting—because they are not ready or perhaps not permitted to do so."
Using the classic benchmarks of adulthood, 65 percent of males had reached adulthood by the age of 30 in 1960. By contrast, in 2000, only 31 percent had. Among women, 77 percent met the benchmarks of adulthood by age 30 in 1960. By 2000, the number had fallen to 46 percent.
Boom Boom Boomerang
Take away play from the front end of development and it finds a way onto the back end. A steady march of success through regimented childhood arranged and monitored by parents creates young adults who need time to explore themselves. "They often need a period in college or afterward to legitimately experiment—to be children," says historian Stearns. "There's decent historical evidence to suggest that societies that allow kids a few years of latitude and even moderate [rebellion] end up with healthier kids than societies that pretend such impulses don't exist."
Marriage is one benchmark of adulthood, but its antecedents extend well into childhood. "The precursor to marriage is dating, and the precursor to dating is playing," says Carducci. The less time children spend in free play, the less socially competent they'll be as adults. It's in play that we learn give and take, the fundamental rhythm of all relationships. We learn how to read the feelings of others and how to negotiate conflicts. Taking the play out of childhood, he says, is bound to create a developmental lag, and he sees it clearly in the social patterns of today's adolescents and young adults, who hang around in groups that are more typical of childhood. Not to be forgotten: The backdrop of continued high levels of divorce confuses kids already too fragile to take the huge risk of commitment.
Just Whose Shark Tank Is It Anyway?
The stressful world of cutthroat competition that parents see their kids facing may not even exist. Or it exists, but more in their mind than in reality—not quite a fiction, more like a distorting mirror. "Parents perceive the world as a terribly competitive place," observes Anderegg. "And many of them project that onto their children when they're the ones who live or work in a competitive environment. They then imagine that their children must be swimming in a big shark tank, too."
"It's hard to know what the world is going to look like 10 years from now," says Elkind. "How best do you prepare kids for that? Parents think that earlier is better. That's a natural intuition, but it happens to be wrong."
What if parents have micromanaged their kids' lives because they've hitched their measurement of success to a single event whose value to life and paycheck they have frantically overestimated? No one denies the Ivy League offers excellent learning experiences, but most educators know that some of the best programs exist at schools that don't top the U.S. News and World Report list, and that with the right attitude—a willingness to be engaged by new ideas—it's possible to get a meaningful education almost anywhere. Further, argues historian Stearns, there are ample openings for students at an array of colleges. "We have a competitive frenzy that frankly involves parents more than it involves kids themselves," he observes, both as a father of eight and teacher of many. "Kids are more ambivalent about the college race than are parents."
Yet the very process of application to select colleges undermines both the goal of education and the inherent strengths of young people. "It makes kids sneaky," says Anderegg. Bending rules and calling in favors to give one's kid a competitive edge is morally corrosive.
Like Stearns, he is alarmed that parents, pursuing disability diagnoses so that children can take untimed SATs, actually encourage kids to think of themselves as sickly and fragile. Colleges no longer know when SATs are untimed—but the kids know. "The kids know when you're cheating on their behalf," says Anderegg, "and it makes them feel terribly guilty. Sometimes they arrange to fail to right the scales. And when you cheat on their behalf, you completely undermine their sense of self-esteem. They feel they didn't earn it on their own."
In buying their children accommodations to assuage their own anxiety, parents are actually locking their kids into fragility. Says the suburban teacher: "Exams are a fact of life. They are anxiety-producing. The kids never learn how to cope with anxiety."
Putting Worry in its Place
Children, however, are not the only ones who are harmed by hyperconcern. Vigilance is enormously taxing—and it's taken all the fun out of parenting. "Parenting has in some measurable ways become less enjoyable than it used to be," says Stearns. "I find parents less willing to indulge their children's sense of time. So they either force-feed them or do things for them."
Parents need to abandon the idea of perfection and give up some of the invasive control they've maintained over their children. The goal of parenting, Portmann reminds, is to raise an independent human being. Sooner or later, he says, most kids will be forced to confront their own mediocrity. Parents may find it easier to give up some control if they recognize they have exaggerated many of the dangers of childhood—although they have steadfastly ignored others, namely the removal of recess from schools and the ubiquity of video games that encourage aggression.
The childhood we've introduced to our children is very different from that in past eras, Epstein stresses. Children no longer work at young ages. They stay in school for longer periods of time and spend more time exclusively in the company of peers. Children are far less integrated into adult society than they used to be at every step of the way. We've introduced laws that give children many rights and protections—although we have allowed media and marketers to have free access.
In changing the nature of childhood, Stearns argues, we've introduced a tendency to assume that children can't handle difficult situations. "Middle-class parents especially assume that if kids start getting into difficulty they need to rush in and do it for them, rather than let them flounder a bit and learn from it. I don't mean we should abandon them," he says, "but give them more credit for figuring things out." And recognize that parents themselves have created many of the stresses and anxieties children are suffering from, without giving them tools to manage them.
While the adults are at it, they need to remember that one of the goals of higher education is to help young people develop the capacity to think for themselves.
Although we're well on our way to making kids more fragile, no one thinks that kids and young adults are fundamentally more flawed than in previous generations. Maybe many will "recover" from diagnoses too liberally slapped on to them. In his own studies of 14 skills he has identified as essential for adulthood in American culture, from love to leadership, Epstein has found that "although teens don't necessarily behave in a competent way, they have the potential to be every bit as competent and as incompetent as adults."
Parental anxiety has its place. But the way things now stand, it's not being applied wisely. We're paying too much attention to too few kids—and in the end, the wrong kids. As with the girl whose parents bought her the Gestalt-defect diagnosis, resources are being expended for kids who don't need them.
There are kids who are worth worrying about—kids in poverty, stresses Anderegg. "We focus so much on our own children," says Elkind, "It's time to begin caring about all children."
By Mary Schmich of the Chicago Tribune
Ladies and gentlemen of the class of '98: Wear sunscreen. If I could offer you only one tip for the future, sunscreen would be it. The long-term benefits of sunscreen have been proved by scientists whereas the rest of my advice has no basis more reliable than my own meandering experience.
I will dispense this advice now.
Enjoy the power and beauty of your youth. Oh, never mind. You will not understand the power and beauty of your youth until they've faded. But trust me, in 20 years, you'll look back at photos of yourself and recall in a way you can't grasp now how much possibility lay before you and how fabulous you really looked. You are not as fat as you imagine.
Don't worry about the future. Or worry, but know that worrying is as effective as trying to solve an algebra equation by chewing bubble gum. The real troubles in your life are apt to be things that never crossed your worried mind, the kind that blind side you at 4 PM on some idle Tuesday.
Do one thing every day that scares you.
Don't be reckless with other people's hearts. Don't put up with people who are reckless with yours.
Don't waste your time on jealousy. Sometimes you're ahead, sometimes you're behind. The race is long and, in the end, it's only with yourself.
Remember compliments you receive. Forget the insults. If you succeed in doing this, tell me how.
Keep your old love letters. Throw away your old bank statements.
Don't feel guilty if you don't know what you want to do with your life. The most interesting people I know didn't know at 22 what they wanted to do with their lives. Some of the most interesting 40-year-olds I know still don't.
Get plenty of calcium.
Be kind to your knees. You'll miss them when they're gone.
Maybe you'll marry, maybe you won't. Maybe you'll have children, maybe you won't. Maybe you'll divorce at 40, maybe you'll dance the funky chicken on your 75th wedding anniversary. Whatever you do, don't congratulate yourself too much, or berate yourself either. Your choices are half chance. So are everybody else's.
Enjoy your body. Use it every way you can. Don't be afraid of it or of what other people think of it. It's the greatest instrument you'll ever own.
Dance, even if you have nowhere to do it but your living room.
Read the directions, even if you don't follow them.
Do not read beauty magazines. They will only make you feel ugly.
Get to know your parents. You never know when they'll be gone for good. Be nice to your siblings. They're your best link to your past and the people most likely to stick with you in the future. Understand that friends come and go, but with a precious few you should hold on. Work hard to bridge the gaps in geography and lifestyle, because the older you get, the more you need the people who knew you when you were young.
Live in New York City once, but leave before it makes you hard. Live in Northern California once, but leave before it makes you soft.
Accept certain inalienable truths: Prices will rise. Politicians will philander. You, too, will get old. And when you do, you'll fantasize that when you were young, prices were reasonable, politicians were noble, and children respected their elders.
Respect your elders. Don't expect anyone else to support you. Maybe you have a trust fund. Maybe you'll have a wealthy spouse. But you never know when either one might run out.
Don't mess too much with your hair or by the time you're 40 it will look 85.
Be careful whose advice you buy, but be patient with those who supply it. Advice is a form of nostalgia. Dispensing it is a way of fishing the past from the disposal, wiping it off, painting over the ugly parts and recycling it for more than it's worth.
But trust me on the sunscreen.
Why the Campaign to Stop America's Obesity Crisis Keeps Failing
Most of my favorite factoids about obesity are historical ones, and they don’t make it into the new, four-part HBO documentary on the subject, The Weight of the Nation. Absent, for instance, is the fact that the very first childhood-obesity clinic in the United States was founded in the late 1930s at Columbia University by a young German physician, Hilde Bruch. As Bruch later told it, her inspiration was simple: she arrived in New York in 1934 and was “startled” by the number of fat kids she saw—“really fat ones, not only in clinics, but on the streets and subways, and in schools.”
What makes Bruch’s story relevant to the obesity problem today is that this was New York in the worst year of the Great Depression, an era of bread lines and soup kitchens, when 6 in 10 Americans were living in poverty. The conventional wisdom these days—promoted by government, obesity researchers, physicians, and probably your personal trainer as well—is that we get fat because we have too much to eat and not enough reasons to be physically active. But then why were the PC- and Big Mac–-deprived Depression-era kids fat? How can we blame the obesity epidemic on gluttony and sloth if we easily find epidemics of obesity throughout the past century in populations that barely had food to survive and had to work hard to earn it?
These seem like obvious questions to ask, but you won’t get the answers from the anti-obesity establishment, which this month has come together to unfold a major anti-fat effort, including The Weight of the Nation, which begins airing May 14 and “a nationwide community-based outreach campaign.” The project was created by a coalition among HBO and three key public-health institutions: the nonprofit Institute of Medicine, and two federal agencies, the Centers for Disease Control and Prevention and the National Institutes of Health. Indeed, it is unprecedented to have the IOM, CDC, and NIH all supporting a single television documentary, says producer John Hoffmann. The idea is to “sound the alarm” and motivate the nation to act.
At its heart is a simple “energy balance” idea: we get fat because we consume too many calories and expend too few. If we could just control our impulses—or at least control our environment, thereby removing temptation—and push ourselves to exercise, we’d be fine. This logic is everywhere you look in the official guidelines, commentary, and advice. “The same amount of energy IN and energy OUT over time = weight stays the same,” the NIH website counsels Americans, while the CDC site tells us, “Overweight and obesity result from an energy imbalance.”
The problem is, the solutions this multi-level campaign promotes are the same ones that have been used to fight obesity for a century—and they just haven’t worked. “We are struggling to figure this out,” NIH Director Francis Collins conceded to Newsweek last week. When I interviewed CDC obesity expert William Dietz back in 2001, he told me that his primary accomplishment had been getting childhood obesity “on the map.” “It’s now widely recognized as a major health problem in the United States,” he said then—and that was 10 years and a few million obese children ago.
There is an alternative theory, one that has also been around for decades but that the establishment has largely ignored. This theory implicates specific foods—refined sugars and grains—because of their effect on the hormone insulin, which regulates fat accumulation. If this hormonal-defect hypothesis is true, not all calories are created equal, as the conventional wisdom holds. And if it is true, the problem is not only controlling our impulses, but also changing the entire American food economy and rewriting our beliefs about what constitutes a healthy diet.
Oddly, this nutrient-hormone-fat interaction is not particularly controversial. You can find it in medical textbooks as the explanation for why our fat cells get fat. But the anti-obesity establishment doesn’t take the next step: that fat fat cells lead to fat humans. In their eyes, yes, insulin regulates how much fat gets trapped in your fat cells, and the kinds of carbohydrates we eat today pretty much drive up your insulin levels. But, they conclude, while individual cells get fat that way, the reason an entire human gets fat has nothing to do with it. We’re just eating too much.
I’ve been arguing otherwise. And one reason I like this hormonal hypothesis of obesity is that it explains the fat kids in Depression-era New York. As the extreme situation of exceedingly poor populations shows, the problem could not have been that they ate too much, because they didn’t have enough food available. The problem then—as now, across America—was the prevalence of sugars, refined flour, and starches in their diets. These are the cheapest calories, and they can be plenty tasty without a lot of preparation and preservation. And the biology suggests that they are literally fattening—they make us fat, while other foods (fats, proteins, and green leafy vegetables) don’t.
If this hypothesis is right, then the reason the anti-obesity efforts championed by the IOM, the CDC, and the NIH haven’t worked and won’t work is not because we’re not listening, and not because we just can’t say no, but because these efforts are not addressing the fundamental cause of the problem. Like trying to prevent lung cancer by getting smokers to eat less and run more, it won’t work because the intervention is wrong.
The authority figures in obesity and nutrition are so fixed on the simplistic calorie-balance idea that they’re willing to ignore virtually any science to hold on to it.
The first and most obvious mistake they make is embracing the notion that the only way foods can influence how fat we get is through the amount of energy—calories—they contain. The iconic example here is sugar, or rather sugars, since we’re talking about both sucrose (the white, granulated stuff we sprinkle on cereal) and high-fructose corn syrup. “What’s the single best thing I can do for me and my family?” asks one obese mother in The Weight of the Nation. The answer she’s given is “stop drinking sugar-sweetened beverages.” But the official wisdom—that all we need know is that a calorie is a calorie is a calorie—doesn’t explain why that might be so.
Left unsaid is the fact that sucrose and high-fructose corn syrup have a unique chemical composition, a near 50-50 combination of two different carbohydrates: glucose and fructose. And while glucose is metabolized by virtually every cell in the body, the fructose (also found in fruit, but in much lower concentrations) is metabolized mostly by liver cells. From there, the chain of metabolic events has been worked out by biochemists over 50 years: some of the fructose is converted into fat, the fat accumulates in the liver cells, which become resistant to the action of insulin, and so more insulin is secreted to compensate. The end results are elevated levels of insulin, which is the hallmark of type 2 diabetes, and the steady accumulation of fat in our fat tissue—a few tens of calories worth per day, leading to pounds per year, and obesity over the course of a few decades.
Last fall, researchers at the University of California, Davis, published three studies—two of humans, one of rhesus monkeys—confirming the deleterious effect of these sugars on metabolism and insulin levels. The message of all three studies was that sugars are unhealthy—not because people or monkeys consumed too much of them, but because, well, they do things to our bodies that the other nutrients we eat simply don’t do.
The second fallacy is the belief that physical activity plays a meaningful role in keeping off the pounds—an idea that the authorities just can’t seem to let go of, despite all evidence to the contrary. “We don’t walk, we don’t bike,” says University of North Carolina economist Barry Popkin in The Weight of the Nation. If we do exercise regularly, the logic goes, then we’ll at least maintain a healthy weight (along with other health benefits), which is why the official government recommendations from the USDA are that we should all do 150 minutes each week of “moderate intensity” aerobic exercise. And if that’s not enough to maintain a healthy weight or lose the excess, then, well, we should do more.
So why is the world full of obese individuals who do exercise regularly? Arkansas construction workers in The Weight of the Nation, for instance, do jobs that require constant lifting and running up ladders with “about 50 to 60 pounds of tools”—and an equal amount of excess fat. They’re on-camera making the point about how the combination is exhausting. “By the time the day’s over,” one tells us, “your feet are killing you; your legs are cramping. You can’t last as long as you used to.” If physical activity helps us lose weight or even just maintain it, how did these hardworking men get so fat?
There are two obvious reasons why this idea that working out makes you skinny or keeps you skinny is likely to be just wrong. One is that it takes a significant amount of exercise to burn even a modest amount of calories. Run three miles, says Cornell University researcher Brian Wansink in the documentary, and you’ll burn up roughly the amount of calories in a single candy bar. And this brings up the second reason: you’re likely to be hungrier after strenuous exercise than before and so you’re more likely to eat that candy bar’s worth of calories after than before. (When the American Heart Association and the American College of Sports Medicine jointly published physical-activity guidelines back in 2007, they described the evidence that exercise can even prevent us from growing fatter as “not particularly compelling,” which was a kind way to put it.)
Finally, the anti-obesity establishment embraces the idea that what are really missing from our diet are fresh fruits and vegetables—that these are the sine qua non of a healthy diet—and that meat, red meat in particular, is a likely cause of obesity. Since the mid-1970s, health agencies have waged a campaign to reduce our meat consumption, for a host of reasons: it causes colon cancer or heart disease (because of the saturated fat) and now because it supposedly makes us fat as well. The lowly cheeseburger is consistently targeted as a contributor to both obesity and diabetes.
But when David Wallinga of the Institute for Agriculture and Trade Policy tells us in The Weight of the Nation that the USDA has established the cause of the obesity epidemic and it’s “an increase in our calorie consumption over the last 30, 35 years,” he also tells us where those calories come from: a quarter come from added sugars, a quarter from added fats (“most of which are from soy”), and “almost half is from refined grains, mainly corn starches, wheat, and the like.” What Wallinga doesn’t say is that the same USDA data clearly shows that red-meat consumption peaked in this country in the mid-1970s, before the obesity epidemic started. It’s been dropping ever since, consistent with a nation that has been doing exactly what health authorities have been telling it to do.
At the moment, the government efforts to curb obesity and diabetes avoid the all-too-apparent fact, as Hilde Bruch pointed out more than half a century ago, that exhorting obese people to eat less and exercise more doesn’t work, and that this shouldn’t be an indictment of their character but of the value of the advice. By institutionalizing this advice as public-health policy, we waste enormous amounts of money and effort on programs that might make communities nicer places to live—building parks and making green markets available—but that we have little reason to believe will make anyone thinner. When I asked CDC Director Thomas Frieden about this, he pointed to two recent reports, from Massachusetts and New York, documenting small but real decreases in childhood-obesity levels. He then admitted that they had no idea why this had happened. “I’m doing everything I can do,” he said, “to assure that we rigorously monitor the efforts underway so we can try to understand what works and what doesn’t.”
If the latest research is any indication, sugar may have been the primary problem all along. Back in the 1980s, the FDA gave sugar a free pass based on the idea that the evidence wasn’t conclusive. While the government spent hundreds of millions trying to prove that salt and saturated fat are bad for our health, it spent virtually nothing on sugar. Had it targeted sugar then, instead of waiting for an obesity and diabetes epidemic for motivation, our entire food culture and the options that go with it might have changed as they did with low-fat and low-salt foods.
So what should we eat? The latest clinical trials suggest that all of us would benefit from fewer (if any) sugars and fewer refined grains (bread, pasta) and starchy vegetables (potatoes). This was the conventional wisdom through the mid-1960s, and then we turned the grains and starches into heart-healthy diet foods and the USDA enshrined them in the base of its famous Food Guide Pyramid as the staples of our diet. That this shift coincides with the obesity epidemic is probably not a coincidence. As for those of us who are overweight, experimental trials, the gold standard of medical evidence, suggest that diets that are severely restricted in fattening carbohydrates and rich in animal products—meat, eggs, cheese—and green leafy vegetables are arguably the best approach, if not the healthiest diet to eat. Not only does weight go down when people eat like this, but heart disease and diabetes risk factors are reduced. Ethical arguments against meat-eating are always valid; health arguments against it can no longer be defended.
If The Weight of the Nation accomplishes anything, it’s communicating the desperation of obese Americans trying to understand their condition and, even more, of lean (or relatively lean) parents trying to cope with the obesity of their offspring. Lack of will isn’t their problem. It’s the absence of advice that might actually work. If our authorities on this subject could accept that maybe their fundamental understanding of the problem needs to be rethought, we and they might begin to make progress. Clearly the conventional wisdom has failed so far. We can hold onto it only so long.
Gary Taubes is is the author of Why We Get Fat and What to Do About It (Knopf 2010) and Good Calories, Bad Calories: Challenging the Conventional Wisdom on Diet, Weight Control and Disease (Knopf 2007). He’s a contributing correspondent for the journal Science and a Robert Wood Johnson Foundation Independent Investigator in Health Policy Research at the University of California, Berkeley School of Public Health. Taubes has won numerous awards for his journalism including the International Health Reporting Award from the Pan American Health Organization and the National Association of Science Writers Science in Society Journalism Award three times, the only print journalist to do so.