Saturday, June 2, 2007

Food for Thought

Unhappy Meals
January 28, 2007
by Michael Pollan

Eat food. Not too much. Mostly plants.That, more or less, is the short answer to the supposedly incrediblycomplicated and confusing question of what we humans should eat in order tobe maximally healthy. I hate to give away the game right here at thebeginning of a long essay, and I confess that I'm tempted to complicatematters in the interest of keeping things going for a few thousand morewords. I'll try to resist but will go ahead and add a couple more details toflesh out the advice. Like: A little meat won't kill you, though it's betterapproached as a side dish than as a main. And you're much better off eatingwhole fresh foods than processed food products. That's what I mean by therecommendation to eat ''food.'' Once, food was all you could eat, but todaythere are lots of other edible foodlike substances in the supermarket. Thesenovel products of food science often come in packages festooned with healthclaims, which brings me to a related rule of thumb: if you're concernedabout your health, you should probably avoid food products that make healthclaims. Why? Because a health claim on a food product is a good indicationthat it's not really food, and food is what you want to eat.Uh-oh. Things are suddenly sounding a little more complicated, aren't they?Sorry. But that's how it goes as soon as you try to get to the bottom of thewhole vexing question of food and health. Before long, a dense cloud bank ofconfusion moves in. Sooner or later, everything solid you thought you knewabout the links between diet and health gets blown away in the gust of thelatest study.Last winter came the news that a low-fat diet, long believed to protectagainst breast cancer, may do no such thing -- this from the monumental,federally financed Women's Health Initiative, which has also found no linkbetween a low-fat diet and rates of coronary disease. The year before welearned that dietary fiber might not, as we had been confidently told, helpprevent colon cancer. Just last fall two prestigious studies on omega-3 fatspublished at the same time presented us with strikingly differentconclusions. While the Institute of Medicine stated that ''it is uncertainhow much these omega-3s contribute to improving health'' (and they might dothe opposite if you get them from mercury-contaminated fish), a Harvardstudy declared that simply by eating a couple of servings of fish each week(or by downing enough fish oil), you could cut your risk of dying from aheart attack by more than a third -- a stunningly hopeful piece of news.It's no wonder that omega-3 fatty acids are poised to become the oat bran of2007, as food scientists micro-encapsulate fish oil and algae oil and blastthem into such formerly all-terrestrial foods as bread and tortillas, milkand yogurt and cheese, all of which will soon, you can be sure, sprout fishynew health claims. (Remember the rule?)By now you're probably registering the cognitive dissonance of thesupermarket shopper or science-section reader, as well as some nostalgia forthe simplicity and solidity of the first few sentences of this essay. WhichI'm still prepared to defend against the shifting winds of nutritionalscience and food-industry marketing. But before I do that, it might beuseful to figure out how we arrived at our present state of nutritionalconfusion and anxiety.The story of how the most basic questions about what to eat ever got socomplicated reveals a great deal about the institutional imperatives of thefood industry, nutritional science and -- ahem -- journalism, three partiesthat stand to gain much from widespread confusion surrounding what is, afterall, the most elemental question an omnivore confronts. Humans deciding whatto eat without expert help -- something they have been doing with notablesuccess since coming down out of the trees -- is seriously unprofitable ifyou're a food company, distinctly risky if you're a nutritionist and justplain boring if you're a newspaper editor or journalist. (Or, for thatmatter, an eater. Who wants to hear, yet again, ''Eat more fruits andvegetables''?) And so, like a large gray fog, a great Conspiracy ofConfusion has gathered around the simplest questions of nutrition -- much tothe advantage of everybody involved. Except perhaps the ostensiblebeneficiary of all this nutritional expertise and advice: us, and our healthand happiness as eaters. FROM FOODS TO NUTRIENTSIt was in the 1980s that food began disappearing from the Americansupermarket, gradually to be replaced by ''nutrients,'' which are not thesame thing. Where once the familiar names of recognizable comestibles --things like eggs or breakfast cereal or cookies -- claimed pride of place onthe brightly colored packages crowding the aisles, now new terms like''fiber'' and ''cholesterol'' and ''saturated fat'' rose to large-typeprominence. More important than mere foods, the presence or absence of theseinvisible substances was now generally believed to confer health benefits ontheir eaters. Foods by comparison were coarse, old-fashioned and decidedlyunscientific things -- who could say what was in them, really? But nutrients-- those chemical compounds and minerals in foods that nutritionists havedeemed important to health -- gleamed with the promise of scientificcertainty; eat more of the right ones, fewer of the wrong, and you wouldlive longer and avoid chronic diseases.Nutrients themselves had been around, as a concept, since the early 19thcentury, when the English doctor and chemist William Prout identified whatcame to be called the ''macronutrients'': protein, fat and carbohydrates. Itwas thought that that was pretty much all there was going on in food, untildoctors noticed that an adequate supply of the big three did not necessarilykeep people nourished. At the end of the 19th century, British doctors werepuzzled by the fact that Chinese laborers in the Malay states were dying ofa disease called beriberi, which didn't seem to afflict Tamils or nativeMalays. The mystery was solved when someone pointed out that the Chinese ate''polished,'' or white, rice, while the others ate rice that hadn't beenmechanically milled. A few years later, Casimir Funk, a Polish chemist,discovered the ''essential nutrient'' in rice husks that protected againstberiberi and called it a ''vitamine,'' the first micronutrient. Vitaminsbrought a kind of glamour to the science of nutrition, and though certainsectors of the population began to eat by its expert lights, it reallywasn't until late in the 20th century that nutrients managed to push foodaside in the popular imagination of what it means to eat.No single event marked the shift from eating food to eating nutrients,though in retrospect a little-noticed political dust-up in Washington in1977 seems to have helped propel American food culture down this dimlylighted path. Responding to an alarming increase in chronic diseases linkedto diet -- including heart disease, cancer and diabetes -- a Senate SelectCommittee on Nutrition, headed by George McGovern, held hearings on theproblem and prepared what by all rights should have been an uncontroversialdocument called ''Dietary Goals for the United States.'' The committeelearned that while rates of coronary heart disease had soared in Americasince World War II, other cultures that consumed traditional diets basedlargely on plants had strikingly low rates of chronic disease.Epidemiologists also had observed that in America during the war years, whenmeat and dairy products were strictly rationed, the rate of heart diseasetemporarily plummeted.Naively putting two and two together, the committee drafted astraightforward set of dietary guidelines calling on Americans to cut downon red meat and dairy products. Within weeks a firestorm, emanating from thered-meat and dairy industries, engulfed the committee, and Senator McGovern(who had a great many cattle ranchers among his South Dakota constituents)was forced to beat a retreat. The committee's recommendations were hastilyrewritten. Plain talk about food -- the committee had advised Americans toactually ''reduce consumption of meat'' -- was replaced by artfulcompromise: ''Choose meats, poultry and fish that will reduce saturated-fatintake.''A subtle change in emphasis, you might say, but a world of difference justthe same. First, the stark message to ''eat less'' of a particular food hasbeen deep-sixed; don't look for it ever again in any official U.S. dietarypronouncement. Second, notice how distinctions between entities as differentas fish and beef and chicken have collapsed; those three venerable foods,each representing an entirely different taxonomic class, are now lumpedtogether as delivery systems for a single nutrient. Notice too how the newlanguage exonerates the foods themselves; now the culprit is an obscure,invisible, tasteless -- and politically unconnected -- substance that may ormay not lurk in them called ''saturated fat.''The linguistic capitulation did nothing to rescue McGovern from his blunder;the very next election, in 1980, the beef lobby helped rusticate thethree-term senator, sending an unmistakable warning to anyone who wouldchallenge the American diet, and in particular the big chunk of animalprotein sitting in the middle of its plate. Henceforth, government dietaryguidelines would shun plain talk about whole foods, each of which has itstrade association on Capitol Hill, and would instead arrive clothed inscientific euphemism and speaking of nutrients, entities that few Americansreally understood but that lack powerful lobbies in Washington. This wasprecisely the tack taken by the National Academy of Sciences when it issuedits landmark report on diet and cancer in 1982. Organized nutrient bynutrient in a way guaranteed to offend no food group, it codified theofficial new dietary language. Industry and media followed suit, and termslike polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber,polyphenols, amino acids and carotenes soon colonized much of the culturalspace previously occupied by the tangible substance formerly known as food.The Age of Nutritionism had arrived. THE RISE OF NUTRITIONISMThe first thing to understand about nutritionism -- I first encountered theterm in the work of an Australian sociologist of science named GyorgyScrinis -- is that it is not quite the same as nutrition. As the ''ism''suggests, it is not a scientific subject but an ideology. Ideologies areways of organizing large swaths of life and experience under a set of sharedbut unexamined assumptions. This quality makes an ideology particularly hardto see, at least while it's exerting its hold on your culture. A reigningideology is a little like the weather, all pervasive and virtuallyinescapable. Still, we can try.In the case of nutritionism, the widely shared but unexamined assumption isthat the key to understanding food is indeed the nutrient. From this basicpremise flow several others. Since nutrients, as compared with foods, areinvisible and therefore slightly mysterious, it falls to the scientists (andto the journalists through whom the scientists speak) to explain the hiddenreality of foods to us. To enter a world in which you dine on unseennutrients, you need lots of expert help.But expert help to do what, exactly? This brings us to another unexaminedassumption: that the whole point of eating is to maintain and promote bodilyhealth. Hippocrates's famous injunction to ''let food be thy medicine'' isritually invoked to support this notion. I'll leave the premise alone fornow, except to point out that it is not shared by all cultures and that theexperience of these other cultures suggests that, paradoxically, viewingfood as being about things other than bodily health -- like pleasure, say,or socializing -- makes people no less healthy; indeed, there's some reasonto believe that it may make them more healthy. This is what we usually havein mind when we speak of the ''French paradox'' -- the fact that apopulation that eats all sorts of unhealthful nutrients is in many wayshealthier than we Americans are. So there is at least a question as towhether nutritionism is actually any good for you.Another potentially serious weakness of nutritionist ideology is that it hastrouble discerning qualitative distinctions between foods. So fish, beef andchicken through the nutritionists' lens become mere delivery systems forvarying quantities of fats and proteins and whatever other nutrients are ontheir scope. Similarly, any qualitative distinctions between processed foodsand whole foods disappear when your focus is on quantifying the nutrientsthey contain (or, more precisely, the known nutrients).This is a great boon for manufacturers of processed food, and it helpsexplain why they have been so happy to get with the nutritionism program. Inthe years following McGovern's capitulation and the 1982 National Academyreport, the food industry set about re-engineering thousands of popular foodproducts to contain more of the nutrients that science and government haddeemed the good ones and less of the bad, and by the late '80s a golden eraof food science was upon us. The Year of Eating Oat Bran -- also known as1988 -- served as a kind of coming-out party for the food scientists, whosucceeded in getting the material into nearly every processed food sold inAmerica. Oat bran's moment on the dietary stage didn't last long, but thepattern had been established, and every few years since then a new oat branhas taken its turn under the marketing lights. (Here comes omega-3!)By comparison, the typical real food has more trouble competing under therules of nutritionism, if only because something like a banana or an avocadocan't easily change its nutritional stripes (though rest assured the geneticengineers are hard at work on the problem). So far, at least, you can't putoat bran in a banana. So depending on the reigning nutritional orthodoxy,the avocado might be either a high-fat food to be avoided (Old Think) or afood high in monounsaturated fat to be embraced (New Think). The fate ofeach whole food rises and falls with every change in the nutritionalweather, while the processed foods are simply reformulated. That's why whenthe Atkins mania hit the food industry, bread and pasta were given a quickredesign (dialing back the carbs; boosting the protein), while the poorunreconstructed potatoes and carrots were left out in the cold.Of course it's also a lot easier to slap a health claim on a box of sugarycereal than on a potato or carrot, with the perverse result that the mosthealthful foods in the supermarket sit there quietly in the produce section,silent as stroke victims, while a few aisles over, the Cocoa Puffs and LuckyCharms are screaming about their newfound whole-grain goodness. EAT RIGHT, GET FATTERSo nutritionism is good for business. But is it good for us? You might thinkthat a national fixation on nutrients would lead to measurable improvementsin the public health. But for that to happen, the underlying nutritionalscience, as well as the policy recommendations (and the journalism) based onthat science, would have to be sound. This has seldom been the case.Consider what happened immediately after the 1977 ''Dietary Goals'' --McGovern's masterpiece of politico-nutritionist compromise. In the wake ofthe panel's recommendation that we cut down on saturated fat, arecommendation seconded by the 1982 National Academy report on cancer,Americans did indeed change their diets, endeavoring for a quarter-centuryto do what they had been told. Well, kind of. The industrial food supply waspromptly reformulated to reflect the official advice, giving us low-fatpork, low-fat Snackwell's and all the low-fat pasta and high-fructose (yetlow-fat!) corn syrup we could consume. Which turned out to be quite a lot.Oddly, America got really fat on its new low-fat diet -- indeed, many datethe current obesity and diabetes epidemic to the late 1970s, when Americansbegan binging on carbohydrates, ostensibly as a way to avoid the evils offat.This story has been told before, notably in these pages (''What if It's AllBeen a Big Fat Lie?'' by Gary Taubes, July 7, 2002), but it's a little morecomplicated than the official version suggests. In that version, whichinspired the most recent Atkins craze, we were told that America got fatwhen, responding to bad scientific advice, it shifted its diet from fats tocarbs, suggesting that a re-evaluation of the two nutrients is in order: fatdoesn't make you fat; carbs do. (Why this should have come as news is amystery: as long as people have been raising animals for food, they havefattened them on carbs.)But there are a couple of problems with this revisionist picture. First,while it is true that Americans post-1977 did begin binging on carbs, andthat fat as a percentage of total calories in the American diet declined, wenever did in fact cut down on our consumption of fat. Meat consumptionactually climbed. We just heaped a bunch more carbs onto our plates,obscuring perhaps, but not replacing, the expanding chunk of animal proteinsquatting in the center.How did that happen? I would submit that the ideology of nutritionismdeserves as much of the blame as the carbohydrates themselves do -- that andhuman nature. By framing dietary advice in terms of good and bad nutrients,and by burying the recommendation that we should eat less of any particularfood, it was easy for the take-home message of the 1977 and 1982 dietaryguidelines to be simplified as follows: Eat more low-fat foods. And that iswhat we did. We're always happy to receive a dispensation to eat more ofsomething (with the possible exception of oat bran), and one of the thingsnutritionism reliably gives us is some such dispensation: low-fat cookiesthen, low-carb beer now. It's hard to imagine the low-fat craze taking offas it did if McGovern's original food-based recommendations had stood: eatfewer meat and dairy products. For how do you get from that stark counsel tothe idea that another case of Snackwell's is just what the doctor ordered? BAD SCIENCEBut if nutritionism leads to a kind of false consciousness in the mind ofthe eater, the ideology can just as easily mislead the scientist. Mostnutritional science involves studying one nutrient at a time, an approachthat even nutritionists who do it will tell you is deeply flawed. ''Theproblem with nutrient-by-nutrient nutrition science,'' points out MarionNestle, the New York University nutritionist, ''is that it takes thenutrient out of the context of food, the food out of the context of diet andthe diet out of the context of lifestyle.''If nutritional scientists know this, why do they do it anyway? Because anutrient bias is built into the way science is done: scientists needindividual variables they can isolate. Yet even the simplest food is ahopelessly complex thing to study, a virtual wilderness of chemicalcompounds, many of which exist in complex and dynamic relation to oneanother, and all of which together are in the process of changing from onestate to another. So if you're a nutritional scientist, you do the onlything you can do, given the tools at your disposal: break the thing downinto its component parts and study those one by one, even if that meansignoring complex interactions and contexts, as well as the fact that thewhole may be more than, or just different from, the sum of its parts. Thisis what we mean by reductionist science.Scientific reductionism is an undeniably powerful tool, but it can misleadus too, especially when applied to something as complex as, on the one side,a food, and on the other, a human eater. It encourages us to take amechanistic view of that transaction: put in this nutrient; get out thatphysiological result. Yet people differ in important ways. Some populationscan metabolize sugars better than others; depending on your evolutionaryheritage, you may or may not be able to digest the lactose in milk. Thespecific ecology of your intestines helps determine how efficiently youdigest what you eat, so that the same input of 100 calories may yield moreor less energy depending on the proportion of Firmicutes and Bacteroidetesliving in your gut. There is nothing very machinelike about the human eater,and so to think of food as simply fuel is wrong.Also, people don't eat nutrients, they eat foods, and foods can behave verydifferently than the nutrients they contain. Researchers have long believed,based on epidemiological comparisons of different populations, that a diethigh in fruits and vegetables confers some protection against cancer. Sonaturally they ask, What nutrients in those plant foods are responsible forthat effect? One hypothesis is that the antioxidants in fresh produce --compounds like beta carotene, lycopene, vitamin E, etc. -- are the X factor.It makes good sense: these molecules (which plants produce to protectthemselves from the highly reactive oxygen atoms produced in photosynthesis)vanquish the free radicals in our bodies, which can damage DNA and initiatecancers. At least that's how it seems to work in the test tube. Yet as soonas you remove these useful molecules from the context of the whole foodsthey're found in, as we've done in creating antioxidant supplements, theydon't work at all. Indeed, in the case of beta carotene ingested as asupplement, scientists have discovered that it actually increases the riskof certain cancers. Big oops.What's going on here? We don't know. It could be the vagaries of humandigestion. Maybe the fiber (or some other component) in a carrot protectsthe antioxidant molecules from destruction by stomach acids early in thedigestive process. Or it could be that we isolated the wrong antioxidant.Beta is just one of a whole slew of carotenes found in common vegetables;maybe we focused on the wrong one. Or maybe beta carotene works as anantioxidant only in concert with some other plant chemical or process; underother circumstances, it may behave as a pro-oxidant.Indeed, to look at the chemical composition of any common food plant is torealize just how much complexity lurks within it. Here's a list of just theantioxidants that have been identified in garden-variety thyme:4-Terpineol, alanine, anethole, apigenin, ascorbic acid, beta carotene,caffeic acid, camphene, carvacrol, chlorogenic acid, chrysoeriol,eriodictyol, eugenol, ferulic acid, gallic acid, gamma-terpineneisochlorogenic acid, isoeugenol, isothymonin, kaempferol, labiatic acid,lauric acid, linalyl acetate, luteolin, methionine, myrcene, myristic acid,naringenin, oleanolic acid, p-coumoric acid, p-hydroxy-benzoic acid,palmitic acid, rosmarinic acid, selenium, tannin, thymol, tryptophan,ursolic acid, vanillic acid.This is what you're ingesting when you eat food flavored with thyme. Some ofthese chemicals are broken down by your digestion, but others are going onto do undetermined things to your body: turning some gene's expression on oroff, perhaps, or heading off a free radical before it disturbs a strand ofDNA deep in some cell. It would be great to know how this all works, but inthe meantime we can enjoy thyme in the knowledge that it probably doesn't doany harm (since people have been eating it forever) and that it may actuallydo some good (since people have been eating it forever) and that even if itdoes nothing, we like the way it tastes.It's also important to remind ourselves that what reductive science canmanage to perceive well enough to isolate and study is subject to change,and that we have a tendency to assume that what we can see is all there isto see. When William Prout isolated the big three macronutrients, scientistsfigured they now understood food and what the body needs from it; when thevitamins were isolated a few decades later, scientists thought, O.K., now wereally understand food and what the body needs to be healthy; today it's thepolyphenols and carotenoids that seem all-important. But who knows what thehell else is going on deep in the soul of a carrot?The good news is that, to the carrot eater, it doesn't matter. That's thegreat thing about eating food as compared with nutrients: you don't need tofathom a carrot's complexity to reap its benefits.The case of the antioxidants points up the dangers in taking a nutrient outof the context of food; as Nestle suggests, scientists make a second,related error when they study the food out of the context of the diet. Wedon't eat just one thing, and when we are eating any one thing, we're noteating another. We also eat foods in combinations and in orders that canaffect how they're absorbed. Drink coffee with your steak, and your bodywon't be able to fully absorb the iron in the meat. The trace of limestonein the corn tortilla unlocks essential amino acids in the corn that wouldotherwise remain unavailable. Some of those compounds in that sprig of thymemay well affect my digestion of the dish I add it to, helping to break downone compound or possibly stimulate production of an enzyme to detoxifyanother. We have barely begun to understand the relationships among foods ina cuisine.But we do understand some of the simplest relationships, like the zero-sumrelationship: that if you eat a lot of meat you're probably not eating a lotof vegetables. This simple fact may explain why populations that eat dietshigh in meat have higher rates of coronary heart disease and cancer thanthose that don't. Yet nutritionism encourages us to look elsewhere for theexplanation: deep within the meat itself, to the culpable nutrient, whichscientists have long assumed to be the saturated fat. So they are baffledwhen large-population studies, like the Women's Health Initiative, fail tofind that reducing fat intake significantly reduces the incidence of heartdisease or cancer.Of course thanks to the low-fat fad (inspired by the very same reductionistfat hypothesis), it is entirely possible to reduce your intake of saturatedfat without significantly reducing your consumption of animal protein: justdrink the low-fat milk and order the skinless chicken breast or the turkeybacon. So maybe the culprit nutrient in meat and dairy is the animal proteinitself, as some researchers now hypothesize. (The Cornell nutritionist T.Colin Campbell argues as much in his recent book, ''The China Study.'') Or,as the Harvard epidemiologist Walter C. Willett suggests, it could be thesteroid hormones typically present in the milk and meat; these hormones(which occur naturally in meat and milk but are often augmented inindustrial production) are known to promote certain cancers.But people worried about their health needn't wait for scientists to settlethis question before deciding that it might be wise to eat more plants andless meat. This is of course precisely what the McGovern committee wastrying to tell us.Nestle also cautions against taking the diet out of the context of thelifestyle. The Mediterranean diet is widely believed to be one of the mosthealthful ways to eat, yet much of what we know about it is based on studiesof people living on the island of Crete in the 1950s, who in many respectslived lives very different from our own. Yes, they ate lots of olive oil andlittle meat. But they also did more physical labor. They fasted regularly.They ate a lot of wild greens -- weeds. And, perhaps most important, theyconsumed far fewer total calories than we do. Similarly, much of what weknow about the health benefits of a vegetarian diet is based on studies ofSeventh Day Adventists, who muddy the nutritional picture by drinkingabsolutely no alcohol and never smoking. These extraneous but unavoidablefactors are called, aptly, ''confounders.'' One last example: People whotake supplements are healthier than the population at large, but theirhealth probably has nothing whatsoever to do with the supplements they take-- which recent studies have suggested are worthless. Supplement-takers arebetter-educated, more-affluent people who, almost by definition, take agreater-than-normal interest in personal health -- confounding factors thatprobably account for their superior health.But if confounding factors of lifestyle bedevil comparative studies ofdifferent populations, the supposedly more rigorous ''prospective'' studiesof large American populations suffer from their own arguably even moredisabling flaws. In these studies -- of which the Women's Health Initiativeis the best known -- a large population is divided into two groups. Theintervention group changes its diet in some prescribed manner, while thecontrol group does not. The two groups are then tracked over many years tolearn whether the intervention affects relative rates of chronic disease.When it comes to studying nutrition, this sort of extensive, long-termclinical trial is supposed to be the gold standard. It certainly soundssound. In the case of the Women's Health Initiative, sponsored by theNational Institutes of Health, the eating habits and health outcomes ofnearly 49,000 women (ages 50 to 79 at the beginning of the study) weretracked for eight years. One group of the women were told to reduce theirconsumption of fat to 20 percent of total calories. The results wereannounced early last year, producing front-page headlines of which the onein this newspaper was typical: ''Low-Fat Diet Does Not Cut Health Risks,Study Finds.'' And the cloud of nutritional confusion over the countrydarkened.But even a cursory analysis of the study's methods makes you wonder whyanyone would take such a finding seriously, let alone order a QuarterPounder With Cheese to celebrate it, as many newspaper readers no doubtpromptly went out and did. Even the beginner student of nutritionism willimmediately spot several flaws: the focus was on ''fat,'' rather than on anyparticular food, like meat or dairy. So women could comply simply byswitching to lower-fat animal products. Also, no distinctions were madebetween types of fat: women getting their allowable portion of fat fromolive oil or fish were lumped together with woman getting their fat fromlow-fat cheese or chicken breasts or margarine. Why? Because when the studywas designed 16 years ago, the whole notion of ''good fats'' was not yet onthe scientific scope. Scientists study what scientists can see.But perhaps the biggest flaw in this study, and other studies like it, isthat we have no idea what these women were really eating because, like mostpeople when asked about their diet, they lied about it. How do we know this?Deduction. Consider: When the study began, the average participant weighedin at 170 pounds and claimed to be eating 1,800 calories a day. It wouldtake an unusual metabolism to maintain that weight on so little food. And itwould take an even freakier metabolism to drop only one or two pounds aftergetting down to a diet of 1,400 to 1,500 calories a day -- as the women onthe ''low-fat'' regimen claimed to have done. Sorry, ladies, but I justdon't buy it.In fact, nobody buys it. Even the scientists who conduct this sort ofresearch conduct it in the knowledge that people lie about their food intakeall the time. They even have scientific figures for the magnitude of thelie. Dietary trials like the Women's Health Initiative rely on''food-frequency questionnaires,'' and studies suggest that people onaverage eat between a fifth and a third more than they claim to on thequestionnaires. How do the researchers know that? By comparing what peoplereport on questionnaires with interviews about their dietary intake over theprevious 24 hours, thought to be somewhat more reliable. In fact, themagnitude of the lie could be much greater, judging by the huge disparitybetween the total number of food calories produced every day for eachAmerican (3,900 calories) and the average number of those calories Americansown up to chomping: 2,000. (Waste accounts for some of the disparity, butnowhere near all of it.) All we really know about how much people actuallyeat is that the real number lies somewhere between those two figures.To try to fill out the food-frequency questionnaire used by the Women'sHealth Initiative, as I recently did, is to realize just how shaky the dataon which such trials rely really are. The survey, which took about 45minutes to complete, started off with some relatively easy questions: ''Didyou eat chicken or turkey during the last three months?'' Having answeredyes, I was then asked, ''When you ate chicken or turkey, how often did youeat the skin?'' But the survey soon became harder, as when it asked me tothink back over the past three months to recall whether when I ate okra,squash or yams, they were fried, and if so, were they fried in stickmargarine, tub margarine, butter, ''shortening'' (in which category theyinexplicably lump together hydrogenated vegetable oil and lard), olive orcanola oil or nonstick spray? I honestly didn't remember, and in the case ofany okra eaten in a restaurant, even a hypnotist could not get out of mewhat sort of fat it was fried in. In the meat section, the portion sizesspecified haven't been seen in America since the Hoover administration. If afour-ounce portion of steak is considered ''medium,'' was I really going toadmit that the steak I enjoyed on an unrecallable number of occasions duringthe past three months was probably the equivalent of two or three (or, inthe case of a steakhouse steak, no less than four) of these portions? Ithink not. In fact, most of the ''medium serving sizes'' to which I wasasked to compare my own consumption made me feel piggish enough to want toshave a few ounces here, a few there. (I mean, I wasn't under oath oranything, was I?)This is the sort of data on which the largest questions of diet and healthare being decided in America today. THE ELEPHANT IN THE ROOMIn the end, the biggest, most ambitious and widely reported studies of dietand health leave more or less undisturbed the main features of the Westerndiet: lots of meat and processed foods, lots of added fat and sugar, lots ofeverything -- except fruits, vegetables and whole grains. In keeping withthe nutritionism paradigm and the limits of reductionist science, theresearchers fiddle with single nutrients as best they can, but thepopulations they recruit and study are typical American eaters doing whattypical American eaters do: trying to eat a little less of this nutrient, alittle more of that, depending on the latest thinking. (One problem with thecontrol groups in these studies is that they too are exposed to nutritionalfads in the culture, so over time their eating habits come to more closelyresemble the habits of the intervention group.) It should not surprise usthat the findings of such research would be so equivocal and confusing.But what about the elephant in the room -- the Western diet? It might beuseful, in the midst of our deepening confusion about nutrition, to reviewwhat we do know about diet and health. What we know is that people who eatthe way we do in America today suffer much higher rates of cancer, heartdisease, diabetes and obesity than people eating more traditional diets.(Four of the 10 leading killers in America are linked to diet.) Further, weknow that simply by moving to America, people from nations with low rates ofthese ''diseases of affluence'' will quickly acquire them. Nutritionism byand large takes the Western diet as a given, seeking to moderate its mostdeleterious effects by isolating the bad nutrients in it -- things like fat,sugar, salt -- and encouraging the public and the food industry to limitthem. But after several decades of nutrient-based health advice, rates ofcancer and heart disease in the U.S. have declined only slightly (mortalityfrom heart disease is down since the '50s, but this is mainly because ofimproved treatment), and rates of obesity and diabetes have soared.No one likes to admit that his or her best efforts at understanding andsolving a problem have actually made the problem worse, but that's exactlywhat has happened in the case of nutritionism. Scientists operating with thebest of intentions, using the best tools at their disposal, have taught usto look at food in a way that has diminished our pleasure in eating it whiledoing little or nothing to improve our health. Perhaps what we need now is abroader, less reductive view of what food is, one that is at once moreecological and cultural. What would happen, for example, if we were to startthinking about food as less of a thing and more of a relationship?In nature, that is of course precisely what eating has always been:relationships among species in what we call food chains, or webs, that reachall the way down to the soil. Species co-evolve with the other species theyeat, and very often a relationship of interdependence develops: I'll feedyou if you spread around my genes. A gradual process of mutual adaptationtransforms something like an apple or a squash into a nutritious and tastyfood for a hungry animal. Over time and through trial and error, the plantbecomes tastier (and often more conspicuous) in order to gratify theanimal's needs and desires, while the animal gradually acquires whateverdigestive tools (enzymes, etc.) are needed to make optimal use of the plant.Similarly, cow's milk did not start out as a nutritious food for humans; infact, it made them sick until humans who lived around cows evolved theability to digest lactose as adults. This development proved much to theadvantage of both the milk drinkers and the cows.''Health'' is, among other things, the byproduct of being involved in thesesorts of relationships in a food chain -- involved in a great many of them,in the case of an omnivorous creature like us. Further, when the health ofone link of the food chain is disturbed, it can affect all the creatures init. When the soil is sick or in some way deficient, so will be the grassesthat grow in that soil and the cattle that eat the grasses and the peoplewho drink the milk. Or, as the English agronomist Sir Albert Howard put itin 1945 in ''The Soil and Health'' (a founding text of organic agriculture),we would do well to regard ''the whole problem of health in soil, plant,animal and man as one great subject.'' Our personal health is inextricablybound up with the health of the entire food web.In many cases, long familiarity between foods and their eaters leads toelaborate systems of communications up and down the food chain, so that acreature's senses come to recognize foods as suitable by taste and smell andcolor, and our bodies learn what to do with these foods after they pass thetest of the senses, producing in anticipation the chemicals necessary tobreak them down. Health depends on knowing how to read these biologicalsignals: this smells spoiled; this looks ripe; that's one good-looking cow.This is easier to do when a creature has long experience of a food, and muchharder when a food has been designed expressly to deceive its senses -- withartificial flavors, say, or synthetic sweeteners.Note that these ecological relationships are between eaters and whole foods,not nutrients. Even though the foods in question eventually get broken downin our bodies into simple nutrients, as corn is reduced to simple sugars,the qualities of the whole food are not unimportant -- they govern suchthings as the speed at which the sugars will be released and absorbed, whichwe're coming to see as critical to insulin metabolism. Put another way, ourbodies have a longstanding and sustainable relationship to corn that we donot have to high-fructose corn syrup. Such a relationship with corn syrupmight develop someday (as people evolve superhuman insulin systems to copewith regular floods of fructose and glucose), but for now the relationshipleads to ill health because our bodies don't know how to handle thesebiological novelties. In much the same way, human bodies that can cope withchewing coca leaves -- a longstanding relationship between native people andthe coca plant in South America -- cannot cope with cocaine or crack, eventhough the same ''active ingredients'' are present in all three.Reductionism as a way of understanding food or drugs may be harmless, evennecessary, but reductionism in practice can lead to problems.Looking at eating through this ecological lens opens a whole new perspectiveon exactly what the Western diet is: a radical and rapid change not just inour foodstuffs over the course of the 20th century but also in our foodrelationships, all the way from the soil to the meal. The ideology ofnutritionism is itself part of that change. To get a firmer grip on thenature of those changes is to begin to know how we might make ourrelationships to food healthier. These changes have been numerous andfar-reaching, but consider as a start these four large-scale ones:From Whole Foods to Refined. The case of corn points up one of the keyfeatures of the modern diet: a shift toward increasingly refined foods,especially carbohydrates. Call it applied reductionism. Humans have beenrefining grains since at least the Industrial Revolution, favoring whiteflour (and white rice) even at the price of lost nutrients. Refining grainsextends their shelf life (precisely because it renders them less nutritiousto pests) and makes them easier to digest, by removing the fiber thatordinarily slows the release of their sugars. Much industrial foodproduction involves an extension and intensification of this practice, asfood processors find ways to deliver glucose -- the brain's preferred fuel-- ever more swiftly and efficiently. Sometimes this is precisely the point,as when corn is refined into corn syrup; other times it is an unfortunatebyproduct of food processing, as when freezing food destroys the fiber thatwould slow sugar absorption.So fast food is fast in this other sense too: it is to a considerable extentpredigested, in effect, and therefore more readily absorbed by the body. Butwhile the widespread acceleration of the Western diet offers us the instantgratification of sugar, in many people (and especially those newly exposedto it) the ''speediness'' of this food overwhelms the insulin response andleads to Type II diabetes. As one nutrition expert put it to me, we're inthe middle of ''a national experiment in mainlining glucose.'' To encountersuch a diet for the first time, as when people accustomed to a moretraditional diet come to America, or when fast food comes to theircountries, delivers a shock to the system. Public-health experts call it''the nutrition transition,'' and it can be deadly.From Complexity to Simplicity. If there is one word that covers nearly allthe changes industrialization has made to the food chain, it would besimplification. Chemical fertilizers simplify the chemistry of the soil,which in turn appears to simplify the chemistry of the food grown in thatsoil. Since the widespread adoption of synthetic nitrogen fertilizers in the1950s, the nutritional quality of produce in America has, according toU.S.D.A. figures, declined significantly. Some researchers blame the qualityof the soil for the decline; others cite the tendency of modern plantbreeding to select for industrial qualities like yield rather thannutritional quality. Whichever it is, the trend toward simplification of ourfood continues on up the chain. Processing foods depletes them of manynutrients, a few of which are then added back in through ''fortification'':folic acid in refined flour, vitamins and minerals in breakfast cereal. Butfood scientists can add back only the nutrients food scientists recognize asimportant. What are they overlooking?Simplification has occurred at the level of species diversity, too. Theastounding variety of foods on offer in the modern supermarket obscures thefact that the actual number of species in the modern diet is shrinking. Forreasons of economics, the food industry prefers to tease its myriadprocessed offerings from a tiny group of plant species, corn and soybeanschief among them. Today, a mere four crops account for two-thirds of thecalories humans eat. When you consider that humankind has historicallyconsumed some 80,000 edible species, and that 3,000 of these have been inwidespread use, this represents a radical simplification of the food web.Why should this matter? Because humans are omnivores, requiring somewherebetween 50 and 100 different chemical compounds and elements to be healthy.It's hard to believe that we can get everything we need from a dietconsisting largely of processed corn, soybeans, wheat and rice.From Leaves to Seeds. It's no coincidence that most of the plants we havecome to rely on are grains; these crops are exceptionally efficient attransforming sunlight into macronutrients -- carbs, fats and proteins. Thesemacronutrients in turn can be profitably transformed into animal protein (byfeeding them to animals) and processed foods of every description. Also, thefact that grains are durable seeds that can be stored for long periods meansthey can function as commodities as well as food, making these plantsparticularly well suited to the needs of industrial capitalism.The needs of the human eater are another matter. An oversupply ofmacronutrients, as we now have, itself represents a serious threat to ourhealth, as evidenced by soaring rates of obesity and diabetes. But theundersupply of micronutrients may constitute a threat just as serious. Putin the simplest terms, we're eating a lot more seeds and a lot fewer leaves,a tectonic dietary shift the full implications of which we are justbeginning to glimpse. If I may borrow the nutritionist's reductionistvocabulary for a moment, there are a host of critical micronutrients thatare harder to get from a diet of refined seeds than from a diet of leaves.There are the antioxidants and all the other newly discovered phytochemicals(remember that sprig of thyme?); there is the fiber, and then there are thehealthy omega-3 fats found in leafy green plants, which may turn out to bemost important benefit of all.Most people associate omega-3 fatty acids with fish, but fish get them fromgreen plants (specifically algae), which is where they all originate. Plantleaves produce these essential fatty acids (''essential'' because our bodiescan't produce them on their own) as part of photosynthesis. Seeds containmore of another essential fatty acid: omega-6. Without delving too deeplyinto the biochemistry, the two fats perform very different functions, in theplant as well as the plant eater. Omega-3s appear to play an important rolein neurological development and processing, the permeability of cell walls,the metabolism of glucose and the calming of inflammation. Omega-6s areinvolved in fat storage (which is what they do for the plant), the rigidityof cell walls, clotting and the inflammation response. (Think of omega-3s asfleet and flexible, omega-6s as sturdy and slow.) Since the two lipidscompete with each other for the attention of important enzymes, the ratiobetween omega-3s and omega-6s may matter more than the absolute quantity ofeither fat. Thus too much omega-6 may be just as much a problem as toolittle omega-3.And that might well be a problem for people eating a Western diet. As we'veshifted from leaves to seeds, the ratio of omega-6s to omega-3s in ourbodies has shifted, too. At the same time, modern food-production practiceshave further diminished the omega-3s in our diet. Omega-3s, being lessstable than omega-6s, spoil more readily, so we have selected for plantsthat produce fewer of them; further, when we partly hydrogenate oils torender them more stable, omega-3s are eliminated. Industrial meat, raised onseeds rather than leaves, has fewer omega-3s and more omega-6s thanpreindustrial meat used to have. And official dietary advice since the 1970shas promoted the consumption of polyunsaturated vegetable oils, most ofwhich are high in omega-6s (corn and soy, especially). Thus, withoutrealizing what we were doing, we significantly altered the ratio of thesetwo essential fats in our diets and bodies, with the result that the ratioof omega-6 to omega-3 in the typical American today stands at more than 10to 1; before the widespread introduction of seed oils at the turn of thelast century, it was closer to 1 to 1.The role of these lipids is not completely understood, but many researcherssay that these historically low levels of omega-3 (or, conversely, highlevels of omega-6) bear responsibility for many of the chronic diseasesassociated with the Western diet, especially heart disease and diabetes.(Some researchers implicate omega-3 deficiency in rising rates of depressionand learning disabilities as well.) To remedy this deficiency, nutritionismclassically argues for taking omega-3 supplements or fortifying foodproducts, but because of the complex, competitive relationship betweenomega-3 and omega-6, adding more omega-3s to the diet may not do much goodunless you also reduce your intake of omega-6.From Food Culture to Food Science. The last important change wrought by theWestern diet is not, strictly speaking, ecological. But theindustrialization of our food that we call the Western diet issystematically destroying traditional food cultures. Before the modern foodera -- and before nutritionism -- people relied for guidance about what toeat on their national or ethnic or regional cultures. We think of culture asa set of beliefs and practices to help mediate our relationship to otherpeople, but of course culture (at least before the rise of science) has alsoplayed a critical role in helping mediate people's relationship to nature.Eating being a big part of that relationship, cultures have had a great dealto say about what and how and why and when and how much we should eat. Ofcourse when it comes to food, culture is really just a fancy word for Mom,the figure who typically passes on the food ways of the group -- food waysthat, although they were never ''designed'' to optimize health (we have manyreasons to eat the way we do), would not have endured if they did not keepeaters alive and well.The sheer novelty and glamour of the Western diet, with its 17,000 new foodproducts introduced every year, and the marketing muscle used to sell theseproducts, has overwhelmed the force of tradition and left us where we nowfind ourselves: relying on science and journalism and marketing to help usdecide questions about what to eat. Nutritionism, which arose to help usbetter deal with the problems of the Western diet, has largely been co-optedby it, used by the industry to sell more food and to undermine the authorityof traditional ways of eating. You would not have read this far into thisarticle if your food culture were intact and healthy; you would simply eatthe way your parents and grandparents and great-grandparents taught you toeat. The question is, Are we better off with these new authorities than wewere with the traditional authorities they supplanted? The answer by nowshould be clear.It might be argued that, at this point in history, we should simply acceptthat fast food is our food culture. Over time, people will get used toeating this way and our health will improve. But for natural selection tohelp populations adapt to the Western diet, we'd have to be prepared to letthose whom it sickens die. That's not what we're doing. Rather, we'returning to the health-care industry to help us ''adapt.'' Medicine islearning how to keep alive the people whom the Western diet is making sick.It's gotten good at extending the lives of people with heart disease, andnow it's working on obesity and diabetes. Capitalism is itself marvelouslyadaptive, able to turn the problems it creates into lucrative businessopportunities: diet pills, heart-bypass operations, insulin pumps, bariatricsurgery. But while fast food may be good business for the health-careindustry, surely the cost to society -- estimated at more than $200 billiona year in diet-related health-care costs -- is unsustainable. BEYOND NUTRITIONISMTo medicalize the diet problem is of course perfectly consistent withnutritionism. So what might a more ecological or cultural approach to theproblem recommend? How might we plot our escape from nutritionism and, inturn, from the deleterious effects of the modern diet? In theory nothingcould be simpler -- stop thinking and eating that way -- but this issomewhat harder to do in practice, given the food environment we now inhabitand the loss of sharp cultural tools to guide us through it. Still, I dothink escape is possible, to which end I can now revisit -- and elaborateon, but just a little -- the simple principles of healthy eating I proposedat the beginning of this essay, several thousand words ago. So try these few(flagrantly unscientific) rules of thumb, collected in the course of mynutritional odyssey, and see if they don't at least point us in the rightdirection.1. Eat food. Though in our current state of confusion, this is much easiersaid than done. So try this: Don't eat anything your great-great-grandmotherwouldn't recognize as food. (Sorry, but at this point Moms are as confusedas the rest of us, which is why we have to go back a couple of generations,to a time before the advent of modern food products.) There are a great manyfoodlike items in the supermarket your ancestors wouldn't recognize as food(Go-Gurt? Breakfast-cereal bars? Nondairy creamer?); stay away from these.2. Avoid even those food products that come bearing health claims. They'reapt to be heavily processed, and the claims are often dubious at best. Don'tforget that margarine, one of the first industrial foods to claim that itwas more healthful than the traditional food it replaced, turned out to givepeople heart attacks. When Kellogg's can boast about its Healthy HeartStrawberry Vanilla cereal bars, health claims have become hopelesslycompromised. (The American Heart Association charges food makers for theirendorsement.) Don't take the silence of the yams as a sign that they havenothing valuable to say about health.3. Especially avoid food products containing ingredients that are a)unfamiliar, b) unpronounceable c) more than five in number -- or thatcontain high-fructose corn syrup. None of these characteristics arenecessarily harmful in and of themselves, but all of them are reliablemarkers for foods that have been highly processed.4. Get out of the supermarket whenever possible. You won't find anyhigh-fructose corn syrup at the farmer's market; you also won't find foodharvested long ago and far away. What you will find are fresh whole foodspicked at the peak of nutritional quality. Precisely the kind of food yourgreat-great-grandmother would have recognized as food.5. Pay more, eat less. The American food system has for a century devotedits energies and policies to increasing quantity and reducing price, not toimproving quality. There's no escaping the fact that better food -- measuredby taste or nutritional quality (which often correspond) - costs more,because it has been grown or raised less intensively and with more care. Noteveryone can afford to eat well in America, which is shameful, but most ofus can: Americans spend, on average, less than 10 percent of their income onfood, down from 24 percent in 1947, and less than the citizens of any othernation. And those of us who can afford to eat well should. Paying more forfood well grown in good soils -- whether certified organic or not -- willcontribute not only to your health (by reducing exposure to pesticides) butalso to the health of others who might not themselves be able to afford thatsort of food: the people who grow it and the people who live downstream, anddownwind, of the farms where it is grown.''Eat less'' is the most unwelcome advice of all, but in fact the scientificcase for eating a lot less than we currently do is compelling. ''Calorierestriction'' has repeatedly been shown to slow aging in animals, and manyresearchers (including Walter Willett, the Harvard epidemiologist) believeit offers the single strongest link between diet and cancer prevention. Foodabundance is a problem, but culture has helped here, too, by promoting theidea of moderation. Once one of the longest-lived people on earth, theOkinawans practiced a principle they called ''Hara Hachi Bu'': eat until youare 80 percent full. To make the ''eat less'' message a bit more palatable,consider that quality may have a bearing on quantity: I don't know aboutyou, but the better the quality of the food I eat, the less of it I need tofeel satisfied. All tomatoes are not created equal.6. Eat mostly plants, especially leaves. Scientists may disagree on what'sso good about plants -- the antioxidants? Fiber? Omega-3s? -- but they doagree that they're probably really good for you and certainly can't hurt.Also, by eating a plant-based diet, you'll be consuming far fewer calories,since plant foods (except seeds) are typically less ''energy dense'' thanthe other things you might eat. Vegetarians are healthier than carnivores,but near vegetarians (''flexitarians'') are as healthy as vegetarians.Thomas Jefferson was on to something when he advised treating meat more as aflavoring than a food.7. Eat more like the French. Or the Japanese. Or the Italians. Or theGreeks. Confounding factors aside, people who eat according to the rules ofa traditional food culture are generally healthier than we are. Anytraditional diet will do: if it weren't a healthy diet, the people whofollow it wouldn't still be around. True, food cultures are embedded insocieties and economies and ecologies, and some of them travel better thanothers: Inuit not so well as Italian. In borrowing from a food culture, payattention to how a culture eats, as well as to what it eats. In the case ofthe French paradox, it may not be the dietary nutrients that keep the Frenchhealthy (lots of saturated fat and alcohol?!) so much as the dietary habits:small portions, no seconds or snacking, communal meals -- and the seriouspleasure taken in eating. (Worrying about diet can't possibly be good foryou.) Let culture be your guide, not science.8. Cook. And if you can, plant a garden. To take part in the intricate andendlessly interesting processes of providing for our sustenance is thesurest way to escape the culture of fast food and the values implicit in it:that food should be cheap and easy; that food is fuel and not communion. Theculture of the kitchen, as embodied in those enduring traditions we callcuisines, contains more wisdom about diet and health than you are apt tofind in any nutrition journal or journalism. Plus, the food you growyourself contributes to your health long before you sit down to eat it. Soyou might want to think about putting down this article now and picking up aspatula or hoe.9. Eat like an omnivore. Try to add new species, not just new foods, to yourdiet. The greater the diversity of species you eat, the more likely you areto cover all your nutritional bases. That of course is an argument fromnutritionism, but there is a better one, one that takes a broader view of''health.'' Biodiversity in the diet means less monoculture in the fields.What does that have to do with your health? Everything. The vastmonocultures that now feed us require tremendous amounts of chemicalfertilizers and pesticides to keep from collapsing. Diversifying thosefields will mean fewer chemicals, healthier soils, healthier plants andanimals and, in turn, healthier people. It's all connected, which is anotherway of saying that your health isn't bordered by your body and that what'sgood for the soil is probably good for you, too.

Copyright (c) Michael Pollan

No comments: