Free Novel Read

First Bite: How We Learn to Eat Page 4


  Because our tastes are such an intimate part of our selves, it is easy to make the leap to thinking that they must be mostly genetic: something you just have to accept as your lot in life. Parents often tell children that their particular passions place them on this or that side of the family—you got your fussiness from your grandfather!—as if you were destined from birth to eat a certain way. Sometimes it is uncanny how a suspicion of celery or a deep hunger for blackberries replicates from parent to child. When we notice these familial patterns, it confirms us in our view that food preferences must be inherited through our genes.

  When I’ve described the argument of this book to people I meet, sometimes they get a little angry. “I disagree that we learn how to eat,” they say. “You’d never get me to like sultanas/squid/salami (delete as appropriate).” Anyway, they say, “What about genes?”

  It’s fine by me if you don’t like sultanas. And I’m certainly not denying that there is a genetic component in our relationship with food. We are not born as blank slates. Some people have a heightened genetic sensitivity to certain flavors (notably bitterness), while others are blind to them. There are also genetic variations in individual appetite, the speed at which we eat, and the extent to which people actually enjoy eating. We vary in how we chew, how we swallow, and how we digest. Some people are born with conditions that make it much harder to eat, such as a delay to the oral-motor system. I had no idea how fraught the basic matter of getting food from plate to mouth could be until my third child was born with a cleft palate; he and I both struggled at mealtimes. He is now five, and new dishes occasionally still provoke tears (usually his). Our relationship with food and weight is additionally affected by epigenetics: our experience in the womb. The “thrifty phenotype” hypothesis of biochemist C. Nicholas Hales and epidemiologist David J.P. Barker suggests that being undernourished in utero leaves people with a lifelong propensity for weight gain, an unfair fate to be handed so early.

  The question remains to what extent we are capable of overriding this genetic and epigenetic inheritance and learning new tastes. This riddle can seem impossible to unravel, given that children do not learn to eat under laboratory conditions. As we take our first bites, our parents are supplying us simultaneously with both nature (genes) and nurture (environment conceived in its broadest sense, including everything from cuisine to family dynamics to religion to cutlery and table manners to the ethics of meat to views on whether it’s okay to eat food off the floor if it was only there for five seconds). The two are so intertwined, it’s hard to tell where one starts and the other stops.

  In one remarkable experiment, however, a group of children did learn to eat under lab conditions. In the 1920s and 1930s, Dr. Clara Davis, a pediatrician from Chicago, spent six years trying to study what children’s appetites would look like if allowed to blossom in total freedom without any preconceived ideas of what tasted good. Davis’s results have often been taken as a clear indication that likes and dislikes are fundamentally inbuilt and natural, though, as we’ll see, Davis herself drew a rather different conclusion.

  In 1926, at Mt. Sinai Hospital in Cleveland, Dr. Clara Marie Davis started the most influential experiment ever conducted on the question of food likes and dislikes. As a doctor, Davis saw many children with eating problems—mostly refusal to eat. Their appetites did not match their nutritional needs. She wondered what children’s appetites would look like freed from the usual pressures of parents and doctors pushing them to eat nutritious foods such as hot cereals and milk regardless of whether the children liked them. Conventional medical wisdom at that time was that children’s particular likes should not be indulged, lest they became “faddy.” Davis was not so sure that eating what you liked was automatically a bad thing.

  She borrowed a number of infants—some of them orphans from institutions and some the children of teenage mothers or widows—and placed them on a special “self-selection diet” under her medical care. The children—aged six to eleven months, who had never yet tasted solid food—were offered a selection of whole, natural foods and given free rein, day after day, to eat only what they wished. The full list of foods was:

  1. Water

  2. Sweet milk

  3. Sour (lactic) milk

  4. Sea salt

  5. Apples

  6. Bananas

  7. Orange juice

  8. Fresh pineapple

  9. Peaches

  10. Tomatoes

  11. Beets

  12. Carrots

  13. Peas

  14. Turnips

  15. Cauliflower

  16. Cabbage

  17. Spinach

  18. Potatoes

  19. Lettuce

  20. Oatmeal

  21. Wheat

  22. Cornmeal

  23. Barley

  24. Ry-Krisp

  25. Beef

  26. Lamb

  27. Bone marrow

  28. Bone jelly

  29. Chicken

  30. Sweetbreads

  31. Brains

  32. Liver

  33. Kidneys

  34. Fish (haddock)

  At each meal, the infants were offered a selection of around ten foods from this list, all of them mashed, ground up, or finely minced. Some, such as bone marrow, beef, peas, and carrots, were offered both in cooked and raw form. The selection was laid out in bowls, while nurses sat by, waiting to see what the children would choose. As Davis described it,

  the nurse’s orders were to sit quietly by, spoon in hand, and make no motion. When, and only when, the infant reached for or pointed to a dish might she take up a spoonful and, if he opened his mouth for it, put it in. She might not comment on what he took or did not take, point to or in any way attract his attention to any food, or refuse him any for which he reached. He might eat with his fingers or in any way he could without comment or correction of his manners.

  Davis continued this experiment over a period of six years, starting with three babies and building up to fifteen. The results, which have been hotly discussed by doctors ever since, were dramatic. Without any preconceived notions about what foods were suitable for them, the babies showed enthusiasm for everything from bone marrow to turnips. They didn’t realize they weren’t supposed to like beets or organ meats. All of them tried all of the thirty-four foods, except for two who never attempted lettuce and one who shunned spinach.

  Within a few days, Davis noticed, “they began to reach eagerly for some and to neglect others, so that definite tastes grew under our eyes.” It soon became obvious to her that for the fifteen children, there were “fifteen different patterns of taste.” The children made some very odd selections that looked like a “dietician’s nightmare,” said Dr. Davis. They went on curious “food jags.” One day, they might gorge on liver, or eat a meal of nothing but bananas, eggs, and milk. A boy called Donald showed a rare passion for oranges, cramming in nearly two pounds of them one day. In the process of trial and error in finding out what tasted nice, some of the children “chewed hopefully” on plates and spoons, while others grabbed handfuls of pure salt. When they tried something new, Davis observed, their faces showed at first surprise, then indifference, pleasure, or dislike.

  However bizarre and unbalanced the children’s likes and dislikes look to our eyes, they served them well. In a 1928 article writing up her findings, Davis included a “before” and “after” photo of one of the children, Abraham G. At eight months, on arriving in her care, he looks a little pale. At twenty months, after a year on the diet, he is cherubic and plump.

  When they arrived at the hosp
ital, the infants were generally in poor health. Four were seriously underweight; five had rickets. Yet within a few months, all of the children were pink-cheeked and optimally nourished. One of the rickets sufferers was offered cod liver oil, which he took the occasional glug of; but the other four managed to get enough vitamin D and calcium to cure their rickets through diet alone. When they suffered colds, they appeared to self-medicate, eating vast amounts of carrots, beets, and raw beef. Even though they were given no guidance on what their bodies needed, their ratio of calories averaged at protein 17 percent, fat 35 percent, and carbohydrate 48 percent, very much in line with contemporary nutritional science.

  Davis created an unprecedented body of information on childish appetites (though it was never fully analyzed; and, after her death in 1959, all the boxes of raw data were discarded). When she took up a new job, the original setup in Cleveland was moved to Chicago, where she established what amounted to “an eating-experiment orphanage.” In all, she logged around 36,000 meals as well as recording changes in height and weight, blood and urine, bowel movements and bone density. It is unlikely that any other scientist will ever get such detailed data again, given the dubious ethics of keeping children locked up in an experimental nursery for so long. The babies stayed on the diet for a minimum of six months and a maximum of four and a half years, during which time they were always at the hospital.

  No friends visited, and those who were not orphans had little or no contact with their parents. While in the hospital nursery, their lives were subordinated to the needs of the experiment. Such an arrangement would never be allowed now, though Davis evidently cared for the children very much, in her way. She adopted two of them, as a single mother: Abraham G (the plump cherub) and Donald, the passionate orange eater. Many years later, after Donald was dead, his widow recalled that he and Abraham had always been “easy to cook for” and “happy to try all kinds of foods”—they remained omnivores all their lives.

  It was such an extraordinary, audacious, borderline-crazy project that Davis attempted: to get to the heart of where children’s food passions come from. It’s just a shame that her experiment proved so easy to misread. Time and again, Davis’s orphanage has been held up as evidence that appetite is mostly genetic, and, as a consequence, that the foods children like or dislike are a sure guide to what their bodies need. Davis’s food orphanage has been taken as proof that in their natural state, likes and dislikes are genetic and highly individual, like fingerprints: our tastes are a matter of nature, not nurture. What this interpretation fails to take into account is that the biggest thing Davis did was to radically restructure the food environment of the children.

  There was a “trick” to the way the experiment was set up, as Clara Davis was the first to point out. The real secret was in her choice of the thirty-four items on her list, which were all unprocessed whole foods. With such foods preselected for them, it didn’t matter which ones the children were drawn to on any given day, because, assuming they took food from several of the bowls at each meal, they could not help but eat a diet of an excellent standard of nutrition. Davis said that her choice of foods was designed to mimic the conditions of “primitive peoples,” though the heaping bowlfuls were surely more plentiful than any hunter-gatherer regime. The experiment proved that when your only food choices are good ones, preferences become unimportant. The “fifteen patterns of taste” resulted in a single healthy whole-food diet, because of the setup. Not one of the children was totally omnivorous, but nor were their likes and dislikes a problem, as they so often are in normal family life. There was no option to like unhealthy food and dislike healthy food.

  Davis herself concluded that her experiment showed that the selection of food for young children should be left “in the hands of their elders where everyone has always known it belongs.” Instead of the “wisdom of the body,” Davis spoke of the “glaring fallibility of appetite.” It was obvious to her that there was no “instinct” pointing blindly to the “good” and the “bad” in food. The two most popular foods overall in her study were also the sweetest: milk and fruit. Had she offered the children a free choice of “sugar and white flour,” those staples of a 1930s diet, it is unlikely they would have ended up in such fine fettle. Self-selection, she concluded, would have little or no value if the children were selecting from “inferior foods.”

  The real test, Davis recognized, would be to offer newly weaned infants a choice between natural and processed foods. This was to be have been her next experiment, but the Depression dashed this prospect, as her funding ran out at the crucial moment. Davis never got the chance to test the effects on appetite of the “pastries, preserves, gravies, white bread, sugar and canned food” that had in her lifetime become so popular. Her experiment left a powerful legacy that took no account of the trick at the heart of it. Doctors, particularly in America, interpreted her experiment to mean that children’s appetites are inbuilt and benign, without paying attention to the way in which Davis had changed the food environment in which the babies ate. Her work was seized on as proof that our individual appetites are messages encoded with exactly the nutrients that our particular body needs. If we need protein, we will crave chicken. If we have rickets, we will naturally gorge on vitamin D until we are cured. All we have to do to eat well is listen to our cravings. Mother Nature knows best. Davis herself gave license to such a view, commenting that the children’s successful “juggling and balancing” of more than thirty essential nutrients suggested “the existence of some innate, automatic mechanism . . . of which appetite is a part.”

  Influenced by Davis, the dominant view on appetite among pediatricians became “the wisdom of the body,” which went along with the vogue for “child-centered” learning. In 2005, Benjamin Scheindin, MD, a pediatrician, noted that Davis’s work had contributed to a widespread change in attitudes in pediatric medicine from the 1930s onward. Where a previous generation lamented the pickiness of children’s changeable tastes, now doctors positively welcomed childish vagaries of appetite. Dr. Benjamin Spock, author of the best-selling Baby and Child-Care, first published in 1946, devoted ten pages to the Davis experiment. A mother, in Spock’s opinion, “can trust an unspoiled child’s appetite to choose a wholesome diet if she serves him a reasonable variety and balance.” It didn’t matter if a child developed a temporary dislike of a vegetable, because his or her cravings would naturally provide everything the child needed in the way of nutrition.

  Many experts in child-rearing still think like this, operating on the assumption that children are born with special appetites for exactly the nutrients they most need and that it will all balance out, if only they are given free rein to eat what they like. A book on solving children’s eating problems that went through several reprints in the 1980s and 1990s argued that Davis’s work showed that children should be given total control over food selection: let them eat cornflakes! As recently as 2007, a popular website about feeding children discussed Davis and concluded that there was “a strong biological plausibility . . . that children will instinctively choose a balanced diet.”

  The “wisdom of the body” is an alluring thought (like maternal instinct and other biological myths). Eating would be such a simple business, if only we had little memos inside our bodies telling us what we needed to eat at each precise moment (your vitamin C levels are dropping—quick, eat a kiwifruit!). If only we liked just the stuff that was good for us and disliked anything superfluous or bad. We can certainly learn to get better at reading the body’s cues for food, but this tends to come with age and experience, as you notice little things like how pasta for lunch makes you sleepy, or that a handful of nuts and a cup of Greek yogurt keep you full longer than white toast and jam. But children’s omnivorous bodies—after the milk stage, when breastfed infants do self-regulate—are not so wise.

  Many children habitually seek out precisely the foods that are least suitable for them. They crave sugar and shun green vegetables. They
neglect to drink enough water. Nutritious meals are rejected, while junk is revered. Can we really believe that a preschooler demanding a packet of the latest sugary kid’s breakfast cereal, having seen it on TV, is responding to the body’s need for certain vitamins and carbohydrate?

  The scientific evidence—both from humans and rats—shows that the theory of the “wisdom of the body” is flawed at best. For the theory to be true, omnivores would need to have specific appetites for the essential nutrients the body needed at any given time. This is a very unlikely proposition, given that the list of nutrients needed by omnivores comes in so many guises, depending on the environment in which we happen to live. An innate appetite for the vitamin C in black currants would be of no use if you lived somewhere that black currants did not grow.

  In lab conditions, rats—our fellow omnivores—have shown a very erratic ability to self-select the diet that would do them the most nutritional good. In one study, rats were given a choice between a bad-tasting but protein-rich diet and a good-tasting but low-protein diet. Over the course of a week, fourteen out of eighteen rats failed to develop a preference for the food that would have done them the most good, and they lost weight. Other trials have attempted to find out whether rats could “self-select” to correct certain vitamin deficiencies, and concluded that many of them could not. With thiamine-deprived rats, the process of learning to like a thiamine-rich diet took a week or more, and the rats that did not adapt quickly enough to the correct food died. As for human subjects, there is, notes one specialist in the field, no data to suggest innate appetites for specific foods. It does seem possible for humans to learn over time specific appetites that will correct certain imbalances—particularly a craving for salt when lacking in sodium—but that is a different matter.