Libby's pumpkin pie is the iconic recipe that graces many American tables for Thanksgiving each year. Although pumpkin pie goes way back in American history (see my take on Lydia Maria Child's 1832 recipe), canned pumpkin does not. Libby's is perhaps most famous these days for their canned pumpkin, but they started out making canned corned beef in the 1870s (under the name Libby, McNeill, & Libby), using the spare cuts of Chicago's meatpacking district and a distinctive trapezoidal can. They quickly expanded into over a hundred different varieties of canned goods, including, in 1899, canned plum pudding. Although it's not clear exactly when they started canning pumpkin (a 1915 reference lists canned squash as part of their lineup), in 1929 they purchased the Dickinson family canning company, including their famous Dickinson squash, which Libby's still uses exclusively today. In the 1950s, Libby's started printing their famous pumpkin pie recipe on the label of their canned pumpkin. Although it is the recipe that Americans have come to know and love, it's not, in fact, the original recipe. Nor is a 1929 recipe the original. The original Libby's pumpkin pie recipe was much, much earlier. In fact, it may have even predated Libby's canned pumpkin. In 1912, in time for the holiday season, Libby's began publishing full-page ads using their pumpkin pie recipe in several national magazines, including Cosmopolitan, The Century Illustrated, and Sunset. But the key Libby's ingredient wasn't pumpkin at all - it was evaporated milk. Sweetened condensed milk had been invented in the 1850s by Gail Borden in Texas, but unsweetened evaporated milk was invented in the 1880s by John B. Meyenberg and Louis Latzer in Chicago, Illinois. Wartime had made both products incredibly popular - the Civil War popularized condensed milk, and the Spanish American War popularized evaporated milk. Libby's got into both the condensed and evaporated milk markets in 1906. Perhaps competition from other brands like Borden's Eagle, Nestle's Carnation, and PET made Libby's make the pitch for pumpkin pie. Libby's Original 1912 Pumpkin Pie Recipe:The ad features a smiling trio of White people, clearly upper-middle class, or even upper-class, seated around a table. An older gentleman and a smiling young boy dig into slices of pumpkin pie, cut at the table by a not-quite-matronly woman. A maid in uniform brings what appears to be tea service in the background. The advertisement reads: "How did you make this pie so delicious?" "Why it was easy enough. I tried the new way I found in my Libby's recipe booklet. Here it is - " "Pumpkin Pie: 1 ½ cups cooked and strained pumpkin, 2 eggs, ¾ cup sugar, ¼ cup molasses, ½ tablespoonful cinnamon, ½ tablespoonful ginger, 1/8 teaspoonful salt, 1 cup (1/2 can) Libby’s Evaporated Milk, with 1 cupful water. Mix pumpkin, molasses, sugar and spices together. Add the mixed milk and water, then add the eggs thoroughly beaten. Mix well and put into deep pie tins lined with pastry. Bake 45 minutes in a moderate oven. "Libby’s Evaporated Milk "For all pies and baking, for soups, coffee, tea or cocoa Libby’s milk gives an added richness and a delicious flavor. Libby’s milk is evaporate din clean, sanitary condenseries, located in the heart of the greatest dairy regions in the world. It is always pure and when open will keep sweet longer than raw milk. "Buy Libby’s milk for convenience and satisfaction. It’s the brand you can trust. "Send for a copy of Libby’s Milk Recipe Booklet. Libby, McNeill & Libby, Chicago." My research has not been exhaustive, but as far as I can tell, Libby's was the first to develop a recipe for pumpkin pie using evaporated milk. Sadly I have been unable to track down a copy of the 1912 edition of their Milk Recipes Booklet, but if anyone has one, please send a scan of the page featuring the pumpkin pie recipe! Curiously enough, the original 1912 recipe treats the evaporated milk like regular fluid milk, which was a common pumpkin pie ingredient at the time. Instead of just using the evaporated milk as-is, it calls for diluting it with water! The recipe also calls for molasses, and less cinnamon than the 1950s recipe, which also features cloves, which are missing from the 1912 version. Both versions, of course, call for using your own prepared pie crust. Nowadays Libby's recipe calls for using Nestle's Carnation brand evaporated milk - both companies are subsidiaries of ConAgra - and Libby's own canned pumpkin replaced the home-cooked pumpkin after it purchased the Dickinson canning company in 1929. Interestingly, this 1912 version (which presumably is also in Libby's milk recipe booklet) does not show up again in Libby's advertisements. Indeed, pumpkin pie is rarely mentioned at all again in Libby's ads until the 1930s - after it acquires Dickinson. And by the 1950s, the recipe wasn't even making the advertisements - Libby's wanted you to buy their canned pumpkin in order to access it - via the label. The 1950s recipe on the can persisted for decades. But in 2019, Libby's updated their pumpkin pie recipe again. This time, the evaporated milk and sugar have been switched out for sweetened condensed milk and less sugar. As many bakers know, the older recipe was very liquid, which made bringing it to the oven an exercise in balance and steady hands (although really the trick is to pull out the oven rack and then gently move the whole thing back into the oven). This newer recipe makes a thicker filling that is less prone to spillage. Still - many folks prefer the older recipe, especially at Thanksgiving, which is all about nostalgia for so many Americans. I'll admit the original Libby's is hard to beat if you're using canned pumpkin, but Lydia Maria Child's recipe also turned out lovely - just a little more labor intensive. I've even made pumpkin custard without a crust. Are you a fan of pumpkin pie? Do you have a favorite pumpkin pie recipe? Share in the comments, and Happy Thanksgiving! The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! Don't like Patreon? Leave a tip!
0 Comments
In throwing an Autumnal Tea Party (see yesterday's post!), I wanted a simple but impactful dessert. Apples are plentiful in New York in September, but plain apple crisp, while delicious, didn't feel quite special enough for a tea party. The British have a long tradition of gleaning from hedgerows in the fall. Hedgerows often have apple trees, sloes, blackcurrants, and blackberries in fall. Sloes and blackcurrants are hard to find here in the US, but blackberries seemed like the perfect accent to the American classic. This recipe is endlessly adaptable as the crumble topping is great with any kind of fruit. You do need quite a lot of fruit for a crumble, which makes it nice in that it feels a little lighter on the stomach than cake or pie. These sorts of desserts were common in areas where fruit was plentiful and sugar and butter weren't. Apple Blackberry Crumble RecipeI never sweeten the fruit for a crumble (similar to a crisp, but without rolled oats) unless it is a very sour fruit like rhubarb or fresh cranberries. This topping has quite a lot of sugar, which is what helps make it so crunchy and delicious, but you definitely do not need additional sugar in the fruit, especially when pairing with ice cream. You can substitute whole grain flour for part or all of this to good effect as well. If you prefer a crisp, use 1/2 cup of flour and 1 heaping cup of rolled oats. 8+ small apples (I used a mix of gingergold and gala) 1 pint (2 small packages) fresh blackberries 1 1/2 cups all-purpose flour (plus more for the fruit) 1 scant cup white sugar 1/2 cup coldish butter 1/2 teaspoon salt 1/2 teaspoon pumpkin spice Preheat the oven to 400 degrees F. Peel the apples, cut into quarters, cut out the core, and slice. Wash the blackberries and drain. Toss the apples and blackberries with flour to coat (this will thicken the juices). Add to the baking dish. Then make the crumble. Mix the flour, sugar, salt, and pumpkin spice. Then cut the butter into small cubes, toss in the flour mix, and using your hands squeeze and rub it into the flour mix until it holds together when squeezed. Crumble gently over the fruit in an even mix, then bake for 40-50 minutes, or until the fruit is bubbly and thick and the crumble is golden brown. Serve warm with vanilla ice cream. There's nothing like a warm crisp with cold vanilla ice cream, and I think this is my new favorite kind. Blackberries and apples seem like a match made in heaven. What's your favorite autumnal dessert? The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! Don't like Patreon? Leave a tip! The weather has finally turned, dear readers, and so I felt it was time for another tea party! I've had a long couple of weeks, and I wasn't really looking forward to spending one of my days off cleaning the house and cooking, but it was very much worth the effort and I'm glad we did it. Tea parties can be incredibly complicated, or very simple. My process is to think about the theme, and the flavors, and then come up with way too many ideas and then pare it down to what's possible. I wanted to honor the flavors of early fall, with something pumpkin or squash, apples, blackberries, and a savory bread. My original list also had gingerbread and shortbread cookies with jam, and scotch eggs, but that was too much! I wanted to keep the menu fairly simple, because I was quite sleep deprived after a big event over the weekend at work. So I maximized flavor and minimized effort, to great acclaim! The party (just three of us) ended up delicious, with a chilly, drizzly day with beautiful overcast light on our front porch. Ironically, we ended up having mulled cider, instead of tea, but I'm enjoying a cup of tea as I write this a few hours later, so I suppose it still counts! Autumnal Tea Party DecorIt can be tempting to go out and buy a lot of supplies for parties. I'm definitely as susceptible to that impulse as the next person! But I find what makes parties special is not how much everything matches, but the quality of your decor. I decorated my mantel with some of my favorite fall decorations - a coppery leaf garland, my favorite vintage china pheasant, a pretty vase with some fake flowers, a little green ceramic pumpkin. But when it comes to decking the table, nothing is better than nice tablecloth and real dishes. I grew up shopping thrift stores and garage sales and flea markets with my mom, so I've amassed quite a collection of vintage dishes and tablecloths over the years. Because I actually use my collection, I don't spend a lot of money on it. It pains me enough when a vintage piece gets chipped or broken. My frugal soul would be even more deeply wounded if it was a piece I had spent a lot of money on. This ended up being a very grandmother-focused display. The Metlox California ivy plates and a single surviving teacup I inherited from my grandmother Eunice, along with the green glass bowl I used for butter and the green glass saucers. My grandma Ruby found me the beautiful etched water glasses. The glass teacups embossed with leaves I picked up at a garage sale for a dollar for the pair. The milk glass is from my thrifted collection, and the beautiful tablecloth is a vintage one I forget where I found but it's probably one of my absolute favorites. I did not intend for the food to match the tablecloth, but that's kind of how it happened! When it comes to collecting, it's important to buy things you love, instead of focusing on what things are worth. Who cares how expensive it was if you think it's ugly? It's also important to choose things that are relatively easy to care for. I do not recommend putting vintage dishes in the dishwasher, but a lot of vintage tablecloths are meant to be washed. I find vintage textiles with a stain or two are often much less expensive than the pristine stuff, and then if you get a stain on them you don't feel quite so bad! Autumnal Tea Party MenuButternut Squash Soup with buttered pecans Sage Cream Biscuit Sandwiches with pickled apple, pickled onions, and sharp cheddar Dilly Beans Mulled Cider Blackberry Raspberry Hibiscus Water Apple Blackberry Crumble with vanilla ice cream Although I love to cook from scratch, the butternut squash soup was store-bought from one of my favorite soup brands: Pacific Foods Butternut Squash Soup, and I got the low-sodium version (affiliate link). I don't usually like butternut squash soup, but I know lots of people love it, so I thought I would give it a go. This one was so delicious, I was surprised how much I enjoyed it. I felt it needed a little something extra, so I toasted some chopped pecans in a little butter and salt, and the butter got a little browned. It was the perfect garnish. The blackberry raspberry hibiscus water was also store-bought, a simple cold water infusion from Bigelow tea which I found at the store the other day (affiliate link). It turned out lovely - not as strong as tea, just a hint of flavor to cold water. Very refreshing. Sadly, the color, which was a beautiful purple as it steeped, got diluted to a kind of washed purple-gray, which was less beautiful. But still delicious! Sage Cream BiscuitsI had thought about making scones for this tea party, but I don't have a reliable savory scones recipe, and since I was doing sandwiches, I thought biscuits would be better. This is an adaptation of my tried-and-true Dorie Greenspan cream biscuit recipe. It's almost fool-proof. This one is doubled. 4 cups of all-purpose flour 2 tablespoons baking powder 2 teaspoon sugar 1 1/2 teaspoons salt 1 heaping teaspoon dried sage (not ground) 2 1/2 cups heavy cream Preheat the oven to 425 F. Whisk all the dry ingredients together, and then add the heavy cream, tossing with a fork until most of the flour is absorbed. Knead gently with your hands (don't overwork!), then pour out onto a clean, floured work surface and knead, folding often, until it comes together. Pat into a large rectangle and cut into squares. Place on a parchment-lined baking sheet and bake 15-20 minutes or until golden brown. Serve warm, and to make the sandwiches, split the biscuits, butter them, and add sliced sharp cheddar cheese, a slice of pickled apple, and a few strands of pickled red onion. Top with more cheddar and the other half of the biscuit and devour. Serve with butternut squash and a side of dilly beans and mulled cider. I'll be following up with the recipe the apple blackberry crumble tomorrow, and the recipes for refrigerator dilly beans, pickled apples, and pickled onions will be available to patrons on my Patreon tomorrow as well. Do you like to have tea parties? What's your favorite autumnal food? The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! Don't like Patreon? Leave a tip! Last World War Wednesday, we looked at the use of ice cream in the U.S. Navy during the First World War, especially aboard hospital ships. Now it's time for a reprise! By the Second World War, ice cream was firmly entrenched aboard Naval vessels. So much so, that battleships and aircraft carriers were actually outfitted with ice cream machinery, and by the end of the war the Navy was training sailors in their uses through special classes. The above propaganda poster, courtesy the National Archives, outlines all of the requirements to build a battleship. "Your Battleship and Her Requirements" may have been targeted toward factory workers, but I think it is more likely this poster was designed to impress upon ordinary Americans the extraordinary amount of materials and supplies needed to keep a battleship in fighting trim. What I found particularly interesting, was that among the supplies listed, alongside fruits and vegetables and meat and even candy, was 60,000 quarts of ice cream! Smaller vessels, such as destroyer escorts and submarines, did not have the space for their own ice cream making machinery, although they did have freezers. In fact, it became common for destroyer escorts and PT boats to rescue downed pilots (the aircraft carries were too big for the job) and "ransom" them for ice cream. Last week a brand new food podcast debuted for American Public Television called "If This Food Could Talk," and I'm so pleased to say I was featured in the first episode, "Frozen in Time: Ice Cream and America's Past." I had a blast doing the research for that episode's interview, which has inspired these two recent World War Wednesday posts. Have a listen if you want to learn more about ice cream in American history, and especially the story of ice cream in the Navy. But while I was doing the research, I kept running across references to ice cream as a health food! The National Dairy Council really leaned into the notion of ice cream and the armed forces. This advertisement reads, "There's a reason why the U.S. Navy serves Ice Cream. America's favorite dairy food - Ice Cream - is an important source of vitamins, proteins and minerals." The ad goes on: "Navy menus don't just happen! Every food included in the diet of Navy personnel, ashore or afloat, is there for a purpose. It is there because it has been okayed by the staff of experts at the Subsistence Research Laboratory of the U.S. Navy in Chicago for making an important contribution to the health, strength, and morale fighters must have to win! "These highly skilled and trained technicians at the laboratory know every condition under which the men live - know their requirements - and make sure exactly what each food will do for those men before it is approved. "That is why it is significant that ice cream ranks so high on Navy menus. It is not only a favorite food, but it also supplies valuable vitamins, proteins, and minerals. For that reason, wherever practical, the Navy gets ice cream! "Throughout the world - over the seven seas - the talents of the Subsistence Research Laboratory of the U.S. Navy are directed to keeping our Navy a strong, healthy, hard-hitting force; making sure it gets the foods the men like - the foods they need for victory! "Ice Cream Is a Fighting Food "Ice cream is a favorite with all branches of our armed forces - and it is important that they get this valuable food. So fi you aren't always able to get all the ice cream you want - remember, you're 'sharing' this nutritious food with our fighters." The National Dairy Council might be just a SMIDGE biased in this regard, but certainly the federal government ranked milk, and by extension milk products, very highly in terms of nutrition during the Second World War, notably as part of the Basic 7 nutrition recommendations. This was almost certainly a holdover from the Progressive Era's take on milk as the "perfect food" - combining proteins, carbohydrates, and fats all in one. We see this in another advertisement, this time by the National Dairy Products Corporation. The National Dairy Council is an industry-funded research and marketing organization. But the National Dairy Products Corporation would later become Kraft Foods. "Here's what one leatherneck dreams about! "One Marine's dream of the post-war world is a mountain of strawberry ice cream. He wrote his girl from Guadalcanal that he wants it three times a day, every day for five years. In standard servings, that's over 900 quarts! "Strawberry ice cream was a symbol, of course, to a hot, tired fighting man in a fox-hole - a symbol of his home town and the corner drug store - a symbol of America. It must have appealed to lots of folks, for many newspapers carried the story. "There are good reasons why ice cream is on Army menus regularly - good reason why busy war workers eat so much of it. It is more than a delicious dessert - it's a valuable food - rich in vitamins and calcium. "Right now, of course, ice cream must come from the same milk supply that furnishes milk, cream, butter and cheese to soldiers, civilians and allies alike. That means less ice cream for your family's use. If you'll be content with your fair share - if you'll accept part of your order in fruit ices - you can continue to enjoy ice cream. "And we'll continue to improve ice cream processing and packaging - controlling its quality - keeping it pure and good. "We'll continue our intensive laboratory research... developing important new products from milk... bringing to America's fighters, workers and friendly allies the full benefits of nature's most nearly perfect food." Here you can see the "perfect food" rhetoric in action! And interestingly, this one touts the role of ice cream in the Army as well. In my opinion ice cream, for all the rhetoric about nutrition, had far more to do with morale than anything else. But there is some truth to the idea that as a dessert it was superior to cake or pie. For one thing, ice cream does have some protein, in addition to a decent amount of fat. Full fat dairy is generally proven to be more filling and satisfying and protein and fat slow down the absorption of sugar directly into the bloodstream, making the "energy-giving" properties of carbohydrates longer-lasting and less likely to make you crash (unlike cake and cookies). That being said, viewing ice cream as a health food is questionable today. But in the period, the discoveries of vitamins and minerals like calcium were cutting-edge, and any food containing those essential nutrients was considered good for you. Ice cream also fit neatly into ideas (unconscious or otherwise) of White supremacy and American (i.e. Anglo-Saxon) culture. As the National Dairy Products Corporation marketing team wrote, ice cream was "a symbol of America." When combined with soda fountains (the wholesome, if sugary, alternative to saloons and beer halls), ice cream seemed to represent the best of America - slim, good-looking, young, White America, that is. Today, ice cream's modern accessibility has given us ice cream alternatives aplenty, especially for folks who can't consume dairy. Ice cream's ubiquity has also meant some of its luster has faded. But at a time of extreme stress - the violence of the theater of war, the privations of home front rationing, the push to mobilize for total war, the fear of invasion - ice cream provided a moment of bliss in the midst of uncertainty. Ice cream is still an essential tradition aboard Naval vessels today. When you're miles from home for months at a time, anything that seems like a treat gives morale a boost. It's still a treasured treat in our household, whether homemade or store bought (if you find yourself in upstate New York - do yourself a favor and seek out Stewart's Shops. They have the best commercial ice cream around). What does ice cream mean to you? The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! A special patrons-only post is coming tomorrow with more on ice cream in World War II - this time featuring Elsie the Cow! Join now for as little as $1/month.
Don't like Patreon? Leave a tip! Dear Reader, I finally did it, and not in a good way. A few weeks ago I hosted a beautiful (albeit hot and humid) French Garden Party, a belated celebration of Bastille Day, for approximately 30 people. The decorations were gorgeous and the food was fabulous and I did not take a single. solitary. photograph. My consternation was extreme. My beautiful screen porch was set with tables dressed in blue and white striped linens. It was BYOB - bring your own baguette, and folks brought fancy cheeses to go with the goat cheese and paper-thin ham I provided. I made homemade mushroom walnut pate and TWO compound butters - fresh herb and garlic, and lemon caper. I made beautiful French salads: potato and green bean vinaigrette, lentils vinaigrette with shallot and parsley and a hint of fresh rosemary, cucumber with tarragon and sour cream, celery with black olives and anchovies, peach basil. We had honeydew melon and both sweet dark AND Queen Anne cherries. We had wine and spritzers a-plenty. A friend brought chocolate cream puffs. I made lemon pots de crème and earl grey madeleines. But the absolute star of the show was this chocolate mousse, which I flavored with rose water. And since I had one little glass pot left over from the party, I snapped a few photographs a few days later to give you the incredibly easy recipe so that you, too, may feature this glorious star, and have your guests talking about it for days afterwards (no really - they did). But course, I wouldn't be a food historian if I didn't give you a little context, and I was curious about the history of chocolate mousse, so here you go: A Brief History of Chocolate MousseGoodness there is a lot of nonsense out on the internet about chocolate mousse! Way, WAY, too many sources say it was invented by Toulouse Lautrec, and that it was called "mayonnaise de chocolat." People. Chocolate mousse dates back to at least the 18th century, if not earlier, so it was around long before Monsieur Lautrec. I did, to my surprise, find a couple of recipe references to "mayonnaise au chocolat." It sounds so ridiculous as to be fake, but this was apparently a real recipe, albeit a name I can only date to the 20th century. One recipe is from a 1909 French cookbook, which calls for melting chocolate with egg yolks and adding beaten egg whites, and offers a clue to the name: it says at the end to mix the egg whites and the chocolate mixture "like ordinary mayonnaise." A few references in the 1920s and '30s, and then where it was probably popularized in America - a reference from a 1940 issue of Gourmet magazine (not readable online, alas - if anyone tracks down a hard copy of the original, let me know!). Another recipe is from a 1951 French cookbook, with not very detailed directions. According to my translation, "mayonnaise au chocolat" mixes melted chocolate with egg yolks and a few tablespoons of cream which is then cooked and then mixed with egg whites (unclear whether or not they are beaten stiff or not, but likely yes) and chilled. Another is from the 1961 edition of Mastering the Art of French Cooking by Julia Child and Simone Beck, which has a recipe for "Moussline au Chocolat, Mayonnaise au Chocolat, Fondant au Chocolat" (yes, three names for the same recipe!) the subheading of which reads "Chocolate Mousse - a cold dessert." Mousseline is actually a sauce mixed with whipped cream (for instance, you can turn hollandaise sauce into a mousseline by adding whipped cream), whereas mousse is a thickened chilled dessert made with whipped cream. The Julia Child recipe conflates the two, and her recipe calls for an egg yolk cooked custard mixed with melted chocolate, with whipped egg whites folded in. It is then served with crème anglaise or whipped cream. So, none of these recipes are truly chocolate mousse, because the mixtures contain no whipped cream whatsoever. But Toulouse Lautrec and a crazy-to-Americans name like "chocolate mayonnaise" is so much more dramatic than doing actual historical research and looking at primary sources. SIGH. People. We can do better. The earliest references to "chocolate mousse" I could find date to 1687 and refer to the habit of Indigenous peoples in Central America of frothing their chocolate beverages with either a mollinio or by pouring them between cups. A habit which Europeans apparently adopted. A 1701 French dictionary continues the reference to frothy chocolate beverages in its definition of "mousser" or "to foam." The earliest reference I could find to the dessert mousse we know and love today comes from the 1768 French cookbook, "L'art de bien faire les glaces d'office ou les vrais principes pour congeier tous les rafraichissemens" or The Art of Making Ice Cream Well, or the True Principles for Freezing All Refreshments. And lest you think it is just about ice cream, the extremely long title adds, "Ave Un Traite Sur Les Mousses," or "With A Treatise On Mousses." Chocolate mousse (as pictured above) is simply one of dozens of mousse recipes listed, but the early versions are quite similar to the modern. Grate the chocolate and melt it in a saucepan over low heat, then add cream, little by little, to thin it down. Pass it through a sieve, sweeten it, and then let cool and whisk to a foam. By the 19th century, we're adding egg yolks to make a smoother, more custard-y base, as you can see from this pair of recipes by early French restauranteur Antoine Beauvilliers, who published his 1814 "The Art of Cooking" as a French cookbook that became foundational to generations of French chefs and home cooks. English cooks, however, had access before that, judging by this 1812 recipe, which also called for egg yolks. We're still adding large amounts of whipped cream, though, keeping in classic mousse style. It took a bit longer for Americans to adapt to chocolate mousse, although they were prodigious chocolate drinkers, and certainly by the mid-19th century were consuming chocolate custards and ice creams. It wasn't really until (as far as I and the Food Timeline can tell) celebrity cookbook author and cooking school teacher Maria Parloa intervened that it got popular. The Food Timeline cites a 1892 article with Miss Maria Parloa lecturing on chocolate mousse, among other things, but I found a reference dating back to 1885 where she's lecturing on chocolate mousse in Buffalo, NY. However, it doesn't seem to be QUITE the same as we consider chocolate mousse today. Her 1887 recipe for it calls for freezing it like ice cream, albeit without stirring. Regardless of whether the recipe calls for egg yolks or not, or whether it's frozen or not, chocolate mousse in the modern style is easier to make than you'd think. Chocolate Rose Mousse RecipeA French Garden Party called for something easy to prepare and delicious for dessert. Because it was a garden party, I decided on chocolate pots de crème or mousse flavored with rose fairly early on. I used to dislike floral flavors, but after making an Egyptian rosewater dessert last year, and trying Harney & Sons seriously divine Valentine's Day tea, which was chocolate black tea with rosebuds (affiliate link), I was smitten. After realizing how many eggs I'd go through making a triple batch of both lemon AND chocolate pots de crème, I decided to do just the lemon (15 pots) and do the rest as chocolate mousse (15 pots). I found these adorable little glass pots with covers on Amazon, should you care to purchase them yourself from this affiliate link. I adapted several recipes online, and since I didn't want to mess with steeping the cream with rosebuds or rose petals or any other options, I decided to go the less expensive and way easier rose water direction. Rose water is used frequently in Middle Eastern cooking, and is often less expensive in the "ethnic" section than in the spice aisle. In preparing for the party, in which I was attempting a brand new recipe, I didn't want to try to mess with egg yolks any more than I already had to with the lemon pots de crème (which turned out only okay). So the simple mixture of heavy cream, dark chocolate, and a little icing sugar seemed best. I then added rose water to taste, which turned out about perfect. Here's the reasonable-serving recipe, which I tripled to get 15 generous servings for the party. This makes more like 6 servings, 4 if you're being greedy. 1 1/2 cups heavy cream (I used a local dairy brand, which is richer than national brands) 1 cup high-quality dark chocolate chips (like Guittard) 1/4 cup powdered sugar 1 teaspoon rose water Heat a saucepan of water over medium heat, bringing it to a simmer. Place a heat-proof bowl (glass or metal) over the simmering water and add 3/4 cup heavy cream and the chocolate chips. Stir gently until the chocolate chips are completely melted. Remove from heat. In a large bowl, beat the remaining 3/4 cup heavy cream until you get soft peaks, then add the 1/4 cup powered sugar and beat until you get stiff peaks. Add the rose water and beat to combine. 1 teaspoon should be enough to stand up to the chocolate, but taste the whipped cream. The rose water should be present, but not overpowering. If faint, add another 1/2 teaspoon and beat again. Then, using a ladle, add approximately 1/3 cup of the cooled chocolate-cream mixture to your whipped cream, one ladle at a time, and gently fold to combine using a rubber spatula. To fold, use the spatula to cut down the center of the whipped cream mixture and scrape up from the bottom. Rotate the bowl slightly, and repeat the action, until the chocolate is incorporated. Add another ladle of chocolate and continue until you have folded in all of the chocolate without totally deflating the whipped cream. Spoon into small glass jars or custard cups and refrigerate until ready to serve. It was so gorgeous. Light but rich, with a subtle hint of rosewater, which added fascinating depths to the chocolate. Folks gobbled it all up, and it was by far the best dessert of the party. The one lonely little pot left allowed me to take the above photo the following day, so you're welcome! I didn't serve it with whipped cream during the party, and it's admittedly a little overkill, but it does look pretty. So now you have the recipe AND some history, plus some food history mythbusting! So do yourself a favor and go splurge on some high-quality ingredients and treat yourself and your loved ones to this easy and stunningly delicious dessert. To quote Julia, quoting the French, Bon Appetit! The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! Don't like Patreon? Leave a tip!
I've been getting a lot of calls for information about ice cream lately, and that has sent me down a rabbit hole. I did a whole talk on the history of ice cream last year (you can watch the filmed version here), but while I knew ice cream was a big tradition in naval history, I didn't know the connection to the First World War. I don't usually cover the history of military consumption of food during the World Wars, but this topic was just too much fun to resist. Ice cream wasn't always the Navy's treat of choice. For over a hundred years rum was the preferred ration by many sailors. But in the late 19th century the Temperance movement began to have increasing power over society. By 1919 we had a Constitutional Amendment (the 19th - often known just as "Prohibition"). But the armed forces went dry much earlier. In particular, on July 1, 1914, the U.S. Navy went alcohol-free. At the same time, naval vessels were being stocked with ice cream. In the May, 1913 issue of The Ice Cream Trade Journal, an article entitled "Sailors Like Ice Cream" explained that the Navy had recently ordered 350,000 pounds of evaporated milk - ostensibly for all sorts of cooking and baking, but ice cream was high on the list. You may wonder why hospital ships in the First World War were manufacturing ice cream on board? Well, it involves multiple factors. First is that ice cream was a product of milk; during the Progressive Era, milk was considered the "perfect food" as it contained fats, proteins, and carbohydrates all in one (supposedly) easily digestible package. Although many people are lactose intolerant, the White Anglo-Saxon dominance of American culture at the time prized milk. Ice cream rode into nutritional value on the coattails of milk. During this time period, dairy-based products like puddings, custards, milk and cream on cooked cereals or with toast, and ice cream were all considered nutritious foods for people who had been injured or ill. Along with foods like beef tea, eggs, and stewed fruits, these made up the bulk of recommended hospital foods from the late 19th century to World War I. Ice cream shows up quite frequently in early reports of the Surgeon General to the U.S. Navy. In his 1918 report to the Secretary of the Navy, ice cream appears to cause more problems than it solves. Notably, the use of ice cream produced commercially results in several instances of crew sickness, including simple illnesses like strep throat, alongside more serious ones like a diphtheria outbreak in Newport in 1917, which was traced to ice cream produced off-station. Fears of the spread of typhoid from places like restaurants, soda fountains, and ice cream shops led to "antityphoid inoculations" at naval shipyards. In Chicago, "All soft-drink and ice-cream stands have promised to give sailors individual service in the form of paper cups and dishes. To make this more effective, it is believed that an order should be issued prohibiting men from accepting any other kind of service." The Surgeon General also recommended inspection of offsite dairies and bottling works for milk, ice cream, and soft drinks to ensure proper sterilization of equipment and pasteurization of dairy products, as well as inoculation of employees against typhoid and smallpox. But ice cream was also noted as essential not only on the existing hospital ship USS Solace, but also on two new hospital ships fitted out since the declaration of war in 1917 - the USS Mercy and the USS Comfort. These ships included a cold storage plant and a refrigerating machine that could "produce, under favorable circumstances, a ton of ice or more a day." The ships also had distilling plants, able to convert seawater to fresh water, up to 20,000 gallons per day. In addition to describing the medical wards, crew facilities, laundry, and kitchen, the report noted: The most valuable adjunct in the treatment and feeding of the sick is the milk emulsifier, popularly known as the "mechanical cow." The milk produced from this machine is made from a combination of unsalted butter and skimmed milk powder and can be made with any proportion of butter fat and proteins desired. This machine will produce 15 gallons of cold, pasteurized milk in 45 minutes. The electric ice cream machine, controlled by one man, makes 10 gallons at a time and is supplemented by small freezers for preparing individual diets for the sick. According to the October, 1918 issue of The Milk Dealer, the "mechanical cow" had been displayed as part of the exhibits at the National Dairy Show in Columbus, Ohio in the fall of 1917. They noted: The "Mechanical Cow" Becoming Famous. Few people who saw the combination exhibit of Merrell-Soule Co. and the DeLaval Separator Co., last October, in Columbus, would have believed that within a year from that beginning the use of the Emulsor in combining Skimmed Milk Powder, unsalted butter and water would be taken up by Army, Navy, City Administrations, etc. throughout the United States. Such is the fact, however. The Mechanical Cow is now producing milk and cream on the U.S.S. Comfort and U.S.S. Mercy, the two splendid hospital ships of the Navy. An installation on board the U.S.S. South Carolina is kept working continuously to supply the demands of her crew. Mechanical Cows are filling the needs of milk at the base hospitals and several camps and one large machine is being operated by one of the city departments of New York. Health officers, physicians, milk experts and authorities on infant feeding all unanimously agree that milk and cream made by means of the "Mechanical Cow" is superior in every way to the average milk supply. This advertisement for "The Chilly King," a cooling machine that was part of the "Mechanical Cow" system on ships like the U.S.S. Mercy (a photograph of the machinery on board featured in the ad) also names a number of naval ships and military camps which use it, including:
By enabling ships and camps to use shelf-stable skimmed milk powder and unsalted butter, which keeps a very long time in cold storage, "mechanical cows" allowed for an ample supply of milk made in sanitary conditions. For naval ships, this was especially important when crews were away from shore for long periods of time. Ice cream also helped patients recover from illness (or so medical professionals at the time believed) but it also helped a great deal with morale. The professionalization of ship operations via the installation of state-of-the-art equipment was a hallmark of the First World War, but the U.S.'s late involvement in the war hamstrung most shipbuilding operations. Indeed, the construction of a new hospital ship in 1916 was actually shelved in favor of retrofitting existing ships like those that were transformed into the U.S.S. Comfort and U.S.S. Mercy, which had initially served as the sister passenger steamboats S.S. Havana and S.S. Saratoga, respectively. Part of the Ward Line, these very fast steamships ran the New York City to Havana, Cuba route but were requisitioned in 1917 first as troop transports, and later as hospital ships. The U.S.S. Mercy spent time as a home for the homeless during the Great Depression before she was scrapped in the 1930s, and the U.S.S. Comfort went back to civilian passenger transport for the Ward line under her old name, the S.S. Havana, before being pressed into service again in World War II, this time as a troop transport once again. The names USS Comfort and USS Mercy would be revived in World War II and a third pair of hospital ships bearing those names are still in operation today. Although ice cream is no longer considered central to the recuperation of the sick and wounded, it is still served on American naval vessels around the world. Ice cream would play an even more important role in the Navy during the Second World War. But that's a tale for another World War Wednesday! The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! Don't like Patreon? Leave a tip!
This article contains Amazon.com and Bookshop.org affiliate links. If you purchase anything from these links, The Food Historian will receive a small commission. No Useless Mouth: Waging War and Fighting Hunger in the American Revolution, Rachel B. Herrmann. Cornell University Press, 2019. 308 pp., $27.95, paperback, ISBN 978-1501716119. This may be the longest I have ever taken to write a book review. I first received this book and the invitation to review it for the Hudson Valley Review in the fall of 2021. As many of you know, 2022 was a rough year for me, for many reasons, but I finally turned in the review in February of 2023. A few days ago, I received my copy of the Review and now that my book review is in print, I feel I can share it here! This edition of the Review is great, with several excellent articles and other book reviews, so if you manage to find a copy, please check it out! Back issues are often posted digitally. Without further ado, the review: In the historiography of the American Revolution, one can be forgiven for thinking every possible topic has been covered. But Rachel B. Herrmann’s new book No Useless Mouth: Waging War and Fighting Hunger in the American Revolution brings new nuance to the period. In it, Herrmann argues that food played a decisive role in the shifting power dynamics between White Europeans, Indigenous Americans, and enslaved and free Africans and people of African descent. She looks at the American Revolution through an international lens, covering from 1775 in the various colonies through to the dawning of the 19th century in Sierra Leone. The book is divided into eight chapters and three parts. Part I, “Power Rising,” introduces us to the ideas of “food diplomacy,” “victual warfare,” and “victual imperialism” within the context of the American Revolution. Contrasting the roles of the Iroquois Confederacy in the north and the Creeks and Cherokees in the South, Herrmann brings additional support to the idea that U.S. treaties with Indigenous groups should join the pantheon of diplomacy history, while centering food and food diplomacy within the context of those treaties. She also addresses how Indian Affairs agents communicated with various Indigenous groups – with varying success. Part II, “Power in Flux,” addresses the roles of people of African descent in the American Revolution, focusing primarily on Black Loyalists as they gained freedom through Dunmore’s Proclamation and the Philipsburg Proclamation. Black Loyalists fought on behalf of the British as soldiers, spies, and foraging groups, and escaped post-war to Nova Scotia with White Loyalists. Part III, “Power Waning,” summarizes what happened to Indigenous and Black groups post-war, focusing on the nascent U.S. imperialism of Indian policy and assimilation and the role food and agriculture played in attempts to control Native populations. It also argues that Black Loyalists adopted the imperialism of their British compatriots in attempts to control food in Sierra Leone, ultimately losing their power to White colonists. Part III also includes Herrmann’s conclusion chapter. No Useless Mouth is most useful to scholars of the American Revolution, providing good references to food diplomacy while also highlighting under-studied groups like Native Americans and Black Loyalists. However, lay readers may find the text difficult to process. Herrmann often makes references to groups and events with little to no context, assuming her readers are as knowledgeable as she. In addition, the author appears to conflate Indigenous groups with one another, making generalizations about food consumption patterns and agricultural practices without the context of cultural differences. In focusing on the Iroquois Confederacy and the Creeks/Cherokee, Herrmann also ignores other Native groups, despite sometimes using evidence from other Indigenous nations to support her arguments. For instance, when discussing postwar assimilation practices with the Iroquois in the north and the Creeks and Cherokee in the south (often jumping from one to another in quick succession), she cites Hendrick Aupaumut’s advice to Europeans for dealing successfully with Indigenous groups. But she fails to note that Aupaumut was neither Iroquois, Creek, nor Cherokee, but was in fact Stockbridge Mohican. The Stockbridge Mohicans were a group from Stockbridge, Massachusetts that was already Christianized prior to the outbreak of the American Revolution. They fought on the Patriot side of the war, with disastrous consequences to the Stockbridge Munsee population, and ultimately lost their lands to the people they fought to defend. Without knowledge of this nuance, readers would accept the author’s evidence at face-value. Herrmann’s strongest chapters are on the Black Loyalists, and her research into the role of food control in both Nova Scotia and Sierra Leone is groundbreaking, but even those chapters have a few curious omissions. In discussing Lord Dunmore’s Proclamation, which was issued in 1775 in Virginia and targeted enslaved people held in bondage by rebels, freeing those who were willing to join the British Army. The chapter then focuses primarily on the roles of enslaved people from the American South. But Herrmann also mentions briefly the Philipsburg Proclamation, issued in 1779 in Westchester County, NY, which freed all people held in bondage by rebel enslavers who could make it to British lines. That proclamation arguably had a much larger impact on the Black Loyalist population, as it also included women, children, and those above military age, thousands of whom streamed into New York City, the primary point of evacuation to Nova Scotia. And yet, Herrmann does not mention at all enslaved people in New York and New Jersey, where slavery was still very active throughout the American Revolution and well into the 19th century. In the chapter on Nova Scotia, Herrmann also mentions that White Loyalists brought enslaved people with them, still held in bondage. Neither Dunmore’s nor the Philipsburg proclamations freed people held in bondage by Loyalists, and yet they get only a brief mention. Her chapters on Indigenous-European relations are extremely useful for other historians researching the period, but would have been improved with additional context on land use in relation to food. Herrmann often references famine, food diplomacy, and victual warfare in these chapters, without addressing the impact of land grabs and disease on the ability of Indigenous groups to feed themselves. She references, but does not fully address the need of European settlers to expand settlement into Indian Country as a motivating factor in war and postwar diplomacy. Finally, while the focus of the book is specifically on the roles of Indigenous and Black groups in the context of food and warfare, the omission of victual warfare by British and American troops and militias, especially in “foraging” and destroying foodstuffs of White civilian populations throughout the colonies seems like a missed opportunity to compare and contrast with policies and long-term impacts of victual warfare toward Indigenous groups. In all, this book is a worthy addition to the bookshelves of serious scholars of the American Revolution, especially those interested in Indigenous and Black history of this time period, but it also leaves room for future scholars to examine more closely the issues Herrmann raises. No Useless Mouth: Waging War and Fighting Hunger in the American Revolution, Rachel B. Herrmann. Cornell University Press, 2019. 308 pp., $27.95, paperback, ISBN 978-1501716119. The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! Don't like Patreon? Leave a tip!
When you think of rationing in World War II, you may not think of peanuts, but they played an outsized role in acting as a substitute for a lot of otherwise tough-to-find foodstuffs, mainly other vegetable fats. When the United States entered the war in December, 1941, after the Japanese attack on Pearl Harbor, the dynamic of trade in the Pacific changed dramatically. The United States had come to rely on cocoanut oil from the then-American colonial territory of the Philippines and palm oil from Southeast Asia for everything from cooking and the production of foods like margarine to the manufacture of nitroglycerine and soap. Vegetable oils like coconut, palm, and cottonseed were considered cleaner and more sanitary than animal fats, which had previously been the primary ingredient in soap, shortening, and margarine. But when the Pacific Ocean became a theater of war, all but domestic vegetable oils were cut off. Cottonseed was still viable, but it was considered a byproduct of the cotton industry, not an product in and of itself, and therefore difficult to expand production. Soy was growing in importance, but in 1941 production was low. That left a distinctly American legume - the peanut. Peanuts are neither a pea nor a nut, although like peas they are a legume. Unlike peas, the seed pods grow underground, in tough papery shells. Native to the eastern Andes Mountains of South America, they were likely introduced to Europe by the Spanish. European colonizers then also introduced them to Africa and Southeast Asia. In West Africa, peanuts largely replaced a native groundnut in local diets. They were likely imported to North America by enslaved people from West Africa (where peanut production may have prolonged the slave trade). Peanuts became a staple crop in the American South largely as a foodstuff for enslaved people and livestock, but the privations of White middle and upper classes during the American Civil War expanded the consumption of peanuts to all levels of society. Union soldiers encountered peanuts during the war and liked the taste. The association of hot roasted peanuts with traveling circuses in the latter half of the 19th century and their use in candies like peanut brittle also helped improve their reputation. Peanuts are high in protein and fats, and were often used as a meat substitute by vegetarians in the late 19th century. Peanut loaf, peanut soup, and peanut breads were common suggestions, although grains and other legumes still held ultimate sway. George Washington Carver helped popularize peanuts as a crop in the early 20th century. Peanuts are legumes and thus fix nitrogen to the soil. With the cultivation of sweet potatoes, Carver saw peanuts as a way to restore soil depleted by decades of cotton farming, giving Black farmers a way to restore the health of their land while also providing nutritious food for their families and a viable cash crop. During the First World War, peanut production expanded as peanut oil was used to make munitions and peanuts were a recommended ration-friendly food. But it was consumer's love of the flavor and crunch of roasted peanuts that really drove post-war production. By the 1930s, the sale of peanuts had skyrocketed. No longer the niche boiled snack food of Southerners or ground into meal for livestock, peanuts were everywhere. Peanut butter and jelly (and peanut butter and mayonnaise) became popular sandwich fillings during the Great Depression. Roasted peanuts gave popcorn a run for its money at baseball games and other sporting events. Peanut-based candy bars like Baby Ruth and Snickers were skyrocketing in sales. And roasted, salted, shelled peanuts were replacing the more expensive salted almonds at dinner parties and weddings. Peanuts were even included as a "basic crop" in attempts by the federal government to address agricultural price control. They were included in the 1929 Agricultural Marketing Act, the Agricultural Adjustment Act of 1933, and an April, 1941 amendment to the Agricultural Adjustment Act of 1938. Peanuts were included in farm loan support and programs to ensure farmers got a share of defense contracts. By the U.S. entry into World War II, most peanuts were being used in the production of peanut butter. And while Americans enjoyed them as a treat, their savory applications were ultimately less popular as an everyday food. But their use as source of high-quality oil was their main selling point during the Second World War. Peanut oil was the primary fuel in Rudolf Diesel's first engine, which debuted in 1900 at the Paris World's Fair. Its very high smoke point has made it a favorite of cooks around the world. During the Second World War peanut oil was used to produce margarine, used in salad dressings and as a butter and lard substitute in cooking and frying. But like other fats, its most important role was in the production of glycerin and nitroglycerine - a primary component in explosives. Which brings us to our imagery in the above propaganda poster. "Mr. Peanut Goes to War!" the poster cries. Produced by the United States Department of Agriculture, it features an anthropomorphized peanut in helmet and fatigues, carrying a rifle, bayonet fixed, marching determinedly across a battlefield, with a tank in the background. Likely aimed at farmers instead of ordinary households, Mr. Peanut of the USDA was nothing like the monocled, top-hatted suave character Planter's introduced in 1916. This Mr. Peanut was tough, determined to do his part, and aid in the war effort. The USDA expected farmers (including African American farmers) to do the same. Further Reading: Note: Amazon purchases from these links help support The Food Historian.
The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! Don't like Patreon? Leave a tip!
It's that time of year when people all over the world are thinking about graduating, either from high school or college. The question in so many minds is, "What's next?" High school students are considering what they should major in, whether they should do a summer internship or get a job, what they want their adult lives to look like. College students are doing much the same - considering whether to attend graduate school, try for an internship, get an entry-level job. And many of them are considering food history as a career option. When you call yourself "THE" food historian, you get a lot of questions about how to break into the field. After answering lots of individual emails, I've decided to tackle the subject in this blog post. I'll be breaking down what it takes to be a food historian, but first I want to emphasize that making a career out of history can be difficult, and it is not particularly lucrative. Even those with PhDs in their fields have trouble finding jobs. I myself work in the history museum field (more job opportunities, but the salaries are usually low) and do the work of a food historian as a passion project that occasionally pays me. That being said, there are folks who are able to make a living through freelance writing, being history professors (food history and food studies programs are expanding in academia), working in museums, writing books, consulting on films, and even making YouTube videos and podcasts. It's not easy, and it's generally not lucrative, but if you have a passion for history and food, this may just be the route for you. A few other folks have written on this subject, notably Rachel Laudan. But while I think she has some great advice on the work of doing food history, that's not quite the same thing as being a food historian. The Inclusive Historian's Handbook, which is a resource for public historians, has also written about food history, albeit in a forum meant for public historians and museum professionals. So I thought I'd tackle my own definitions and advice. I should note that this guide is going to be necessarily focused on the field of food history in the United States, because that is my lived experience, but most of the basic advice is applicable beyond the U.S. What is food history?First, let's do a little defining. I use the term "food history" in the broadest sense. It encompasses:
Food is the one constant that connects all humans throughout human history. Which is part of why it is so appealing to so many people, including non-historians. The History of Food HistoryProfessor and food historian Dr. Steven Caplan of Cornell University has written on the history of the field, but here's my own summary. Although agricultural history has a long and storied past, until quite recently food history was considered an unserious topic of study in academia. Even after the social history revolution of the 1970s, which coincided with a groundswell of public interest in the past thanks to the American Bicentennial, food history was largely considered the purview of museum professionals, reenactors, and non-historians. Even trailblazing scholars like Dr. Jessica B. Harris approached food history from an oblique angle - she has a background as a journalist, her doctoral dissertation was on French language theatre in Senegal, and she was a professor in the English Department at Queens College in New York City for decades. And yet her groundbreaking books helped set the tone for subsequent historians. In fact, many of the food history books published in the last fifty years have been written by non-historians - largely journalists and food writers. And while many fine works have come out of that technique, as someone who has a background in both cooking and academic history, there are often missed opportunities in food history books written by both non-historians and non-cooking academics alike. So why has food history been deemed so unserious? A couple of reasons. One was that gatekeepers in the ivory tower didn't consider food an important topic. It was such a ubiquitous, everyday thing. It seemed to have little importance in the grand scheme of big personalities and big events. But I think its very ubiquity is part of the reason why food history is so compelling and important. Another reason was that the primary actors throughout food history were not generally wealthy White European men. They were (and often still are) primarily women and people of color - producing and growing food, preparing, cooking, preserving, and serving it. Racism and sexism influenced whose history got told, and whose didn't. You can still see some of this bias in modern food history, which often still focuses on rich White dudes with name recognition. Then, when people began studying social history in the 1960s and '70s, food history largely remained the work of museums, historical societies, and historical reconstructions like Old Sturbridge Village and Colonial Williamsburg. It was popular, and therefore not serious. Real food historians knew better. They persisted, but it was a long slog. Let's take a look at a brief (and very incomplete) timeline of food history in the United States since the Bicentennial: In 1973, British historian and novelist Reay Tannahill published Food in History, which was so popular it was reprinted multiple times over the following decades. In 1980, the Culinary Historians of Boston was founded. In 1985, the Culinary Historians of New York was founded. In 1989, museum professional Sandra (Sandy) Oliver began The Food History News, a print newsletter (a zine, if you will) dedicated to the study and recreation of historical foodways. Largely consumed by reenactors, museum professionals, and food history buffs, the publication of the newsletter sadly ceased in 2010. I am lucky enough to have been given a large portion of the print run by a friend, but Sandy's voice is missed in the food history publication sphere. In 1999, librarian Lynne Olver created the Food Timeline website, dedicated to busting myths and answering food history questions. Lynne sadly passed away in 2015, but in 2020 Virginia Tech adopted the Food Timeline and Lynne's enormous food history library. Lynne herself published some food history research recommendations as well. The Food Timeline is likely responsible for many a budding food historian, as it is incredibly fascinating to fall down the many rabbit holes contained therein. In 1994, Andrew F. Smith published The Tomato in American Early History, Culture, and Cookery. He joined the New School faculty in 1996, starting their Food Studies program. He has since gone on to publish dozens of food history books and is the managing editor of the Edible series, which began in 2008 under Reaktion Press, now published by the University of Chicago Press. In 1998, food and nutrition professor Amy Bentley published Eating for Victory - one of the first-ever history books to tackle food in the United States during World War II. In 2003, the University of California Press started its food history imprint with Harvey Levenstein's Revolution at the Table. Levenstein is an academically trained social historian and helped reinvent food history's brand as a serious topic for "real" historians to tackle. In the last 20 years, the study of food history has exploded. Dozens of books are published each year, and more and more people are choosing food as a lens to study all kinds of things, with history at the forefront. Food history has increasingly become a serious field of study, and more and more academically trained historians are bringing their skills to the field, helping shift the historiography. Even so, issues persist. Until very recently, the broad focus of American food history was still very White, and very middle- and upper-class, in large part because those were the folks who published the most cookbooks and magazine articles, who owned and patronized restaurants, who built food processing companies, etc. Low hanging fruit, and all that. The field is diversifying, but the assumption that American food is White Anglo-Saxon persists. Like bias in general, it takes constant work to overcome. Problems in Food HistoryBecause food history is so popular, and so many non-historians have contributed to the historiography, issues that plague all popular subjects persist. First, food history is FULL of apocryphal stories and legends. Many of these continue because non-historians take primary and secondary sources at face value without critical thinking. Many food history books published by non-historians and/or in the popular press do not contain citations. Sometimes this is a design choice by the publisher and not a reflection on the scholarship of the historian. But sometimes this is because the writer is simply regurgitating legends they found somewhere questionable. Food historians need to delve deeply, think critically, and amass corroborating evidence when at all possible. Mythbusting is an important part of the field. Second, non-historians tend to take foods out of context and make assumptions about the people in the past. In the United States, the general public simultaneously romanticizes our food past (yes, there were pesticides before World War II, no, not everyone knew how to cook well) and vilifies difference. I'm sure you've seen all of the memes and comments making fun of foods of the past (I've tackled a few of them here and here). The idea that tastes or priorities might change boggles the minds of folks who are stuck in the mindset of the present. But historical peoples were diverse, circumstances (like central heating and air conditioning) changed frequently, and both had a big impact on how and why people ate what they did. While we can't excuse moral issues like racism, sexism, and xenophobia, it's important to note that there were always people in the past fighting against those isms, too. Rejection of those mindsets is not a uniquely modern idea. The idea that modern humans are somehow morally better, smarter, more sensitive, etc. than historical peoples is not only wrong, it is dangerous, because it implies that we are somehow so advanced that we do not need to self-critique. Nothing could be further from the truth. Third, many of the research questions I and many food historians get from journalists and students tend to focus on origin stories of certain foods. Some are provable, most are not, and most confoundingly for the folks who want to know who did it FIRST, sometimes certain types of foods crop up simultaneously in unrelated places. The real issue with these questions is, who cares who made the first eclair? I mean, maybe you do, but what does finding out who made the first eclair tell us about food history? Perhaps you might argue that it was a turning point in the history of French baking, and went on to influence global fashion for decades to come. You might be right. But if your question is "who did it first?" instead of "what are the wider implications of this event?" or "what does this tell us about the time and place in which it was created?" or "how did this influence how we do things today?" you're asking the wrong questions. We ask "who cares?" a lot in the museum field. To be a good educator and communicator of history, you must make it not only understandable, but relevant to your audience. The origin of the tomato is largely unimportant except as an exercise in research unless you can connect it somehow to our modern society and make people care. On the flip side of things, and finally, there is a lot of gatekeeping in academia about food history. I've actually seen academics bemoan the idea of using food as a lens to study broader historical ideas, instead of focusing only on the food. Which is ridiculous. That's like saying the history of clothing has to focus only on the physical clothes themselves, and not the people who wore them, the society that created them, or the political, economic, and/or religious meaning behind them. Food history is meant to illuminate the hows and whys behind what is probably the biggest and most pervasive driving force behind most of human history - the acquisition and consumption of food. Focusing only on individual dishes doesn't improve food history - it dilutes it. Thankfully, many of these problems can be solved by following two simple truths. To be a food historian, you must first be a historian, and you must also know food. To be a food historian, you must first be a historianThis is a tough one for a lot of folks. The word "historian" conjures up old men with white beards and tweed jackets sitting in book-filled academic offices. But a historian is less a specific person and more a set of skills and experiences. And food history differs from food studies in that it has a primary focus on history. Food studies is an increasingly popular field, especially for colleges and universities that want to attract students with diverse interests. But in my personal opinion, a lot of the work coming out of food studies programs lacks the rigor of history training. I base this judgement on some of the work I've seen presented at conferences. Doesn't mean there aren't fine folks in food studies, and mediocre folks in food history, but the broadness of the food studies umbrella seems to allow for a looser interpretation of the subject than I'd prefer. History is evidence-based, and the best history not only requires corroborating evidence, but also an examination of what is not present in the historical record, and why. Historians also produce original research, which is the primary difference between a history buff and a historian. History buffs can be subject experts and have read all the secondary sources on a topic, and even primary sources. But you're not a historian until you're producing original research. The medium for that research can take many forms. The most common are articles, conference papers, and books. But original research can also be presented in documentary films, podcasts, museum exhibits, etc. Public historians are the same as academic historians with one distinct difference: audience. A public historian's audience is the general public, and they must adjust their communications accordingly. Public historians use less jargon and give more context because their audience tends to have less foreknowledge of the subject. Although many academic historians are starting to see the public history light, many still have other academics as their primary audience, which leads to dry, jargon-y work that that is often purposely difficult to read and comprehend, designed so that only a few of their colleagues can fully understand it. I mentioned earlier that a lot of food history books have been written by journalists and food writers. On the one hand, this is a great thing. The writing quality is better and more approachable, and journalists in particular can be dogged about tracking down primary sources and following leads. But there are issues with non-historians writing history. Mainly, they miss stuff. Many journalists write food histories as one-offs, moving onto the next, usually unrelated topic. And food writers tend to focus more on what they know - cooking and eating - rather than what's happening more broadly in the time period they're examining, and how their research fits into that. Because neither group tends to have a breadth of knowledge about the time period or culture they are examining, they often overlook connections, make assumptions that aren't necessarily supported by the evidence, and generally miss a lot of cues that historians trained in that time period would pick up on. That stuff is usually called context. Context is the background knowledge of what is influencing a moment in history. It's the foreknowledge of the history that leads up to a moment, the understanding of the culture in which the moment is taking place, grasping the relationships between the historical actors involved in the moment, etc. I'll give you an example. Promontory, Utah, 1869. If you know your history, you'll know that's the date and place where the two sections of the transcontinental railroad were joined. If you didn't know the context around that event, you might consider it a rather unimportant local celebration of the completion of a section of railroad track. But the context that surrounds it - the technology history of railroads, the political history of the United States emerging from the Civil War, the backdrop of the Indian Wars and the federal government's land grabs and attempted genocide, the economic history of railroad barons and pioneers - this context is all incredibly important to fully understanding the Promontory Summit event and its significance in American history. Journalists are great at the who, what, where, when, and even how. They're less great at the context. History is more than reciting facts. It is more than discovering new things and rebutting the arguments of historians who came before you. History, especially food history, is about the why. When I was in undergrad, it was the why that drove me. My first introduction was via agricultural history. I was interested in history, but also in the environment, and sustainable agriculture was my gateway into the world of agricultural history. I wanted to know not only how our modern food system came into being, but also why it was the way it was. I was also deeply interested in eating, and therefore in learning how to cook, although it would take until my senior year in college to actually do much of it. A burgeoning interest in collecting vintage cookbooks helped. From there I researched rural sociology and the Progressive Era in graduate school, and that sparked my Master's thesis on food in World War I, and the rest is history (pun intended). I'm now obsessed. Thankfully, being a historian is a lot less difficult than a lot of people assume. Like most skills, it just takes practice and time. If you like to read and write, you're curious, you are good at remembering and synthesizing information, and you're a good communicator, history might be your perfect field. It requires a lot of close reading, critical thinking, and the ability to organize information into coherent arguments and compelling stories. History is simultaneously wonderful and horrible because it is constantly expanding. Not only thanks to linear time, but also because new evidence is being discovered all the time and new research is being published daily. It's both an incredible opportunity and a daunting task. But with the right area of focus, it becomes not a boring task, but a lifelong passion. To be a food historian, you must also know foodThis is one some academics sometimes struggle with. Like journalists, they have can also have an incomplete understanding of their subject matter. I learned to cook not at my grandmother's knee, or even my mother's. I am self-taught, largely because I love to eat and eating mediocre food is a sad chore. But I did not start cooking for myself until late in college, and even then my first attempts were pretty disastrous. But while failure is a necessary component of learning any new skill, I dislike it enough to be highly motivated to avoid making the same mistake twice, and will take steps to avoid making subsequent mistakes. I'm pretty risk-averse. As an avid reader, I decided that books would be my salvation. I started collecting vintage cookbooks in high school thanks to a mother who passed on a love of both reading and thrifting. I read cookbooks like some people read novels - usually before bed. I got picky with my collecting. If a cookbook didn't have good headnotes, it was out. I wanted scratch cooking, but approachable, with modern measurements and equipment and not too many unfamiliar ingredients. Cookbooks published between 1920 and 1950 largely fit the bill, and I read lots of them. In grad school I expanded my cookbook reading, thanks in large part to Amazon's Kindle readers, which had hundreds of free public domain cookbooks, all published before the 1920s. Then I discovered the Internet Archive, and other historical cookbook repositories with digital collections. All the while I was learning to cook for myself, my roommates, and eventually my boyfriend (now husband). I worked in a French patisserie/coffee shop after college and honed my palate. I tried new recipes often and taught myself to be a good cook and a competent baker. I began to understand historical cooking through recipes. I mentally filed away hundreds of old-fashioned cooking tips. I thought about the best and most efficient ways to do things. I got confident enough to learn to adjust and alter recipes. I experimented. I made predictions and tested them. I learned from my failures and adjusted accordingly. In short - I learned to cook. But I still didn't consider myself a food historian. It seemed too big. I felt too inexperienced, despite all my research, both historical and culinary. Then, I had an epiphany. I attended a food history program with a friend. It had a lecture and a meal component. The lecture was good, but the speaker (who was not an academic historian) mocked a historical foodway I knew to be common in the time period. When I tried to call her on it after her talk, she brushed me off. Then, during the delicious meal, the organizer apologized that one of the dishes didn't turn out quite right, despite following the historical recipe exactly. I knew immediately the step they had missed, which probably was not in the recipe and which wasn't common knowledge except to someone who had studied historic cooking extensively. It was then I realized that there I was, a "mere" graduate student, and I had more depth and breadth of knowledge in food history than these two professional presenters. It was then I decided I could officially call myself a food historian. And that led me to write my Master's thesis on food history and launch this website. In the intervening years, I've often caught small (and occasionally not-so-small) food-related errors in academic history books. Assumptions, missed connections, misinterpretations of the primary sources, etc. It taught me that to be a food historian, you really have to know agriculture and food varieties, food preparation, how to read a recipe, and historical technologies to really understand the history you're studying. If that seems like a lot, it can be. But like learning to cook, doing the work of historical research is about building on the work that came before you. Both the work of other historians, and your own knowledge. We learn by doing, and through practice we improve. Understanding and starting food history research and writingSo now that we've got our terms straight, and we know we need to be well-versed in both history and food, what exactly do I mean by original research? Original research is based in primary sources and covers a topic or makes an argument that is not already present in the historiography (the published work of other historians). Some history topics are difficult to find original research to publish. The American Revolution and the American Civil War are two notable examples of this. There are few unexamined primary sources and the historiography is both wide and deep. Food history does not usually have this problem. In fact, several historians have recently focused on the food history of both the Revolutionary and Civil Wars. Because it has been so understudied, food is one of the few avenues left to explore in these otherwise crowded fields. World War I American home front history, however, is criminally under-studied, especially the field of food history (the Brits are better at it than us). Which is one of the reasons why I chose WWI as my area of focus. One major pitfall budding historians stumble into is the idea that you do research to prove a theory. Nope. Not with history. You should formulate a question you want to answer to help guide you and get you started. But cherry picking evidence to support your pet theory is not history. Real historians go where the research takes them. Sometimes your question doesn't have an answer because the evidence doesn't exist, or hasn't been discovered yet. Sometimes your pet theory is wrong. And that's okay! It's all part of the process. Even when you hit a dead end, you don't really fail, because you learned a lot along the way, including that your theory was not correct. And sometimes the research leads you to discover something you weren't even looking for, and that's often where the magic happens. Secondary Sources Food history research, like all history research, has a couple of levels. The first is secondary source material. This is your historiography - the works in and related to your area of study that other historians have already written and published. Books, journal articles, theses and dissertations are the best secondary sources because they generally use citations and can therefore be fact-checked. Secondary sources without footnotes should be taken with a large grain of salt. Any history that doesn't tell you where the evidence came from is suspect. I say this as someone who dislikes the work of footnoting intensely. But it's a necessary evil. But while secondary sources are usually written by experienced historians, they aren't always perfectly correct. Sometimes new research has been done since the book you're reading has been published, which is why it's important to read the recent research as well as the classics. Sometimes the historian writing the book makes a mistake (it happens). Sometimes the writer has biases they are blind to, which need to be taken into account when absorbing their research. Secondary sources can be both a blessing and a bane. A blessing, because you can build on their research instead of having to do everything from scratch. And a bane, because sometimes someone has already written on your beloved topic! Thankfully, there's almost always room for new interpretations. A good practice to truly understand a secondary source is to write an academic book review of it. Book reviews force you not only to closely analyze the text, but also to understand the author's primary arguments, evaluate their sources, and look at the book within the context of the wider historiography. Primary Sources The second level of research is primary sources. This is the historical evidence created in the period you're studying: letters, diaries, newspaper articles, magazines and periodicals, ephemera, photographs, paintings and sketches, laws and lawsuits, wills and inventories, account books and receipts, census records and government reports, etc. Primary sources also include objects, audio recordings (along with oral histories), historic film, etc. Primary sources generally live in collections held in trust and managed by public entities like museums, historical societies, archives, universities, and public libraries. Some folks like to insist that some primary sources are not primary sources at all, notably oral histories and sometimes even newspaper articles, because the author's memory or understanding of an event may not be accurate. But here's the deal, folks - pretty much no primary source is going to be a 100% accurate account of what is happening in history. The creators all have their biases, faulty memories, assumptions, etc. to contend with. Just because someone was present at a major event doesn't mean they're recording it correctly. It also doesn't mean they saw the whole thing, understood what they saw, or aren't completely lying about their presence there. Nobody has a time machine to go fact-check their accuracy, which is why corroborating evidence is important. ALL primary sources deserve some skepticism and an understanding of the culture that produced them. I should also note that primary sources only exist because someone saved them. And that "someone" historically was middle and upper-class White men. Which means a lot of historical documentation did NOT get saved. In particular, the work of women and especially people of color was not only not saved, sometimes it was actively destroyed. And even when those sources do exist, they can be hard to find, and are therefore often overlooked or ignored by the historians who have come before you. This is part of the skepticism of primary sources, too. Again, what is not present can be just as important as what is. Context And that's the third, often-overlooked level of research: context. We've defined it before. It can be a major stumbling block for non-historians. Context is when you take all the stuff you've learned from your secondary sources, and the cumulative knowledge of the primary sources you're studying, and put it into action. It's reading between the lines, understanding cultural cues, realizing what your primary sources aren't saying, who's absent, and what topics are being avoided. Context is understanding why historical figures do what they do, not just how. It can be easy to get wrapped up in the primary sources. I get it. They're fascinating! I've dived down many a primary source rabbit hole. But we have to think critically about them. Taking primary sources at face value has gotten a lot of historians into trouble, leading them to make assumptions about the past, to accept one perspective as gospel truth, and to overlook other avenues of research. No one source (or one person) ever has all the answers. Writing And finally, the last step of research: the writing. Some folks never make it to this point, just amassing research endlessly. And for history buffs, that's okay. But historians have to leave their mark on the historiography. There are a few important things I've learned in the course of writing my own book. First, get the words on the page. When it comes to writing history, get it on the page first, and then think about your argument, the organization of your article or chapter, the chronology, etc. You can't edit what isn't there, and editing is just as if not more important than writing. Everyone needs to be edited, and all works are strengthened by judicious paring and rewriting. You can always edit, cut, rearrange, rewrite, and/or add to anything you write. In fact, you probably should! I'll give an example. When I'm writing an academic book review, I'll usually write out my first reactions in a way-too-long Word document. Then, once I'm satisfied I've covered all the bases, I'll start a new Word doc and rewrite the whole thing, summarizing, examining my gut reactions, and modulating my tone for the audience I'm writing for. Second, practice makes perfect. The more you write (and read), the better you'll get at it. You'll get better at formulating arguments, explaining things, writing for a specific audience, and just better at communicating overall. So many writers feel shame about their early works. I get it! Sometimes I cringe when I see my early stuff. But sometimes I'm retroactively impressed, too. You've got to put out buds if you want to grow leaves and branches. So write the book reviews, write the blog posts, submit journal articles, etc. Everything you do gets you a step further down the historian road. Third, it's important to know when to stop researching. You need enough research to be well-versed enough to do the work. But perfection is the enemy of done. If you can afford to spend decades researching and refining a single book, by all means please do so. But if you want a career in food history, you're gonna have to be more efficient than that. Finally, and this is the hardest one. You have to learn to kill your darlings. I learned that phrase from a professional writer friend. Just because you're endlessly fascinated by the COOL THING! you found, doesn't mean it should go in your book. If it doesn't advance your argument or is otherwise superfluous information, cut it. I'm at that stage now with my own book, and while it's painful, killing your darlings doesn't meant deleting them! Save them for future projects, turn them into blog posts or podcasts or YouTube videos if you like. But getting rid of the extraneous stuff will make your work so much stronger. Still want to be a food historian?Whew! You made it this far! You may be feeling daunted by now. That isn't my intent. Like learning to cook, doing the work of history takes practice. Start with a topic or time period you're interested in. Look to see if anyone else has already published anything on that topic, and read it. Expand your reading beyond your immediate topic to related topics. For instance, if you want to study food in the Civil War, read not only about food, but about war, politics, gender, agriculture, race, economics, etc. in the Civil War, too. See if your local public or university library has remote access to places like JSTOR, Newspapers.com, ProQuest, Project Muse. See if your local library or historical society has any archival records related to your topic. Scour places like the Internet Archive, the Library of Congress, HathiTrust, and the Digital Public Library of America for other primary sources. Many have been digitized and more are being digitized daily. Start looking, start reading, and start synthesizing what you've learned into writing of your own. If you're thinking of pursuing food history on the graduate level, look for universities with professors whose work you admire and respect, for places with rich food history collections, and for programs that excite you. I worked my way through graduate school, working in museums while getting my master's part-time. I also took two years off between undergrad and graduate school, trying to gain experience in my field before tackling the commitment of grad school. I heartily recommend trying the work of food history before you dive into academia. Finally, food history can be FUN! If this all seems like a lot of work, then maybe food history isn't for you, or perhaps you'd like to remain a food history buff, instead of a food historian. But if the idea of delving into primary sources, devouring secondary source material, learning everything you can about a topic, and then writing it all down to share with others excites you, congratulations! You're already well on your way to becoming a food historian, whether you want to make it your full-time career, or just a fun hobby. Whatever you decide - good luck and good hunting! Your friend in food history, Sarah The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! Don't like Patreon? Leave a tip! The short answer? At least in the United States? Yes. Let's look at the history and the reasons why. I post a lot of propaganda posters for World War Wednesday, and although it is implied, I don't point out often enough that they are just that - propaganda. They are designed to alter peoples' behavioral patterns using a combination of persuasion, authority, peer pressure, and unrealistic portrayals of culture and society. In the last several months of sharing propaganda posters on social media for World War Wednesday, I've gotten a couple of comments on how much they reflect an exclusively White perspective. Although White Anglo-Saxon Protestant culture was the dominant culture in the United States at the time, it was certainly not the only culture. And its dominance was the result of White supremacy and racism. This is reflected in the nutritional guidelines and nutrition science research of the time. The First World War takes place during the Progressive Era under a president who re-segregated federal workplaces that had been integrated since Reconstruction. It was also a time when eugenics was in full swing, and the burgeoning field of nutrition science was using racism as justifications for everything from encouraging assimilation among immigrant groups by decrying their foodways and promoting White Anglo-Saxon Protestant foodways like "traditional" New England and British foods to encouraging "better babies" to save the "White race" from destruction. Nutrition science research with human subjects used almost exclusively adult White men of middle- and upper-middle class backgrounds - usually in college. Certain foods, like cow's milk, were promoted heavily as health food. Notions of purity and cleanliness also influenced negative attitudes about immigrants, African Americans, and rural Americans. During World War II, Progressive-Era-trained nutritionists and nutrition scientists helped usher in a stereotypically New England idea of what "American" food looked like, helping "kill" already declining regional foodways. Nutrition research, bolstered by War Department funds, helped discover and isolate multiple vitamins during this time period. It's also when the first government nutrition guidelines came out - the Basic 7. Throughout both wars, the propaganda was focused almost exclusively on White, middle- and upper-middle-class Americans. Immigrants and African Americans were the target of some campaigns for changing household habits, usually under the guise of assimilation. African Americans were also the target of agricultural propaganda during WWII. Although there was plenty of overt racism during this time period, including lynching, race massacres, segregation, Jim Crow laws, and more, most of the racism in nutrition, nutrition science, and home economics came in two distinct types - White supremacy (that is, the belief that White Anglo-Saxon Protestant values were superior to every other ethnicity, race, and culture) and unconscious bias. So let's look at some of the foundations of modern nutrition science through these lenses. Early Nutrition ScienceNutrition Science as a field is quite young, especially when compared to other sciences. The first nutrients to be isolated were fats, carbohydrates, and proteins. Fats were the easiest to determine, since fat is visible in animal products and separates easily in liquids like dairy products and plant extracts. The term "protein" was coined in the 1830s. Carbohydrates began to be individually named in the early 19th century, although that term was not coined until the 1860s. Almost immediately, as part of nearly any early nutrition research, was the question of what foods could be substituted "economically" for other foods to feed the poor. This period of nutrition science research coordinated with the Enlightenment and other pushes to discover, through experimentation, the mechanics of the universe. As such, it was largely limited to highly educated, White European men (although even Wikipedia notes criticism of such a Euro-centric approach). As American colleges and universities, especially those driven by the Hatch Act of 1877, expanded into more practical subjects like agriculture, food and nutrition research improved. American scientists were concerned more with practical applications, rather than searching for knowledge for knowledge's sake. They wanted to study plant and animal genetics and nutrition to apply that information on farms. And the study of human nutrition was not only to understand how humans metabolized foods, but also to apply those findings to human health and the economy. But their research was influenced by their own personal biases, conscious and unconscious. The History of Body Mass Index (BMI)Body Mass Index, or BMI, is a result of that same early 19th century time period. It was invented by Belgian mathematician Lambert Adolphe Jacques Quetelet in the 1830s and '40s specifically as a "hack" for determining obesity levels across wide swaths of population, not for individuals. Quetelet was a trained astronomist - the one field where statistical analysis was prevalent. Quetelet used statistics as a research tool, publishing in 1835 a book called Sur l'homme et le développement de ses facultés, ou Essai de physique sociale, the English translation of which is usually called A Treatise on Man and the Development of His Faculties. In it, he discusses the use of statistics to determine averages for humanity (mainly, White European men). BMI became part of that statistical analysis. Quetelet named the index after himself - it wasn't until 1972 that researcher Ancel Keys coined the term "Body Mass Index," and as he did so he complained that it was no better or worse than any other relative weight index. Quetelet's work went on to influence several famous people, including Francis Galton, a proponent of social Darwinism and scientific racism who coined the term "eugenics," and Florence Nightingale, who met him in person. As a tool for measuring populations, BMI isn't bad. It can look at statistical height and weight data and give a general idea of the overall health of population. But when it is used as a tool to measure the health of individuals, it becomes extremely flawed and even dangerous. Quetelet had to fudge the math to make the index work, even with broad populations. And his work was based on White European males who he considered "average" and "ideal." Quetelet was not a nutrition scientist or a doctor - this "ideal" was purely subjective, not scientific. Despite numerous calls to abandon its use, the medical community continues to use BMI as a measure of individual health. Because it is a statistical tool not based on actual measures of health, BMI places people with different body types in overweight and obese categories, even if they have relatively low body fat. It can also tell thin people they are healthy, even when other measurements (activity level, nutrition, eating disorders, etc.) are signaling an unhealthy lifestyle. In addition, fatphobia in the medical community (which is also based on outdated ideas, which we'll get to) has vilified subcutaneous fat, which has less impact on overall health and can even improve lifespans. Visceral fat, or the abdominal fat that surrounds your organs, can be more damaging in excess, which is why some scientists and physicians advocate for switching to waist ratio measurements. So how is this racist? Because it was based on White European male averages, it often punishes women and people of color whose genetics do not conform to Quetelet's ideal. For instance, people with higher muscle mass can often be placed in the "overweight" or even "obese" category, simply because BMI uses an overall weight measure and assumes a percentage of it is fat. Tall people and people with broader than "ideal" builds are also not accurately measured. The History of the CalorieAlthough more and more people are moving away from measuring calories as a health indicator, for over 100 years they have reigned as the primary measure of food intake efficiency by nutritionists, doctors, and dieters alike. The calorie is a unit of heat measurement that was originally used to describe the efficiency of steam engines. When Wilbur Olin Atwater began his research into how the human body metabolizes food and produces energy, he used the calorie to measure his findings. His research subjects were the White male students at Wesleyan University, where he was professor. Atwater's research helped popularize the idea of the calorie in broader society, and it became essential learning for nutrition scientists and home economists in the burgeoning field - one of the few scientific avenues of study open to women. Atwater's research helped spur more human trials, usually "Diet Squads" of young middle- and upper-middle-class White men. At the time, many papers and even cookbooks were written about how the working poor could maximize their food budgets for effective nutrition. Socialists and working class unionists alike feared that by calculating the exact number of calories a working man needed to survive, home economists were helping keep working class wages down, by showing that people could live on little or inexpensive food. Calculating the calories of mixed-food dishes like casseroles, stews, pilafs, etc. was deemed too difficult, so "meat and three" meals were emphasized by home economists. Making "American" FoodEfforts to Americanize and assimilate immigrants went into full swing in the late 19th and early 20th centuries as increasing numbers of "undesirable" immigrants from Ireland, southern Italy, Greece, the Middle East, China, Eastern Europe (especially Jews), Russia, etc. poured into American cities. Settlement workers and home economists alike tried to Americanize with varying degrees of sensitivity. Some were outright racist, adopting a eugenics mindset, believing and perpetuating racist ideas about criminology, intelligence, sanitation, and health. Others took a more tempered approach, trying to convince immigrants to give up the few things that reminded them of home - especially food. These often engaged in the not-so-subtle art of substitution. For instance, suggesting that because Italian olive oil and butter were expensive, they should be substituted with margarine. Pasta was also expensive and considered to be of dubious nutritional value - oatmeal and bread were "better." A select few realized that immigrant foodways were often nutritionally equivalent or even superior to the typical American diet. But even they often engaged in the types of advice that suggested substituting familiar ingredients with unfamiliar ones. Old ideas about digestion also influenced food advice. Pickled vegetables, spicy foods, and garlic were all incredibly suspect and scorned - all hallmarks of immigrant foodways and pushcart operators in major American cities. The "American" diet advocated by home economists was highly influenced by Anglo-Saxon and New England ideals - beef, butter, white bread, potatoes, whole cow's milk, and refined white sugar were the nutritional superstars of this cuisine. Cooking foods separately with few sauces (except white sauce) was also a hallmark - the "meat and three" that came to dominate most of the 20th century's food advice. Rooted in English foodways, it was easy for other Northern European immigrants to adopt. Although French haute cuisine was increasingly fashionable from the Gilded Age on, it was considered far out of reach of most Americans. French-style sauces used by middle- and lower-class cooks were often deemed suspect - supposedly disguising spoiled meat. Post-Civil War, Yankee New England foodways were promoted as "American" in an attempt to both define American foodways (which reflected the incredibly diverse ecosystems of the United States and its diverse populations) and to unite the country after the Civil War. Sarah Josepha Hale's promotion of Thanksgiving into a national holiday was a big part of the push to define "American" as White and Anglo-Saxon. This push to "Americanize" foodways also neatly ignores or vilifies Indigenous, Asian-American, and African American foodways. "Soul food," "Chinese," and "Mexican" are derided as unhealthy junk food. In fact, both were built on foundations of fresh, seasonal fruits, vegetables, and whole grains. But as people were removed from land and access to land, the they adapted foodways to reflect what was available and what White society valued - meat, dairy, refined flour, etc. Asian food in particular was adapted to suit White palates. We won't even get into the term "ethnic food" and how it implies that anything branded as such isn't "American" (e.g. White). Divorcing foodways from their originators is also hugely problematic. American food has a big cultural appropriation problem, especially when it comes to "Mexican" and "Asian" foods. As late as the mid-2000s, the USDA website had a recipe for "Oriental salad," although it has since disappeared. Instead, we get "Asian Mango Chicken Wraps," and the ingredients of mango, Napa cabbage, and peanut butter are apparently what make this dish "Asian," rather than any reflection of actual foodways from countries in Asia. Milk - The Perfect FoodCombining both nutrition research of the 19th century and also ideas about purity and sanitation, whole cow's milk was deemed by nutrition scientists and home economists to be "the perfect food" - as it contained proteins, fats, and carbohydrates, all in one package. Despite issues with sanitation throughout the 19th century (milk wasn't regularly pasteurized until the 1920s), milk became a hallmark of nutrition advice throughout the Progressive Era - advice which continues to this day. Throughout the history of nutritional guidelines in the U.S., milk and dairy products have remained a mainstay. But the preponderance of advice about dairy completely ignores that wide swaths of the population are lactose intolerant, and/or did not historically consume dairy the way Europeans did. Indigenous Americans, and many people of African and Asian descent historically did not consume cow's milk and their bodies often do not process it well. This fact has been capitalized upon by both historic and modern racists, as milk as become a symbol of the alt-right. Even today, the USDA nutrition guidelines continue to recommend at least three servings of dairy per day, an amount that can cause long term health problems in communities that do not historically consume large amounts of dairy. Nutrition Guidelines HistoryBecause Anglo-centric foodways were considered uniquely "American" and also the most wholesome, this style of food persisted in government nutritional guidelines. Government-issued food recommendations and recipes began to be released during the First World War and continued during the Great Depression and World War II. These guidelines and advice generally reinforced the dominant White culture as the most desirable. Vitamins were first discovered as part of research into the causes of what would come to be understood as vitamin deficiencies. Scurvy (Vitamin C deficiency), rickets (Vitamin D deficiency), beriberi (Vitamin B1 or thiamine deficiency), and pellagra (Vitamin B2 or niacin deficiency) plagued people around the world in the 19th and early 20th centuries. Vitamin C was the first to be isolated in 1914. The rest followed in the 1930s and '40s. Vitamin fortification took off during World War II. The Basic 7 guidelines were first released during the war and were based on the recent vitamin research. But they also, consciously or not, reinforced white supremacy through food. Confident that they had solved the mystery of the invisible nutrients necessary for human health, American nutrition scientists turned toward reconfiguring them every which way possible. This is the history that gives us Wonder Bread and fortified breakfast cereals and milk. By divorcing vitamins from the foods in which they naturally occur (foods that were often expensive or scarce), nutrition scientists thought they could use equivalents to maintain a healthy diet. As long as people had access to vitamins, carbohydrates, proteins, and fats, it didn't matter how they were delivered. Or so they thought. This policy of reducing foods to their nutrients and divorcing food from tradition, culture, and emotion dates back to the Progressive Era and continues to today, sometimes with disastrous consequences. Commodities & NutritionDivorcing food from culture is one government policy Indigenous people understand well. U.S. treaty violations and land grabs led to the reservation system, which forcibly removed Native people from their traditional homelands, divorcing them from their traditional foodways as well. Post-WWII, the government helped stabilize crop prices by purchasing commodity foods for use in a variety of programs operated by the United States Department of Agriculture (USDA), including the National School Lunch Program, Special Supplemental Nutrition Program for Women, Infants, and Children (WIC), and the Food Distribution Program on Indian Reservations (FDPIR) program. For most of these programs, the government purchases surplus agricultural commodities to help stabilize the market and keep prices from falling. It then distributes the foods to low-income groups as a form of food assistance. Commodity foods distributed through the FDPIR program were generally canned and highly processed - high in fat, salt, and sugar and low in nutrients. This forced reliance on commodity foods combined with generational trauma and poverty led to widespread health disparities among Indigenous groups, including diabetes and obesity. Which is why I was appalled to find this cookbook the other day. Commodity Cooking for Good Health, published by the USDA in 1995 (1995!) is a joke, but it illustrates how pervasive and long-lasting the false equivalency of vitamins and calories can be. The cookbook starts with an outline of the 1992 Food Pyramid, whose base rests on bread, pasta, cereal, and rice. It then goes to outline how many servings of each group Indigenous people should be eating, listing 2-3 servings a day for the dairy category, but then listing only nonfat dry milk, evaporated milk, and processed cheese as the dairy options. In the fruit group, it lists five different fruit juices as servings of fruit. It has a whole chapter on diabetes and weight loss as well as encouraging people to count calories. With the exception of a recipe for fry bread, one for chili, and one for Tohono O'odham corn bread, the remainder of the recipes are extremely European. Even the "Mesa Grande Baked Potatoes" are not, as one would assume from the title, a fun take on baked whole potatoes, but rather a mixture of dehydrated mashed potato flakes, dried onion soup mix, evaporated milk, and cheese. You can read the whole cookbook for yourself, but the fact of the matter is that the USDA is largely responsible for poor health on reservations, not only because it provides the unhealthy commodity foods, but also because it was founded in 1862, the height of the Indian Wars, during attempts by the federal government at genocide and successful land grabs. Although the Bureau of Indian Affairs (BIA) under the Department of the Interior was largely responsible for the reservation system, the land grant agricultural college system started by the Hatch Act was literally built on the sale of stolen land. In addition, the USDA has a long history of dispossessing Black farmers, an issue that continues to this day through the denial of farm loans. Thanks to redlining, people of color, especially Black people, often live in segregated school districts whose property taxes are inadequate to cover expenses. Many children who attend these schools are low-income, and rely on free or reduced lunch delivered through the National School Lunch Program, which has been used for decades to prop up commodity agriculture. Although school lunch nutrition efforts have improved in recent years, many hot lunches still rely on surplus commodities and provide inadequate nutrition. Issues That PersistEven today, the federal nutrition guidelines, administered by the USDA, emphasize "meat and three" style meals accompanied by dairy. And while the recipe section is diversifying, it is still all-too-often full of Americanized versions of "ethnic" dishes. Many of the dishes are still very meat- and dairy-centric, and short on fresh fruits and vegetables. Some recipes, like this one, seem straight out of 1956. The idea that traditional ingredients should be replaced with "healthy" variations, for instance always replacing white rice with brown rice or, more recently cauliflower rice, continues. Many nutritionists also push the Mediterranean Diet as the healthiest in the world, when in fact it is very similar to other traditional diets around the world where people have access to plenty of unsaturated fats, fruits and vegetables, whole grains, lean meats, etc. Even the name - the "Mediterranean Diet," implies the diets of everyone living along the Mediterranean. So why does "Mediterranean" always mean Italian and Greek food, and never Persian, Egyptian, or Tunisian food? (Hint: the answer is racism). Old ideas about nutrition, including emphasis on low-fat foods, "meat and three" style recipes, replacement ingredients (usually poor cauliflower), and artificial sweeteners for diabetics, seem hard to shake for many people. Doctors receive very little training in nutrition and hospital food is horrific, as I saw when my father-in-law was hospitalized for several weeks in 2019. As a diabetic with problems swallowing, pancakes with sugar-free syrup, sugar-free gelatin and pudding, and not much else were their solution to his needs. The modern field of nutritionists is also overwhelmingly White, and racism persists, even towards trained nutritionists of color, much less communities of color struggling with health issues caused by generational trauma, food deserts, poverty, and overwork. Our modern food system has huge structural issues that continue to today. Why is the USDA, which is in charge of promoting agriculture at home and abroad, in charge of federal nutrition programs? Commodity food programs turn vulnerable people into handy props for industrial agriculture and the economy, rather than actually helping vulnerable people. Federal crop subsidies, insurance, and rules assigns way more value to commodity crops than fruits and vegetables. This government support also makes it easy and cheap for food processors to create ultra-processed, shelf-stable, calorie-dense foods for very little money - often for less than the crops cost to produce. This makes it far cheaper for people to eat ultra-processed foods than fresh fruits and vegetables. The federal government also gives money to agriculture promotion organizations that use federal funds to influence American consumers through advertising (remember the "Got Milk?" or "The Incredible, Edible Egg" marketing? That was your taxpayer dollars at work), regardless of whether or not the foods are actually good for Americans. Nutrition science as a field has a serious study replication problem, and an even more serious communications problem. Although scientists themselves usually do not make outrageous claims about their findings, the fact that food is such an essential part of everyday life, and the fact that so many Americans are unsure of what is "healthy" and what isn't, means that the media often capitalizes on new studies to make over-simplified announcements to drive viewership. Key TakeawaysNutrition science IS a science, and new discoveries are being made everyday. But the field as a whole needs to recognize and address the flawed scientific studies and methods of the past, including their racism - conscious or unconscious. Nutrition scientists are expanding their research into the many variables that challenge the research of the Progressive Era, including gut health, environmental factors, and even genetics. But human research is expensive, and test subjects rarely diverse. Nutrition science has a particularly bad study replication problem. If the government wants to get serious about nutrition, it needs to invest in new research with diverse subjects beyond the flawed one-size-fits-all rhetoric. The field of nutrition - including scientists, medical professionals, public health officials, and dieticians - need to get serious about addressing racism in the field. Both their own personal biases, as well as broader institutional and cultural ones. Anyone who is promoting "healthy" foods needs to think long and hard about who their audience is, how they're communicating, and what foods they're branding as "unhealthy" and why. We also need to address the systemic issues in our food system, including agriculture, food processing, subsidies, and more. In particular, the government agencies in charge of nutrition advice and food assistance need to think long and hard about the role of the federal government in promoting human health and what the priorities REALLY are - human health? or the economy? There is no "one size fits all" recommendation for human health. Ever. Especially not when it comes to food. Because nutrition guidelines have problems not just with racism, but also with ableism and economics. Not everyone can digest "healthy" foods, either due to medical issues or medication. Not everyone can get adequate exercise, due to physical, mental, or even economic issues. And I would argue that most Americans are not able to afford the quality and quantity of food they need to be "healthy" by government standards. And that's wrong. Like with human health, there are no easy solutions to these problems. But recognizing that there is a problem is the first step on the path to fixing them. Further ReadingMany of these were cited in the text of the article above, but they are organized here for clarity. I have organized them based on the topics listed above. (note: any books listed below are linked as part of the Amazon Affiliate program - any purchases made from those links will help support The Food Historian's free public articles like this one). EARLY NUTRITION SCIENCE
A HISTORY OF BODY MASS INDEX (BMI)
THE HISTORY OF THE CALORIE
MAKING "AMERICAN" FOOD
MILK - THE PERFECT FOOD
NUTRITION GUIDELINES HISTORY
COMMODITIES AND NUTRITION
ISSUES THAT PERSIST
The Food Historian blog is supported by patrons on Patreon! Patrons help keep blog posts like this one free and available to the public. Join us for awesome members-only content like free digitized cookbooks from my personal collection, e-newsletter, and even snail mail from time to time! Don't like Patreon? Leave a tip!
|
AuthorSarah Wassberg Johnson has an MA in Public History from the University at Albany and studies early 20th century food history. Archives
September 2023
Categories
All
|