Kalyx Bowler

Daily Assignments

March 18, 2026
For each assignment, create a Google Doc with your responses and email it to mom and dad. If you need to ask a question about an assignment, include it in the email.
1
English — The Sixth Extinction

Read this excerpt from Elizabeth Kolbert’s The Sixth Extinction, then choose 2 or 3 of the prompts below to respond to.

Read the excerpt

The Icelandic Institute of Natural History occupies a new building on a lonely hillside outside Reykjavik. The building has a tilted roof and tilted glass walls and looks a bit like the prow of a ship. It was designed as a research facility, with no public access, which means that a special appointment is needed to see any of the specimens in the institute’s collection. These specimens, as I learned on the day of my own appointment, include: a stuffed tiger, a stuffed kangaroo, and a cabinet full of stuffed birds of paradise.

The reason I’d arranged to visit the institute was to see its great auk. Iceland enjoys the dubious distinction of being the bird’s last known home, and the specimen I’d come to look at was killed somewhere in the country—no one is sure of the exact spot—in the summer of 1821. The bird’s carcass was purchased by a Danish count, Frederik Christian Raben, who had come to Iceland expressly to acquire an auk for his collection (and had nearly drowned in the attempt). Raben took the specimen home to his castle, and it remained in private hands until 1971, when it came up for auction in London. The Institute of Natural History solicited donations, and within three days Icelanders contributed the equivalent of ten thousand British pounds to buy the auk back. (One woman I spoke to, who was ten years old at the time, recalled emptying her piggy bank for the effort.) Icelandair provided two free seats for the homecoming, one for the institute’s director and the other for the boxed bird.

Guðmundur Guðmundsson, who’s now the institute’s deputy director, had been assigned the task of showing me the auk. Guðmundsson is an expert on foraminifera, tiny marine creatures that form intricately shaped shells, known as “tests.” On our way to see the bird, we stopped at his office, which was filled with boxes of little glass tubes, each containing a sampling of tests that rattled like sprinkles when I picked it up. Guðmundsson told me that in his spare time he did translating. A few years ago he had completed the first Icelandic rendering of On the Origin of Species. He’d found Darwin’s prose quite difficult—“sentences inside sentences inside sentences”—and the book, Uppruni Tegundanna, had not sold well, perhaps because so many Icelanders are fluent in English.

We made our way to the storeroom for the institute’s collection. The stuffed tiger, wrapped in plastic, looked ready to lunge at the stuffed kangaroo. The great auk—Pinguinus impennis—was standing off by itself, in a specially made Plexiglas case. It was perched on a fake rock, next to a fake egg.

As the name suggests, the great auk was a large bird; adults grew to be more than two and a half feet tall. The auk could not fly—it was one of the few flightless birds of the Northern Hemisphere—and its stubby wings were almost comically undersized for its body. The auk in the case had brown feathers on its back; probably these were black when the bird was alive but had since faded. “UV light,” Guðmundsson said gloomily. “It destroys the plumage.” The auk’s chest feathers were white, and there was a white spot just beneath each eye. The bird had been stuffed with its most distinctive feature—its large, intricately grooved beak—tipped slightly into the air. This lent it a look of mournful hauteur.

Guðmundsson explained that the great auk had been on display in Reykjavik until 2008, when the institute was restructured by the Icelandic government. At that point, another agency was supposed to create a new home for the bird, but various mishaps, including Iceland’s financial crisis, had prevented this from happening, which is why Count Raben’s auk was sitting on its fake rock in the corner of the storeroom. On the rock, there was a painted inscription, which Guðmundsson translated for me: the bird who is here for show was killed in 1821. it is one of the few great auks that still exist.

In its heyday, which is to say, before humans figured out how to reach its nesting grounds, the great auk ranged from Norway over to Newfoundland and from Italy to Florida, and its population probably numbered in the millions. When the first settlers arrived in Iceland from Scandinavia, great auks were so common that they were regularly eaten for dinner, and their remains have been found in the tenth-century equivalent of household trash. While I was in Reykjavik, I visited a museum built over the ruins of what’s believed to be one of the most ancient structures in Iceland—a longhouse constructed out of strips of turf. According to one of the museum’s displays, the great auk was “easy prey” for Iceland’s medieval inhabitants. In addition to a pair of auk bones, the display featured a video re-creation of an early encounter between man and bird. In the video, a shadowy figure crept along a rocky shore toward a shadowy auk. When he drew close enough, the figure pulled out a stick and clubbed the animal over the head. The auk responded with a cry somewhere between a honk and a grunt. I found the video grimly fascinating and watched it play through a half a dozen times. Creep, clobber, squawk. Repeat.

As best as can be determined, great auks lived much as penguins do. In fact, great auks were the original “penguins.” They were called this—the etymology of “penguin” is obscure and may or may not be traced to the Latin pinguis, meaning “fat”—by European sailors who encountered them in the North Atlantic. Later, when subsequent generations of sailors met similar-colored flightless birds in the Southern Hemisphere, they used the same name, which led to much confusion, since auks and penguins belong to entirely different families. (Penguins constitute their own family, while auks are members of the family that includes puffins and guillemots; genetic analysis has shown that razorbills are the great auk’s closest living relatives.)

Like penguins, great auks were fantastic swimmers—eyewitness accounts attest to the birds’ “astonishing velocity” in the water—and they spent most of their lives at sea. But during breeding season, in May and June, they waddled ashore in huge numbers, and here lay their vulnerability. Native Americans clearly hunted the great auk—one ancient grave in Canada was found to contain more than a hundred great auk beaks—as did paleolithic Europeans: great auk bones have been found at archaeological sites in, among other places, Denmark, Sweden, Spain, Italy, and Gibraltar. By the time the first settlers got to Iceland, many of its breeding sites had already been plundered and its range was probably much reduced. Then came the wholesale slaughter.

Lured by the rich cod fishery, Europeans began making regular voyages to Newfoundland in the early sixteenth century. Along the way, they encountered a slab of pinkish granite about fifty acres in area, which rose just above the waves. In the spring, the slab was covered with birds, standing, in a manner of speaking, shoulder to shoulder. Many of these were gannets and guillemots; the rest were great auks. The slab, about forty miles off Newfoundland’s northeast coast, became known as the Isle of Birds or, in some accounts, Penguin Island; today it is known as Funk Island. Toward the end of a long transatlantic journey, when provisions were running low, fresh meat was prized, and the ease with which auks could be picked off the slab was soon noted. In an account from 1534, the French explorer Jacques Cartier wrote that some of the Isle of Birds’ inhabitants were “as large as geese.”

They are always in the water, not being able to fly in the air, inasmuch as they have only small wings… with which… they move as quickly along the water as the other birds fly through the air. And these birds are so fat it is marvellous. In less than half an hour we filled two boats full of them, as if they had been stones, so that besides them which we did not eat fresh, every ship did powder and salt five or six barrels full of them.

A British expedition that landed on the island a few years later found it “full of great foules.” The men drove a “great number of the foules” into their ships and pronounced the results to be quite tasty—“very good and nourishing meat.” A 1622 account by a captain named Richard Whitbourne describes great auks being driven onto boats “by hundreds at a time as if God had made the innocency of so poor a creature to become such an admirable instrument for the sustenation of Man.”

Over the next several decades, other uses for the great auk were found besides “sustenation.” (As one chronicler observed, “the great auks of Funk Island were exploited in every way that human ingenuity could devise.”) Auks were used as fish bait, as a source of feathers for stuffing mattresses, and as fuel. Stone pens were erected on Funk Island—vestiges of these are still visible today—and the birds were herded into the enclosures until someone could find time to butcher them. Or not. According to an English seaman named Aaron Thomas, who sailed to Newfoundland on the HMS Boston:

If you come for their Feathers you do not give yourself the trouble of killing them, but lay hold of one and pluck the best of the Feathers. You then turn the poor Penguin adrift, with his skin half naked and torn off, to perish at his leisure.

There are no trees on Funk Island, and hence nothing to burn. This led to another practice chronicled by Thomas.

You take a kettle with you into which you put a Penguin or two, you kindle a fire under it, and this fire is absolutely made of the unfortunate Penguins themselves. Their bodys being oily soon produce a Flame.

It’s been estimated that when Europeans first landed at Funk Island, they found as many as a hundred thousand pairs of great auks tending to a hundred thousand eggs. (Probably great auks produced only one egg a year; these were about five inches long and speckled, Jackson Pollock–like, in brown and black.) Certainly the island’s breeding colony must have been a large one to persist through more than two centuries of depredation. By the late seventeen hundreds, though, the birds’ numbers were in sharp decline. The feather trade had become so lucrative that teams of men were spending the entire summer on Funk, scalding and plucking. In 1785, George Cartwright, an English trader and explorer, observed of these teams: “The destruction which they have made is incredible.” If a stop were not soon put to their efforts, he predicted, the great auk would soon “be diminished to almost nothing.”

Whether the teams actually managed to kill off every last one of the island’s auks or whether the slaughter simply reduced the colony to the point that it became vulnerable to other forces is unclear. (Diminishing population density may have made survival less likely for the remaining individuals, a phenomenon that’s known as the Allee effect.) In any event, the date that’s usually given for the extirpation of the great auk from North America is 1800. Some thirty years later, while working on The Birds of America, John James Audubon traveled to Newfoundland in search of great auks to paint from life. He couldn’t find any, and for his illustration had to make do with a stuffed bird from Iceland that had been acquired by a dealer in London. In his description of the great auk, Audubon wrote that it was “rare and accidental on the banks of Newfoundland” and that it was “said to breed on a rock on that island,” a curious contradiction since no breeding bird can be said to be “accidental.”

Once the Funk Island birds had been salted, plucked, and deep-fried into oblivion, there was only one sizable colony of great auks left in the world, on an island called the Geirfuglasker, or great auk skerry, which lay about thirty miles off southwestern Iceland’s Reykjanes Peninsula. Much to the auk’s misfortune, a volcanic eruption destroyed the Geirfuglasker in 1830. This left the birds one solitary refuge, a speck of an island known as Eldey. By this point, the great auk was facing a new threat: its own rarity. Skins and eggs were avidly sought by gentlemen, like Count Raben, who wanted to fill out their collections. It was in the service of such enthusiasts that the very last known pair of auks was killed on Eldey in 1844.

Before setting out for Iceland, I’d decided that I wanted to see the site of the auk’s last stand. Eldey is only about ten miles off the Reykjanes Peninsula, which is just south of Reykjavik. But getting out to the island proved to be way more difficult to arrange than I’d imagined. Everyone I contacted in Iceland told me that no one ever went there. Eventually, a friend of mine who’s from Iceland got in touch with his father, who’s a minister in Reykjavik, who contacted a friend of his, who runs a nature center in a tiny town on the peninsula called Sandgerði. The head of the nature center, Reynir Sveinsson, in turn, found a fisherman, Halldór Ármannsson, who said he’d be willing to take me, but only if the weather was fair; if it was rainy or windy, the trip would be too dangerous and nausea-inducing, and he wouldn’t want to risk it.

Fortunately, the weather on the day we’d fixed turned out to be splendid. I met Sveinsson at the nature center, which features an exhibit on a French explorer, Jean-Baptiste Charcot, who died when his ship, the infelicitously named Pourquoi-Pas, sunk off Sandgerði in 1936. We walked over to the harbor and found Ármannsson loading a chest onto his boat, the Stella. He explained that inside the chest was an extra life raft. “Regulations,” he shrugged. Ármannsson had also brought along his fishing partner and a cooler filled with soda and cookies. He seemed pleased to be making a trip that didn’t involve cod.

We motored out of the harbor and headed south, around the Reykjanes Peninsula. It was clear enough that we could see the snow-covered peak of Snæfellsjökull, more than sixty miles away. (To English speakers, Snæfellsjökull is probably best known as the spot where in Jules Verne’s A Journey to the Center of the Earth the hero finds a tunnel through the globe.) Eldey, being much shorter than Snæfellsjökull, was not yet visible. Sveinsson explained that Eldey’s name means “fire island.” He said that although he’d spent his entire life in the area, he’d never before been out to it. He’d brought along a fancy camera and was shooting pictures more or less continuously.

As Sveinnson snapped away, I chatted with Ármannsson inside the Stella’s small cabin. I was intrigued to see that he had dramatically different colored eyes, one blue and one hazel. Usually, he told me, he fished for cod using a long line that extended six miles and trailed twelve thousand hooks. The baiting of the hooks was his father’s job, and it took nearly two days. A good catch could weigh more than seven metric tons. Often Ármannsson slept on the Stella, which was equipped with a microwave and two skinny berths.

After a while, Eldey appeared on the horizon. The island looked like the base of an enormous column, or like a giant pedestal waiting for an even more gigantic statue. When we got within maybe a mile, I could see that the top of the island, which from a distance appeared flat, was actually tilted at about a ten-degree angle. We were approaching from the shorter end, so we could look across the entire surface. It was white and appeared to be rippling. As we got closer, I realized that the ripples were birds—so many that they seemed to blanket the island—and when we got even closer, I could see that the birds were gannets—elegant creatures with long necks, cream-colored heads, and tapered beaks. Sveinsson explained that Eldey was home to one of the world’s largest colonies of northern gannets—some thirty thousand pairs. He pointed out a pyramid-like structure atop the island. This was a platform for a webcam that Iceland’s environmental agency had set up. It was supposed to stream a live feed of the gannets to bird-watchers, but it had not functioned as planned.

“The birds do not like this camera,” Sveinsson said. “So they fly over it and shit on it.” The guano from thirty thousand gannet pairs has given the island what looks like a coating of vanilla frosting.

Because of the gannets, and perhaps also because of the island’s history, visitors are not allowed to step onto Eldey without special (and hard-to-obtain) permits. When I first learned this, I was disappointed, but when we got right up to the island and I saw the way the sea beat against the cliffs, I felt relieved.

The last people to see great auks alive were around a dozen Icelanders who made the trip to Eldey by rowboat. They set out one evening in June 1844, rowed through the night, and reached the island the following morning. With some difficulty, three of the men managed to clamber ashore at the only possible landing spot: a shallow shelf of rock that extends from the island to the northeast. (A fourth man who was supposed to go with them refused to on the grounds that it was too dangerous.) By this point the island’s total auk population, probably never very numerous, appears to have consisted of a single pair of birds and one egg. On catching sight of the humans, the birds tried to run, but they were too slow. Within minutes, the Icelanders had captured the auks and strangled them. The egg, they saw, had been cracked, presumably in the course of the chase, so they left it behind. Two of the men were able to jump back into the boat; the third had to be hauled through the waves with a rope.

The details of the great auks’ last moments, including the names of the men who killed the birds—Sigurður Iselfsson, Ketil Ketilsson, and Jón Brandsson—are known because fourteen years later, in the summer of 1858, two British naturalists traveled to Iceland in search of auks. The older of these, John Wolley, was a doctor and an avid egg collector; the younger, Alfred Newton, was a fellow at Cambridge and soon to be the university’s first professor of zoology. The pair spent several weeks on the Reykjanes Peninsula, not far from the site of what is now Iceland’s international airport, and during that time, they seem to have talked to just about everyone who had ever seen an auk, or even just heard about one, including several of the men who’d made the 1844 expedition. The pair of birds that had been killed in that outing, they discovered, had been sold to a dealer for the equivalent of about nine pounds. The birds’ innards had been sent to the Royal Museum in Copenhagen; no one could say what had happened to the skins. (Subsequent detective work has traced the skin of the female to an auk now on display at the Natural History Museum of Los Angeles.)

Wolley and Newton hoped to get out to Eldey themselves. Wretched weather prevented them. “Boats and men were engaged, and stores laid in, but not a single opportunity occurred when a landing would have been practicable,” Newton would later write. “It was with heavy hearts that we witnessed the season wearing away.”

Wolley died shortly after the pair returned to England. For Newton, the experience of the trip would prove to be life-altering. He concluded that the auk was gone—“for all practical purposes therefore we may speak of it as a thing of the past”—and he developed what one biographer referred to as a “peculiar attraction” to “extinct and disappearing faunas.” Newton realized that the birds that bred along Britain’s long coast were also in danger; he noted that they were being gunned down for sport in great numbers.

“The bird that is shot is a parent,” he observed in an address to the British Association for the Advancement of Science. “We take advantage of its most sacred instincts to waylay it, and in depriving the parent of life, we doom the helpless offspring to the most miserable of deaths, that by hunger. If this is not cruelty, what is?” Newton argued for a ban on hunting during breeding season, and his lobbying resulted in one of the first laws aimed at what today would be called wildlife protection: the Act for the Preservation of Sea Birds.

As it happens, Darwin’s first paper on natural selection appeared in print just as Newton was returning home from Iceland. The paper, in the Journal of the Proceedings of the Linnean Society, had—with Lyell’s help—been published in a rush soon after Darwin had learned that a young naturalist named Alfred Russel Wallace was onto a similar idea. (A paper by Wallace appeared in the same issue of the Journal.) Newton read Darwin’s essay very soon after it came out, staying up late into the night to finish it, and he immediately became a convert. “It came to me like the direct revelation of a higher power,” he later recalled, “and I awoke next morning with the consciousness that there was an end of all the mystery in the simple phrase, ‘Natural Selection.’” He had, he wrote to a friend, developed a case of “pure and unmitigated Darwinism.” A few years later, Newton and Darwin became correspondents—at one point Newton sent Darwin a diseased partridge’s foot that he thought might be of interest to him—and eventually the two men paid social calls on each other.

Whether the subject of the great auk ever came up in their conversations is unknown. It is not mentioned in Newton and Darwin’s surviving correspondence, nor does Darwin allude to the bird or its recent demise in any of his other writings. But Darwin had to be aware of human-caused extinction. In the Galápagos, he had personally witnessed, if not exactly a case of extinction in action, then something very close to it.

Darwin’s visit to the archipelago took place in the fall of 1835, nearly four years into the voyage of the Beagle. On Charles Island—now Floreana—he met an Englishman named Nicholas Lawson, who was the Galápagos’s acting governor as well as the warden of a small, rather miserable penal colony. Lawson was full of useful information. Among the facts he related to Darwin was that on each of the islands in the Galápagos the tortoises had different-shaped shells. On this basis, Lawson claimed that he could “pronounce from which island any tortoise may have been brought.” Lawson also told Darwin that the tortoises’ days were numbered. The islands were frequently visited by whaling ships, which carried the huge beasts off as portable provisions. Just a few years earlier, a frigate visiting Charles Island had left with two hundred tortoises stowed in its hold. As a result, Darwin noted in his diary, “the numbers have been much reduced.” By the time of the Beagle’s visit, tortoises had become so scarce on Charles Island that Darwin, it seems, did not see a single one. Lawson predicted that Charles’s tortoise, known today by the scientific name Chelonoidis elephantopus, would be entirely gone within twenty years. In fact, it probably disappeared in fewer than ten. (Whether Chelonoidis elephantopus was a distinct species or a subspecies is still a matter of debate.)

Darwin’s familiarity with human-caused extinction is also clear from On the Origin of Species. In one of the many passages in which he heaps scorn on the catastrophists, he observes that animals inevitably become rare before they become extinct: “we know this has been the progress of events with those animals which have been exterminated, either locally or wholly, through man’s agency.” It’s a brief allusion and, in its brevity, suggestive. Darwin assumes that his readers are familiar with such “events” and already habituated to them. He himself seems to find nothing remarkable or troubling about this. But human-caused extinction is of course troubling for many reasons, some of which have to do with Darwin’s own theory, and it’s puzzling that a writer as shrewd and self-critical as Darwin shouldn’t have noticed this.

In the Origin, Darwin drew no distinction between man and other organisms. As he and many of his contemporaries recognized, this equivalence was the most radical aspect of his work. Humans, just like any other species, were descended, with modification, from more ancient forebears. Even those qualities that seemed to set people apart—language, wisdom, a sense of right and wrong—had evolved in the same manner as other adaptive traits, such as longer beaks or sharper incisors. At the heart of Darwin’s theory, as one of his biographers has put it, is “the denial of humanity’s special status.”

And what was true of evolution should also hold for extinction, since according to Darwin, the latter was merely a side effect of the former. Species were annihilated, just as they were created, by “slow-acting and still existing causes,” which is to say, through competition and natural selection; to invoke any other mechanism was nothing more than mystification. But how, then, to make sense of cases like the great auk or the Charles Island tortoise or, to continue the list, the dodo or the Steller’s sea cow? These animals had obviously not been done in by a rival species gradually evolving some competitive advantage. They had all been killed off by the same species, and all quite suddenly—in the case of the great auk and the Charles Island tortoise over the course of Darwin’s own lifetime. Either there had to be a separate category for human-caused extinction, in which case people really did deserve their “special status” as a creature outside of nature, or space in the natural order had to be made for cataclysm, in which case, Cuvier—distressingly—was right.

Choose 2 or 3 of the following prompts to respond to:

  1. 1. The Irony of Rarity
    Kolbert writes that once the great auk became rare, it faced “a new threat: its own rarity,” because collectors wanted specimens. In a paragraph or two, explain how this works as a kind of trap. Can you think of anything else—not necessarily an animal—where something becoming scarce makes people want it more, which makes it even scarcer?
  2. 2. Captain Whitbourne’s Quote
    One of the historical figures describes great auks as proof that God made “the innocency of so poor a creature to become such an admirable instrument for the sustenation of Man.” What is Whitbourne assuming about the purpose of animals? How does the rest of the chapter challenge that assumption?
  3. 3. Darwin’s Blind Spot
    Kolbert says it’s “puzzling” that Darwin didn’t notice a problem in his own theory. In your own words, explain the contradiction she’s pointing to. Why is human-caused extinction hard to fit into Darwin’s framework? Hint: think about what Darwin said humans are versus what they were actually doing.
  4. 4. The Penguin Name Mix-Up
    Explain, in your own words, how great auks ended up connected to penguins even though they’re not related at all. What does this small story tell you about how humans name and categorize the natural world—do we do it based on what things actually are, or based on what they remind us of?
  5. 5. Creative / Reflective Option
    Kolbert describes watching the museum video of a man clubbing an auk over and over: “Creep, clobber, squawk. Repeat.” Why do you think she watched it six times? Write a short paragraph about a time you couldn’t stop looking at or thinking about something that disturbed you, and try to explain what was pulling you back to it.
2
Health — Food, Glorious Food

Read this chapter from Bill Bryson’s The Body. No written response required today—just read it carefully. We’ll discuss it next time.

Read the chapter

Tell me what you eat, and I will tell you what you are.
—Jean Anthelme Brillat-Savarin, The Physiology of Taste

We all know that if we consume too much beer and cake and pizza and cheeseburgers and all the other things that make life frankly worth living, we will add pounds to our bodies because we have taken in too many calories. But what exactly are these little numerical oddments that are so keen to make us round and wobbly?

The calorie is a strange and complicated measure of food energy. Formally, it’s a kilocalorie, and it is defined as the amount of energy required to heat one kilogram of water by one degree centigrade, but it seems safe to say that no one ever thinks of it in those terms when deciding what foods to eat. Just how many calories each of us needs is pretty much a personal matter. Until 1964, the official guidance in the United States was for thirty-two hundred calories per day for a moderately active man and twenty-three hundred for a similarly disposed woman. Today those inputs have been reduced to about twenty-six hundred calories for a moderately active man and two thousand for a moderately active woman. That’s a big reduction. Over the course of a year, for a man that would be almost a quarter of a million fewer calories.

It probably won’t come as a surprise to hear that in fact the inputs have gone in exactly the other direction. Americans today consume about 25 percent more calories than they did in 1970 (and, let’s face it, we weren’t exactly going without in 1970).

The father of caloric measurement—indeed of modern food science—was the American academic Wilbur Olin Atwater. A devout and kindly man with a walrus mustache and a stout frame that showed he was no stranger to the larder himself, Atwater was born in 1844 in upstate New York, the son of a traveling Methodist preacher, and studied agricultural chemistry at Wesleyan University in Connecticut. On a study trip to Germany, he was introduced to the exciting new concept of the calorie and returned to America with an evangelical urge to bring scientific rigor to the infant science of nutrition. Taking a position as professor of chemistry at his alma mater, he embarked on a series of experiments to test every aspect of food science. Some of these experiments were a touch unorthodox, not to say risky. In one, he ate a fish poisoned with ptomaine to see what effect it would have on him. The effect was that it nearly killed him.

Atwater’s most celebrated project was the building of a contraption he called a respiratory calorimeter. This was a sealed chamber, not much larger than a large cupboard, in which subjects were confined for up to five days while Atwater and his helpers minutely measured various facets of their metabolism—inputs of food and oxygen, outputs of carbon dioxide, urea, ammonia, feces, and so on—and so calculated caloric intake. The work was so exacting it took up to sixteen people to read all the dials and perform the calculations. Most of the subjects were students, though the lab janitor, Swede Osterberg, was also sometimes drafted in; quite how voluntarily is unknown. Wesleyan’s president was mystified by the point of the calorimeter—the calorie was an entirely new concept, after all—and especially appalled at the cost, and ordered Atwater to take a 50 percent pay cut or hire an assistant at his own expense. Atwater chose the latter and, undeterred, worked out the calories and nutritional values of practically all known foods—some four thousand in all. In 1896, he produced his magnum opus, The Chemical Composition of American Food Materials, which remained the last word on diet and nutrition for a generation. For a time he was one of the most famous scientists, of any type, in America.

Much of what Atwater concluded was ultimately wrong, but that wasn’t really his fault. Nobody yet understood the concept of vitamins and minerals or even the need for a balanced diet. To Atwater and his contemporaries, all that made one food superior to another was how well it served as fuel. So he believed that fruits and vegetables provided comparatively little energy and needed to play no part in the average person’s diet. Instead, he suggested that we should eat a lot of meat—two pounds every day, 730 pounds a year. The average American today eats 268 pounds of meat a year, about a third of Atwater’s recommended amount, and most authorities say that is still too much.

Atwater’s most unsettling discovery—to himself as much as to the world at large—was that alcohol was an especially rich source of calories, and thus an efficient fuel. As the son of a clergyman and a teetotaler himself, he was appalled to report it, but as a diligent scientist he felt his first duty was to the truth, however awkward. In consequence, he was swiftly disowned by his own, devoutly Methodist university and its already scornful president.

Before the controversy could be resolved, fate intervened. In 1904, Atwater suffered a massive stroke. He lingered for three years without recovering his faculties and died aged sixty-three, but his long efforts secured the calorie’s place at the heart of nutrition science, evidently for all time.

As a measure of dietary intake, the calorie has a number of failings. For one thing, it gives no indication of whether a food is actually good for you or not. The concept of “empty” calories was quite unknown in the early twentieth century. Nor does conventional calorie measurement account for how foods are absorbed as they pass through the body. A great many nuts, for instance, are less completely digested than other foods, which means that they leave behind fewer calories than are consumed. You may eat 170 calories’ worth of almonds, but keep only 130 of them. The other 40 sluice through without, as it were, touching the sides.

By whatever means you measure it, we are pretty good at extracting energy from food, not because we have an especially dynamic metabolism but because of a trick we learned a very long time ago: cooking. No one knows even approximately when humans first began cooking food. We have good evidence that our ancestors were utilizing fire 300,000 years ago, but Richard Wrangham of Harvard, who has devoted much of his career to studying the matter, believes that our ancestors mastered fire a million and a half years before that—which is to say long before we were properly human.

Cooking confers all kinds of benefits. It kills toxins, improves taste, makes tough substances chewable, greatly broadens the range of what we can eat, and above all vastly boosts the amount of calories humans can derive from what they eat. It is widely believed now that cooked food gave us the energy to grow big brains and the leisure to put them to use.

But in order to cook food, you also need to be able to gather and prepare it efficiently, and that is what Daniel Lieberman of Harvard believes is at the heart of our becoming modern. “You can’t possibly have a large brain unless you’ve got the energy to fuel it,” he told me when we met in the autumn of 2018. “And in order to fuel it, you need to master hunting and gathering. That’s more challenging than people realize. It’s not just a question of picking berries or digging up tubers; it is a matter of processing foods—making them easier to eat and digest, and safer to eat—and that involves toolmaking and communication and cooperation. That is the essence of what drove the shift from primitive to modern humans.”

In nature, we actually starve pretty easily. We are incapable of deriving nutrition from most parts of most plants. In particular we cannot make use of cellulose, which is what plants primarily consist of. The few plants that we can eat are the ones we know as vegetables. Otherwise we are limited to eating a few botanical end products, such as seeds and fruits, and even many of those are poisonous to us. But we can benefit from a lot more foods by cooking them. A cooked potato, for instance, is about twenty times more digestible than a raw one.

Cooking frees up a lot of time for us. Other primates spend as many as seven hours a day just chewing. We don’t need to eat constantly to ensure our survival. Our tragedy, of course, is that we eat more or less constantly anyway.

The fundamental components of the human diet—the macronutrients: water, carbohydrates, fat, and protein—were recognized nearly two hundred years ago by an English chemist named William Prout, but it was even then clear that some other, more elusive elements were needed to produce a fully healthy diet. No one knew for the longest time exactly what these elements were, but it was evident that in their absence people were likely to suffer a deficiency disease like beriberi or scurvy.

We now know them, of course, as vitamins and minerals. Vitamins are simply organic chemicals—that is, from things that are or were once alive, like plants and animals—while minerals are inorganic and come from soil or water. Altogether there are about forty of these little particles that we must get from our foods because we cannot manufacture them for ourselves.

Vitamins are a surprisingly recent concept. A little over four years after Wilbur Atwater died, a Polish émigré chemist in London, Casimir Funk, came up with the notion of vitamins, though he called them “vitamines,” a contraction of “vital” and “amines” (amines being a type of organic compound). As it turned out, only some vitamins are amines, so the name was later shortened. (Other names were also tried, among them nutramines, food hormones, and accessory food factors, but failed to catch on.) Funk didn’t discover vitamins but merely speculated, correctly, as to their existence. But because no one could produce these strange elements, many authorities refused to accept their reality. Sir James Barr, president of the British Medical Association, dismissed them as “a figment of the imagination.”

The discovery and naming of vitamins didn’t begin until almost the 1920s and has been a checkered affair, to put it mildly. In the beginning, vitamins were named in more or less strict alphabetical order—A, B, C, D, and so on—but then the system began to fall apart. Vitamin B was discovered to be not one vitamin but several, and these were renamed B1, B2, B3, and so on up to B12. Then it was decided that the B vitamins weren’t so diverse after all, so some were eliminated and others reclassified, so that today we are left with six semi-sequential B vitamins: B1, B2, B3, B5, B6, and B12. Other vitamins came and went, so that the scientific literature is filled with a lot of what might be called ghost vitamins—M, P, PP, S, U, and several others. In 1935, a researcher in Copenhagen, Henrik Dam, discovered a vitamin that was central to blood coagulation and called it vitamin K (for the Danish koagulere). The next year, some other researchers came up with vitamin P (for “permeability”). The process hasn’t entirely settled down yet. Biotin, for instance, was for a time called vitamin H, but then became B7. Today it is mostly just called biotin.

Although Funk coined the term “vitamines,” and is thus often given credit for their discovery, most of the real work of determining the chemical nature of vitamins was done by others, in particular Sir Frederick Hopkins, who was awarded the Nobel Prize for his work in 1929—a fact that left Funk permanently in one.

Even today vitamins are an ill-defined entity. The term describes thirteen chemical oddments that we need to function smoothly but are unable to manufacture for ourselves. Though we tend to think of them as closely related, they mostly have little in common apart from being useful to us. They are sometimes described as “hormones made outside the body,” which is a pretty good definition except that it is only partly true. Vitamin D, one of the most vital of all vitamins, can both be made in the body (where it really is a hormone) or be ingested (which makes it a vitamin again).

A good deal of what we know about vitamins and their mineral cousins is surprisingly recent. Choline, for instance, is a micronutrient you have probably never heard of. It has a central role in making neurotransmitters and keeping your brain running smoothly, but that has only been known since 1998. It is abundant in foods that we don’t generally eat a lot of—liver, Brussels sprouts, and lima beans, for instance—which doubtless explains why it is thought that some 90 percent of us have at least a moderate choline deficiency.

In the case of many micronutrients, scientists don’t know quite how much you need or even what they do for you when you get them. Bromine, for instance, is found throughout the body, but nobody is sure if it is there because the body needs it or is just a kind of accidental passenger. Arsenic is an essential trace element for some animals, but we don’t know if that includes humans. Chromium is definitely needed, but in such small amounts that it becomes toxic quite quickly. Chromium levels fall steadily as we age, but no one knows why they fall or what this indicates.

For nearly all vitamins and minerals, the risk of taking in too much is as great as the risk of getting too little. Vitamin A is needed for vision, for healthy skin, and for fighting infection, so it is vital to have it. Luckily, it is abundant in many common foods, like eggs and dairy products, so it’s easy to get more than enough. But there’s the rub. The recommended daily level is seven hundred micrograms for women and nine hundred for men; the upper limit for both is about three thousand micrograms, and exceeding that regularly can become risky. How many of us could begin to guess even roughly how close we are to getting the balance right? Iron similarly is vital for healthy red blood cells. Too little iron and you become anemic, but too much is toxic, and there are some authorities who believe that quite a number of people may be getting too much of it. Curiously, too much or too little iron both provide the same symptom, lethargy. “Too much iron in the form of supplements can accumulate in our tissues causing our organs literally to rust,” Leo Zacharski of Dartmouth-Hitchcock Medical Center in New Hampshire told New Scientist in 2014. “It’s a far stronger risk factor than smoking for all sorts of clinical disorders,” he added.

In 2013, an editorial in the highly respected Annals of Internal Medicine, based on a study led by researchers at Johns Hopkins University, said that nearly everyone in high-income countries was sufficiently well nourished not to require vitamins or other health supplements and that we should stop wasting our money on them. The report came in for some swift and withering criticism, however.

Professor Meir Stampfer of the Harvard Medical School said it was regrettable that “such a poorly done paper would be published in a prominent journal.” According to the Centers for Disease Control, far from having plenty in our diet, some 90 percent of American adults don’t get the recommended daily dose of vitamins D and E and about half don’t get sufficient vitamin A. No less than 97 percent, according to the CDC, don’t get enough potassium, a vital electrolyte, which is particularly alarming because potassium helps to keep your heart beating smoothly and your blood pressure within tolerable limits. Having said that, there is often disagreement over what precisely we do need. In America, the daily recommended dose of vitamin E is fifteen milligrams, for instance, but in the U.K. it is three to four milligrams—a very considerable difference.

What can be said with some confidence is that many people have a faith in health supplements that goes some way beyond the fully rational. Americans can choose from among a truly staggering eighty-seven thousand different dietary supplements and we spend a no less impressive $40 billion a year on them.

The greatest of vitamin controversies was stirred up by the American chemist Linus Pauling (1901–94), who had the distinction of winning not one but two Nobel Prizes (for chemistry in 1954 and for peace eight years later). Pauling believed that massive doses of vitamin C were effective against colds, flu, and even some cancers. He took up to forty thousand milligrams of vitamin C daily (the recommended daily dose is sixty milligrams) and maintained that his large intake of vitamin C had kept his prostate cancer at bay for twenty years. He had no evidence for any of his claims, and all have been pretty well discredited by subsequent studies. Thanks to Pauling, to this day many people believe that taking a lot of vitamin C will help to get rid of a cold. It won’t.

Of all the many things we take in with our foods (salts, water, minerals, and so on), just three need to be altered as they proceed through the digestive tract: proteins, carbohydrates, and fats. Let’s look at them in turn.

Proteins

Proteins are complicated molecules. About a fifth of our body weight is made up of them. In simplest terms, a protein is a chain of amino acids. About a million different proteins have been identified so far, and nobody knows how many more are to be found. They are all made from just twenty amino acids, even though hundreds of amino acids exist in nature that could do the job just as well. Why evolution has wedded us to such a small number of amino acids is one of the great mysteries of biology. For all their importance, proteins are surprisingly ill-defined. Although all proteins are made from amino acids, there is no accepted definition as to how many amino acids you need in a chain to qualify as a protein. All that can be said is that a small but unspecified number of amino acids strung together is a peptide. Ten or twelve strung together is a polypeptide. When a polypeptide begins to get bigger than that, it becomes, at some ineffable point, a protein.

It is a slightly strange fact that we break down all the proteins we consume in order to reassemble them into new proteins, rather as if they were Lego toys. Eight of the twenty amino acids cannot be made in the body and must be consumed in the diet. If they are missing from the foods we eat, then certain vital proteins cannot be made. Protein deficiency is almost never a problem for people who eat meat, but it can be for vegetarians because not all plants provide all the necessary amino acids. It is interesting that most traditional diets in the world are based around combinations of plant products that do provide all the necessary amino acids. So people in Asia eat a lot of rice and soybeans, while indigenous Americans have long combined corn with black or pinto beans. This isn’t just a matter of taste, it seems, but an instinctive recognition of the need for a rounded diet.

Carbohydrates

Carbohydrates are compounds of carbon, hydrogen, and oxygen, which are bound together to form a variety of sugars—glucose, galactose, fructose, maltose, sucrose, deoxyribose (the stuff found in DNA), and so on. Some of these are chemically complex and known as polysaccharides, some are simple and known as monosaccharides, and some are in between and known as disaccharides. Although all are sugars, not all are sweet. Some, like the starches found in pasta and potatoes, are too big to activate the tongue’s sweet detectors. Virtually all carbohydrates in the diet come from plants, with one conspicuous exception: lactose, from milk.

We eat a lot of carbohydrates, but we use them up quickly, so the total amount in your body at any given time is modest—usually less than a pound. The main thing to bear in mind is that carbohydrates, upon being digested, are just more sugar—often quite a lot more. That means that a 150-gram serving of white rice or a small bowl of cornflakes will have the same effect on your blood glucose levels as nine teaspoons of sugar.

Fats

The third member of the trio, fats, are also made up of carbon, hydrogen, and oxygen, but in different proportions. This has the effect of making fat easier to store. When fats are broken down in the body, they are teamed up with cholesterol and proteins in a new molecule called lipoproteins, which travel through the body via the bloodstream. Lipoproteins come in two principal types: high density and low density. Low-density lipoproteins are the ones frequently referred to as “bad cholesterol” because they tend to form plaque deposits on the walls of blood vessels. Cholesterol is not as fundamentally evil as we tend to think it. Indeed, it is vital to a healthy life. Most of the cholesterol in your body is locked up in your cells, where it is doing useful work. Just a small part—about 7 percent—floats about in the bloodstream. Of that 7 percent, one-third is “good” cholesterol and two-thirds is “bad.”

So the trick with cholesterol is not to eliminate it but to maintain it at a healthy level. One way to do so is to eat a lot of fiber, or roughage. Fiber is the material in fruits, vegetables, and other plant foods that the body cannot fully break down. It contains no calories and no vitamins, but it helps to lower cholesterol and slows the rate at which sugar gets into the bloodstream and is then turned into fat by the liver, among many other benefits.

Carbohydrates and fats are the principal fuel reserves of the body, but they are stored and used in different ways. When the body needs fuel, it tends to burn up the available carbohydrates and store any spare fat. The main point to bear in mind—and you are no doubt well aware of it each time you take your shirt off—is that the human body likes to hold on to its fat. It burns some of the fat we consume for energy, but a good deal of the rest is sent off to tens of billions of tiny storage terminals called adipocytes, which exist all over the body. The upshot of all this is that the human body is designed to take in fuel, use what it needs, and store the rest to call on later as required. That makes it possible for us to be active for hours at a time without eating. Your body below the neck doesn’t do a lot of complicated thinking, and it is only too happy to hold on to any surplus fat you give it. It even rewards you for overeating with a lovely feeling of well-being.

Depending on where the fat ends up, it is known as subcutaneous (beneath the skin) or visceral (around the belly). For complex chemical reasons, visceral fat is much worse for you than the subcutaneous kind.

Fat comes in several varieties. “Saturated fat” sounds greasy and unhealthy, but in fact it is a technical description of carbon-hydrogen bonds rather than how much of it runs down your chin when you bite into it. As a rule, animal fats tend to be saturated and vegetable fats to be unsaturated, but there are many exceptions, and you can’t tell by looking whether a food is high in saturated fat or not. Who would guess, for instance, that an avocado has five times as much saturated fat as a small bag of potato chips? Or that a large latte has more than almost any pastry? Or that coconut oil is almost nothing but saturated fat?

Even more invidious are trans fats, an artificial form of fat made from vegetable oils. Invented by a German chemist in 1902, they were long thought of as a healthy alternative to butter or animal fat, but we now know the opposite to be true. Also known as hydrogenated oils, trans fats are much worse for your heart than any other kind of fat. They raise levels of bad cholesterol, lower levels of good cholesterol, and damage the liver. As Daniel Lieberman has rather chillingly put it, “Trans fats are essentially a form of slow-acting poison.”

As early as the mid-1950s, Fred A. Kummerow, a biochemist at the University of Illinois, reported clear evidence of a link between high intake of trans fats and clogged coronary arteries, but his findings were widely dismissed, particularly with the influence of lobbying by the food processing industry. Not until 2004 did the American Heart Association finally accept that Kummerow was right, and not until 2015—almost sixty years after Kummerow first reported the dangers—did the Food and Drug Administration finally decree trans fats unsafe to eat. Despite their known dangers, it remained legal to add them to foods in America until July 2018.

Finally, we should say a word or two about the most vital of our macronutrients: water. We consume about two and a half quarts of water a day, though we are not generally aware of it because about half is contained within our foods. The conviction that we should all drink eight glasses of water a day is the most enduring of dietary misunderstandings. The idea has been traced to a 1945 paper from the U.S. Food and Nutrition Board, which noted that that was the amount that the average person consumed in a day. “What happened,” Dr. Stanley Goldfarb of the University of Pennsylvania told the BBC radio program More or Less in 2017, “was that people sort of confused the idea that this was the required intake. And the other confusion that occurred was then people said that it is not so much that you should take in eight ounces eight times a day, but that you should consume that in addition to whatever fluid you consume in association with your diet and your meals. And there was never any evidence for that.”

One other enduring myth concerning water intake is the belief that caffeinated drinks are diuretics and make you pee out more than you have taken in. They may not be the most wholesome of options for liquid refreshment, but they do make a net contribution to your personal water balance. Thirst, curiously, is not a reliable indication of how much water you need. People allowed to drink all the water they want after getting very thirsty usually report feeling slaked after drinking only one-fifth the amount they have lost through perspiration.

Drinking too much water can actually be dangerous. Normally, your body manages fluid balance very well, but occasionally people take in so much water that the kidneys cannot get rid of it fast enough and they end up dangerously diluting the sodium levels in their blood, setting off a condition known as hyponatremia. In 2007, a young woman in California named Jennifer Strange died after drinking six quarts of water in three hours in a clearly ill-judged water-drinking competition held by a local radio station. Similarly in 2014, a high school football player in Georgia, complaining of cramps after practice, downed two gallons of water and two of Gatorade and soon afterward fell into a coma and died.

Over a lifetime, we eat about sixty tons of food, which is equivalent, notes Carl Zimmer in Microcosm, to eating sixty small cars. In 1915, the average American spent half his weekly income on food. Today it’s just 6 percent. We live in a paradoxical situation. For centuries, people ate unhealthily out of economic necessity. Now we do it out of choice. We are in the historically extraordinary position that far more people on Earth suffer from obesity than from hunger. To be fair, it doesn’t take much to put on weight. One chocolate chip cookie a week, in the absence of any offsetting extra exercise, will translate into about two pounds of extra weight a year.

It took a surprisingly long time to realize that a lot of the things we eat can make you seriously unhealthy. The person most responsible for our enlightenment was a nutritionist from the University of Minnesota named Ancel Keys.

Keys was born in 1904 into a moderately distinguished family in California (his uncle was the movie star Lon Chaney, to whom he bore a striking resemblance). He was a bright but undermotivated child. Professor Lewis Terman of Stanford, who studied intelligence in youngsters (he was responsible for putting “Stanford” in the Stanford-Binet IQ test), declared the young Keys a potential genius, but Keys chose not to fulfill his potential. Instead, he dropped out of school at fifteen and worked at a variety of exotic jobs, from sailor in the merchant navy to a shoveler of bat guano in Arizona. Only then did he belatedly embark on an academic career, but he made up for lost time in a big way, rapidly acquiring degrees in biology and economics from the University of California at Berkeley, a PhD in oceanography from the Scripps Institution in La Jolla, California, and a second PhD, in physiology, from Cambridge University in England. After settling briefly at Harvard, where he became a world authority on high altitude physiology, he was lured to the University of Minnesota to become the founding director of its Laboratory of Physiological Hygiene. There he rapidly became an expert on human nutrition. When America joined the Second World War, the War Department commissioned Keys to devise a lightweight food pack for paratroopers. The result was the imperishable army food known as K rations. The K stood for Keys.

In 1944, as much of Europe faced the prospect of starvation because of the disruptions and privations of war, Keys embarked on what became known as the Minnesota Starvation Experiment. He recruited thirty-six healthy male volunteers—all conscientious objectors—and for six months allowed them just two meager meals a day (one on Sundays) for a total daily intake of about 1,500 calories. Over the six months, the men’s average weight dropped from 152 pounds to 115. The idea of the experiment was to establish how well people could cope with the experience of chronic hunger and how well they would recover afterward. Essentially, it just confirmed what anyone could have guessed at the outset—that chronic hunger made the volunteers irritable, lethargic, and depressed, and left them more susceptible to illness. On the plus side, when their normal diet was resumed, they quickly recovered their lost weight and missing vitality. On the basis of the study, Keys produced a two-volume work, The Biology of Human Starvation, which was highly regarded, though not particularly timely. By the time it came out, in 1950, nearly everyone in Europe was well fed again and starvation was not an issue.

Soon afterward, Keys embarked on the study that would permanently seal his fame. The Seven Countries Study compared the dietary habits and health outcomes of 12,000 men in seven nations: Italy, Greece, the Netherlands, Yugoslavia, Finland, Japan, and the United States. Keys found a direct correlation between levels of dietary fat and heart disease—a conclusion that is hardly surprising now but was revolutionary then. In 1957, with his wife, Margaret, Keys produced a popular book called Eat Well and Stay Well, which promoted what we now know as the Mediterranean diet. The book infuriated the dairy and meat industries, but it made Keys rich and universally famous, and it marked a milestone in the history of dietary science. Before Keys, nutritional studies had been directed almost entirely at combating deficiency diseases. Now, people realized that too much nutrition could be as dangerous as too little.

Keys’s findings have come in for some sharp criticism over the years. One commonly heard complaint is that Keys focused on countries that supported his thesis and ignored those that did not. The French, for example, eat more cheese and drink more wine than almost anybody else on Earth and yet have some of the lowest rates of heart disease. This “French paradox,” as it is known, led Keys to exclude France from the study because it didn’t fit with his findings, critics have claimed. “When Keys didn’t like data,” says Lieberman, “he just eliminated them. By today’s standards he would have been accused and fired for scientific misconduct.”

Keys’s defenders have argued, however, that the French dietary anomaly wasn’t widely noted outside France until 1981, so Keys wouldn’t have known to include it. Whatever else anyone concludes, Keys surely deserves credit for drawing attention to the role of diet in maintaining heart health. And it must be said it did him no harm. Keys devoted himself to a Mediterranean-style diet long before anyone had heard of the term and lived to be a hundred. (He died in 2004.)

Keys’s findings have had a lasting effect on dietary recommendations. The official guidance in most countries is that fats should account for no more than 30 percent of a person’s daily diet, and saturated fats no more than 10 percent. The American Heart Association puts it even lower at 7 percent.

Now, however, we are not quite so sure how solid that advice is. In 2010, two large studies (in The American Journal of Clinical Nutrition and the Annals of Internal Medicine) involving almost a million people in eighteen countries concluded that there was no clear evidence that avoiding saturated fat reduced the risk of heart disease. A similar and more recent study in the British medical journal The Lancet in 2017 found that fat was “not associated with cardiovascular diseases, myocardial infarction, or cardiovascular disease mortality” and that dietary guidelines consequently needed to be readdressed. Both conclusions have been heatedly disputed by some academics.

The problem with all dietary studies is that people eat foods that have oils, fats, good and bad cholesterol, sugars, salts, and chemicals of every description all mixed together in ways that make it impossible to attribute any particular outcome to any one input, and that is not to mention all the other factors that affect health: exercise, drinking habits, where you carry fat on your body, genetics, and much more.

These days the most frequently cited culprit for dietary concern is sugar. It has been linked to a lot of horrible diseases, notably diabetes, and there is no question that most of us take in way more sugar than we need. The average American consumes twenty-two teaspoons of added sugar a day. For young American men, it’s closer to forty. The World Health Organization recommends a maximum of five.

It doesn’t take much to go over the limit. A single standard-sized can of soda pop contains about 50 percent more sugar than the daily recommended maximum for an adult. One-fifth of all young people in America consume five hundred calories or more a day from soft drinks, which is all the more arresting when you realize that sugar isn’t actually very high in calories—just sixteen per teaspoon. You have to take in a lot of sugar to get a lot of calories. The problem is that we do take in a lot, more or less all the time, often when we don’t even know it. For one thing, nearly all processed foods include added sugar.

By one estimate, about half the sugar we consume is lurking in foods where we are not even aware of it—in breads, salad dressings, spaghetti sauces, ketchup, and other processed foods that don’t normally strike us as sugary. Altogether about 80 percent of the processed foods we eat contain added sugars. Heinz ketchup is almost one-quarter sugar. It has more sugar per unit of volume than Coca-Cola.

Complicating matters is that there is also a lot of sugar in the good stuff we eat. Your liver doesn’t know whether the sugar you consume comes from an apple or a candy bar. A sixteen-ounce bottle of Pepsi has about thirteen teaspoons of added sugar and no nutritive value at all. Three apples would give you just as much sugar but compensate by also giving you vitamins, minerals, and fiber, not to mention a greater feeling of satiation. That said, even the apples are a lot sweeter than they really need to be. As Lieberman has noted, modern fruits have been selectively bred to be vastly more sugary than they once were. The fruits that Shakespeare ate were, for the most part, probably no sweeter than the modern carrot.

Many of our fruits and vegetables are nutritionally less good for us than they were even in the fairly recent past. Donald Davis, a biochemist at the University of Texas, in 2011 compared the nutritive values of various foods in 1950 with those of our own era and found substantial drops in almost every type. Modern fruits, for instance, are almost 50 percent poorer in iron than they were in the early 1950s, and about 12 percent down in calcium and 15 percent in vitamin A. Modern agricultural practices, it turns out, focus on high yields and rapid growth at the expense of quality.

In the United States, we are left in the bizarre and paradoxical situation that we are essentially the world’s most overfed nation but also one of its most nutritionally deficient ones.

It is hard to know what to make of any of this. According to the Statistical Abstract of the United States, the amount of vegetables eaten by the average American between 2000 and 2010 dropped by thirty pounds. That seems an alarming decline until you realize that the most popular vegetable in America by a very wide margin is the French fry. (It accounts for a quarter of our entire vegetable intake.) These days, eating thirty pounds less “vegetables” may well be a sign of an improved diet.

A striking marker of just how confused nutrition advice can be was a finding by an advisory committee for the American Heart Association that 37 percent of American nutritionists rate coconut oil—which is essentially nothing but saturated fat in liquid form—as a “healthy food.” Coconut oil may be tasty, but it is no better for you than a big scoop of deep-fried butter.

“It is,” says Lieberman, “a reflection of how deficient dietary education can be. People just aren’t always getting the facts. It’s possible for doctors to go through medical school without being taught nutrition. That’s crazy.”

Perhaps nothing is more emblematic of the unsettled state of knowledge on the modern diet than the long and unresolved controversy over salt. Salt is vital to us. There is no question of that. We would die without it. That’s why we have taste buds devoted exclusively to it. Lack of salt is nearly as dangerous to us as lack of water. Because our bodies cannot produce salt, we must consume it in our diets. The problem is in determining how much is the right amount. Take too little and you grow lethargic and weak, and eventually you die. Take too much and your blood pressure soars and you run the risk of heart failure and stroke.

The average American consumes about 3,400 milligrams of sodium a day. It is very difficult not to. A lightish lunch of soup and a sandwich, none of it conspicuously salty, can easily push you over the limit. Many authorities believe that 3,400 milligrams is way too much and that it vastly increases the risk of heart attack and stroke. The World Health Organization suggests that we consume no more than 2,000 milligrams of sodium a day. But other authorities say that reducing sodium intake to that level has no proven health benefit and may actually be harmful.

One study in Britain estimated that as many as 30,000 people a year died in the U.K. from consuming too much salt over too long a period, but another study concluded that salt did no harm to anyone except for those with elevated blood pressure, and yet another concluded that people who ate a lot of salt actually lived longer. A meta-analysis at McMaster University in Canada of 133,000 people in four dozen countries found a link between high salt intake and heart problems only for those with existing hypertension, while low salt intake (less than three thousand milligrams a day) had an increased risk of heart problems for people from both groups. In other words, according to the McMaster study, too little salt is at least as risky as too much.

It is easy to make risk sound scary. It is often written that eating a daily helping of processed meat increases your risk of colorectal cancer by 18 percent, which is doubtless true. But as Julia Belluz of Vox has pointed out, “A person’s lifetime risk of colorectal cancer is about 5 percent, and eating processed meat every day appears to boost a person’s absolute risk of cancer by 1 percentage point, to 6 percent (that’s 18 percent of the 5 percent lifetime risk).” So, put another way, if a hundred people eat a hot dog or bacon sandwich every day, over the course of a lifetime one of them will get colorectal cancer (in addition to the five who would have gotten it anyway). That’s not a risk you may want to take, but it’s not a death sentence.

It is important to distinguish between probability and destiny. Just because you are obese or a smoker or couch potato doesn’t mean you are doomed to die before your time, or that if you follow an ascetic regime you will avoid peril. Roughly 40 percent of people with diabetes, chronic hypertension, or cardiovascular disease were fit as a fiddle before they got ill, and roughly 20 percent of people who are severely overweight live to a ripe old age without ever doing anything about it. Just because you exercise regularly and eat a lot of salad doesn’t mean you have bought yourself a better life span. What you have bought is a better chance of having a better life span.

The most prudent option, it seems, is to have a balanced and moderate diet. A sensible approach is, in short, the sensible approach.

3
Science — Carbon Dioxide and Photosynthesis

30 minutes

Part 1: The Cycle (~10 min)

Every time you exhale, you release carbon dioxide. Every green plant around you is pulling that CO₂ out of the air and using it to make food for itself. This is photosynthesis—and it’s essentially the reason there’s oxygen for you to breathe in the first place.

Here’s the basic equation:

Carbon dioxide + Water + Light energy → Glucose + Oxygen

In your own words, explain what this means. Don’t just restate the formula—describe what’s actually happening. Where does each ingredient come from? Where does each product go? What’s the “point” of the whole process from the plant’s perspective?

Part 2: The Other Direction (~10 min)

Photosynthesis pulls CO₂ out of the atmosphere. But CO₂ also gets put back in—by animals breathing, by fires burning, by cars running, by decomposition.

Think of this as a balance. Draw or describe a simple diagram showing CO₂ moving back and forth between the atmosphere and living things. Include at least four sources that add CO₂ to the air and at least two processes that remove it. Then answer: what happens to the balance if one side starts outpacing the other? What real-world situation does that describe?

Part 3: One Plant, One Day (~10 min)

Pick a specific plant—a tree outside, a houseplant, a crop plant, whatever you want. Write a paragraph describing what that plant does over the course of a single sunny day, from the perspective of the chemistry you just learned. Where is it getting its CO₂? What’s it doing with the glucose it produces? What happens when the sun goes down—does photosynthesis just stop? (Look that last one up if you’re not sure—the answer is interesting.)

4
Math — Greek Logic and the Art of Proof

30 minutes

Part 1: Deductive Reasoning (~10 min)

Aristotle formalized something called a syllogism—a chain of reasoning where if the first two statements are true, the conclusion must be true. The classic example:

All men are mortal. Socrates is a man. Therefore, Socrates is mortal.

That one’s easy. Try these—for each, decide whether the conclusion is logically valid (meaning it must follow from the premises), and if it isn’t, explain why not:

  1. All birds have wings. Penguins are birds. Therefore, penguins have wings.
  2. All cats have tails. My dog has a tail. Therefore, my dog is a cat.
  3. No fish can breathe air. A whale breathes air. Therefore, a whale is not a fish.
  4. Some athletes are tall. Marcus is tall. Therefore, Marcus is an athlete.
  5. All squares are rectangles. All rectangles have four sides. Therefore, all squares have four sides.

Now write one valid syllogism and one invalid syllogism of your own. Try to make the invalid one tricky—something that sounds right but isn’t.

Part 2: Proof by Contradiction (~10 min)

The Greeks loved a technique called reductio ad absurdum—“reduction to absurdity.” The idea: if you want to prove something is true, assume the opposite and show that it leads to something impossible.

Here’s a famous example. Euclid wanted to prove that there are infinitely many prime numbers. So he assumed the opposite—suppose there are only finitely many primes, and you could make a complete list of all of them. Now multiply every prime on your list together and add 1. The resulting number isn’t divisible by any prime on your list (because it always has a remainder of 1). So either this new number is itself prime, or it’s divisible by some prime you missed. Either way, your “complete list” wasn’t complete. Contradiction. Therefore, there must be infinitely many primes.

In your own words, explain why the “add 1” step is the key to the whole argument. What would go wrong if Euclid just multiplied all the primes together without adding 1?

Part 3: The Limits of Logic (~10 min)

Here’s a puzzle the Greeks argued about and never fully resolved. It’s called the Liar’s Paradox:

“This statement is false.”

If it’s true, then what it says must be the case—but it says it’s false. If it’s false, then the opposite of what it says must be true—which means it’s true. It can’t be either.

Write a paragraph about why this is a problem for a system built on logic. Does every statement have to be either true or false? Can you think of any other statements or situations—doesn’t have to be math—where something seems to be both true and not true at the same time, or where the categories just don’t work?

5
Integrated Tech — Audit Your Own Digital Security

30 minutes

Part 1: Password Check (~10 min)

Pick three accounts you actually use (email, gaming, school, whatever). For each one, ask yourself: Am I reusing this password somewhere else? Is it something someone could guess—a pet’s name, a birthday, “password123”? Is two-factor authentication available, and if so, is it turned on?

Then do something about it. Change at least one weak or reused password. If 2FA is available and off, turn it on for at least one account. If you don’t use a password manager, spend a few minutes finding out what one is and how it works.

Part 2: Permissions Audit (~10 min)

Go to the privacy or permissions settings on your phone or main device. Look at which apps have access to your location, camera, microphone, contacts, and photos. For each one, ask: does this app actually need this to do what I use it for?

Revoke at least two permissions that don’t make sense. Delete at least one app you no longer use that still has broad access to your device.

Part 3: Reflection (~10 min)

Write a solid paragraph (5–10 sentences). What did you find when you looked at your own security? Were there things you expected, or anything that surprised you? What specific changes did you make, and why? If you were going to explain one thing you learned today to someone less tech-savvy—a grandparent, a younger sibling—what would it be and how would you explain it?