A surprising food may have been a staple of the real Paleo diet: rotten meat

In a book about his travels in Africa published in 1907, British explorer Arnold Henry Savage Landor recounted witnessing an impromptu meal that his companions relished but that he found unimaginably revolting.

As he coasted down a river in the Congo Basin with several local hunter-gatherers, a dead rodent floated near their canoe. Its decomposing body had bloated to the size of a small pig.

Stench from the swollen corpse left Landor gasping for breath. Unable to speak, he tried to signal his companions to steer the canoe away from the fetid creature. Instead, they hauled the supersize rodent aboard and ate it.
“The odour when they dug their knives into it was enough to kill the strongest of men,” Landor wrote. “When I recovered, my admiration for the digestive powers of these people was intense. They were smacking their lips and they said the [rodent] had provided most excellent eating.”

Starting in the 1500s, European and then later American explorers, traders, missionaries, government officials and others who lived among Indigenous peoples in many parts of the world wrote of similar food practices. Hunter-gatherers and small-scale farmers everywhere commonly ate putrid meat, fish and fatty parts of a wide range of animals. From arctic tundra to tropical rainforests, native populations consumed rotten remains, either raw, fermented or cooked just enough to singe off fur and create a more chewable texture. Many groups treated maggots as a meaty bonus.

Descriptions of these practices, which still occur in some present-day Indigenous groups and among northern Europeans who occasionally eat fermented fish, aren’t likely to inspire any new Food Network shows or cookbooks from celebrity chefs.

Case in point: Some Indigenous communities feasted on huge decomposing beasts, including hippos that had been trapped in dug-out pits in Africa and beached whales on Australia’s coast. Hunters in those groups typically smeared themselves with the fat of the animal before gorging on greasy innards. After slicing open animals’ midsections, both adults and children climbed into massive, rotting body cavities to remove meat and fat.

Or consider that Native Americans in Missouri in the late 1800s made a prized soup from the greenish, decaying flesh of dead bison. Animal bodies were buried whole in winter and unearthed in spring after ripening enough to achieve peak tastiness.

But such accounts provide a valuable window into a way of life that existed long before Western industrialization and the war against germs went global, says anthropological archaeologist John Speth of the University of Michigan in Ann Arbor. Intriguingly, no reports of botulism and other potentially fatal reactions to microorganisms festering in rotting meat appear in writings about Indigenous groups before the early 1900s. Instead, decayed flesh and fat represented valued and tasty parts of a healthy diet.
Many travelers such as Landor considered such eating habits to be “disgusting.” But “a gold mine of ethnohistorical accounts makes it clear that the revulsion Westerners feel toward putrid meat and maggots is not hardwired in our genome but is instead culturally learned,” Speth says.

This dietary revelation also challenges an influential scientific idea that cooking originated among our ancient relatives as a way to make meat more digestible, thus providing a rich calorie source for brain growth in the Homo genus. It’s possible, Speth argues, that Stone Age hominids such as Neandertals first used cooking for certain plants that, when heated, provided an energy-boosting, carbohydrate punch to the diet. Animals held packets of fat and protein that, after decay set in, rounded out nutritional needs without needing to be heated.
Putrid foods in the diets of Indigenous peoples
Speth’s curiosity about a human taste for putrid meat was originally piqued by present-day hunter-gatherers in polar regions. North American Inuit, Siberians and other far-north populations still regularly eat fermented or rotten meat and fish.

Fermented fish heads, also known as “stinkhead,” are one popular munchy among northern groups. Chukchi herders in the Russian Far East, for instance, bury whole fish in the ground in early fall and let the bodies naturally ferment during periods of freezing and thawing. Fish heads the consistency of hard ice cream are then unearthed and eaten whole.

Speth has suspected for several decades that consumption of fermented and putrid meat, fish, fat and internal organs has a long and probably ancient history among northern Indigenous groups. Consulting mainly online sources such as Google Scholar and universities’ digital library catalogs, he found many ethnohistorical descriptions of such behavior going back to the 1500s. Putrid walrus, seals, caribou, reindeer, musk oxen, polar bears, moose, arctic hares and ptarmigans had all been fair game. Speth reported much of this evidence in 2017 in PaleoAnthropology.

In one recorded incident from late-1800s Greenland, a well-intentioned hunter brought what he had claimed in advance was excellent food to a team led by American explorer Robert Peary. A stench filled the air as the hunter approached Peary’s vessel carrying a rotting seal dripping with maggots. The Greenlander had found the seal where a local group had buried it, possibly a couple of years earlier, so that the body could reach a state of tasty decomposition. Peary ordered the man to keep the reeking seal off his boat.

Miffed at this unexpected rejection, the hunter “told us that the more decayed the seal the finer the eating, and he could not understand why we should object,” Peary’s wife wrote of the encounter.

Even in temperate and tropical areas, where animal bodies decompose within hours or days, Indigenous peoples have appreciated rot as much as Peary’s seal-delivery man did. Speth and anthropological archaeologist Eugène Morin of Trent University in Peterborough, Canada, described some of those obscure ethnohistorical accounts last October in PaleoAnthropology.
Early hominids may have scavenged rotten meat
These accounts undermine some of scientists’ food-related sacred cows, Speth says. For instance, European explorers and other travelers consistently wrote that traditional groups not only ate putrid meat raw or lightly cooked but suffered no ill aftereffects. A protective gut microbiome may explain why, Speth suspects. Indigenous peoples encountered a variety of microorganisms from infancy on, unlike people today who grow up in sanitized settings. Early exposures to pathogens may have prompted the development of an array of gut microbes and immune responses that protected against potential harms of ingesting putrid meat.

That idea requires further investigation; little is known about the bacterial makeup of rotten meat eaten by traditional groups or of their gut microbiomes. But studies conducted over the last few decades do indicate that putrefaction, the process of decay, offers many of cooking’s nutritional benefits with far less effort. Putrefaction predigests meat and fish, softening the flesh and chemically breaking down proteins and fats so they are more easily absorbed and converted to energy by the body.

Given the ethnohistorical evidence, hominids living 3 million years ago or more could have scavenged meat from decomposing carcasses, even without stone tools for hunting or butchery, and eaten their raw haul safely long before fire was used for cooking, Speth contends. If simple stone tools appeared as early as 3.4 million years ago, as some researchers have controversially suggested, those implements may have been made by hominids seeking raw meat and marrow (SN: 9/11/10, p. 8). Researchers suspect regular use of fire for cooking, light and warmth emerged no earlier than around 400,000 years ago (SN: 5/5/12, p. 18).

“Recognizing that eating rotten meat is possible, even without fire, highlights how easy it would have been to incorporate scavenged food into the diet long before our ancestors learned to hunt or process [meat] with stone tools,” says paleoanthropologist Jessica Thompson of Yale University.

Thompson and colleagues suggested in Current Anthropology in 2019 that before about 2 million years ago, hominids were primarily scavengers who used rocks to smash open animal bones and eat nutritious, fat-rich marrow and brains. That conclusion, stemming from a review of fossil and archaeological evidence, challenged a common assumption that early hominids — whether as hunters or scavengers — primarily ate meat off the bone.

Certainly, ancient hominids were eating more than just the meaty steaks we think of today, says archaeologist Manuel Domínguez-Rodrigo of Rice University in Houston. In East Africa’s Olduvai Gorge, butchered animal bones at sites dating to nearly 2 million years ago indicate that hominids ate most parts of carcasses, including brains and internal organs.

“But Speth’s argument about eating putrid carcasses is very speculative and untestable,” Domínguez-Rodrigo says.

Untangling whether ancient hominids truly had a taste for rot will require research that spans many fields, including microbiology, genetics and food science, Speth says.

But if his contention holds up, it suggests that ancient cooks were not turning out meat dishes. Instead, Speth speculates, cooking’s primary value at first lay in making starchy and oily plants softer, more chewable and easily digestible. Edible plants contain carbohydrates, sugar molecules that can be converted to energy in the body. Heating over a fire converts starch in tubers and other plants to glucose, a vital energy source for the body and brain. Crushing or grinding of plants might have yielded at least some of those energy benefits to hungry hominids who lacked the ability to light fires.

Whether hominids controlled fire well enough to cook plants or any other food regularly before around 400,000 to 300,000 years ago is unknown.
Neandertals may have hunted animals for fat
Despite their nutritional benefits, plants often get viewed as secondary menu items for Stone Age folks. It doesn’t help that plants preserve poorly at archaeological sites.

Neandertals, in particular, have a long-standing reputation as plant shunners. Popular opinion views Neandertals as burly, shaggy individuals who huddled around fires chomping on mammoth steaks.

That’s not far from an influential scientific view of what Neandertals ate. Elevated levels of a diet-related form of nitrogen in Neandertal bones and teeth hint that they were committed carnivores, eating large amounts of protein-rich lean meat, several research teams have concluded over nearly the last 30 years.

But consuming that much protein from meat, especially from cuts above the front and hind limbs now referred to as steaks, would have been a recipe for nutritional disaster, Speth argues. Meat from wild, hoofed animals and smaller creatures such as rabbits contains almost no fat, or marbling, unlike meat from modern domestic animals, he says. Ethnohistorical accounts, especially for northern hunters including the Inuit, include warnings about weight loss, ill health and even death that can result from eating too much lean meat.

This form of malnutrition is known as rabbit starvation. Evidence indicates that people can safely consume between about 25 and 35 percent of daily calories as protein, Speth says. Above that threshold, several investigations have indicated that the liver becomes unable to break down chemical wastes from ingested proteins, which then accumulate in the blood and contribute to rabbit starvation. Limits to the amount of daily protein that can be safely consumed meant that ancient hunting groups, like those today, needed animal fats and carbohydrates from plants to fulfill daily calorie and other nutritional needs.

Modern “Paleo diets” emphasize eating lean meats, fruits and vegetables. But that omits what past and present Indigenous peoples most wanted from animal carcasses. Accounts describe Inuit people eating much larger amounts of fatty body parts than lean meat, Speth says. Over the last few centuries, they have favored tongue, fat deposits, brisket, ribs, fatty tissue around intestines and internal organs, and marrow. Internal organs, especially adrenal glands, have provided vitamin C — nearly absent in lean muscle — that prevented anemia and other symptoms of scurvy.

Western explorers noted that the Inuit also ate chyme, the stomach contents of reindeer and other plant-eating animals. Chyme provided at least a side course of plant carbohydrates. Likewise, Neandertals in Ice Age Europe probably subsisted on a fat- and chyme-supplemented diet (SN Online: 10/11/13), Speth contends.

Large numbers of animal bones found at northern European Neandertal sites — often viewed as the residue of ravenous meat eaters — may instead reflect overhunting of animals to obtain enough fat to meet daily calorie needs. Because wild game typically has a small percentage of body fat, northern hunting groups today and over the last few centuries frequently killed prey in large numbers, either discarding most lean meat from carcasses or feeding it to their dogs, ethnographic studies show.

If Neandertals followed that playbook, eating putrid foods might explain why their bones carry a carnivore-like nitrogen signature, Speth suggests. An unpublished study of decomposing human bodies kept at a University of Tennessee research facility in Knoxville called the Body Farm tested that possibility. Biological anthropologist Melanie Beasley, now at Purdue University in West Lafayette, Ind., found moderately elevated tissue nitrogen levels in 10 deceased bodies sampled regularly for about six months. Tissue from those bodies served as a stand-in for animal meat consumed by Neandertals. Human flesh is an imperfect substitute for, say, reindeer or elephant carcasses. But Beasley’s findings suggest that decomposition’s effects on a range of animals need to be studied. Intriguingly, she also found that maggots in the decaying tissue displayed extremely elevated nitrogen levels.

Paleobiologist Kimberly Foecke of George Washington University in Washington, D.C., has also found high nitrogen levels in rotting, maggot-free cuts of beef from animals fed no hormones or antibiotics to approximate the diets of Stone Age creatures (SN: 1/2/19).

Like arctic hunters did a few hundred years ago, Neandertals may have eaten putrid meat and fish studded with maggots, Speth says. That would explain elevated nitrogen levels in Neandertal fossils.

But Neandertal dining habits are poorly understood. Unusually extensive evidence of Neandertal big-game consumption has come from a new analysis of fossil remains at a roughly 125,000-year-old site in northern Germany called Neumark-Nord. There, Neandertals periodically hunted straight-tusked elephants weighing up to 13 metric tons, say archaeologist Sabine Gaudzinski-Windheuser of Johannes Gutenberg University of Mainz in Germany and colleagues.

In a study reported February 1 in Science Advances, her group analyzed patterns of stone-tool incisions on bones of at least 57 elephants from 27 spots near an ancient lake basin where Neandertals lit campfires and constructed shelters (SN: 1/29/22, p. 8). Evidence suggests that Neandertal butchers — much like Inuit hunters — removed fat deposits under the skin and fatty body parts such as the tongue, internal organs, brain and thick layers of fat in the feet. Lean meat from elephants would have been eaten in smaller quantities to avoid rabbit starvation, the researchers argue.

Further research needs to examine whether the Neandertals cooked elephant meat or boiled the bones to extract nutritious grease, Speth says. Mealtime options would have expanded for hominids who could not only consume putrid meat and fat but also heat animal parts over fires, he suspects.

Neandertals who hunted elephants must also have eaten a variety of plants to meet their considerable energy requirements, says Gaudzinski-Windheuser. But so far, only fragments of burned hazelnuts, acorns and blackthorn plums have been found at Neumark-Nord.
Neandertals probably carb-loaded
Better evidence of Neandertals’ plant preferences comes from sites in warm Mediterranean and Middle Eastern settings. At a site in coastal Spain, Neandertals probably ate fruits, nuts and seeds of a variety of plants (SN: 3/27/21, p. 32).

Neandertals in a range of environments must have consumed lots of starchy plants, argues archaeologist Karen Hardy of the University of Glasgow in Scotland. Even Stone Age northern European and Asian regions included plants with starch-rich appendages that grew underground, such as tubers.

Neandertals could also have obtained starchy carbs from the edible, inner bark of many trees and from seaweed along coastlines. Cooking, as suggested by Speth, would have greatly increased the nutritional value of plants, Hardy says. Not so for rotten meat and fat, though Neandertals such as those at Neumark-Nord may have cooked what they gleaned from fresh elephant remains.

There is direct evidence that Neandertals munched on plants. Microscopic remnants of edible and medicinal plants have been found in the tartar on Neandertal teeth (SN: 4/1/17, p. 16), Hardy says.

Carbohydrate-fueled energy helped to maintain large brains, enable strenuous physical activity and ensure healthy pregnancies for both Neandertals and ancient Homo sapiens, Hardy concludes in the January 2022 Journal of Human Evolution. (Researchers disagree over whether Neandertals, which lived from around 400,000 to 40,000 years ago, were a variant of H. sapiens or a separate species.)
Paleo cuisine was tasty
Like Hardy, Speth suspects that plants provided a large share of the energy and nutrients Stone Age folks needed. Plants represented a more predictable, readily available food source than hunted or scavenged meat and fat, he contends.

Plants also offered Neandertals and ancient H. sapiens — whose diets probably didn’t differ dramatically from Neandertals’, Hardy says — a chance to stretch their taste buds and cook up tangy meals.

Paleolithic plant cooking included preplanned steps aimed at adding dashes of specific flavors to basic dishes, a recent investigation suggests. In at least some places, Stone Age people apparently cooked to experience pleasing tastes and not just to fill their stomachs. Charred plant food fragments from Shanidar Cave in Iraqi Kurdistan and Franchthi Cave in Greece consisted of crushed pulse seeds, possibly from starchy pea species, combined with wild plants that would have provided a pungent, somewhat bitter taste, microscopic analyses show.

Added ingredients included wild mustard, wild almonds, wild pistachio and fruits such as hackberry, archaeobotanist Ceren Kabukcu of the University of Liverpool in England and colleagues reported last November in Antiquity.

Four Shanidar food bits date to about 40,000 years ago or more and originated in sediment that included stone tools attributed to H. sapiens. Another food fragment, likely from a cooked Neandertal meal, dates to between 70,000 and 75,000 years ago. Neandertal fossils found in Shanidar Cave are also about 70,000 years old. So it appears that Shanidar Neandertals spiced up cooked plant foods before Shanidar H. sapiens did, Kabukcu says.

Franchthi food remains date to between 13,100 and 11,400 years ago, when H. sapiens lived there. Wild pulses in food from both caves display microscopic signs of having been soaked, a way to dilute poisons in seeds and moderate their bitterness.

These new findings “suggest that cuisine, or the combination of different ingredients for pleasure, has a very long history indeed,” says Hardy, who was not part of Kabukcu’s team.

There’s a hefty dollop of irony in the possibility that original Paleo diets mixed what people in many societies today regard as gross-sounding portions of putrid meat and fat with vegetarian dishes that still seem appealing.

Add beer to the list of foods threatened by climate change

Beer lovers could be left with a sour taste, thanks to the latest in a series of studies mapping the effects of climate change on crops.

Malted barley — a key ingredient in beer including IPAs, stouts and pilsners — is particularly sensitive to warmer temperatures and drought, both of which are likely to increase due to climate change. As a result, average global barley crop yields could drop as much as 17 percent by 2099, compared with the average yield from 1981 to 2010, under the more extreme climate change projections, researchers report October 15 in Nature Plants.
That decline “could lead to, on average, a doubling of price in some countries,” says coauthor Steven Davis, an Earth systems scientist at University of California, Irvine. Consumption would also drop globally by an average of 16 percent, or roughly what people in the United States consumed in 2011.

The results are based on computer simulations projecting climate conditions, plant responses and global market reactions up to the year 2099. Under the mildest climate change predictions, world average barley yields would still go down by at least 3 percent, and average prices would increase about 15 percent, the study says.

Other crops such as maize, wheat and soy and wine grapes are also threatened by the global rising of average atmospheric temperatures as well as by pests emboldened by erratic weather (SN: 2/8/14, p. 3). But there’s still hope for ale aficionados. The study did not account for technological innovations or genetic tweaks that could spare the crop, Davis says.

310-million-year-old fossil blobs might not be jellyfish after all

What do you get when you flip a fossilized “jellyfish” upside down? The answer, it turns out, might be an anemone.

Fossil blobs once thought to be ancient jellyfish were actually a type of burrowing sea anemone, scientists propose March 8 in Papers in Palaeontology.

From a certain angle, the fossils’ features include what appears to be a smooth bell shape, perhaps with tentacles hanging beneath — like a jellyfish. And for more than 50 years, that’s what many scientists thought the animals were.
But for paleontologist Roy Plotnick, something about the fossils’ supposed identity seemed fishy. “It’s always kind of bothered me,” says Plotnick, of the University of Illinois Chicago. Previous scientists had interpreted one fossil feature as a curtain that hung around the jellies’ tentacles. But that didn’t make much sense, Plotnick says. “No jellyfish has that,” he says. “How would it swim?”

One day, looking over specimens at the Field Museum in Chicago, something in Plotnick’s mind clicked. What if the bell belonged on the bottom, not the top? He turned to a colleague and said, “I think this is an anemone.”

Rotated 180 degrees, Plotnick realized, the fossils’ shape — which looks kind of like an elongated pineapple with a stumpy crown — resembles some modern anemones. “It was one of those aha moments,” he says. The “jellyfish” bell might be the anemone’s lower body. And the purported tentacles? Perhaps the anemone’s upper section, a tough, textured barrel protruding from the seafloor.

Plotnick and his colleagues examined thousands of the fossilized animals, dubbed Essexella asherae, unearthing more clues. Bands running through the fossils match the shape of some modern anemones’ musculature. And some specimens’ pointy protrusions resemble an anemone’s contracted tentacles.
“It’s totally possible that these are anemones,” says Estefanía Rodríguez, an anemone expert at the American Museum of Natural History in New York City who was not involved with the work. The shape of the fossils, the comparison with modern-day anemones — it all lines up, she says, though it’s not easy to know for sure.

Paleontologist Thomas Clements agrees. Specimens like Essexella “are some of the most notoriously difficult fossils to identify,” he says. “Jellyfish and anemones are like bags of water. There’s hardly any tissue to them,” meaning there’s little left to fossilize.
Still, it’s plausible that the blobs are indeed fossilized anemones, says Clements, of Friedrich-Alexander-Universität Erlangen-Nürnberg in Germany. He was not part of the new study but has spent several field seasons at Mazon Creek, the Illinois site where Essexella lived some 310 million years ago. Back then, the area was near the shoreline, Clements says, with nearby rivers dumping sediment into the environment – just the kind of place ancient burrowing anemones may have once called home.

The mystery of Christiaan Huygens’ flawed telescopes may have been solved

17th century scientist Christiaan Huygens set his sights on faraway Saturn, but he may have been nearsighted.

Huygens is known, in part, for discovering Saturn’s largest moon, Titan, and deducing the shape of the planet’s rings. But by some accounts, the Dutch scientist’s telescopes produced fuzzier views than others of the time despite having well-crafted lenses.

That may be because Huygens needed glasses, astronomer Alexander Pietrow proposes March 1 in Notes and Records: the Royal Society Journal of the History of Science.
To make his telescopes, Huygens combined two lenses, an objective and an eyepiece, positioned at either end of the telescope. Huygens experimented with different lenses to find combinations that, to his eye, created a sharp image, eventually creating a table to keep track of which combinations to use to obtain a given magnification. But when compared with modern-day knowledge of optics, Huygens’ calculations were a bit off, says Pietrow, of the Leibniz Institute for Astrophysics Potsdam in Germany.

One possible explanation: Huygens selected lenses based on his flawed vision. Historical records indicate that Huygens’ father was nearsighted, so it wouldn’t be surprising if Christiaan Huygens also suffered from the often-hereditary affliction.

Assuming that’s the reason for the mismatch, Pietrow calculates that Huygens had 20/70 vision: What someone with normal vision could read from 70 feet away, Huygens could read only from 20 feet. If so, that could be why Huygens’ telescopes never quite reached their potential.

Bizarre metals may help unlock mysteries of how Earth’s magnetic field forms

Weird materials called Weyl metals might reveal the secrets of how Earth gets its magnetic field.

The substances could generate a dynamo effect, the process by which a swirling, electrically conductive material creates a magnetic field, a team of scientists reports in the Oct. 26 Physical Review Letters.

Dynamos are common in the universe, producing the magnetic fields of the Earth, the sun and other stars and galaxies. But scientists still don’t fully understand the details of how dynamos create magnetic fields. And, unfortunately, making a dynamo in the lab is no easy task, requiring researchers to rapidly spin giant tanks of a liquefied metal, such as sodium (SN: 5/18/13, p. 26).
First discovered in 2015, Weyl metals are topological materials, meaning that their behavior is governed by a branch of mathematics called topology, the study of shapes like doughnuts and knots (SN: 8/22/15, p. 11). Electrons in Weyl metals move around in bizarre ways, behaving as if they are massless.

Within these materials, the researchers discovered, electrons are subject to the same set of equations that describes the behavior of liquids known to form dynamos, such as molten iron in the Earth’s outer core. The team’s calculations suggest that, under the right conditions, it should be possible to make a dynamo from solid Weyl metals.

It might be easier to create such dynamos in the lab, as they don’t require large quantities of swirling liquid metals. Instead, the electrons in a small chunk of Weyl metal could flow like a fluid, taking the place of the liquid metal.
The result is still theoretical. But if the idea works, scientists may be able to use Weyl metals to reproduce the conditions that exist within the Earth, and better understand how its magnetic field forms.

50 years ago, atomic testing created otter refugees

Sea otters restocked in old home

When the [Atomic Energy Commission] first cast its eye on the island of Amchitka as a possible site for the testing of underground nuclear explosions, howls of anguish went up; the island is part of the Aleutians National Wildlife Refuge, created to preserve the colonies of nesting birds and some 2,500 sea otters that live there…— Science News, November 9, 1968

Update
The commission said underground nuclear testing would not harm the otters, but the fears of conservationists were well-founded: A test in 1971 killed more than 900 otters on the Aleutian island.
Some otters remained around Amchitka, but 602 otters were relocated in 1965–1972 to Oregon, southeast Alaska, Washington and British Columbia — areas where hunting had wiped them out. All but the Oregon population thrived, and today more than 25,000 otters live near the coastal shores where once they were extinct.

“They were sitting on the precipice,” says James Bodkin, who is a coastal ecologist at the U.S. Geological Survey. “It’s been a great conservation story.”

Skull damage suggests Neandertals led no more violent lives than humans

Neandertals are shaking off their reputation as head bangers.

Our close evolutionary cousins experienced plenty of head injuries, but no more so than late Stone Age humans did, a study suggests. Rates of fractures and other bone damage in a large sample of Neandertal and ancient Homo sapiens skulls roughly match rates previously reported for human foragers and farmers who have lived within the past 10,000 years, concludes a team led by paleoanthropologist Katerina Harvati of the University of Tübingen in Germany.
Males suffered the bulk of harmful head knocks, whether they were Neandertals or ancient humans, the scientists report online November 14 in Nature.

“Our results suggest that Neandertal lifestyles were not more dangerous than those of early modern Europeans,” Harvati says.

Until recently, researchers depicted Neandertals, who inhabited Europe and Asia between around 400,000 and 40,000 years ago, as especially prone to head injuries. Serious damage to small numbers of Neandertal skulls fueled a view that these hominids led dangerous lives. Proposed causes of Neandertal noggin wounds have included fighting, attacks by cave bears and other carnivores and close-range hunting of large prey animals.

Paleoanthropologist Erik Trinkaus of Washington University in St. Louis coauthored an influential 1995 paper arguing that Neandertals incurred an unusually large number of head and upper-body injuries. Trinkaus recanted that conclusion in 2012, though. All sorts of causes, including accidents and fossilization, could have resulted in Neandertal skull damage observed in relatively small fossil samples, he contended (SN: 5/27/17, p. 13).
Harvati’s study further undercuts the argument that Neandertals engaged in a lot of violent behavior, Trinkaus says.

Still, the idea that Neandertals frequently got their heads bonked during crude, close-up attacks on prey has persisted, says paleoanthropologist David Frayer of the University of Kansas in Lawrence. The new report highlights the harsh reality that, for Neandertals and ancient humans alike, “head trauma, no matter the level of technological or social complexity, or population density, was common.”

Harvati’s group analyzed data for 114 Neandertal skulls and 90 H. sapiens skulls. All of these fossils were found in Eurasia and date to between around 80,000 and 20,000 years ago. One or more head injuries appeared in nine Neandertals and 12 ancient humans. After statistically accounting for individuals’ sex, age at death, geographic locations and state of bone preservation, the investigators estimated comparable levels of skull damage in the two species. Statistical models run by the team indicate that skull injuries affected an average of 4 percent to 33 percent of Neandertals, and 2 percent to 34 percent of ancient humans.

Estimated prevalence ranges that large likely reflect factors that varied from one locality to another, such as resource availability and hunting conditions, the researchers say.

Neandertals with head wounds included more individuals under age 30 than observed among their human counterparts. Neandertals may have suffered more head injuries early in life, the researchers say. It’s also possible that Neandertals died more often from head injuries than Stone Age humans did.

Researchers have yet to establish whether Neandertals experienced especially high levels of damage to body parts other than the head, writes paleoanthropologist Marta Mirazón Lahr of the University of Cambridge in a commentary in Nature accompanying the new study.

50 years ago, researchers discovered a leak in Earth’s oceans

Oceans may be shrinking — Science News, March 10, 1973

The oceans of the world may be gradually shrinking, leaking slowly away into the Earth’s mantle…. Although the oceans are constantly being slowly augmented by water carried up from Earth’s interior by volcanic activity … some process such as sea-floor spreading seems to be letting the water seep away more rapidly than it is replaced.

Update
Scientists traced the ocean’s leak to subduction zones, areas where tectonic plates collide and the heavier of the two sinks into the mantle. It’s still unclear how much water has cycled between the deep ocean and mantle through the ages. A 2019 analysis suggests that sea levels have dropped by an average of up to 130 meters over the last 230 million years, in part due to Pangea’s breakup creating new subduction zones. Meanwhile, molten rock that bubbles up from the mantle as continents drift apart may “rain” water back into the ocean, scientists reported in 2022. But since Earth’s mantle can hold more water as it cools (SN: 6/13/14), the oceans’ mass might shrink by 20 percent every billion years.

Martian soil may have all the nutrients rice needs

THE WOODLANDS, TEXAS — Martian dirt may have all the necessary nutrients for growing rice, one of humankind’s most important foods, planetary scientist Abhilash Ramachandran reported March 13 at the Lunar and Planetary Science Conference. However, the plant may need a bit of help to survive amid perchlorate, a chemical that can be toxic to plants and has been detected on Mars’ surface (SN: 11/18/20).

“We want to send humans to Mars … but we cannot take everything there. It’s going to be expensive,” says Ramachandran, of the University of Arkansas in Fayetteville. Growing rice there would be ideal, because it’s easy to prepare, he says. “You just peel off the husk and start boiling.”
Ramachandran and his colleagues grew rice plants in a Martian soil simulant made of Mojave Desert basalt. They also grew rice in pure potting mix as well as several mixtures of the potting mix and soil simulant. All pots were watered once or twice a day.

Rice plants did grow in the synthetic Mars dirt, the team found. However, the plants developed slighter shoots and wispier roots than the plants that sprouted from the potting mix and hybrid soils. Even replacing just 25 percent of the simulant with potting mix helped heaps, they found.

The researchers also tried growing rice in soil with added perchlorate. They sourced one wild rice variety and two cultivars with a genetic mutation — modified for resilience against environmental stressors like drought — and grew them in Mars-like dirt with and without perchlorate (SN: 9/24/21).

No rice plants grew amid a concentration of 3 grams of perchlorate per kilogram of soil. But when the concentration was just 1 gram per kilogram, one of the mutant lines grew both a shoot and a root, while the wild variety managed to grow a root.

The findings suggest that by tinkering with the successful mutant’s modified gene, SnRK1a, humans might eventually be able to develop a rice cultivar suitable for Mars.

This huge plant eater thrived in the age of dinosaurs — but wasn’t one of them

A new species of hulking ancient herbivore would have overshadowed its relatives.

Fossils found in Poland belong to a new species that roamed during the Late Triassic, a period some 237 million to 201 million years ago, researchers report November 22 in Science. But unlike most of the enormous animals who lived during that time period, this new creature isn’t a dinosaur — it’s a dicynodont.

Dicynodonts are a group of ancient four-legged animals that are related to mammals’ ancestors. They’re a diverse group, but the new species is far larger than any other dicynodont found to date. The elephant-sized creature was more than 4.5 meters long and probably weighed about nine tons, the researchers estimate. Related animals didn’t become that big again until the Eocene, 150 million years later.
“We think it’s one of the most unexpected fossil discoveries from the Triassic of Europe,” says study coauthor Grzegorz Niedzwiedzki, a paleontologist at Uppsala University in Sweden. “Who would have ever thought that there is a fossil record of such a giant, elephant-sized mammal cousin in this part of the world?” He and his team first described some of the bones in 2008; now they’ve made the new species — Lisowicia bojani — official.

The creature had upright forelimbs like today’s rhinoceroses and hippos, instead of the splayed front limbs seen on other Triassic dicynodonts, which were similar to the forelimbs of present-day lizards. That posture would have helped it support its massive bodyweight.