I don't see how that's surprising, at least if you subscribe to the Endurance Running Hypothesis (and I do). Any prey humans used to hunt using what we now know as persistence hunting would have been large game. Especially in the hot, tropic climates early humans seemed to have thrived in, such meat would have started to spoil, even if hunted fresh, before any such hunter would have had the time to bring it back to the family group, let alone consume it. Meats in general have a very short shelf life and if we look to the various independently developed preservation methods in human groups around the world, many of them seem to be centered around a controlled decomposition by "favourable" organisms (like the maggots in the seal story).
There's also the simple observation that no doubt many of us have already made: animals all around us seem to have no issues with partially decomposed meat. Dogs don't seem to mind it, nor do cats, nor do other primates like chimpanzees. It's obvious that at least in our very recent evolutionary history we had far higher tolerance to spoilage.
Why subscribe to the endurance running hypothesis, though? There are no contemporary examples of humans hunting that way, and hominids were already making tools by the time they had developed to subsist on meat. The endurance/persistence hypothesis doesn't even make sense without tools like spears, knives, or arrows, because humans don't kill their prey by biting into them. If they had access to primitive weaponry, why would they waste energy running when they could instead ambush? Even other species like chimps and dolphins, that can't chuck spears, hunt by surrounding their prey. The only reason the hypothesis is even a thing is because people find it confusing that we sweat so much.
If anything, the persistence running hypothesis in some way discredits the idea that hominids relied on rotten meat. If they could catch fresh prey, why eat spoiled meat? Yet our stomach acid has a very low pH, more acidic than even that of cats.
What's more likely, in my opinion, is hominids began to supplement their diet by eating the leftover kills of larger predators as well as any rotting carcasses they may have encountered, and then developed hunting strategies around when they began developing tools. Even having sharp knives made of stone or volcanic glass would have been an obvious advantage in food procurement, and it wouldn't have taken millions of years to figure out that slashing an animal's throat in the right place would take it down quickly. No need for persistence running at all if a group can corner a mammoth or buffalo or lead it to a dead end.
> If they had access to primitive weaponry, why would they waste energy running when they could instead ambush?
Have you ever fought a dog, a deer or a boar? Fighting means wounds, possibly fatal ones. Spear or no spear, you're going to die in one of those fights. Running on the other hand is pretty safe and not that tiring, especially if you've been doing it for years and your weight is close to 50-60kg. Ability to sweat more should be even more advantegous in a warm climate. I sometimes run in 30°C for an hour / an hour and a half. It's hard, but you get used to it. I imagine most animals would just drop by that time.
Would you want to chase animals you wouldn't want to fight? What if they turned around and tried to fight you? You have less tactical advantage than trying to hunt or ambush them with stealth, and if you turn to run maybe you get away but then you've burned up a lot of your extra stamina? Give me the bow and arrow or spears so I can wound the target before /while chasing it, vs just popping out and hoping it ran away from me instead of at me.
Deer and such will not fight unless there are no other options and it has the necessary energy and verve. Apart from immediate physical entrapment, they do not realize that they likely have no options in the long run (pun fully intended). So they run and keep running until they fall into the trough of energy exhaustion.
The first weapons would have been far more basic and require closer proximity than that of a bow and arrow or a throwing spear. So, to your point, prudent early humans would have targeted only the weak, sickly or fatigued. A hot and extended hunt would inevitably result in fatigue, so early humans would have learned the tactic naturally.
This is an unusually poorly written Wikipedia article with a very weird style and a lot of unsourced statements. I would be very hesitant to trust it without extensively reading the sources.
In her book "The Old Way", Elizabeth Marshall Thomas describes the Kalahari San who she lived with in the 1950s. Despite having bows and arrows some members of that group would persistence hunt.
Dmitriy Lykov(1940-1981) of the Lykov family living an isolated life in Siberia developed a method of persistence hunting, despite or perhaps because of, the absence of any sort of hunting culture in his tiny, isolated group.
Those are just two very recent examples off the top of my head of people persistence hunting for survival. Maybe you're out of your depth on this subject.
In addition to the direct persistence hunting, also note that most bow and arrow hunting or even spear hunting is, in and of itself, a form of persistence hunting. The expectation that an arrow will take down the animal where it is hit is usually not how it would have worked with more primitive bows except with very small game. The animal would still have to be followed and possibly struck more times to actually bring it down. Bull fighting is a modern extension of this, though in an enclosed space.
>If they had access to primitive weaponry, why would they waste energy running when they could instead ambush?
This is the main point why I don't believe in the endurance running hypothesis. Humans are lazy. Why would they opt to run for days over just waiting at the local watering hole for prey?
I don't think you realize how easy it is if you know how to track and jog for 2-3 hours for antelope, even less for cats. All you need is a club. 'Running' isn't necessary.
Plenty of humans “run for days” for pure enjoyment. Not to say the ancients would be so foolish when energy expenditure is a concern. But run for days, for a purpose, with companions, releasing endorphins throughout, and then topping that off with a nice, rewarding meal full of fat and protein? I don’t see why not.
It’s not easy to stalk animals built to spot predators — especially in the terrain where I’ve seen modern persistence hunters.
Many watering hole types of animals also don’t just fall right over after being struck, particularly with a “rudimentary” weapon. Even if you get a decent shot off, you will still most likely be chasing it for quite a while. Not to mention the dangers of what’s hanging around, looking to take what you (hopefully) managed to score.
this somehow always comes up when anybody mentions human sweating or how good we are walking large distances. But whoever thinks it's feasible to hunt any animal worth its calories just jogging behind it has never chased a playful dog. A furry mammal with no sweat glands has way more endurance than any fit man and let's not talk about top speed. You'd never catch it.
"But when it comes to long distances, humans can outrun almost any animal. Because we cool by sweating rather than panting, we can stay cool at speeds and distances that would overheat other animals. On a hot day, the two scientists wrote, a human could even outrun a horse in a 26.2-mile marathon."
And we know because there are a number of "human versus horse" races over long distances that are occasionally won by humans. Generally the longer the race the more often humans win.
what other animals? I used the dog example because it fits perfectly here: they don't sweat yet it's impossible to catch up to them if they are running away. Even if they overheat they don't just drop dead, they slow down to a human's walking pace.
And what about running animals that do sweat like horses? makes no sense. What makes more sense is to approach from multiple fronts and spear/arrow it down. Plus, all remaining hunter-gatherers hunt with the help of dogs. Sure, you do have to run, a lot. But the endurance side is the least important.
You seem very confident that a human can't catch a dog on a hot day. I hate to be that person, but do you have a source for that?
I've gone running with plenty of different dogs in temperatures of above 30 degrees over the years. Smaller or stockier dogs (like a staffy) fade very fast. Within a couple of kms. Working breeds (retrievers/poodles/kelpies/collies) will leave you in their dust but this is why the endurance hunting hypothesis relies on efficient tracking while running. If they're running on the lead, especially when it's a bit hotter, they seem to lose it after about 10kms. On a cooler day these same dogs would leave me dead.
of course I'm not talking about pugs. I mean dogs, average dogs, as a model for wild animals our hunter gatherer ancestors might have hunted long time ago.
Again I'm not saying it's "literally" impossible. Just not feasible. Actually, chasing an animal worth hunting to exhaustion sounds more like a feat of strength someone would do as a rite of passage. Something you could do once in a lifetime, rather than a reliable source of food.
As far as I'm aware, every land animal on Earth. (In hot weather. It's very different in the cold. Might also be different if it's very humid enough, so sweating doesn't work.)
> I used the dog example because it fits perfectly here: they don't sweat yet it's impossible to catch up to them if they are running away. Even if they overheat they don't just drop dead, they slow down to a human's walking pace.
For an hour or two, sure. Now try doing that for twelve solid hours, on a hot day in the open. The dog will collapse from heatstroke long before a fit, practiced human has to bow out.
Dogs also have unusually high endurance actually, some breeds extremely so like sled dogs (up there in the top three with humans and horses). In general wolves are already pretty high up there endurance wise, and dogs probably had an evolutionary incentive to be able to keep up with us better on top of that.
On short distances all dogs outpace humans. After an hour of running most dogs will just quit. On several occasions I had very hard time convincing various dogs to get off the ground after an 8-10km run.
my point is, can a human reasonably stay within viewing distance from a dog that is running away from him? the dog can just run at top speed, gain some distance, take a little rest and maintain that distance at a comfortable pace.
I used to believe in the endurance hunting thing until I realized playing with my dog it didn't make sense. They DO get tired, but they take a 1 minute break and it's like they are back square 0.
For the record, I'm in shape and with a pretty light frame (181cm/70kg) and go for 10km/1 hour jogs regularly.
It's only fair to compare exceptional individual to an exceptional breed. According to Google Husky can run 240km in a day. Exceptional Husky might do much better. Average Husky will easily outrun most human runners.
Still, this is an exception to the rule - overall humans outperform almost all animals when it comes to endurance running.
On a steppe? I think so, I don't see why not. An animal running at or close to top speed might only speed up its exhaustion - it's much easier to run 1km at a steady pace than to do 10x100m intervals.
Try to play with your dog for an hour at high intensity. If your dog belongs to a large or very large breed it'll get tired much more quickly than you - assuming you're in somewhat good shape.
I'm not advocating in favor of endurance hunting, I'm only stating that we can out-endure some large-ish animals.
I think you're misunderstanding how endurance hunting works. There is no land-dweller on Earth that can outpace a fit, prepared human in a hot, dry environment indefinitely. Not even a horse; Google it. We're slow in the sprint, but we're monsters of endurance.
(By "fit" I mean "makes a living this way," not "jogs and hits the gym regularly.")
Plus, now you're exhausted and you've chased it for how long? If you've run for the last 18 hours exhausting the gazelle, home is now an 18 hour jog back the other way, and you won't be jogging, you'll be lugging a gazelle with you and you're already utterly exhausted.
Is gazelle able to jog for 18 hours in a hot day? Or even for 2 hours? Keep in mind that the predator chooses the prey. Humans didn't had to chase gazelles, they may choose something bigger and slower.
Anyone who's tried to go for a significant walk or run with a dog on a hot day can attest that they don't handle distances and heats humans can. My brother once went on a 24km run in summer with our Labrador x Kelpie and had to stop 5km in and slowly walk back, because the otherwise very fit dog couldn't cope with the heat.
I have no idea whether that's an adaptation to endurance hunting or just to normal ambush hunting over long distances in the hot and humid African wet season, but it's pretty undeniable that dogs don't have more long-distance endurance than a fit man on a hot day.
> Finally, while it is true that the low pH of our stomach acid suggests that our ancestors were not relying on rotten meat as a primary food source, it is important to remember that scavenging for meat was likely an important part of their diet. It is also possible that the ability to run long distances gave our ancestors an advantage in scavenging for meat, as they could cover greater distances in search of carrion.
The point I was making was that our ancestors were relying on scavenged, and probably spoiled meat.
Yes. Know it is rude (is it impolite to use GPT?), but I've been playing with GPT and thought I'd give it a shot to see if anyone noticed, or if HN had any detection.
I really don't think whether the prey is big-game or not matters with respect to "spoiled" meat, however, contra to what some others here are saying, persistence hunting is (or was) a real thing. This paper [1] describes ethnographic examples of persistence hunting; heck, my grandfather described to me running down rabbits as a way of hunting them.
However, I would also argue that a more common approach, at least when there are multiple hunters working together, is to "herd" animals into kill zones such as pit traps, or channels. We have lots of evidence of those in the archeological record, such as here [2] and here [3].
I actually suspect that bacterial "preprocessing" of meat would assist in digestion. Rather famously, it is hard to get everything you need in a raw food diet, and a well known argument in anthropology is that cooking with fire is our way to make food easier to digest (rather than approach the problem the way, say, cows do). Fermentation can also improve the digestibility of foods (increasing bioavailability of calories, nutrients, etc.).
That said, all the pets I've had throw up way more than I do. I think we're ignoring the very real possibility that ancient people were constantly living with some level of gastrointestinal problems.
> I think we're ignoring the very real possibility that ancient people were constantly living with some level of gastrointestinal problems.
We do know that humanity from its earliest days over the Rome Empire [1] up to, what, maybe a century or even less ago has had massive infestations of all kinds of parasites, including gastrointestinal.
Would it be possible that these parasites actually could have a symbiotic relationship, in helping to break down spoiled food?
Pets have diet sensitivities too, which can upset their stomachs. My cat used to occasionally throw up but with her current food she has gone for more than a year without any events.
By eating spoiled and contaminated food, ancient people were certainly getting internal parasites too (also something that a big fraction of pets have).
I mean many modern people also live with constant gastrointestinal problems, and I'm not talking about teens eating taco bell and mountain dew every day and wondering why they have bathroom troubles, I'm talking about average people eating a varied and healthy diet with IBS.
You're not wrong and that's an excellent example. I'm questioning the idea that somehow people were able to consume this food impact-free. I struggle with the common narrative that somehow people were healthier and better in some dark forgotten past and we should return to those times for our own well being. There often seems to be promises that conditions like IBS would stop existing if we did that. But maybe ancient man was cool with having near-constantly runny poos because there was no alternative.
> I struggle with the common narrative that somehow people were healthier and better in some dark forgotten past and we should return to those times for our own well being
I think there's a bit of a slightly saner way to put this viewpoint-- that:
- Humans evolved in a very different environment from which we live now
- Our degree of genetic adaptation to modern living is relatively small, because there's been a blink of an eye for it to happen in
- Even if something newer has a sane rationale for it being "healthier" or "better" despite us not having genetically adapted to it, it may not be so: most of our handwaving rationale about complex systems ends up being wrong after careful trials and measurement, and the rest we may be missing a lot of second-order and third-order effects.
Not that I'm advocating for eating rotten meat. I'm just saying that the system is complicated and has a bunch of set-points learned from very different conditions.
I agree that the system is complicated. But it's hard to make the argument that we were genetically well adapted, or that said adaptation didn't come up with sub-optimal tradeoffs to everything we suffered through when we were paleolithic as well. Evolution is very good at building "good enough" systems for an organism to survive in whatever environment it's in based upon what energy it has available.
I too don't like rejecting things outright, but knowing that, I also don't accept things outright either. We have no reason to believe that eating rotten meat wasn't a choice made out of necessity rather than out of some sort of optimal health.
> We have no reason to believe that eating rotten meat wasn't a choice made out of necessity rather than out of some sort of optimal health.
First, basically every choice could be said to be made out of some kind of necessity.
There's a big middle ground even from this. E.g. "eating rotten meat caused a lot of harms, but it also happened to fulfill a couple of somewhat important functions, and thus there was no evolutionary pressure to meet these requirements in any other way"
Thus, even if eating rotten meat today would be a net negative for many, many reasons (and I strongly suspect this is the case), it makes a lot of sense to think about and study what possible benefits it could have had besides merely being available.
Animals also don't have the social stigma of vomiting that humans do. Sometimes they'll vomit for basically no reason at all, because there's a low "cost" to doing it.
One time my dog and cat both ate some human food which was perfectly fine for them, but the cat didn't like the flavor. So the cat vomited. The dog looked at the cat, looked at the food, looked at the cat, and then vomited also.
Dogs have much shorter GI tracts compared to humans. Just that fact means there's less opportunity for bacteria/toxins from rotted food to affect them.
> It's obvious that at least in our very recent evolutionary history we had far higher tolerance to spoilage.
A few hours after reading this article, I realized that we're learning that exposure to peanut proteins at a certain age prevents peanut allergy.
I wonder if eating rotten meat during formative years similarly confers resistance to the ill effects of such meats. Or more generally, develops our immune response enough to where it's not a problem. (Akin to people in developing nations not having a problem with the tap water)
> It's obvious that at most in our very recent evolutionary history we had far higher tolerance to spoilage.
I wonder what effect that’s had on a humans immune system. I’d imagine eating spoiled meat containing bacteria, etc was giving their immune system a run for its money until a tolerance was built.
This is why you can’t render rotten meat safe by simply cooking it. That said, people still age meat. Small birds for will get left for 4-9 days in a cool space before being butchered etc. https://magazine.outdoornebraska.gov/2014/01/hanging-pheasan...
Yep other than a couple notable exceptions like salmonella, this is the norm for food poisoning. It's intoxication from metabolic byproducts, not infection. Remains one of the biggest lay misunderstandings of food safety though.
"Just cook it out" plus "your nose can tell" (it can't, spoilage microbes aren't usually illness microbes) are false beliefs that still get people killed.
Curious about sources for this. I see some indication that botulism toxin can be destroyed by cooking, though the bar is pretty high - needs to be boiled over 5 minutes. So lesser temperature methods such as grilling the meat may not be sufficient, but something like stewing over many hours seems like would work fine.
Here's an example with Staph food poisoning[1]. The bacteria dies when you cook it, but the enterotoxins that Staph generates aren't broken down by heat and remain (section 4). It's good to not consume live Staph, but contaminated food can still cause issues despite cooking.
I'm not gonna have a peer reviewed journal article for you or anything. I learned this in the food safety class you have to go through before managing a kitchen in a lot of jurisdictions.
> stewing over many hours seems like would work fine.
Well I get that would make sense in that kind of environment. Best way to not hurt your customers is to never even risk it - why would you?
But my agenda is more selfish. I like to cold-smoke meats for adding to stews, based on the knowledge that stewing in > 85'C [1] water will destroy the botulism toxin. I could just stay away from cold-smoked meats to be safe; but then I would lose out on some of the tastiest food ever :-)
It can be a confusing topic because there are 3 different things to worry about: the active bacteria, the spores, and the toxin. Of the 3, the spores are the most resistant to heat, and according to the WHO link [1] can survive multiple hours in boiling water. However the toxin seems relatively unstable.
1) Is this what was referred to in the movie "The Menu" where they were taking the tour in the beginning and talking about the 152 day aging process for meat, but the 153rd day would bring chaos?
2) Why is there such a precise known number for that issue? Don't bacteria grow differently? It just seems odd that 152 days would definitely be ok and 153 days would definitely kill.
There does seem to be this notion that being "too hygienic" is a negative thing because it doesn't train your immune system.
That's only kind of true. Exposing a healthy person to small amounts of weakened virus tends to build immunity. Exposing the healthy person to large amounts of the virus overwhelms the immune system and may not even lead to a strong viral immunity.
If I had to guess, eating rotten meat is something that indigenous peoples have adapted to over time (and some individuals died along the way to that adaptation) and they are resistant to the specific things in rotten meat that make modern day humans sick. I think it would be wrong to suggest the rotten meat eaters are somehow better off for doing this, or that they have stronger immune systems in general.
Immune systems aren't like a muscle that you can repeatedly train to get stronger and stronger and stronger ad infinitum.
Immunity also comes at a cost. This isn't talked about often. But it's not "Free" for your body to learn and maintain immunity to specific infections.
Especially in the last years, the hygiene hypothesis is not about a lack of immune training against harmful microbes, but a lack of encounter of so called old friends, harmless microbes.
I'm picturing a guy hauling a roadkill deer in, tossing it on a table and yelling "meat's up!" I mean, sure, they didn't have cars back then, but it's better than letting trash-eating deer meat go to waste.
I see how easy it would be for me at least to assume they had better immune systems from eating the spoiled meat. If true our modern immune systems are most likely way less effective than theirs at that time.
I've known dogs that have lived on very varied diets (including most human food sans poisons like grapes), and other dogs that had very strictly controlled diets because they were sensitive to pretty much anything and everything.
Many breeds of modern dogs are very far removed from the lifestyle and habits of their wild brethren.
I would disagree that meat had a short shelf life. It is almost trivial to dry meat out and make jerky that will last a very long time. Doesn't even require advanced tools.
That being said, still agree with the premise. When you're truly hungry, food is food.
There may be no advantage at all to not being able to tolerate. It may simply be there is no longer any advantage to being able to tolerate it, and thus no longer any pressure to select for the trait.
Igunaq is fermented ( aged ) walrus or seal meat that has been cached away for future use. Meat is usually cached beneath stones or pebbles. Aged walrus meat is extremely high in protein, iron and vitamins. Igunaq has been traditional medicine to keep the digestive system clean, as it flushes away anything in its way. It is also great eating for those who have acquired the taste and can go beyond the smell.
Too fermented, igunaq can be poisonous and can kill people. People have died from eating over-aged meat from walrus and polar bear. These two mammals are very rich in vitamins. The fat is often light green colored when the meat is aged properly. The fat will be darker green if the meat is over-aged or even brownish. Among Inuit it is a delicacy usually eaten with apples. To prepare the meat for eating people find that washing in cold water is better than washing with hot water. Cold water takes away the smell more.
Igloolik and Hall Beach are known to have the best igunaq in Baffin Island. These two communities are blessed with walrus and proper gravel. Meat ages better when it is fermented in loose gravel. It takes time to make good igunaq. One has to store away the meat at the right season when it is not too hot or too cold. Temperature plays a big role.
Cached meat is usually saved for the winter for people to eat but polar bears are known to steal the cache before people can claim them. Regardless the weight of the stones for caching, the polar bear will easily get at it. Polar bears are extremely strong animals.
Common phrase...“ I wonder how many polar bears I have fed to date?” Meaning...Hoping that the cache is not eaten by polar bears yet.
As noted igunaq is good for the digestive system as it cleans it completely of any foreign objects such as viruses and sickness a person may have. A person may experience a natural “high” if they have not eaten aged meat for a while. Men who grew up with igunaq are usually more physically muscular than those who have not. Igloolik, Hall Beach and Cape Dorset have muscular looking men compared to other communities on
Baffin Island. It is believed that igunaq contributes to the physical appearance of the
people who eat it.
Igunaq is such a delicacy that people that have no access to it will fly it in from communities that do have good igunaq. Igunaq is often brought in at special occasions such as Christmas for community feasts. Some communities look forward to Inuit organizations having meetings in their communities...igunaq is surely to be part of the feast. Igunaq when on sale, will sell better than fresh meat. Interestingly due to its odor some airlines in the north will not carry igunaq. People often have to disguise it to get the
aged meat on a plane to take them home. It is said that if you can get beyond the smell, you’ll enjoy the food as it is very nutritious and gives you energy and warmth....and you will be physically ready for your next outing. On a final note a full stomach will also make you concentrate better. Try it! It’s a true Northern experience!
Also Greenland Shark, which “has the longest known lifespan of all vertebrate species (estimated to be between 250 and 500 years)”:
The flesh of the Greenland shark is toxic because of the presence of high concentrations of trimethylamine oxide (TMAO). . . .
Traditionally, this is done by burying the meat in boreal ground for 6–8 weeks, which presses the TMAO out of the meat and also results in partial fermentation. The meat is then dug up and hung up in strips to dry for several more months.[46] It is considered a delicacy in Iceland.
There's also kiviaq:
"a dish made by packing 300 to 500 whole dovekies—beaks, feathers, and all—into the hollowed-out carcass of a seal, snitching it up and sealing it with fat, then burying it under rocks for a few months to ferment. Once it’s dug up and opened, people skin and eat the birds one at a time."
Here is one of the articles I found that wasn't too focused on how gross it is that these people eat this and instead digs into the history and purpose of the food.
https://www.atlasobscura.com/articles/what-is-kiviaq
Don't be! I think that's a completely normal response to something like this. But you aren't writing about it. And I think why and how it became a thing is far more interesting than just "Look at this gross food. And here is why it is gross"
>It's obvious that at least in our very recent evolutionary history we had far higher tolerance to spoilage.
I would agree, both in terms of actual immunity and taste response, although generally-speaking eating rotten meat (especially when well-sourced) is not nearly as risky as common wisdom would make it out to be. Many folks in the deeper circles of online carnivoria dabble in "high meat" (fermented rotten meat, a method of preparing meat learned from Inuit peoples who would feed it to their dogs) and swear by it, even if the taste can be unbearable to the uninitiated.
I suspect that a lot of our tolerance of this stuff in prehistory stems from a combination of:
- spices and seasoning was a lot less common and meat was gamey, not the well-fattened meats we're used to. So this stuff already tasted bland and not amazing as a baseline
- it was preferable to starving, and hunger is the best spice of them all
I recall reading, a long time ago, of a person who was eating just meat and only meat for his diet. He was struggling for a while with it, until he intentionally allowed a small amount of meat to rot for a bit, consumed that, and from then on he did much better on that diet. The principle I believe was in play was that the rotting meat contained a lot of bacteria that was very good at breaking down that meat (it would proliferate the fastest) and that bacteria was taking up residence in his gut microbiome and assisting in digesting non-rotten meat in the future. Sort of like how herbivores will sometimes consume small stones to act as mechanical grinders of plantstuff in their stomachs.
This seems like the most likely way that ancient humans consumed actually-rotten meat, in small doses to edit their gut microbiome (likely encoded in traditions that viewed certain specific rotted foods as a delicacy/luxury - not because it was hard to acquire but because you were only supposed to eat it rarely). To the extent that you see claims that “ancient humans ate large amounts of rotten meat” I suspect they’re conflating that with merely “spoiled by modern food standards” meat, which is much less rotten than it sounds.
I have seen this in many, though. Diet cures a lot of diseases of affluence, but once you start thinking it is a cure-all, you will start seeing enemies everywhere.
I stay far away from the raw vegan community in my town for that reason. I enjoy the food and recipes from time to time, but once COVID struck a not insignifiy chunk of them became intolerable to a boil-head like myself.
It made for diverse groups clumping together under one flag. Far-right, back-to-stone-age climate fighters, raw vegans and general conspiracy theoreticists demonstrating together against public health policy.
A couple of nazi salutes, juice cleanses, coffee enemas and 5g shielding cures covid, who would have thunk.
Here in South Africa we have a thing called biltong which is basically thick slabs of meat that are cured and marinaded for 24 hours or so, then it's hung up to dry for about a week, depending on the size/dryness that you want.
> To the extent that you see claims that “ancient humans ate large amounts of rotten meat” I suspect they’re conflating that with merely “spoiled by modern food standards” meat, which is much less rotten than it sounds.
The example from TFA suggests you are wrong about that -- at least judging by the rotten meat dripping with maggots:
"In one recorded incident from late-1800s Greenland, a well-intentioned hunter brought what he had claimed in advance was excellent food to a team led by American explorer Robert Peary. A stench filled the air as the hunter approached Peary’s vessel carrying a rotting seal dripping with maggots."
My gut refuses to believe it, but my brain understands that this is probably a lot like cheese -- blue cheese in particular. If you've only had fresh milk, letting it curdle and mold before eating it would probably be unthinkable. Same for fresh cabbage vs. kimchi or sauerkraut.
If you ever get a chance to go to a really nice steakhouse, and eat a 45-day or 60-day dry aged steak, do so. It’s one of the funkiest most delicious things I’ve ever eaten, like a combo of the best blue cheese and the best steak I’ve ever had.
The trick with a dry aged steak is that you don't eat the bits that were directly exposed to the air. The outer layer of the beef protects the inner from the intrusion of all the bacteria and spoilage that makes you sick and that's discarded and you cook the inner stuff that's had time to change but not go bad.
I've eaten a lot of 55-day dry aged steak and it always upsets my stomach a little bit for a day or two. I can consistently count on diarrhea for a bit. Tastes great, though.
Too much Wagyu will do something similar, but its not really the same -- in that case its just mostly too much fat/oil. Even a small amount of 30-60 day dry-aged steak will do it to me and it feels different.
I was unfortunate enough to eat some rotten meat at a restaurant in Tanzania when I was working there. The main part was not down, but it was hidden under some potatoes, and I only discovered it as I ate.
Took me months to recover from it. Up to that point in my life, I was totally "regular" as they say. It was probably a year before I got back to normal in that respect.
Don't mess with your gut flora if things are going well, imo.
He's very much in agreement, saying that the human pattern was to hunt large game and then bring it home to eat over weeks, evolving a highly acidic stomach as a result to deal with the bacterial load of rotten meat. That pattern was interrupted by the growing scarcity of megafauna, perhaps caused by human hunters.
"Archaeological evidence does not overlook the fact that stone-age humans also consumed plants," adds Dr. Ben-Dor. "But according to the findings of this study plants only became a major component of the human diet toward the end of the era."
So why rotten meat smells terrible and a steak delicious? Why would we evolve that perception of something that nutritious and which could prevent me from starving. Why my nose keeps telling me to keep away from rotten meant?
IIUC, the digestive system (and the gut bacteria in it) provide feedback to your taste buds over time.
If the appropriate "paleo diet" bacteria settled in (e.g. using the gradual method to build tolerance), your nose would probably change its mind soon enough.
There is an interesting related point in that cruciferous vegetables and allium roots synthesize specific chemicals which have been correlated with healthy effects. Many of these chemicals have significant amounts of sulfur and tend to smell quite strong not necessarily in a pleasant way. Olfactory associations seem to be complex.
The book The Fat of the Land by Vilhjalmur Stefansson (https://en.wikipedia.org/wiki/Vilhjalmur_Stefansson) describes how the Eskimos he lived with ate rotten fish all the time. It was a delicacy for them.
Kind of like we eat "rotten/spoiled" blue cheese and dry-aged beef.
But yea, it seems our digestive systems can adapt to it pretty well, given that we've been doing this since long before doctors were around. I suppose the acid kills all the bacteria in the rotten meat unless you're super weak immune wise.
Not just that, there are toxins lethal to "normal" humans (i.e. cadaverine) that one can adapt to by eating small amounts as a child.
Chukchi babies would get ever more rotten bones to suck on (i.e. pacifiers) presumably for that purpose.
But studies conducted over the last few decades do indicate that putrefaction, the process of decay, offers many of cooking’s nutritional benefits with far less effort. Putrefaction predigests meat and fish, softening the flesh and chemically breaking down proteins and fats so they are more easily absorbed and converted to energy by the body.
That was my first thought seeing the headline. As long as it can be stomached, it makes sense that we'd favor rotten food as a kind of natural over processed food that makes it easy to get at the nutrients. Another way to look at it is that the putrefaction is basically the same thing that happens once we eat it.
I'm not saying this is false, but it sure is setting off my BS detectors, so I'd be curious for other perspectives.
I was taught in school that Europeans traded and used spices specifically in order to mask the smell of rotting meat, and then learned later as an adult that was total BS. The story had seemed so tantalizing -- we were so primitive just a few centuries ago, thank goodness we're in the modern world! But nope -- rotten meat is just rotten meat and it's revolting period. Spices were expensive, and the people who could afford spices sure as heck weren't eating rotten meat.
And then I read this article and there are a lot of comments by explorers describing eating rotten meat -- but explorers were often trying to highlight just how primitive these "other" peoples are -- look at their disgusting practices, unlike civilized man!
And then the article also repeatedly conflates rotten with fermented, when these are quite different things with different effects. It also ignores the fact that something can be rotten on the outside, but once you cut those parts off, the inside is perfectly fine. That's the whole principle behind dry-aged beef, after all. And even in English, hákarl, which is fermented shark, is often incorrectly referred to as "rotten shark", presumably to increase the shock value for tourists.
And look, I'm an incredibly adventurous eater. I've overcome a lot of things that initially were aversions. But I simply don't see how you can overcome an aversion to actual rotting meat. It would seem to be on par with an aversion to eating feces.
What does seem plausible is that, as we know, fermenting and aging meat is a thing, and that people who aren't used to fermenting can find it disgusting and mistake it for rotting, and that explorers confused the two and sometimes even embellished their accounts for dramatic effect, again to highlight just how primitive these people were. But the idea that humans regularly ate rotten meat -- all I'll say is, extraordinary claims require extraordinary evidence. Do we really have that evidence?
This is a pretty interesting point about a carbohydrate source via animals:
> Western explorers noted that the Inuit also ate chyme, the stomach contents of reindeer and other plant-eating animals. Chyme provided at least a side course of plant carbohydrates.
I recently turned vegetarian, and I feel that (controversial opinion coming ahead) a vegetarian (lacto-ovo that is) diet is the closest thing to a Paleo diet we have today. The reason is that milk fats and egg yolk fat provide the most realistic balance of protein+fats that our ancestors previously exclusively got from eating animals whole where not just the flesh was eaten but also the organs (just as the article above confirms).
Also that the composition of milk and egg yolk fats is important here as they're similar in fat composition to animals derived fats (eg: saturated fat, a favorable omega 3 to omega 6 ratio, presence of fat-soluble vitamins like Vitamin A, etc.).
Extra: With both eggs and milk, it is easy to play with the protein-fat ratio by altering the whole egg to egg whites ratio and magnitude of skimming of milk respectively.
As an aside, most industrial meat is exposed either intentionally or unintentionally to lactobacillus strains. Working in kitchens I've had the opportunity to taste spoiling meat that had high amounts of it. When prepared right it is actually quite good. Taste can be similar to dry-aged steak, but juicer and more flavorful. I've never gotten sick, but it certainly seems like a risk, however tasty it may be.
Lactic acid fermentation [1] is the secret to many sausages. In the case of Mettwurst [2], made of raw meat, it is essential. Aged steak, hanged pheasants [3 - with a discussion of bacteria] - even today, the list should be long.
Yeah you can definitely lacto ferment meat, it's a major flavor component of the european dry sausages. Doing it safely isn't hard, but a lot of the nuances of that technique are towards making it actually taste good. It's very easy to lose control of water activity and get out of control rancid elements, or get air ingress and grow aerobic bacterias that clearly taste like rot, or too much oxidation of the fat, or the wrong mold.
Definitely pretty advanced level fermentation stuff, more to control and more serious consequences than making kimchi or something. Naem and other SE asian sour sausages are probably the most approachable entry point if you want to give it a swing though.
>“a gold mine of ethnohistorical accounts makes it clear that the revulsion Westerners feel toward putrid meat and maggots is not hardwired in our genome but is instead culturally learned,” Speth says.
These two statements aren't necessarily in contraindication of one another. The revulsion can be both hardwired in our genome and we can culturally train ourselves to ignore that revulsion.
I remember watching a video a few years ago about a present day tribe (I believe in SE Asia) that went on a hunt and killed a monkey. Due to the heat by the time the hunting party was back the meat had spoiled but nobody in the village had any issue with diving in and eating it. I would think that up until very recent times in human development this was very common.
I can say, that I grew up on "well done" meat... Even as an adult, It's taken years to get used to medium-rare steak. The texture of raw fish is still really hard for me to get past. It's definitely a learned thing, and the hyper-palatable "foods" we have today doesn't exactly help things at all.
Sadly for the cause of lolz, the subreddit where folks would trade tips about eating "high meat" and reassure each other that intestinal parasites are "totally natural, like our caveman ancestors had" went private a while back.
It's worth considering that our recent obsession with the "caveman diet" is the extension (and obvious continuation) of the early modern "noble savage" construct. It seems we ran out of culture clashes of colonization and now are trying to colonize the past to process our anxieties and dissatisfactions.
Scavenging explains many of the body traits that humans have.. are good binocular vision allows us to see long distances.. so prehistoric man may have been able to see buzzards surfing over a corpse miles away.. so then we began to ran toward the corpse but we had to run fast so that we could beat the other animals to it.. our skeletal structure allows us to run fast and the fact that we have little hair allows us to sweat and carry off the heat so we can run long distances quickly.. once we get to the corpse there are likely other animal scavenging it such as hyenas, and our ability to grasp rocks and throw them allowed us to throw rocks at the hyenas or whatever and scare them off the corpse.. then if there was even little meat left we could pick up a rock smash the bones and get the marrow out.. this explains how we got good long distance binocular vision, a skeleton that allows us to run, a body that allows us to carry off heat through sweating, hands and other anatomy that can grasp and throw stones, and smash bones to get the marrow out
Fish sauce, dry aged steaks, and bellota ham come to mind. All delicious, and while I’m not sure if you’d call them putrefied meat, it’s not that far off.
There is bacterial action at work to give it that delicious flavor. I don't know where the line is between bacterial breakdown of meat tissue in jamón and putrification. I wanted to highlight that bacterially altered meat is a delicacy today. Maybe people back then people were also looking for umami?
> It is interesting to note that humans, uniquely among the primates so far considered, appear to have stomach pH values more akin to those of carrion feeders than to those of most carnivores and omnivores.
"It is interesting to note that humans, uniquely among the primates so far considered, appear to have stomach pH values more akin to those of carrion feeders than to those of most carnivores and omnivores. In the absence of good data on the pH of other hominoids, it is difficult to predict when such an acidic environment evolved. Baboons (_Papio_ spp) have been argued to exhibit the most human–like of feeding and foraging strategies in terms of eclectic omnivory, but their stomachs – while considered generally acidic (pH = 3.7) – do not exhibit the extremely low pH seen in modern humans (pH = 1.5). One explanation for such acidity may be that carrion feeding was more important in humans (and more generally hominin) evolution than currently considered to be the case [...]" - ["The Evolution of Stomach Acidity and Its Relevance to the Human Microbiome" (2015)](https://journals.plos.org/plosone/article?id=10.1371/journal...)
There's also the simple observation that no doubt many of us have already made: animals all around us seem to have no issues with partially decomposed meat. Dogs don't seem to mind it, nor do cats, nor do other primates like chimpanzees. It's obvious that at least in our very recent evolutionary history we had far higher tolerance to spoilage.