Feeds:
Posts
Comments

Posts Tagged ‘diet’

20120511-120450.jpg

Topic: Eating with the Ancient Greeks

In the beginning there was acorn. Then, ancient Greeks said “let there be bread and wine.” But this wasn’t enough and so, the ingredients became fruitful and multiplied – pulse, meat, bread and oil. And the situation continued like this until now, when our kitchen table is full of fats.

This could be the brief history of Mediterranean nutrition. But this history, or to be more precise, this experiment started 4,000 years ago in ancient Greece and keeps evolving.

According to a substitute professor of Biology at the Aristotle University of Thessaloniki, this way of alimentation is not the intelligent invention of some clever Mediterranean people, but came as a result of the constant interaction between inhabitants of this region and the surrounding natural environment.

In the years before Homer, the situation was dramatic, the professor claims. Homeric period is defined as an important moment for the Mediterraneans, since this is the time when they began cultivating grain.

Ancient Arcades fed mostly on acorn. Later on, in the Mycenean Age, began a nutrition revolution – people introduced bread into their daily meals. Grain was the most important source of proteins and carbohydrates for both people and animals of the time.

Homer wrote that the main substances of the meals were bread, meat and wine. He never made reference to vegetables, despite the fact he often included details of the ancient Greek nutrition in his writings.

The reason why Mediterraneans used to consume great amounts of meat may have been the need for fats, which they could not get from another substance.

As for olive oil, which was already known in Homer’s Greece, they used it only as part of ancient Greek rituals – for example, in the Olympic Games athletes anointed oil to their body before entering the arena.

Oil was not included in the “Mediterranean trio” until the classical times.

Historian Herodotus reports that Athens was the center of olive tree cultivation. Scientists estimate that every adult Athenian consumed, on average, 55 litres of olive oil per year.

Original article:

greekreporter.com

Read Full Post »

Topic: Ancient Native Americans

Note: I have set in bold the part of the article that makes mention of ancient foods.

 

 

The federal government should fix or drop new regulations that throttle scientific study of America’s heritage.

A rare set of nearly 10,000-year-old human bones found in 1976 on a seaside bluff in La Jolla, Calif., may soon be removed from the custody of the University of California, San Diego, and turned over to the local Kumeyaay Nation tribes. The Kumeyaay have long sought control over the bones, which they contend are the remains of their ancestors. In accordance with new federal regulations, the university has initiated the legal process to transfer the remains to the Kumeyaay in the absence of other claimants. The Kumeyaay have said they may rebury the bones. Being some of the oldest human skeletal remains in North America, the bones could help scientists piece together the peopling of the New World. The excellent preservation of the specimens hints that they might contain DNA suitable for analysis with techniques geneticists have recently developed- the results of which could yield crucial insights into where early Americans came from. Such studies may never come to pass.

Some might consider a loss of knowledge an acceptable trade-off to right the historic wrongs that the Kumeyaay and other Native peoples have suffered. Archaeologists and anthropologists of yore treated Native Americans disgracefully, looting their graves and using the remains to argue for the intellectual inferiority of Native Americans to peoples of Caucasian descent. But what makes this case disturbing is that the Kumeyaay claim is based on folklore. The physical evidence indicates that the La Jolla bones are not affiliated with any modern tribe, including the Kumeyaay, who moved into the area only within the past few thousand years. The new federal regulations are blind to this evidence. In effect, they privilege faith over fact.

The original intention of the Native American Graves Protection and Repatriation Act (NAGPRA), passed in 1990, was to facilitate the return of Native American bones and sacred objects to descendants and culturally affiliated groups. NAGPRA sought to balance the rights of Native Americans to reclaim ancestral remains with the right of society as a whole to learn about our collective past. By and large, the law was succeeding. In recent years scientists and representatives of Native peoples have been working together to everyone’ s gain.

 For example, archaeologist Alston Thoms of Texas A&M University has been consulting with Native Americans about their cooking techniques, to gain insights into the subsistence strategies of people who lived on the South Texas plains thousands of years ago. Members of the Tap Pilam Coahuiltecan Nation- who consider themselves the descendants of those ancient Texans- have, in turn, been learning about ancestral foods and incorporating them into their diet to counter the high rate of diabetes in their population.

Many Native Americans do not object to studies per se but to analyses that destroy remains. Respecting this concern, anthropologist Ventura Pérez of the University of Massachusetts Amherst, who studies violence, has developed techniques for making high-quality replicas of cut marks on bone that leave the skeletal material intact and allow it to be repatriated, while creating a permanent record for future scholars.

To be sure, not all was well. Many tribes worried that museums were stalling on identifying remains to avoid having to return them. In May 2010 the U.S. Department of the Interior responded with regulations that allow tribes to claim even those remains whose affiliation cannot be established scientifically, as long as they were found on or near the tribes’  aboriginal lands. These rules nudge museums to get on with evaluating their collections, but they have too broad a brush. They upset the balance that NAGPRA had achieved and foster antagonism, not just between tribes and scientists but also among tribes with conflicting claims. The La Jolla case is just one example. Thousands of remains could be made inaccessible to researchers. In our view, the new regulations should be repealed or, at least, revised to distinguish different classes of unidentified remains.

The colonization of the New World was a watershed in the odyssey that carried Homo sapiens from its African birthplace to the entire globe. The stories of the trailblazers who accomplished that feat deserve to be told. Their remains are the shared patrimony of all Americans and, indeed, all peoples everywhere.

Original article:

archaeologydaily.com

April 19, 2012

 

Read Full Post »

Topic: Diet of early man

New technologies challenge old ideas about early hominid diets.

Original article:

oct 13, 2011

eurekalert.org

Read Full Post »

Topic: Sharing Meat

Bone from the Qesem Cave showing irregular cutmarks.

 

Contestants on TV shows like Top Chef and Hell’s Kitchen know that their meat-cutting skills will be scrutinized by a panel of unforgiving judges. Now, new archaeological evidence is getting the same scrutiny by scientists at Tel Aviv University and the University of Arizona.

Their research is providing new clues about how, where and when our communal habits of butchering meat developed, and they’re changing the way anthropologists, zoologists and archaeologists think about our evolutionary development, economics and social behaviors through the millennia.

Presented in the Proceedings of the National Academy of Science, new finds unearthed at Qesem Cave in Israel suggest that during the late Lower Paleolithic period (between 400,000 and 200,000 years ago), people hunted and shared meat differently than they did in later times. Instead of a prey’s carcass being prepared by just one or two persons resulting in clear and repeated cutting marks — the forefathers of the modern butcher — cut marks on ancient animal bones suggest something else.

Different rules of the game

“The cut marks we are finding are both more abundant and more randomly oriented than those observed in later times, such as the Middle and Upper Paleolithic periods,” says Prof. Avi Gopher of TAU’s Department of Archaeology. “What this could mean is that either one person from the clan butchered the group’s meat in a few episodes over time, or multiple persons hacked away at it in tandem,” he interprets. This finding provides clues as to social organization and  structures in these early groups of hunters and gatherers, he adds.

Among human hunters in the past 200,000 years, from southern Africa to upstate New York or sub-arctic Canada, “there are distinctive patterns of how people hunt, who owns the products of the hunt, how carcasses are butchered and shared,” Prof. Gopher says. “The rules of sharing are one of the basic organizing principles of hunter-gatherer cultures. From 200,000 years ago to the present day, the patterns of meat-sharing and butchering run in a long clear line. But in the Qesem Cave, something different was happening. There was a distinct shift about 200,000 years ago, and archaeologists and anthropologists may have to reinterpret hunting and meat-sharing rituals.”

Meat-sharing practices, Prof. Gopher says, can tell present-day archaeologists about who was in a camp, how people dealt with danger and how societies were organized. “The basic logic of butchering large animals has not changed for a long time. Everyone knows how to deal with the cuts of meat, and we see cut marks on bones that are very distinctive and similar, matching even those of modern butchers. It’s the more random slash marks on the bones in Qesem that suggests something new.”

Where’s the beef?

The Qesem Cave finds demonstrate that man was at the top of the food chain during this period, but that they shared the meat differently than their later cousins. The TAU excavators and Prof. Mary Stiner of the University of Arizona (Tucson) hypothesize that the Qesem Cave people hunted cooperatively. After the hunt, they carried the highest-quality body parts of their prey back to the cave, where the meat was cut using stone-blade tools and then cooked on the fire.

“We believe this reflects a different way of butchering and sharing. More than one person was doing the job, and it fits our expectations of a less formal structure of cooperation,” says Prof. Gopher. “The major point here is that around 200,000 years ago or before, there was a change in behavior. What does it mean? Time and further excavations may tell.”

Qesem, which means “magic” in Hebrew, was discovered seven miles east of Tel Aviv about nine years ago during highway construction. It is being excavated on behalf of TAU’s Department of Archaeology by Prof. Avi Gopher and Dr. Ran Barkai in collaboration with an international group of experts. The cave contains the remains of animal bones dating back to 400,000 years ago. Most of the remains are from fallow deer, others from wild ancestors of horse, cattle, pig, and even some tortoise. The data that this dig provides has been invaluable: Until now there was considerable speculation as to whether or not people from the late Lower Paleolithic era were able to hunt at all, or whether they were reduced to scavenging, the researchers say.

Original article:

aftau.org

October/2009

 

Read Full Post »

Topic : Diet and Brain Development

Almost two million years ago, early humans began eating food such as crocodiles, turtles and fish – a diet that could have played an important role in the evolution of human brains and our footsteps out of Africa, according to new research.

In what is the first evidence of consistent amounts of aquatic foods in the human diet, an international team of researchers has discovered early stone tools and cut marked animal remains in northern Kenya. The work has just been published in the Proceedings of the National Academy of Science (PNAS).

“This site in Africa is the first evidence that early humans were eating an extremely broad diet,” says Dr Andy Herries from the University of New South Wales (UNSW), the only researcher from Australia to have worked with the team. The project represents a collaborative effort with the National Museums of Kenya and is led by David Braun of the University of Cape Town in South Africa and Jack Harris of Rutgers University in the US.

The researchers found evidence of the early humans eating both freshwater fish and land animals at the site in the northern Rift Valley of Kenya. It is thought that small bodied early Homo would have scavenged the remains of these creatures, rather than hunting for them.

“This find is important because fish in particular has been associated with brain development and it is after this period that we see smaller-brained hominin species evolving into larger-brained Homo species- Homo erectus – the first hominin to leave Africa,” says Dr Herries, of the School of Medical Sciences.

“A broader diet as suggested by the site’s archaeology may have been the catalyst for brain development and humanity’s first footsteps out of Africa.”

Dr Herries dated the archeological remains using palaeomagnetism, a technique that identifies the fossilised direction of the Earth’s magnetic field in sediments.

Original article:

6/2/2010

Sciencealert.com.au

Read Full Post »

Topic: One Subject Two views-Last Supper

 

ITHACA, NY. Were the twelve apostles guilty of overeating at the Last Supper? Two brothers—an eating behavior expert and a religious studies scholar—are publishing findings that might make you think twice at your Easter dinner.

Brian and Craig Wansink teamed up to analyze the amount of food depicted in 52 of the best-known paintings of the Last Supper. After indexing the sizes of the foods by the sizes of the average disciple’s head, they found that portion size, plate size, and bread size increased dramatically over the last one thousand years. Overall, the main courses depicted in the paintings grew by 69%, plate size by 66%, and bread size by 23%.

The study’s findings will be published in the April 2010 issue of the International Journal of Obesity and released in the online version of the journal on Tuesday, March 23.

“I think people assume that increased serving sizes, or ‘portion distortion,’ is a recent phenomenon,” said Brian Wansink, professor and director of the Cornell Food and Brand Lab. “But this research indicates that it’s a general trend for at least the last millennium.”

“As the most famously depicted dinner of all time, the Last Supper is ideally suited for review,” said Craig Wansink, professor of religious studies at Virginia Wesleyan College.

“The method we used created a natural crossroads between our two divergent fields and a wonderful opportunity to collaborate with my brother,” he added.

Portion size and spatial relationships are familiar topics in Brian Wansink’s work in food and eating behavior. In his book Mindless Eating: Why We Eat More Than We Think, he explores the hidden cues that determine what, when, and how much we eat.

Original article

Eurkalert.org

03/23/2010

Read Full Post »

Topic: Who knew-pickles could do so much?

People have been eating pickles ever since the Mesopotamians started making them way back in 2400 B.C.E. Here are some even more important things you should know about them.

1. In the Pacific Islands, natives pickle their foods in holes in the ground lined with banana leaves, and use them as food reserves in case of storms. The pickles are so valuable that they’ve become part of the courting process, helping a man prove he’ll be able to provide for a woman. In Fiji, guys can’t get a girl without first showing her parents his pickle pits.

2. Cleopatra claimed pickles made her beautiful. (We guess it had more to do with her genes.)

3. The majority of pickle factories in America ferment their pickles in outdoor vats without lids (leaving them subject to insects and bird droppings). But there’s a reason. According to food scientists, the sun’s direct rays prevent yeast and mold from growing in the brine. Mental Floss: 8 disastrous product names

4. In the Delta region of Mississippi, Kool-Aid pickles have become ridiculously popular with kids. The recipe’s simple: take some dill pickles, cut them in half, and then soak them in super strong Kool-Aid for more than a week. According to the New York Times, the sweet vinegar snacks are known to sell out at fairs and delicatessens, and generally go for $.50 to a $1.

5. Not everyone loves a sweet pickle. In America, dill pickles are twice as popular as the sweet variety.

6. The Department of Agriculture estimates that the average American eats 8.5 lbs of pickles a year.

7. When the Philadelphia Eagles thrashed the Dallas Cowboys in sweltering heat in September 2000, many of the players attributed their win to one thing: guzzling down immense quantities of ice-cold pickle juice.8. If it weren’t for pickles, Christopher Columbus might never have “discovered” America. In his famous 1492 voyage, Columbus rationed pickles to his sailors to keep them from getting scurvy. He even grew cucumbers during a pit stop in Haiti to restock for the rest of the voyage.

9. Speaking of people who get credit for discovering America, when he wasn’t drawing maps and trying to steal Columbus’ thunder, Amerigo Vespucci was a well-known pickle-merchant.

10. Napoleon was also a big fan of pickle power. In fact, he put up the equivalent of $250,000 as a prize to whoever could figure out the best way to pickle and preserve foods for his troops.

11. During the 1893 Chicago World’s Fair, H. J. Heinz used pick-shaped pins to lure customers to his out of the way booth. By the end of the fair, he’d given out lots of free food, and over 1,000,000 pickle pins.

12. Berrien Springs, Michigan, has dubbed itself the Christmas Pickle Capital of the World. In early December, they host a parade, led by the Grand Dillmeister, who tosses out fresh pickles to parade watchers. 

Original article:

CNN.com 

By Shannon Cothran

08/07/2009

Read Full Post »

Topic: Man’s Early Diet

Image of what Lucy might have looked like

Peter Ungar, professor of anthropology, will present their findings on Oct. 20 during a presentation at the Royal Society in London, England, as part of a discussion meeting about the first 4 million years of .

“The Lucy species is among the first hominids to show thickened enamel and flattened teeth,” an indication that hard, or abrasive foods such as nuts, seeds and tubers, might be on the menu, Ungar said. However, the microwear texture analysis indicates that tough objects, such as grass and leaves, dominated Lucy’s diet.

“This challenges long-held assumptions and leads us to questions that must be addressed using other techniques,” Ungar said. Researchers thought that with the development of thick enamel, robust skulls and large chewing muscles, these species had evolved to eat hard, brittle foods. However, the microwear texture analysis shows that these individuals were not eating such foods toward the end of their lives.

The researchers used a combination of a scanning confocal microscope, and scale-sensitive fractal analysis to create a microwear texture analysis of the molars from 19 specimens of A. afarensis, the Lucy species, which lived between 3.9 and 2.9 million years ago, and three specimens from A. anamensis, which lived between 4.1 and 3.9 million years ago. They looked at complexity and directionality of wear textures in the teeth they examined. Since food interacts with teeth, it leaves behind telltale signs that can be measured. Hard, brittle foods like nuts and seeds tend to lead to more complex tooth profiles, while tough foods like leaves generally lead to more parallel scratches, which corresponds with directionality.

“The long-held assumption was that with the development of thick enamel, robust skulls and larger chewing muscles marked the beginning of a shift towards hard, brittle foods, such as nuts, seeds and tubers,” Ungar said. “The Lucy species and the species that came before it did not show the predicted trajectory.”

Next they compared the microwear profiles of these two species with microwear profiles from Paranthropus boisei, known as Nutcracker Man that lived between 2.3 and 1.2 million years ago, P. robustus, which lived between 2 million and 1.5 million years ago, and Australopithecus africanus, which lived between about 3 million and 2.3 million years ago. They also compared the microwear profiles of the ancient hominids to those of modern-day primates that eat different types of diets.

The researchers discovered that microwear profiles of the three east African species, A. afarensis, A. anamensis and P. boisei, differed substantially from the two south African species, P. robustus and A. africanus, both of which showed evidence of diets consisting of hard and brittle food.

“There are huge differences in size of and shape of teeth between the species in eastern Africa, but not in their microwear,” Ungar said. “This opens a whole new set of questions.”

Ungar’s colleagues include Robert S. Scott, assistant professor of anthropology at Rutgers University; Frederick E. Grine, professor of anthropology at Stony Brook University; and Mark F. Teaford, professor of anthropology at Johns Hopkins University.

Original article

PHYSORG.com

10/22/209

Read Full Post »

Why Humans Outlive Apes

Topic: Steak-  Lets hear it for meat eaters

Genetic changes that apparently allow humans to live longer than any other primate may be rooted in a more carnivorous diet. 

These changes may also promote brain development and make us less vulnerable to diseases of aging, such as cancer, heart disease and dementia. 

Chimpanzees and great apes are genetically similar to humans, yet they rarely live for more than 50 years. Although the average human lifespan has doubled in the last 200 years – due largely to decreased infant mortality related to advances in diet, environment and medicine – even without these improvements, people living in high mortality hunter-forager lifestyles still have twice the life expectancy at birth as wild chimpanzees do. 

These key differences in lifespan may be due to genes that humans evolved to adjust better to meat-rich diets, biologist Caleb Finch at the University of Southern California in Los Angeles suggested.

Mmmm … raw, red meat 

The oldest known stone tools manufactured by the ancestors of modern humans, which date back some 2.6 million years, apparently helped butcher animal bones. As our forerunners evolved, they became better at capturing and digesting meat, a valuable, high-energy food, by increasing brain and body size and reducing gut size. 

Over time, eating red meat, particularly raw flesh infected with parasites in the era before cooking, stimulates chronic inflammation, Finch explained. In response, humans apparently evolved unique variants in a cholesterol-transporting gene, apolipoprotein E, which regulates chronic inflammation as well as many aspects of aging in the brain and arteries. 

One variant found in all modern human populations, known as ApoE3, emerged roughly 250,000 years ago, “just before the final stage of evolution of Homo sapiens in Africa,” Finch explained. 

ApoE3 lowers the risk of most aging diseases, specifically heart disease and Alzheimer’s, and is linked with an increased lifespan. 

“I suggest that it arose to lower the risk of degenerative disease from the high-fat meat diet they consumed,” Finch told LiveScience. “Another benefit is that it promoted brain development.” 

Puzzle remains

Curiously, another more ancient variant of apolipoprotein E found in a lesser degree in all human populations is ApoE4, which is linked with high cholesterol, shortened lifespan and degeneration of the arteries and brain. 

“The puzzle is, if ApoE4 is so bad, why is it still present?” Finch asked. “It might have some protective effects under some circumstances. A little bit of data suggests that with hepatitis C, you have less liver damage if you have ApoE4.”

Finch detailed these findings in the December issue of Proceedings of the National Academy of Sciences Early Edition.

Original Article:

Livescience.com

Charles Q Choi

Dec 2009

Read Full Post »

Topic: Fish

Human JawCaption: This is the lower mandible of the 40,000-year-old human skeleton, found in the Tianyuan Cave near Beijing. Analyses of collagen extracted from this bone prove that this individual was a regular consumer of fish.

The isotopic analysis of a bone from one of the earliest modern humans in Asia, the 40,000 year old skeleton from Tianyuan Cave in the Zhoukoudian region of China (near Beijing), by an international team of researchers from the Max Planck Institute for Evolutionary Anthropology in Leipzig, the Graduate University of Chinese Academy of Sciences and the Institute of Vertebrate Paleontology and Paleoanthropology in Beijing, the University of British Columbia in Vancouver and Washington University in Saint Louis has shown that this individual was a regular fish consumer (PNAS, 07.07.2009).

Freshwater fish are a major part of the diet of many peoples around the world, but it has been unclear when fish became a significant part of the year-round diet for early humans. Chemical analysis of the protein collagen, using ratios of the isotopes of nitrogen and sulphur in particular, can show whether such fish consumption was an occasional treat or part of the staple diet.

The isotopic analysis of the diet of one of the earliest modern humans in Asia, the 40,000 year old skeleton from Tianyuan Cave near Beijing, has shown that at least this individual was a regular fish consumer. Michael Richards of the Max Planck Institute for Evolutionary Anthropology explains “Carbon and nitrogen isotope analysis of the human and associated faunal remains indicate a diet high in animal protein, and the high nitrogen isotope values suggest the consumption of freshwater fish.” To confirm this inference the researchers measured the sulphur isotope values of terrestrial and freshwater animals around the Zhoukoudian area and of the Tianyuan human.

Since fish appeared on the menu of modern humans before consistent evidence for effective fishing gear appeared, fishing at this level must have involved considerable effort. This shift to more fish in the diet likely reflects greater pressure from an expanding population at the time of modern human emergence across Eurasia. “This analysis provides the first direct evidence for the consumption of aquatic resources by early modern humans in China and has implications for early modern human subsistence and demography”, says Richards.

Original work:

Yaowu Hu, Hong Shang, Haowen Tong, Olaf Nehlich, Wu Liu, Chaohong Zhao, Jincheng Yu, Changsui Wang, Erik Trinkaus, Michael P. Richards
Stable Isotope Dietary Analysis of the Tianyuan 1 Early Modern Human
PNAS. July 7, 2009. Vol. 106, No. 27

Reported in EurkAlert

July 7, 2009

Read Full Post »

« Newer Posts

%d bloggers like this: