Wednesday 24 April 2013

Epigenetics - An introductory exposition

We may be on the verge of collectively knowing just how much we don't know. Biology stands on the brink of a shift in the understanding of inheritance. The discovery of epigenetics -- hidden influences upon the genes -- could affect every aspect of our lives.


The picture of a genetic makeup that fluctuates by the hour and minutes seems at odds with the public perception: That genes determine everything from our physical characteristics all the way to our behaviour. Many scientists seem to think that our geners form an immutable blueprint that our cells must forever follow. British research scientists and Oxford Susan Greenfield says "the reductionist genetic train of thought fuels the currently highly fashionable concept of a gene for this or that"

Niles edridge in his book why we do it, says "genes have been the dominant metaphor underlying all manner of human behaviour, from the most basic to animalistic, like sex, up to and including such esoterica as the practise of religion, the enjoyment of music, and the codification of laws and moral strictures... The media are besotted with genes... genes have for over half a century easily eclipsed the outside natural world as the primary driving force of evolution in the minds of evolutionary biologists."

The tools of our consciousness, including our beliefs, thoughts, intentions and actions, often seem to correlate much more strongly with our health, longevity, and happiness than our genes do. Larry dossey, MD, observes in his much cited publication health perceptions and survival: do global evaluations of health status really predict mortality? "Several studies show that what one thinks about ones health is one of the most accurate predictors of longevity ever discovered". Studies show that a committed spiritual practice involving mindfulness can add many years to our lives, regardless of our genetic mix.

As we think our thoughts and feel our feeling our bodies change and respond with a complex array of shifts, each thought releases a particular mixture of biochemicals in our organs and triggers genetic changes in our cells. Psychologist Ernest Rossi explores in his text the psychobiology of gene expression "how our subjective states of mind, consciously motivated behaviour, and our perception of free will can modulate gene expression to optimize health" Nobel prize winner Eric Kandell MD believes that in future treatments "social influences will be biologically incorporated in the altered expressions of specific genes in specific nerve cells of specific areas of the brain"


Brain researchers Kemperman and Gage envision a future in which the regeneration of damaged neural networks is a cornerstone of medical treatment, and doctors prescriptions include "modulations of environmental or cognitive stimuli", and "alterations of physical activity", in other words, doctors in the future will prescribe, instead of (or in addition to) a drug, a particular therapeutic belief or thought, a positive feeling, an affirmative social activity.


Randy Jirtle has discovered that he could make Agouti mice produce normal healthy young, by changing the expression of their genes, and without making any changes to the mices DNA, by feeding them methyl groups. These molecule clusters are able to inhibit the expression of genes, and sure enough, the methyl groups eventually worked their way through the mothers metabolism to attatch to the Agouti genes of the developing embryos.

In Laymans Terms

At the heart of this new field is a simple but contentious idea -- that genes have a memory. This is not the same as 'water memory' or homeopathy; this has empirical scientific foundations. Suppose that the lives of your grandparents -- the air they breathed, the food they ate, even the things they saw -- can directly affect you, decades later, despite your never experiencing these things yourself. And that what you do in your lifetime could in turn affect your grandchildren.

The conventional view is that DNA carries all our heritable information and that nothing an individual does in their lifetime will be biologically passed to their children. To many scientists, epigenetics amounts to a heresy, calling into question the accepted view of the DNA sequence -- a cornerstone on which modern biology sits.

Epigenetics adds a whole new layer to genes beyond the DNA. It proposes a control system of 'switches' that turn genes on or off -- and suggests that things people experience, like nutrition and stress, can control these switches and cause heritable effects in humans.

In a remote town in northern Sweden there is evidence for this radical idea. Lying in Överkalix's parish registries of births and deaths and its detailed harvest records is a secret that confounds traditional scientific thinking. Marcus Pembrey, a Professor of Clinical Genetics at the Institute of Child Health in London, in collaboration with Swedish researcher Lars Olov Bygren, has found evidence in these records of an environmental effect being passed down the generations. They have shown that a famine at critical times in the lives of the grandparents can affect the life expectancy of the grandchildren. This is the first evidence that an environmental effect can be inherited in humans.

In other independent groups around the world, the first hints that there is more to inheritance than just the genes are coming to light. The mechanism by which this extraordinary discovery can be explained is starting to be revealed.

Professor Wolf Reik, at the Babraham Institute in Cambridge, has spent years studying this hidden ghost world. He has found that merely manipulating mice embryos is enough to set off 'switches' that turn genes on or off.

For mothers like Stephanie Mullins, who had her first child by in vitro fertilisation, this has profound implications. It means it is possible that the IVF procedure caused her son Ciaran to be born with Beckwith-Wiedemann Syndrome -- a rare disorder linked to abnormal gene expression. It has been shown that babies conceived by IVF have a three- to four-fold increased chance of developing this condition.

And Reik's work has gone further, showing that these switches themselves can be inherited. This means that a 'memory' of an event could be passed through generations. A simple environmental effect could switch genes on or off -- and this change could be inherited.

His research has demonstrated that genes and the environment are not mutually exclusive but are inextricably intertwined, one affecting the other.

The idea that inheritance is not just about which genes you inherit but whether these are switched on or off is a whole new frontier in biology. It raises questions with huge implications, and means the search will be on to find what sort of environmental effects can affect these switches.

After the tragic events of September 11th 2001, Rachel Yehuda, a psychologist at the Mount Sinai School of Medicine in New York, studied the effects of stress on a group of women who were inside or near the World Trade Center and were pregnant at the time. Produced in conjunction with Jonathan Seckl, an Edinburgh doctor, her results suggest that stress effects can pass down generations. Meanwhile research at Washington State University points to toxic effects -- like exposure to fungicides or pesticides -- causing biological changes in rats that persist for at least four generations.

This work is at the forefront of a paradigm shift in scientific thinking. It will change the way the causes of disease are viewed, as well as the importance of lifestyles and family relationships. What people do no longer just affects themselves, but can determine the health of their children and grandchildren in decades to come. "We are," as Marcus Pembrey says, "all guardians of our genome."

The mechanics of epigenetics


For the materialistically predisposed, adding small methyl groups to specific points of DNA is one of the main ways of turning a gene off. This short video and news report gives you a Laymans view understanding.

Due to the limited data
available in the gene structure epigenetic changes were needed for things such as plasmodium falciparum genes. Epigenetic changes are more flexible than genetic changes, and permit rapid yet reversible adaptation, so determining which proteins can be turned off, via release of the usual hormones/endorphins/neurochemicals that relate to particular states of mind, is very important.

Epigenetic changes determine which proteins are transcribed. So far theres three systems which intertwine with each other to silence genes: histone modifications (histone proteins that are the primary components of chromatin responsible for forming DNA that makes up chromosomes), RNA-associated silencing and DNA methylation. 


5-methylcytosine is a is a methylated form of the DNA base cytosine that is involved in the regulation of gene transcription, old perspectives on it's role in this (that did not consider biofeedback and psychosomatic cause of the chemicals in question at the time) while still true have been shown to not be holistic enough; it's now known to be a main epigenetic mechanism, with many 5MC patterns being inherited epigenetically.

Epigenetic changes are an ideal target for disease control, because they are natyurally reversable, whereas sequence mutations in DNA are not. Which would shut a lot of the anti-genetic modification people up. But also likely just move the issue they have to scrutinising epigenetics; unless they actually understand the reversible ramifications it has that have before been a matter of contention.


 DNA Is Not Destiny (source)

"The new science of epigenetics rewrites the rules of disease, heredity, and identity. [.....]

It was a little eerie and a little scary to see how something as subtle as a nutritional change in the pregnant mother rat could have such a dramatic impact on the gene expression of the baby," Jirtle says. "The results showed how important epigenetic changes could be."

Jirtle continues "The tip of the iceberg is genomics.... The bottom of the iceberg is epigenetics"

Dr Moshe Syf from McGill university in Montreal has studied the relationships between rats and their offspring. Some of the mother rats groomed and nurtured their young, and some hardly did at all. Rats that had been groomed as infants showed marked behavioural changes as adults, they were "Less fearful and better adjusted than the offspring of the neglectful mothers" (Epigenetics, The Economist*) They then acted in similar nurturing ways towards their own offspring, producing the same epigenetic behavioural results in the next generation. This shows that epigenetic changes, once started in one generation, can be passed on to the following generations without changes in the gene themselves.

There were numerous chemical changes detected in the rats brains and major differences had developed between the nurtured group and the neglected group, especially in the area of the hippocampus involved in stress. A gene that dampens our response to stress had a greater degree of expression in the well nurtured rats. The brains of the nurtured rats also showed higher levels of a chemical (acetyl groups) that facilitates gene expression by binding the protein sheath around the gene, making it easier for the gene to express. They also had higher levels of an enzyme that adds acetyl groups to the protein sheath.

In the non nurtured rats the changes were quite different, they were anxious and fearful. The same gene repressing substance that Randy Jirtle found in her work (that I cited above), the methyl groups, were much more prevalent in the hippocampi. It bonded to the DNA and inhibited the expression of the gene involved in dampening stress. To test the hypothesis that these two substances were causing epigenetic changes in the Rats they injected the fearful rats with a substance that raised the number of acetyls in the hippocampus. Sure enough, the behaviour of the rats changed and they became less fearful and better adjusted.

Determining Nature vs. Nurture



Molecular evidence is finally emerging to inform the long-standing debate

Psychologists, psychiatrists and neuroscientists have jousted for years over how much of our behavior is driven by our genes versus the environments in which we grow up and live. Arguments have persisted because there has been little hard evidence to answer basic questions: How exactly do genes and environment interact to determine whether someone will become depressed, say, or schizophrenic? And can environmental interventions such as drugs or psychotherapy really alleviate disorders that are largely determined by genes?

The article goes on to note that depressed and anti social behaviour in mice is accompanied by methyl groups sticking to genes, and also extends this research to humans, as the brains of schizophrenics also show changes in the methylation of genes, or acetylization of their protein sheaths.

Experiments have shown a striking link between our state of mind, such as childhood stress, and later disease. ACE (adverse child experiences) conducted detailed social, psychological and medial examinations of some 17,000 people over a five year period. The study showed a strong inverse link between emotional wellbeing, health and longevity on the one hand, and early life stress on the other. People in a dysfunctional family were five times more likely to be depressed, three times more likely to smoke, thirty times more likely to commit suicide, and ailments were much more common in the dysfunctional families, increased rates of obesity, heart disease, lung disease, diabetes, bone fractures, hypertension, and hepatitis. The genetic link between nurturing and gene expression in children is also now being traced; "One recent study suggests that children with a certain version of a gene that produces an enzyme known as MAO-A (which metabolizes neurotransmitters such as serotonin and dopamine) are significantly more likely to become violent—but only if they were mistreated as children. In this way, an aspect of human behavior might be a bit like the body of the Bicyclus butterfly, driven to one form or another by genes that switch in response to environmental cues, one genotype yielding two different phenotypes for two different environments." (why we have misunderstood the nature nurture debate - Professor Gary Marcus)

And there are many more examples of our beliefs causing epigenetic changes, the most researched is how our perceptions effect disease progression. Gail Ironson, MD, has shown that there were two factors that are interesting predictors of how fast HIV progressed in the research subjects. The first was the view of the nature of god. Some believed in a punishing god, while other believed in a benevolent god. She observes that, “People who view god as judgemental god have a CD4 (T-helper) cell decline more than twice the rate of those who don’t see god as judgemental, and their viral load increases more than three times faster. For example a precise statement affirmed by these patients is ‘god will judge me harshly one day’ This one item is related to an increased likelihood that patient will develop and opportunistic infection or die. These beliefs predict disease progression even more strongly than depression” (From: View of God is associated with disease progression in HIV. Paper presented at the annual meeting of the Society of Behavioral Medicine, March 22–25, 2006, San Francisco, California. Abstract published in Annals of Behavioural Medicine 2006)

Spirituality may be viewed as another type of coping. Men and women with HIV studied during the HAART era who endorsed more spirituality after their HIV diagnosis had a slower decline in CD4+ cell counts and better control of VL over 4 years (18). Fitzpatrick et al. (19) followed 901 HIV infected persons for 1 year and found that participation in spiritual activities (e.g., prayer, meditation, affirmations, visualizations) predicted reduced risk of dying, but only in those not on HAART. Another HAART era study found significantly lower mortality over 3 to 5 years for those with HIV who had a spiritual transformation (20). Furthermore, the spiritual belief that "God is merciful" was protective of health over time, whereas the belief that "God is judgmental and punishing and is going to judge me harshly some day" was associated with a faster deterioration of CD4+ cells and poorer control of the HIV virus (21). Thus, view of God may be either helpful or harmful, depending on the nature of that belief.

Rethinking 'the selfish gene' as more a cultural icon

"genes have been the dominant metaphor underlying all manner of human behaviour, from the most basic to animalistic, like sex, up to and including such esoterica as the practise of religion, the enjoyment of music, and the codification of laws and moral strictures... The media are besotted with genes... genes have for over half a century easily eclipsed the outside natural world as the primary driving force of evolution in the minds of evolutionary biologists." (Dawson Church, the genie in your genes)

To fully accept the arguments of Richard Dawkins (author of The Selfish Gene) and his acolytes, one would be forced to conclude that "we do it" solely because our genes are telling us to reproduce more genes; but genes don't drive evolution, argues Eldredge (curator, American Museum of Natural History), especially in social creatures such as humans. In this popular science work, he discusses a "human triangle" of sexual, reproductive, and economic behavior that has increasingly been guided by culture over the past two-and-a-half million years. Furthermore, Eldredge says, Dawkins' gene-centric view "has profoundly bad implications for social theory and its political implementation."

Unexamined beliefs in this or that gene causing this or that condition are part of the foundation of many scientific disciplines. Such assumptions can be found in various publications, like this one aired on NPS; "Scientists today announced that they have found a gene for dislexia. Its a gene on choromozone six called DCDC2", the new york times picked up on this and ran a story entitled "Findings support that dislexia disorder is genetic" Other media picked up the story, and the legend of the primacy of DNA was reinforced.

The main issue, in the case of Dawkins selfish gene material and related theories, is that it locates the ultimate power over our health in the untouchable realm of molecular structure, rather than in our own conscious actions and descisions.

Dorothy Nelkin in her much cited book entitled "The DNA mystique: The Gene as a Cultural Icon" sums up the point by stating "In a diverse array of popular sources, the gene has become a supergene, an almost supernatural entity that has the power to define identity, determine human affairs, dictate human relationships, and explain social problems. In this construct, human beings in all their complexity are seen as products of a molecular text...the secular equivalent of a soul—the immortal site of the true self and determiner of fate.


Swaying the skeptics

Many people have come up with ways to test epigenetic ideas, and the general trend is supportive. Some of the strongest evidence comes from the 9/11 attacks effects on people subsequently suffering PTSD and how this has made similar PTSD symptoms more likely in their offspring (Transgenerational effects of posttraumatic stress disorder in babies of mothers exposed to the World Trade Center attacks during pregnancyref2 - ref3). Some studies done on hunger and lack of food during historical food shortages in war times can still be evident in offspring a hundred years later in subsequent generations first became apparent in Sweden after a world war. 

The remote, snow-swept expanses of northern Sweden are an unlikely place to begin a story about cutting-edge genetic science. The kingdom's northernmost county, Norrbotten, is nearly free of human life; an average of just six people live in each square mile. And yet this tiny population can reveal a lot about how genes work in our everyday lives. In the 1980s, Dr. Lars Olov Bygren, a preventive-health specialist who is now at the prestigious Karolinska Institute in Stockholm, began to wonder what long-term effects the feast and famine years might have had on children growing up in Norrbotten in the 19th century — and not just on them but on their kids and grandkids as well. So he drew a random sample of 99 individuals born in the Overkalix parish of Norrbotten in 1905 and used historical records to trace their parents and grandparents back to birth. By analyzing meticulous agricultural records, Bygren and two colleagues determined how much food had been available to the parents and grandparents when they were young.

Around the time he started collecting the data, Bygren had become fascinated with research showing that conditions in the womb could affect your health not only when you were a fetus but well into adulthood. In 1986, for example, the Lancet published the first of two groundbreaking papers showing that if a pregnant woman ate poorly, her child would be at significantly higher than average risk for cardiovascular disease as an adult. Bygren wondered whether that effect could start even before pregnancy: Could parents' experiences early in their lives somehow change the traits they passed to their offspring?


It was a heretical idea. After all, we have had a long-standing deal with biology: whatever choices we make during our lives might ruin our short-term memory or make us fat or hasten death, but they won't change our genes — our actual DNA. Which meant that when we had kids of our own, the genetic slate would be wiped clean.

What's more, any such effects of nurture (environment) on a species' nature (genes) were not supposed to happen so quickly. Charles Darwin, whose On the Origin of Species celebrated its 150th anniversary in November, taught us that evolutionary changes take place over many generations and through millions of years of natural selection. But Bygren and other scientists have now amassed historical evidence suggesting that powerful environmental conditions (near death from starvation, for instance) can somehow leave an imprint on the genetic material in eggs and sperm. These genetic imprints can short-circuit evolution and pass along new traits in a single generation.


For instance, Bygren's research showed that in Overkalix, boys who enjoyed those rare overabundant winters — kids who went from normal eating to gluttony in a single season — produced sons and grandsons who lived shorter lives. Far shorter: in the first paper Bygren wrote about Norrbotten, which was published in 2001 in the Dutch journal Acta Biotheoretica, he showed that the grandsons of Overkalix boys who had overeaten died an average of six years earlier than the grandsons of those who had endured a poor harvest. Once Bygren and his team controlled for certain socioeconomic variations, the difference in longevity jumped to an astonishing 32 years. Later papers using different Norrbotten cohorts also found significant drops in life span and discovered that they applied along the female line as well, meaning that the daughters and granddaughters of girls who had gone from normal to gluttonous diets also lived shorter lives. To put it simply, the data suggested that a single winter of overeating as a youngster could initiate a biological chain of events that would lead one's grandchildren to die decades earlier than their peers did. How could this be possible? What does it all mean practically?

Meet the Epigenome

The answer lies beyond both nature and nurture. Bygren's data — along with those of many other scientists working separately over the past 20 years — have given birth to the new science of epigenetics. At its most basic, epigenetics is the study of changes in gene activity that do not involve alterations to the genetic code but still get passed down to at least one successive generation. These patterns of gene expression are governed by the cellular material — the epigenome — that sits on top of the genome, just outside it. It is these epigenetic "marks" that tell your genes to switch on or off, to speak loudly or whisper. It is through epigenetic marks that environmental factors like diet, stress and prenatal nutrition can make an imprint on genes that is passed from one generation to the next.

Epigenetics brings both good news and bad. Bad news first: there's evidence that lifestyle choices like smoking and eating too much can change the epigenetic marks atop your DNA in ways that cause the genes for obesity to express themselves too strongly and the genes for longevity to express themselves too weakly. We all know that you can truncate your own life if you smoke or overeat, but it's becoming clear that those same bad behaviors can also predispose your kids — before they are even conceived — to disease and early death.

The good news: scientists are learning to manipulate epigenetic marks in the lab, which means they are developing drugs that treat illness simply by silencing bad genes and jump-starting good ones. In 2004 the Food and Drug Administration (FDA) approved an epigenetic drug for the first time. Azacitidine is used to treat patients with myelodysplastic syndromes (usually abbreviated, a bit oddly, to MDS), a group of rare and deadly blood malignancies. The drug uses epigenetic marks to dial down genes in blood precursor cells that have become overexpressed. According to Celgene Corp. — the Summit, N.J., company that makes azacitidine — people given a diagnosis of serious MDS live a median of two years on azacitidine; those taking conventional blood medications live just 15 months.

Since 2004, the FDA has approved three other epigenetic drugs that are thought to work at least in part by stimulating tumor-suppressor genes that disease has silenced. The great hope for ongoing epigenetic research is that with the flick of a biochemical switch, we could tell genes that play a role in many diseases — including cancer, schizophrenia, autism, Alzheimer's, diabetes and many others — to lie dormant. We could, at long last, have a trump card to play against Darwin.

The funny thing is, scientists have known about epigenetic marks since at least the 1970s. But until the late '90s, epigenetic phenomena were regarded as a sideshow to the main event, DNA. To be sure, epigenetic marks were always understood to be important: after all, a cell in your brain and a cell in your kidney contain the exact same DNA, and scientists have long known that nascent cells can differentiate only when crucial epigenetic processes turn on or turn off the right genes in utero.
More recently, however, researchers have begun to realize that epigenetics could also help explain certain scientific mysteries that traditional genetics never could: for instance, why one member of a pair of identical twins can develop bipolar disorder or asthma even though the other is fine. Or why autism strikes boys four times as often as girls. Or why extreme changes in diet over a short period in Norrbotten could lead to extreme changes in longevity. In these cases, the genes may be the same, but their patterns of expression have clearly been tweaked.

Biologists offer this analogy as an explanation: if the genome is the hardware, then the epigenome is the software. "I can load Windows, if I want, on my Mac," says Joseph Ecker, a Salk Institute biologist and leading epigenetic scientist. "You're going to have the same chip in there, the same genome, but different software. And the outcome is a different cell type."As Terence Mckenna first said "culture is your chosen operating system, not a reality"


Other recent studies have also shown the power of environment over gene expression. For instance, fruit flies exposed to a drug called geldanamycin show unusual outgrowths on their eyes that can last through at least 13 generations of offspring even though no change in DNA has occurred (and generations 2 through 13 were not directly exposed to the drug). Similarly, according to a paper published last year in the Quarterly Review of Biology by Eva Jablonka (an epigenetic pioneer) and Gal Raz of Tel Aviv University, roundworms fed with a kind of bacteria can feature a small, dumpy appearance and a switched-off green fluorescent protein; the changes last at least 40 generations. (Jablonka and Raz's paper catalogs some 100 forms of epigenetic inheritance)

Can epigenetic changes be permanent? Possibly, but it's important to remember that epigenetics isn't evolution. It doesn't change DNA. Epigenetic changes represent a biological response to an environmental stressor. That response can be inherited through many generations via epigenetic marks, but if you remove the environmental pressure, the epigenetic marks will eventually fade, and the DNA code will — over time — begin to revert to its original programming. That's the current thinking, anyway: that only natural selection causes permanent genetic change.


And yet even if epigenetic inheritance doesn't last forever, it can be hugely powerful. In February 2009, the Journal of Neuroscience published a paper showing that even memory — a wildly complex biological and psychological process — can be improved from one generation to the next via epigenetics. The paper described an experiment with mice led by Larry Feig, a Tufts University biochemist. Feig's team exposed mice with genetic memory problems to an environment rich with toys, exercise and extra attention. These mice showed significant improvement in long-term potentiation (LTP), a form of neural transmission that is key to memory formation. Surprisingly, their offspring also showed LTP improvement, even when the offspring got no extra attention.
All this explains why the scientific community is so nervously excited about epigenetics. In his forthcoming book The Genius in All of Us: Why Everything You've Been Told About Genetics, Talent and IQ Is Wrong, science writer David Shenk says epigenetics is helping usher in a "new paradigm" that "reveals how bankrupt the phrase 'nature versus nurture' really is." He calls epigenetics "perhaps the most important discovery in the science of heredity since the gene."


Geneticists are quietly acknowledging that we may have too easily dismissed an early naturalist who anticipated modern epigenetics — and whom Darwinists have long disparaged. Jean-Baptiste Lamarck (1744-1829) argued that evolution could occur within a generation or two. He posited that animals acquired certain traits during their lifetimes because of their environment and choices. The most famous Lamarckian example: giraffes acquired their long necks because their recent ancestors had stretched to reach high, nutrient-rich leaves.

In contrast, Darwin argued that evolution works not through the fire of effort but through cold, impartial selection. By Darwinist thinking, giraffes got their long necks over millennia because genes for long necks had, very slowly, gained advantage. Darwin, who was 84 years younger than Lamarck, was the better scientist, and he won the day. Lamarckian evolution came to be seen as a scientific blunder. Yet epigenetics is now forcing scientists to re-evaluate Lamarck's ideas.

Exploring Epigenetic Potential

How can we harness the power of epigenetics for good? In 2008 the National Institutes of Health (NIH) announced it would pour $190 million into a multilab, nationwide initiative to understand "how and when epigenetic processes control genes." Dr. Elias Zerhouni, who directed the NIH when it awarded the grant, said at the time — in a phrase slightly too dry for its import — that epigenetics had become "a central issue in biology."

This past October, the NIH grant started to pay off. Scientists working jointly at a fledgling, largely Internet-based effort called the San Diego Epigenome Center announced with colleagues from the Salk Institute — the massive La Jolla, Calif., think tank founded by the man who discovered the polio vaccine — that they had produced "the first detailed map of the human epigenome."

The claim was a bit grandiose. In fact, the scientists had mapped only a certain portion of the epigenomes of two cell types (an embryonic stem cell and another basic cell called a fibroblast). There are at least 210 cell types in the human body — and possibly far more, according to Ecker, the Salk biologist, who worked on the epigenome maps. Each of the 210 cell types is likely to have a different epigenome. That's why Ecker calls the $190 million grant from NIH "peanuts" compared with the probable end cost of figuring out what all the epigenetic marks are and how they work in concert.

Remember the Human Genome Project? Completed in March 2000, the project found that the human genome contains something like 25,000 genes; it took $3 billion to map them all. For some this amounted to a hallmark in biology, for shareholders however it marked the death of the market and more investment. The human epigenome contains an as yet unknowable number of patterns of epigenetic marks, a number so big that Ecker won't even speculate on it. The number is certainly in the millions. A full epigenome map will require major advances in computing power. When completed, the Human Epigenome Project (already under way in Europe) will make the Human Genome Project look like homework that 15th century kids did with an abacus.

But the potential is staggering. For decades, we have stumbled around massive Darwinian roadblocks. DNA, we thought, was an ironclad code that we and our children and their children had to live by. Now we can imagine a world in which we can tinker with DNA, bend it to our will. It will take geneticists and ethicists many years to work out all the implications, but be assured: the age of epigenetics has arrived.

Tuesday 16 April 2013

Terence McKennas "Stoned Ape" Theory of Human Evolution

 


Terence McKenna was the first vocal proponent of this theory, which theorizes that as the North African jungles receded toward the end of the most recent ice age, giving way to grasslands, a branch of our tree-dwelling primate ancestors left the branches and took up a life out in the open - following around herds of ungulates, nibbling what they could along the way.

Among the new items in their diet were psilocybin-containing mushrooms growing in the dung of these ungulate herds. The changes caused by the introduction of this drug to the primate diet were many -- McKenna theorizes, for instance, that synesthesia (the blurring of boundaries between the senses) caused by psilocybin led to the development of spoken language: the ability to form pictures in another person's mind through the use of vocal sounds. About 12,000 years ago, further climate changes removed the mushroom from the human diet, resulting in a new set of profound changes in our species as we reverted to pre-mushroomed and frankly brutal primate social structures that had been modified and/or repressed by less frequent consumption of psilocybin.


McKenna's theory is necessarily based on a great deal of supposition interpolating between the few fragmentary facts we know about hominid and early human history. In addition, because McKenna (who describes himself as "an explorer, not a scientist") is also a proponent of much wilder suppositions, his more reasonable theories are usually disregarded by the very scientists whose informed criticism is crucial for their development. In a review of his book Food of the gods, Village Voice stated 'if only a fraction of Mckenna thoughts are true, he will someday be regarded as the Copernican for consciousness'


This page links to resources that should help to fill in some of the gaps with the theory with data from various sciences and will try to address other cultural myths about Apes unprecedentedly quick brain and minds evolution from Ape to Human.


"The 20th century mind is nostalgic for the paradise that once existed on the mushroom-dotted plains of Africa, where the plant-human symbiosis occurred that pulled us out of the animal body and into the tool-using, culture-making, imagination-exploring creature that we are." - Mckenna


The precedent 

 The main shortcoming for evolutionary theory as it applies to human origins is the human neocortex. Carl Sofus Lumholtz described the evolution of the human neocortex as "the most dramatic transformation of a major organ of a higher animal in the entire fossil record". A review of the evolution of the human neocortex in 2009 by Pasko Rakic stated "It is, therefore, surprising how little modern research has been done to elucidate how this human difference emerged. It appears that we are sometimes so seduced by similarities between species that we neglect the differences"[35]

Therefore it is necessary in evolutionary theory to account for the dramatic emergence of the human neocortex in this very narrow window of time. In about two million years Apes went from being higher primates, hominids, to being true humans.

Human Plant Symbiosis

Mutualistic symbiosis in biology means two distinct types of life form that have mutual benefit from their company. A example would be a sucker fish that lives off of plankton on a whale's back; by this the relationship both partners gain an evolutionary, survival and biological advantage; the whale gets cleaned and seems to enjoy the sensory contact aspect, and the fish gets food and protection by the huge whale. Likewise, as plants have come to depend on humans for the dispersal of seeds and other benefits, Mckenna posited that it was very likely that we in turn benefited from forest vegetation, including the psychedelic mushrooms.

It was later determined that some mushrooms do rely symbiotically on mammals to aid in spore dissemination. Some fungi and mushrooms can survive through our digestive system and germinate after being excreted, sometimes referred to as coprophilic mushrooms.


Mckennas view was that humans have been involved in a mutualistic symbiosis with psychedelic mushrooms and other related chemical cousins for tens of thousands of years, which have been used to catalyze human imagination, spawning religions, mystical states, artwork, linguistic thinking, spirituality, introspection about the nature of consciousness not possible without such agents, the development of cultures, and many more aspects that distinguish us from other primates.[10]

Apes use of plants as medicine


  At Gombe Steam National Park was one of the first institutions that noticed Apes would tend to even eat food that they did not appear to like the taste of, or were not able to digest very well. Despite previously not enjoying this food, the Apes would still selectively go looking for it [1] Eventually a redish oil was found called Rhiarubrine-A. Neil Towers of British Columbia University soon found out that this oil kills bacteria in their dozens, but just below the significant 10 in a million to make it clinically dangerous. [2]

Thus it seemed that even if the food they learn to eat was unpleasant, if it has a positive effect on it's well being, health or mind in some way, they would tend to continue eating it by self medicating themselves through their choice of food selection from their surrounding natural pharmacy [3][4][5]


Since other animals enjoy psychoactive drugs, like cats love catnip, or monkeys enjoy alcohol they scrounge from humans, it is only natural to expect chimps to also; and numerous studies have found this if they enjoy the medicinal effects they continue to ingest it despite of the taste [5][6] This is sometimes referred to zoopharmacognosy [7][8][9] The basic premise of zoo- pharmacognosy is that animals utilize plant secondary compounds or other non-nutritional substances to medicate themselves. Among primatologists a major focus of concern about plant secondary compounds in the diet has been on how and why pri- mates can cope with their presence

You are what you think, as well as what you eat



Rather than theorizing our sudden evolution was merely due to an expanded diet as our ancestors moved around, Mckenna argues there is a a primary factor often overlooked, and he made the argument for a select few psychedelic foods we found, that centuries of ingesting and experimenting with set us down the road of evolving into the true Humans we are today. Back then each encounter with a new food would have been thought of the same, whether it was a fruit, a drug or an insect a lot of care would at first have to be taken.

As our diets increased so did our perception of varieties of new foods and tastes, Gastronomy was born shortly after our taste for novel pharmacology, which must have preceded it, as maintenance of health and thought is a regulation of diet seen in most animals.[10]


 
Mckenna explains how the mental changes elicited from psychedelics may have played an even bigger role than the nutritional diet in how we evolved socially and culturally:

"The primate tendency to form dominance heirarchies was temporarily interrupted for about 100,000 years by the psilocybin in the paleolithic diet. This behavioral style of male dominance was chemically interrupted by psilocybin in the diet, so it allowed the style of socialorganization called partnership to emerge, and that that occurred during the period when language, altruism, planning, moral values, esthetics, music and so forth -- everything associated with humanness -- emerged during that period. About 12,000 years ago, the mushrooms left the human diet because they were no longer available, due to climatological change and the previous tendency to form dominance hierarchies re-emerged. So, this is what the historic dilemma is: we have all these qualities that were evolved during the suppression of male dominance that are now somewhat at loggerheads with the tendency of society in a situation of re-established male dominance.

The paleolithic situation was orgiastic and this made it impossible for men to trace lines of male paternity, consequently there was no concept of 'my children' for men. It was 'our children' meaning 'we, the group.' This orgiastic style worked into the effects of higher doses of psilocybin to create a situation of frequent boundary dissolution. That's what sexuality is, on one level, about and it's what psychedelics, on another level, are about. With the termination of this orgiastic, mushroom using style of existence, a very neurotic and repressive social style emerged which is now worldwide and typical of western civilization."
(Terence McKenna: Mushrooms Sex and Society Interview by Philip H. Farber)

 

The evolutionary benefits of novel psychedelics

 

Mckenna comments that although his theory focuses mainly on mushrooms there is far more broader scope for a vast array of other psychoactives[10].

The mutation-inducing influence of diet on early humans and the effect of exotic metabolites on the evolution of their neurochemistry and culture is still unstudied territory. The early hom- inids' adoption of an omnivorous diet and their discovery of the power of certain plants were decisive factors in moving early humans out of the stream of animal evolution and into the fast-rising tide of language and culture. Our remote ancestors discovered that certain plants, when self-administered, suppress appetite, diminish pain, supply bursts of sudden energy, confer immunity against pathogens, and synergize cognitive activities. These discoveries set us on the long journey to self-reflection. Once we became tool-using omnivores, evolution itself changed from a process of slow modification of our physical form to a rapid definition of cultural forms by the elaboration of rituals, languages, writing, mnemonic skills, and technology.

 

Potential role in the Evolution of language

 

The ability to generate language is mainly about making a new connection between sound, image and symbol, which can be linked to synesthesia. Synesthesia being the transference of one sensory mode to another, where you can see colours and feel emotions for certain numbers. Most people can not do this without psychedelics, but some can and are genetically synesthetic, and usually when they are the synesthesia is related to language, and this is exactly what psychedelics do. So the process of understanding language can be viewed as process of synesthesia that we are not even aware of anymore, as we live in world where abstractions and symbols are as real as anything in the outside world, and we live in a world in which symbols have significance, and that is the basis of language, our ability to perceive meaning based on a sort of unconscious synesthesia. And mushrooms were able to trigger these synthetic experiences in people and essentially became training tools for learning and cognition, how to associate meaningless sounds and meaningless visual cues together, thus this essential connection leads to meaninless actions now having a significance.[38]

 

Catalyzing lower consciousness to higher consciousness


 Mckennas contention was that is was just not variety in physical food alone that aided the expansion and sudden power of the human mind to evolve, that means various plant alkaloids would have to be involved, and some of these would be DMT, Psilocybin and Harmalin.

In research done back in the 1960's by Roland Fisher experimented by giving students small doses of psilocybin and then testing their visual acuity by moving lines around on a piece of paper. He found that their visual accuracy and awareness of surrounding visual stimuli was greatly improved [11] Unfortunately due it's legality only limited further tests have been done, but many subjective reports report the same at threshold dosages. If this is the case, for a species of tree dwelling primates and hunter gatherers this would provide a tremendous advantage in hunting for food and climbing trees. And they would have to come down out of the trees out of their comfort zone to do this, as the only place this miracle hunting food grew was on the floor of the forest, thus starting the human evolutionary process. The relevance of Fishers studies have been questioned by skeptics, citing small sample size and inconclusive results. The fact that many psychotropic plants in the environment could have potentially conferred an evolutionary advantage to those members of the population that seek it out is not in dispute however (see zoopharmacognosy above)
 

 

The next major steps for the full evolution of humankind


 The main three advantages McKenna identified as being of critical importance to the survival of Apes are that in higher doses, McKenna claims, the mushroom acts as a sexual stimulator, which would make it even more beneficial evolutionary (it would result to more offspring), and at even higher doses the mushroom would have given humans the ability for self-reflection, which McKenna believed was unique to humans, and the first truly mystical experiences (which, as he believed, were the basis for the foundation of all subsequent religions to date). Another factor that McKenna talked about was the mushroom's potency to promote linguistic thinking. This would have promoted vocalization, which in turn would have acted in cleansing the brain (based on a scientific theory that vibrations from speaking cause the precipitation of impurities from the brain to the cerebrospinal fluid), which would further mutate our brain.

All these factors according to McKenna were the most important factors that promoted our evolution towards the Homo sapiens species. After this transformation took place, our species would have begun moving out of Africa to populate the rest of the planet Later on[10].


Mckenna points out many consciousness catalyzing effects on human development when we realized that there were opiate plants that made us not feel pain, stimulants that enabled us boundless energy, psychoactives that enabled deep states of introspection and changes to sensory acuity, tranquilizing agents to aid sleep and rest and other consciousness catalyzing efffects. The question becomes not did ancient man use such agents, that would be unavoidable, but how much various cultures did[10].
 

 

   Sensory

 

Noticeable changes to the audio, visual, and tactile senses may become apparent around an hour after ingestion. These shifts in perception visually include enhancement and contrasting of colors, strange light phenomena (such as auras or "halos" around light sources), increased visual acuity, surfaces that seem to ripple, shimmer, or breathe; complex open and closed eye visuals of form constants or images, objects that warp, morph, or change solid colors; a sense of melting into the environment, and trails behind moving objects. Sounds seem to be heard with increased clarity; music, for example, can often take on a profound sense of cadence and depth. Some users experience synesthesia, wherein they perceive, for example, a visualization of color upon hearing a particular sound [13] Similar psychedelics such as marijuana are used to increase visual acuity for conditions like glaucoma as well as for therapeutic use in numerous conditions, including pain, stroke, cancer, obesity, osteoporosis, fertility, neurodegenerative diseases, multiple sclerosis, and inflammatory diseases, among others[26], and further studies have been done on it's enhancement of visual accuracy and general benefits to the retina at nighttime as well as in the day time.[27][28] These seem especially true when the subject is moving and not in a stationary position[29]. 


Increased spirituality

In 2006, the United States government funded a randomized and double-blinded study by Johns Hopkins University, which studied the spiritual effects of psilocybin in particular. That is, they did not use mushrooms specifically (in fact, each individual mushroom piece can vary wildly in psilocybin and psilocin content[14]). The study involved 36 college-educated adults (average age of 46) who had never tried psilocybin nor had a history of drug use, and who had religious or spiritual interests. The participants were closely observed for eight-hour intervals in a laboratory while under the influence of psilocybin mushrooms[15].

One-third of the participants reported that the experience was the single most spiritually significant moment of their lives and more than two-thirds reported it was among the top five most spiritually significant experiences. Two months after the study, 79% of the participants reported increased well-being or satisfaction; friends, relatives, and associates confirmed this. They also reported anxiety and depression symptoms to be decreased or completely gone. Despite highly controlled conditions to minimize adverse effects, 22% of subjects (8 of 36) had notable experiences of fear, some with paranoia. The authors, however, reported that all these instances were "readily managed with reassurance."[15] 


Roland Griffiths has conducted pioneering research at John Hopkins university showing that the correct dose of psilocybin mushrooms can cause mystical type experiences that have substantial and sustained personal meaning and spiritual significance [31] At 2 months, the volunteers rated the psilocybin experience as having substantial personal meaning and spiritual significance and attributed to the experience sustained positive changes in attitudes and behavior consistent with changes rated by community observers. These effects were still apparent even 14 months after taking the ingesting the psilocybin [32][33] Obviously for evolving apes a plant/fungi that produces such a drastic change that the effects are still felt 14 months after ingestion would have produced huge interest and effected their long term physiology. Other studies of his have also shown that these mystical experiences occasioned by the hallucinogen psilocybin lead to increases in the personality domain of openness [34], which would greatly effect the perspective of habit forming apes in prehistory.
 
As Medicine

There have been calls for medical investigation of the use of synthetic and mushroom-derived psilocybin for the development of improved treatments of various mental conditions, including chronic cluster headaches,[16] following numerous anecdotal reports of benefits. There are also several accounts of psilocybin mushrooms sending both obsessive-compulsive disorders ("OCD") and OCD-related clinical depression (both being widespread and debilitating mental health conditions) into complete remission immediately and for up to months at a time, compared to current medications which often have both limited efficacy[17] and frequent undesirable side-effects.[18] The effect of mushrooms to break OCD habits when applied to primates would be a lot more apparent, as animals operate on habits and instincts with less conscious introspection than humans do.
"Developing drugs that are more effective and faster acting for the treatment of OCD is of utmost importance and until recently, little hope was in hand. A new potential avenue of treatment may exist. There are several reported cases concerning the beneficial effects of hallucinogenic drugs (MDMA, psilocybin and LSD), potent stimulators of 5-HT2A and 5-HT2C receptors, in patients with OCD (Brandrup and Vanggaard, 1977, Rapoport, 1987, Moreno and Delgado, 1997) and related disorders such as body dysmorphic disorder (Hanes, 1996)."[19]

Emotional evolution


As with other psychedelics such as LSD, the experience, or "trip," is strongly dependent upon set and setting. A negative environment could likely induce a bad trip, whereas a comfortable and familiar environment would allow for a pleasant experience. Many users find it preferable to ingest the mushrooms with friends, people they're familiar with, or people that are also 'tripping', although neither side of this binary is without exception.[18][19] This would make users more socially aware of who they are emotionally close to, and give an amount of introspection into their emotions they would not have without the use of the psychedelics. 


Archeological evidence

There is some archaeological evidence for their use in ancient times. Several mesolithic rock paintings from Tassili n'Ajjer (a prehistoric North African site identified with the Capsian culture) have been identified by author Giorgio Samorini as possibly depicting the shamanic use of mushrooms, possibly Psilocybe.[20] Hallucinogenic species of Psilocybe have a history of use among the native peoples of Mesoamerica for religious communion, divination, and healing, from pre-Columbian times up to the present day.

Mushroom-shaped statuettes found at archaeological sites seem to indicate that ritual use of hallucinogenic mushrooms is quite ancient.[21] Mushroom stones and motifs have been found in Mayan temple ruins in Guatemala,[22] though there is considerable controversy as to whether these objects indicate the use of hallucinogenic mushrooms or whether they had some other significance with the mushroom shape being simply a coincidence. 
More concretely, a statuette dating from ca. 200 AD and depicting a mushroom strongly resembling Psilocybe mexicana was found in a west Mexican shaft and chamber tomb in the state of Colima. Hallucinogenic Psilocybe were known to the Aztecs as teonanácatl (literally "divine mushroom" - agglutinative form of teó (god, sacred) and nanácatl (mushroom) in Náhuatl) and were reportedly served at the coronation of the Aztec ruler Moctezuma II in 1502. Aztecs and Mazatecs referred to psilocybin mushrooms as genius mushrooms, divinatory mushrooms, and wondrous mushrooms, when translated into English.[22] Bernardino de Sahagún reported ritualistic use of teonanácatl by the Aztecs, when he traveled to Central America after the expedition of Hernán Cortés.


At present, hallucinogenic mushroom use has been reported among a number of groups spanning from central [23] Mexico to Oaxaca, including groups of Nahua, Mixtecs, Mixe, Mazatecs,[24] Zapotecs, and others. 


Current research and other resources

 

Although not often framed in the psycho-pharmacological context of psychedelic consuming humans in prehistory, the ever evolving field of epigenetic inheritance of behavioral traits seems to add some plausibility to the stoned ape theory previously not allowed by genetic determinism based ideologies. The extent to which behavioral traits based on changes to gene expression, from states of mind and perception, is still a matter of scientific contention. The most concrete example of this effect to date is the inheritance of PTSD found from witnesses to the 9/11 world trade center attacks[37], although there exist many others.

In his book "Animals and psychedelics: The natural world and the instinct to alter consciousness" [36] Giorgio Samorini "Offers a completely new understanding of the role psychedelics play in the development of consciousness in all species. [...] Rejecting the Western cultural assumption that using drugs is a negative action or the result of an illness, Samorini opens our eyes to the possibility that beings who consume psychedelics--whether humans or animals--contribute to the evolution of their species by creating entirely new patterns of behavior that eventually will be adopted by other members of that species."

Criticism


Many people have accused the theory of only focusing on psilocybin, when there are numerous other psychedelic candidates that could satisfy the same criterion. 

Andy Letcher, Author of Shroom: A Cultural History of the Magic Mushroom, comments on his blog:


There’s a danger here that if we don’t question ourselves we’ll end up ossifying into a kind of entheogism, replete with its own mythology, founding fathers, saints, orthodoxies and cherished truths. I’m with the brothers McKenna: it behoves us to question.

So, to restate my position: that these strange, daubed figures might indeed depict psilocybin mushrooms, used within a shamanistic context, remains a possibility but one that is far from proven and which rests on several unsupported assertions.
[30] 


References



[1] Huffman, Michael (2007) Current evidence for self-medication in primates: A multidisciplinary perspective - YEARBOOK OF PHYSICAL ANTHROPOLOGY 40:171–200

[2] G. H. Neil Towers (1996) 'Leaf-swallowing by chimpanzees: A behavioral adaptation for the control of strongyle nematode infections' - International Journal of Primatology August 1996, Volume 17, Issue 4, pp 475-503

[3] Dale H. Clayton Nathan D. Wolfe (1998) The adaptive significance of self-medication Volume 8, Issue 2, February 1993, Pages 60–63

[4] Andrew Fowler, Yianna Koutsioni, Volker Sommer (2007) Leaf-swallowing in Nigerian chimpanzees: evidence for assumed self-medication January 2007, Volume 48, Issue 1, pp 73-76

[5] Harold Altshuler (1975) 'Intragastric self-administration of psychoactive drugs by the rhesus monkey' Volume 17, Issue 6, 15 September, Life Sciences Pages 883–890

[6] Glander KE (1994) Nonhuman primate self-medication with wild plant foods - University of Arizona Press, pp. 239–256.

[7] Huffman, A (2001) 'Self-Medicative Behavior in the African Great Apes: An Evolutionary Perspective into the Origins of Human Traditional Medicine 'BioScience 51(8):651-661. 2001

[8] Huffman MA et al (1994) 'The diversity of medicinal plant use by chimpanzees in the wild.' Chimpanzee Cultures. Cambridge, MA: Harvard University Press, pp. 129–148.

[9] Rodriguez E et al (1993) Zoopharmacog 'The use of medicinal plants by animals. In KR Downum, JT Romeo, and H Stafford' Recent Advances in Phytochemistry, vol. 27: Phytochemical Potential of Tropic Plants. New York: Plenum, pp. 89–105.

[10] Terence McKenna (1999) 'Food of the gods: the search for the original tree of knowledge: a radical history of plants, drugs, and human evolution - Medical Book Publication

[11] Fischer, Roland; Hill, Richard (1970). "Psilocybin-Induced Contraction of Nearby Visual Space". Agents and Actions 1 (4): 190–197.

[13] D.M. Turner Psilocybin Mushrooms: The Extraterrestrial Invasion Of Earth? The Essential Psychedelic Guide - By D. M. Turner, First Printing - September 1994 Copyright ©1994 by Panther Press ISBN 0-9642636-1-0

[14] Stafford PJ. (1992). Psychedelics Encyclopedia. Berkeley, California: Ronin Publishing. ISBN 0-914171-51-8

[15] Griffins et al Psilocybin can occasion mystical-type experiences having substantial and sustained personal meaning and spiritual significance Psychopharmacology187(3):268-83. August 2006.

[16] Arran Frood (2007) Cluster Busters NATURE MEDICINE VOLUME 13 | NUMBER 1 | JANUARY 2007, Paper endorsed and made public by MAPS.

[17] Christopher Wiegand, M.D (2060) Safety, Tolerability, and Efficacy of Psilocybin in 9 Patients With Obsessive-Compulsive Disorder J Clin Psychiatry. 2006 Nov;67(11):1735-40.

[18] Stamets, Paul (1996) Psilocybin Mushrooms of the World. Ten Speed Press. ISBN 0898158397.

[19] Simon G.Powell The Psilocybin Solution:Prelude To A Paradigm Shift

[20] Giorgio Samorini (1992) The oldest Representations of Hallucinogenic Mushrooms in the World. Integration, vol. 2/3, pp. 69-78,

[21] John M. Allegro The Sacred Mushroom And The Cross Gnostic Media Research & Publishing; 40 Anv edition (12 Nov 2009)

[22] Stamets, Paul (1996) [1996]. Psilocybin Mushrooms of the World. Ten Speed Press. p. 11. ISBN 0898158397.

[23] Stamets, Paul (1996) [1996]. Psilocybin Mushrooms of the World. Ten Speed Press. p. 7. ISBN 0898158397

[24] Johnson, Jean Bassett (1939). "The Elements of Mazatec Witchcraft". Gothenburg, Sweden: Ethnological Studies, No. 9.

[26] Ben Amar M (2006) Cannabinoids in medicine: A review of their therapeutic potential (2006) Journal of Ethno-Pharmacology 2006 Apr 21;105(1-2):1-25

[27]Stephen Yazull (2009) Endocannabinoids in the retina: From marijuana to neuroprotection Progress in Retinal and Eye Research 27 (2008) 501–526

[28]Stephen Yazulla (2006) Cannabis improves night vision: a case study of dark adaptometry and scotopic sensitivity in kif smokers of the Rif mountains of northern Morocco Survey of Ophthalmology Volume 46, Issue 1, July–August 2001, Pages 43–5

[29] MICHAEL SIVAK HUMAN FACTORS AND HIGHWAY-ACCIDENT CAUSATION: SOME THEORETICAL CONSIDERATIONS Acrid Anal & Prw.. Vol 13. pp 614, MICHAEL SIVAK

"Adams et al. [ 19751 have found that static visual acuity is unaffected by alcohol or marijuana intoxication. On the other hand, the results of Brown et al.[1975] indicate a significant effect of alcohol and marijuana on dynamic visual acuity. Thus, dynamic visual acuity has been shown to be more affected by frequently present transient human states (i.e. alcohol and marijuana intoxication) than static visual acuity. Therefore, according to the present rationale, dynamic visual acuity would be rated as more critical to safe driving than static visual acuity. (Obviously, before reaching any firm conclusions, effects of other transient states on both of the skills in question would have to be ascertained.)"

[30] Letcher. A The Selva Pascuala mushroom mural. Or not.
Blog entry, 19 July 2011

[31] Griffiths, Roland R., et al. "Psilocybin can occasion mystical-type experiences having substantial and sustained personal meaning and spiritual significance" Psychopharmacology 187.3 (2006): 268-283.

[32] Griffiths, Roland R., et al. "Mystical-type experiences occasioned by psilocybin mediate the attribution of personal meaning and spiritual significance 14 months later" Journal of Psychopharmacology 22.6 (2008): 621-632.

[33] Griffiths, Roland R., et al. "Psilocybin occasioned mystical-type experiences: immediate and persisting dose-related effects" Psychopharmacology 218.4 (2011): 649-665.

[34] MacLean, Katherine A., Matthew W. Johnson, and Roland R. Griffiths. "Mystical experiences occasioned by the hallucinogen psilocybin lead to increases in the personality domain of openness" Journal of Psychopharmacology 25.11 (2011): 1453-1461.

[35] Rakic. P "Evolution of the neocortex: Perspective from developmental biology" Nat Rev Neuroscience. 2009 October; 10(10): 724–735. 

[36] Samorini, Giorgio. Animals and psychedelics: The natural world and the instinct to alter consciousness. Park Street Press, 2002.

[37] Yehuda, Rachel, and Linda M. Bierer. "Transgenerational transmission of cortisol and PTSD risk." Progress in brain research 167 (2007): 121-135.

[38] Mckenna, Dennis "Joe Rogan Experience #298 - Dennis McKenn" January 16th 2013 [video podcast]