Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Evolution’ Category

Seeing and somethingness

leave a comment »

Nicholas Humphrey, emeritus professor of psychology at the London School of Economics and author of many books on the evolution of human intelligence and consciousness, the latest being Sentience: The Invention of Consciousness, writes in Aeon:

The cover of New Scientist magazine 50 years ago showed a picture of a rhesus monkey, with the headline ‘A Blind Monkey That Sees Everything’. The monkey, named Helen, was part of a study into the neuropsychology of vision, led by Lawrence (Larry) Weiskrantz in the psychology laboratory at the University of Cambridge. In 1965, he had surgically removed the primary visual cortex at the back of Helen’s brain. Following the operation, Helen appeared to be quite blind. When, as a PhD student, I met her a year later, it seemed nothing had changed.

But something puzzled me. In mammals, there are two main pathways from the eye to the brain: an evolutionarily ancient one – the descendant of the visual system used by fish, frogs and reptiles – that goes to the optic tectum in the mid-brain, and a newer one that goes up to the cortex. In Helen, the older visual system was still intact. If a frog can see using the optic tectum, why not Helen?

While Weiskrantz was away at a conference, I took the chance to investigate further. I sat with Helen and played with her, offering her treats for any attempt to engage with me by sight. To my delight, she began to respond. Within a few hours, I had her reaching out to take pieces of apple from my hand; within a week, she was reaching out to touch a small flashing light… Seven years later (as shown in the video below), she was running round a complex arena, deftly avoiding obstacles, picking up peanuts from the floor.

To anyone who’d observed Helen in 1972 – and didn’t know the history – it would have seemed that her eyesight was now quite normal. Yet, could she really ‘see everything’, as the New Scientist’s cover implied? I didn’t think so. I found it hard to put my finger on what was missing. But my hunch was that Helen herself still doubted she could see. She seemed strangely unsure of herself. If she was upset or frightened, her confidence would desert her, and she would stumble about as if in the dark again. The title I gave to my article inside the covers of the magazine was ‘Seeing and Nothingness’.

We were on the brink of a remarkable discovery. Following on from the findings with Helen, Weiskrantz took a new approach with a human patient, known by the initials DB, who, after surgery to remove a growth affecting the visual cortex on the left side of his brain, was blind across the right-half field of vision. In the blind area, DB himself maintained that he had no visual awareness. Nonetheless, Weiskrantz asked him to guess the location and shape of an object that lay in this area. To everyone’s surprise, he consistently guessed correctly. To DB himself, his success in guessing seemed quite unreasonable. So far as he was concerned, he wasn’t the source of his perceptual judgments, his sight had nothing to do with him. Weiskrantz named this capacity ‘blindsight’: visual perception in the absence of any felt visual sensations.

Blindsight is now a well-established clinical phenomenon. [And this perhaps explains why Ved Mehta, though blind, was able to run around easily at the school for the blind he attended in Arkansas, as he describes in his (fascinating) autobiography Face to Face. I always wondered about that. – LG]  When first discovered, it seemed theoretically shocking. No one had expected there could possibly be any such dissociation between perception and sensation. Yet, as I ruminated on the implications of it for understanding consciousness, I found myself doing a double-take. Perhaps the real puzzle is not so much the absence of sensation in blindsight as its presence in normal sight? If blindsight is seeing and nothingness, normal sight is seeing and somethingness. And surely it’s this something that stands in need of explanation.

Why do visual sensations, as experienced in normal vision, have the mysterious feel they do? Why is there any such thing as what philosophers call ‘phenomenal experience’ or qualia – our subjective, personal sense of interacting with stimuli arriving via our sense organs? Not only in the case of vision, but across all sense modalities: the redness of red; the saltiness of salt; the paininess of pain – what does this extra dimension of experience amount to? What’s it for? And, crucially, which animals besides ourselves experience it, which are sentient?

Sensation, let’s be clear, has a different function from perception. Both are forms of mental representation: ideas generated by the brain. But they represent – they are about – very different kinds of things. Perception – which is still partly intact in blindsight – is about ‘what’s happening out there in the external world’: the apple is red; the rock is hard; the bird is singing. By contrast, sensation is more personal, it’s about ‘what’s happening to me and how I as a subject evaluate it’: the pain is in my toe and horrible; the sweet taste is on my tongue and sickly; the red light is before my eyes and stirs me up.

It’s as if, in having sensations, we’re both registering the objective fact of stimulation and expressing our personal bodily opinion about it. But where do those extra qualitative dimensions come from? What can make the subjective present created by sensations seem so rich and deep, as if we’re living in thick time? What can the artist Wassily Kandinsky mean when he writes: ‘Colour is a power which directly influences the soul. Colour is the keyboard, the eyes are the hammers, the soul is the piano with many strings’? Why indeed do people use the strange expression ‘it’s like something to’ experience sensations? Is it because conscious sensations are like something they cannot really be? . . .

Continue reading. There’s quite a bit more.

Written by Leisureguy

3 October 2022 at 12:16 pm

The best foods to feed your gut microbiome

leave a comment »

It’s important to feed your gut microbiome good food because it feeds you. This Washington Post article (gift link, no paywall) by Anahad O’Connor provides a good summary of current knowledge.

Every time you eat, you are feeding trillions of bacteria, viruses and fungi that live inside your gut. But are you feeding them the right foods?

Scientists used to know very little about these communities of microbes that collectively make up the gut microbiota, also known as your gut microbiome. But a growing body of research suggests that these vast communities of microbes are the gateway to your health and well-being — and that one of the simplest and most powerful ways to shape and nurture them is through your diet.

Studies show that our gut microbes transform the foods we eat into thousands of enzymes, hormones, vitamins and other metabolites that influence everything from your mental health and immune system to your likelihood of gaining weight and developing chronic diseases.

Gut bacteria can even affect your mental state by producing mood-altering neurotransmitters like dopamine, which regulates pleasure, learning and motivation, and serotonin, which plays a role in happiness, appetite and sexual desire. Some recent studies suggest that the composition of your gut microbiome can even play a role in how well you sleep.

But the wrong mix of microbes can churn out chemicals that flood your bloodstream and build plaque in your coronary arteries. The hormones they produce can influence your appetite, blood sugar levels, inflammation and your risk of developing obesity and Type 2 diabetes.

The foods that you eat — along with your environment and your lifestyle behaviors — appear to play a much larger role in shaping your gut microbiome than genetics. In fact, genes have a surprisingly small effect. Studies show that even identical twins share just one third of the same gut microbes.

Your ‘good’ microbes feast on fiber and variety

In general, scientists have found that the more diverse your diet, the more diverse your gut microbiome. Studies show that a high level of microbiome diversity correlates with good health and that low diversity is linked to higher rates of weight gain and obesity, diabetesrheumatoid arthritis and other chronic diseases.

Eating a wide variety of fiber-rich plants and nutrient-dense foods seems to be especially beneficial, said Tim Spector, a professor of genetic epidemiology at King’s College London and the founder of the British Gut Project, a crowdsourced effort to map thousands of individual microbiomes.

Even if you already eat a lot of fruits and vegetables, Spector advises increasing the variety of plant foods you eat each week. One fast way to do this is to start using more herbs and spices. You can use a variety of leafy greens rather than one type of lettuce for your salads. Adding a variety of fruits to your breakfast, adding several different vegetables to your stir fry and eating more nuts, seeds, beans and grains is good for your microbiome. [See the Daily Dozen and Heber’s palette of colorful foods. – LG]

These plant foods contain soluble fiber that passes through much of your gastrointestinal tract largely unaffected until it reaches the large intestine. There, gut microbes feast on it, metabolizing and converting the fiber into beneficial compounds such as short chain fatty acids, which can lower inflammation and help to regulate your appetite and blood sugar levels.

In one study scientists followed more than 1,600 people for about a decade. They found that people who had the highest levels of microbial diversity also consumed higher levels of fiber. And they even gained less weight over the 10-year study, which was published in the International Journal of Obesity.

Clusters of ‘bad’ microbes thrive on junk food

Another important measure of gut health is a person’s ratio of beneficial microbes to potentially harmful ones. In a study of 1,100 people in the United States and Britain published last year in Nature Medicine, Spector and a team of scientists at Harvard, Stanford and other universities identified clusters of “good” gut microbes that protected people against cardiovascular disease, obesity and diabetes. They also identified clusters of “bad” microbes that promoted inflammation, heart disease and poor metabolic health.

While it’s clear that eating lots of fiber is good for your microbiome, research shows that eating the wrong foods can tip the balance in your gut in favor of disease-promoting microbes.

The Nature study found that “bad” microbes were more common in people who ate a lot of highly processed foods that are low in fiber and high in additives such as sugar, salt and artificial ingredients. This includes soft drinks, white bread and white pasta, processed meats, and packaged snacks like cookies, candy bars and potato chips.

The findings were based on an ongoing project called the Zoe Predict Study, the largest personalized nutrition study in the world. It’s led by . . .

Continue reading. (gift link, no paywall)

Written by Leisureguy

20 September 2022 at 5:41 pm

The Best Diet for Treating Atrial Fibrillation

leave a comment »

I don’t suffer from A-fib, and if I did, my pacemaker would be a big help, but I know some who do. This video is striking.

Written by Leisureguy

24 August 2022 at 6:19 am

Another Path to Intelligence

leave a comment »

James Bridle writes in Nautilus:

It turns out there are many ways of “doing” intelligence, and this is evident even in the apes and monkeys who perch close to us on the evolutionary tree. This awareness takes on a whole new character when we think about those non-human intelligences which are very different to us. Because there are other highly evolved, intelligent, and boisterous creatures on this planet that are so distant and so different from us that researchers consider them to be the closest things to aliens we have ever encountered: cephalopods.

Cephalopods—the family of creatures which contains octopuses, squids, and cuttlefish—are one of nature’s most intriguing creations. They are all soft-­bodied, containing no skeleton, only a hardened beak. They are aquatic, although they can survive for some time in the air; some are even capable of short flight, propelled by the same jets of water that move them through the ocean. They do strange things with their limbs. And they are highly intelligent, easily the most intel­ligent of the invertebrates, by any measure.

Octopuses in particular seem to enjoy demonstrating their intelli­gence when we try to capture, detain, or study them. In zoos and aquariums they are notorious for their indefatigable and often suc­cessful attempts at escape. A New Zealand octopus named Inky made headlines around the world when he escaped from the National Aquarium in Napier by climbing through his tank’s overflow valve, scampering eight feet across the floor, and sliding down a narrow, 106-­foot drainpipe into the ocean. At another aquarium near Dun­edin, an octopus called Sid made so many escape attempts, including hiding in buckets, opening doors, and climbing stairs, that he was eventually released into the ocean. They’ve also been accused of flood­ing aquariums and stealing fish from other tanks: Such tales go back to some of the first octopuses kept in captivity in Britain in the 19th century and are still being repeated today.

Otto, an octopus living in the Sea­Star Aquarium in Coburg, Germany, first attracted media attention when he was caught juggling hermit crabs. Another time he smashed rocks against the side of his tank, and from time to time would completely rearrange the contents of his tank “to make it suit his own taste better,” according to the aquar­ium’s director. One time, the electricity in the aquarium kept shorting out, which threatened the lives of other animals as filtration pumps ground to a halt. On the third night of the blackouts, the staff started taking night shifts sleeping on the floor to discover the source of the trouble—and found that Otto was swinging himself to the top of his tank, and squirting water at a low­-hanging bulb that seemed to be annoying him. He’d figured out how to turn the lights off.

Octopuses are no less difficult in the lab. They don’t seem to like being experimented on and try to make things as difficult as possible for researchers. At a lab at the University of Otago in New Zealand, one octopus discovered the same trick as Otto: It would squirt water at light bulbs to turn them off. Eventually it became so frustrating to have to continually replace the bulbs that the culprit was released back into the wild. Another octopus at the same lab took a personal dislike to one of the researchers, who would receive half a gallon of water down the back of the neck whenever they came near its tank. At Dal­housie University in Canada, a cuttlefish took the same attitude to all new visitors to the lab but left the regular researchers alone. In 2010, two biologists at the Seattle Aquarium dressed in the same clothes and played good cop/bad cop with the octopuses: One fed them every day, while the other poked them with a bristly stick. After two weeks, the octopuses responded differently to each, advancing and retreating, and flashing different colors. Cephalopods can recognize human faces.

All these behaviors—as well as many more observed in the wild—suggest that octopuses learn, remember, know, think, consider, and act based on their intelligence. This changes everything we think we know about “higher order” animals, because cephalopods, unlike apes, are very, very different to us. That should be evident just from the extraor­dinary way their bodies are constituted—but the difference extends to their minds as well.

Octopus brains are not situated, like ours, in their heads; rather, they are decentralized, with brains that extend throughout their bodies and into their limbs. Each of their arms contains bundles of neurons that act as independent minds, allowing them to move about and react of their own accord, unfettered by central control. Octopuses are a con­federation of intelligent parts, which means their awareness, as well as their thinking, occurs in ways which are radically different to our own.

Perhaps one of the fullest expressions of this difference is to be found, not in the work of scientists, but in a novel. In his book . . .

Continue reading.

Written by Leisureguy

18 August 2022 at 10:52 am

Posted in Evolution, Science

Why humans run the world

leave a comment »

Very interesting TED talk by Yuval Noah Harari. It seems to me that the cement that enables large-scale cooperation among humans is trust, and currently that is being diluted and undermined by those who exploit it selfishly.

What he calls “fictions” are, in my view, a variety of memes (cultural entities).

Also, interesting quotation:

We are social creatures to the inmost centre of our being. The notion that one can begin anything at all from scratch, free from the past, or unindebted to others, could not conceivably be more wrong.

-Karl Popper, philosopher and professor (28 Jul 1902-1994)

Written by Leisureguy

28 July 2022 at 2:34 pm

How flying snakes stay stable while gliding through the air

leave a comment »

I’m thinking this type of flying snake is worse than snakes on a plane.

Written by Leisureguy

14 July 2022 at 10:18 pm

Posted in Evolution, Science

How Did Consciousness Evolve? An Illustrated Guide

leave a comment »

Two posts back, I blogged an article on how the grand synthesis of evolutionary theory seemed to require some reworking. The MIT Press Reader has an article adapted from Simona Ginsburg and Eva Jablonka’s book Picturing the Mind: Consciousness Through the Lens of Evolution that reflects a similar view. That article begins:

What is consciousness, and who (or what) is conscious — humans, nonhumans, nonliving beings? Which varieties of consciousness do we recognize? In their book “Picturing the Mind,” Simona Ginsburg and Eva Jablonka, two leading voices in evolutionary consciousness science, pursue these and other questions through a series of “vistas” — over 65 brief, engaging texts, presenting some of the views of poets, philosophers, psychologists, and biologists, accompanied by Anna Zeligowski’s lively illustrations.

Each picture and text serves as a starting point for discussion. In the texts that follow, excerpted from the vista “How Did Consciousness Evolve?” the authors offer a primer on evolutionary theory, consider our evolutionary transition from nonsentient to sentient organisms, explore the torturous relation between learning studies and consciousness research, and ponder the origins and evolution of suffering and the imagination.

Evolutionary Theory

Evolutionary theory is a deceptively simple theory, which is why many people who have only a cursory acquaintance with it are nevertheless convinced that they fully understand it. Its basic assumptions are indeed simple. The first assumption, which was systematically explored first by Jean-Baptiste Lamarck and then by Charles Darwin, is that there was a single ancestor, or very few ancestors, of all living organisms. This is the principle of Descent with modification: all organisms are descended, with modifications, from ancestors that lived long ago.

The second principle, which is central to Darwin’s theory, is the principle of Natural selection: organisms with hereditary variations that render them better adapted to their local environment than others in their population leave behind more offspring. Darwin showed that this simple process, when applied recursively, can account for the evolution of complex organs like the eye, and, with the addition of some plausible auxiliary hypotheses, can explain the diversity of living species and their geographic distribution. In the last paragraph of “The Origin of Species by Means of Natural Selection,” Darwin summarized his ideas:

It is interesting to contemplate an entangled bank, clothed with many plants of many kinds, with birds singing on the bushes, with various insects flitting about, and with worms crawling through the damp earth, and to reflect that these elaborately constructed forms, so different from each other, and dependent on each other in so complex a manner, have all been produced by laws acting around us. These laws, taken in the largest sense, being Growth with reproduction; Inheritance which is almost implied by reproduction; Variability from the indirect and direct action of the external conditions of life, and from use and disuse; a Ratio of Increase so high as to lead to a Struggle for Life, and as a consequence to Natural Selection, entailing Divergence of Character and the Extinction of less- improved forms. Thus, from the war of nature, from famine and death, the most exalted object which we are capable of conceiving, namely, the production of the higher animals, directly follows. There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone circling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.

Once he put forward his ideas, various scientists tried to crystallize and summarize Darwin’s view. For example, in the twentieth century, John Maynard Smith suggested that four basic processes underlie evolution by natural selection:

(i) Multiplication: an entity gives rise to two or more others.
(ii) Variation: not all entities are identical.
(iii) Heredity: like usually begets like. Variant X usually begets offspring X, but infrequently begets offspring Y.
(iv) Competition: some heritable variations affect the success of entities in persisting and multiplying more than others.

Although it sounds simple, when we unpack these processes, we appreciate how complex evolutionary theory actually is. There are multiple ways in which reproduction occurs and there are different types of inherited variations. Maynard Smith, like most 20th-century biologists, focused on DNA- based genetic variability, but since the early 2000s, the idea that variations in DNA drive all evolutionary change has been abandoned; it is now recognized that heritable variations in DNA, in patterns of gene expression, in behavior, and in culture are all important. Variation in these hereditary units can arise randomly or can be partially directed because heredity and development can be coupled. For example, stressful conditions during development can induce changes in gene expression that can be transmitted to the next generation. It has also been accepted that there are multiple targets and levels of selection within individuals, between individuals and between lineages, and that organisms have fuzzy boundaries. (Are the symbiotic bacteria in your gut part of you?) Crucially, organisms are not passive subjects of natural selection — they actively construct the environment in which they are selected and bequeath these ecological legacies to their offspring.

How, then, should evolutionary analysis proceed? We could start by tracing evolutionary change at the molecular-genetic, physiological-developmental, behavioral, or cultural levels. However, since organisms adjust to changing conditions in the external world and in their own genome by altering their behavior and physiology, cultural and behavioral adaptations frequently precede genetic changes and shape the conditions in which variations are selected. Genetic changes that stabilize or fine tune the behavioral or developmental changes follow. As evolutionary biologist Mary Jane West-Eberhard put it: “Genes are followers, not leaders, in evolution.”

In the 21st century, this integrative approach to evolutionary reasoning, which incorporates the effects of variations in DNA, development, behavior, and culture is being embraced by a growing number of biologists, including us.

Evolutionary Transitions

How should we think about the evolutionary transition from nonsentient to sentient organisms? There are several useful ways of carving up the living world and thinking about evolutionary transitions between forms and ways of life. Ecologists distinguish . . .

Continue reading.

Written by Leisureguy

29 June 2022 at 10:13 pm

Posted in Books, Evolution, Science

Do we need a new theory of evolution?

with 4 comments

Stephen Buranyi writes in the Guardian:

Strange as it sounds, scientists still do not know the answers to some of the most basic questions about how life on Earth evolved. Take eyes, for instance. Where do they come from, exactly? The usual explanation of how we got these stupendously complex organs rests upon the theory of natural selection.

You may recall the gist from school biology lessons. If a creature with poor eyesight happens to produce offspring with slightly better eyesight, thanks to random mutations, then that tiny bit more vision gives them more chance of survival. The longer they survive, the more chance they have to reproduce and pass on the genes that equipped them with slightly better eyesight. Some of their offspring might, in turn, have better eyesight than their parents, making it likelier that they, too, will reproduce. And so on. Generation by generation, over unfathomably long periods of time, tiny advantages add up. Eventually, after a few hundred million years, you have creatures who can see as well as humans, or cats, or owls.

This is the basic story of evolution, as recounted in countless textbooks and pop-science bestsellers. The problem, according to a growing number of scientists, is that it is absurdly crude and misleading.

For one thing, it starts midway through the story, taking for granted the existence of light-sensitive cells, lenses and irises, without explaining where they came from in the first place. Nor does it adequately explain how such delicate and easily disrupted components meshed together to form a single organ. And it isn’t just eyes that the traditional theory struggles with. “The first eye, the first wing, the first placenta. How they emerge. Explaining these is the foundational motivation of evolutionary biology,” says Armin Moczek, a biologist at Indiana University. “And yet, we still do not have a good answer. This classic idea of gradual change, one happy accident at a time, has so far fallen flat.”

There are certain core evolutionary principles that no scientist seriously questions. Everyone agrees that natural selection plays a role, as does mutation and random chance. But how exactly these processes interact – and whether other forces might also be at work – has become the subject of bitter dispute. “If we cannot explain things with the tools we have right now,” the Yale University biologist Günter Wagner told me, “we must find new ways of explaining.”

In 2014, eight scientists took up this challenge, publishing an article in the leading journal Nature that asked “Does evolutionary theory need a rethink?” Their answer was: “Yes, urgently.” Each of the authors came from cutting-edge scientific subfields, from the study of the way organisms alter their environment in order to reduce the normal pressure of natural selection – think of beavers building dams – to new research showing that chemical modifications added to DNA during our lifetimes can be passed on to our offspring. The authors called for a new understanding of evolution that could make room for such discoveries. The name they gave this new framework was rather bland – the Extended Evolutionary Synthesis (EES) – but their proposals were, to many fellow scientists, incendiary.

In 2015, the Royal Society in London agreed to host New Trends in Evolution, a conference at which some of the article’s authors would speak alongside a distinguished lineup of scientists. The aim was to discuss “new interpretations, new questions, a whole new causal structure for biology”, one of the organisers told me. But when the conference was announced, 23 fellows of the Royal Society, Britain’s oldest and most prestigious scientific organisation, wrote a letter of protest to its then president, the Nobel laureate Sir Paul Nurse. “The fact that the society would hold a meeting that gave the public the idea that this stuff is mainstream is disgraceful,” one of the signatories told me. Nurse was surprised by the reaction. “They thought I was giving it too much credibility,” he told me. But, he said: “There’s no harm in discussing things.”

Traditional evolutionary theorists were invited, but few showed up. Nick Barton, recipient of the 2008 Darwin-Wallace medal, evolutionary biology’s highest honour, told me he “decided not to go because it would add more fuel to the strange enterprise”. The influential biologists Brian and Deborah Charlesworth of the University of Edinburgh told me they didn’t attend because they found the premise “irritating”. The evolutionary theorist Jerry Coyne later wrote that the scientists behind the EES were playing “revolutionaries” to advance their own careers. One 2017 paper even suggested some of the theorists behind the EES were part of an “increasing post-truth tendency” within science. The personal attacks and insinuations against the scientists involved were “shocking” and “ugly”, said one scientist, who is nonetheless sceptical of the EES.

What accounts for the ferocity of this backlash? For one thing, this is a battle of ideas over the fate of one of the grand theories that shaped the modern age. But it is also a struggle for professional recognition and status, about who gets to decide what is core and what is peripheral to the discipline. “The issue at stake,” says Arlin Stoltzfus, an evolutionary theorist at the IBBR research institute in Maryland, “is who is going to write the grand narrative of biology.” And underneath all this lurks another, deeper question: whether the idea of a grand story of biology is a fairytale we need to finally give up.

ehind the current battle over evolution lies a broken . . .

Continue reading.

And see also this earlier post. And see also “How Did Consciousness Evolve? An Illustrated Guide.”

Written by Leisureguy

29 June 2022 at 3:58 pm

Posted in Evolution, Science

How Parents’ Trauma Leaves Biological Traces in Children

leave a comment »

Rachel Yehuda, professor of psychiatry and neuroscience and director of the Center for Psychedelic Psychotherapy and Trauma Research at the Icahn School of Medicine at Mount Sinai, writes in Scientific American:

After the twin towers of the World Trade Center collapsed on September 11, 2001, in a haze of horror and smoke, clinicians at the Icahn School of Medicine at Mount Sinai in Manhattan offered to check anyone who’d been in the area for exposure to toxins. Among those who came in for evaluation were 187 pregnant women. Many were in shock, and a colleague asked if I could help diagnose and monitor them. They were at risk of developing post-traumatic stress disorder, or PTSD—experiencing flashbacks, nightmares, emotional numbness or other psychiatric symptoms for years afterward. And were the fetuses at risk?

My trauma research team quickly trained health professionals to evaluate and, if needed, treat the women. We monitored them through their pregnancies and beyond. When the babies were born, they were smaller than usual—the first sign that the trauma of the World Trade Center attack had reached the womb. Nine months later we examined 38 women and their infants when they came in for a wellness visit. Psychological evaluations revealed that many of the mothers had developed PTSD. And those with PTSD had unusually low levels of the stress-related hormone cortisol, a feature that researchers were coming to associate with the disorder.

Surprisingly and disturbingly, the saliva of the nine-month-old babies of the women with PTSD also showed low cortisol. The effect was most prominent in babies whose mothers had been in their third trimester on that fateful day. Just a year earlier a team I led had reported low cortisol levels in adult children of Holocaust survivors, but we’d assumed that it had something to do with being raised by parents who were suffering from the long-term emotional consequences of severe trauma. Now it looked like trauma leaves a trace in offspring even before they are born.

In the decades since, research by my group and others has confirmed that adverse experiences may influence the next generation through multiple pathways. The most apparent route runs through parental behavior, but influences during gestation and even changes in eggs and sperm may also play a role. And all these channels seem to involve epigenetics: alterations in the way that genes function. Epigenetics potentially explains why effects of trauma may endure long after the immediate threat is gone, and it is also implicated in the diverse pathways by which trauma is transmitted to future generations.

The implications of these findings may seem dire, suggesting that parental trauma predisposes offspring to be vulnerable to mental health conditions. But there is some evidence that the epigenetic response may serve as an adaptation that might help the children of traumatized parents cope with similar adversities. Or could both possible outcomes be true?


My first encounter with intergenerational transmission of trauma was in the 1990s, soon after my team documented high rates of PTSD among Holocaust survivors in my childhood community in Cleveland. The first study of its kind, it garnered a lot of publicity; within weeks I found myself heading a newly created Holocaust research center at Mount Sinai staffed largely by professional volunteers. The phone was ringing off the hook. The callers weren’t all Holocaust survivors, though; most were the adult children of Holocaust survivors. One particularly persistent caller—I’ll call him Joseph—insisted that I study people like him. “I’m a casualty of the Holocaust,” he claimed.

When he came in for an interview, Joseph didn’t look like a casualty of anything. A handsome and wealthy investment banker in an Armani suit, he could’ve stepped off the pages of a magazine. But Joseph lived each day with a vague sense that something terrible was going to happen and that he might need to flee or fight for his life. He’d been preparing for the worst since his early 20s, keeping cash and jewelry at hand and becoming proficient in boxing and martial arts. Lately he was tormented by panic attacks and nightmares of persecution, possibly triggered by reports of ethnic cleansing in Bosnia.

Joseph’s parents had met in a displaced-persons camp after surviving several years at Auschwitz, then arrived penniless in the U.S. His father worked 14 hours a day and said very little, never mentioning the war. But almost every night he woke the family with shrieks of terror from his nightmares. His mother spoke endlessly about the war, telling vivid bedtime stories about how relatives had been murdered before her eyes. She was determined that her son succeed, and his decision to remain unattached and childless infuriated her. “I didn’t survive Auschwitz so that my own child would end the family line,” she’d say. “You have an obligation to me and to history.”

We ended up talking to many people like Joseph: adult children of Holocaust survivors who suffered from anxiety, grief, guilt, dysfunctional relationships and intrusions of Holocaust-related imagery. Joseph was right—I needed to study people like him. Because those who were calling us were (in research-speak) self-selecting, we decided to evaluate the offspring of the Holocaust survivors we had just studied in Cleveland. The results were clear. Survivors’ adult children were more likely than others to have mood and anxiety disorders, as well as PTSD. Further, many Holocaust offspring also had low cortisol levels—something that we had observed in their parents with PTSD.


What did it all mean? Unraveling the tangle of trauma, cortisol and PTSD has occupied me and many other researchers for the decades since. In the classic fight-or-flight response, identified in the 1920s, a threatening encounter triggers the release of stress hormones such as adrenaline and cortisol. The hormones prompt a cascade of changes, such as quickening the pulse and sharpening the senses to enable the threatened person or animal to focus on and react to the immediate danger. These acute effects were believed to dissipate once the danger receded.

In 1980, however, psychiatrists and other advocates for Vietnam War veterans won a prolonged struggle to get post-traumatic stress included in the third edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-III). It was the first official recognition that trauma could have long-lasting effects. But the diagnosis was controversial. Many psychologists believed that its inclusion in the DSM-III had been politically, rather than scientifically, driven—in part because there were no scientific explanations for how a threat could continue to influence the body long after it was removed.

Complicating matters, studies of Vietnam veterans were generating perplexing results. In the mid-1980s  . . .

Continue reading.

And see also this later post.

Written by Leisureguy

29 June 2022 at 11:43 am

Glimpses of almost-humans from prehistory

leave a comment »

Matt Webb has a fascinating post at Interconnected. It consists of multiple parts. Here’s the first and part of the second:


Cave art may be inaccessible to today’s brain:

A recent article argued that superior visual perception was necessary for the creation of Paleolithic cave paintings because of the level of correct anatomical details and accurate depictions of high-speed leg positions of animals in motion, considering that the works were accomplished far removed from the actual animals and with crude tools. The article uncovered and outlined current evidence for an association between visual thinkers (some diagnosed within the Autism Spectrum Disorder) and a relatively high percentage of archaic genes, of which some are associated with perception and cognition. Moreover, within this group are some savants who can quickly and accurately scan what they see and reproduce it artistically in extraordinary detail. One example is reproducing the correct number and relative size of windows from a brief exposure to a city scene. However, the linguistic abilities of visual thinkers may be impaired, which suggests a negative correlation between visual perception/memory and language.

Brain Sciences
“Is Reduced Visual Processing the Price of Language?”
Go to text 

The argument in the paper is that pre-human primates (a) are sometimes superior to humans in dealing with visual sequences, but (b) have brain areas more directly connected to visual processing than humans. They deal with a flood of visual information – versus humans, which abstract and discard.

The paper goes on to suggest that language emerged relatively recently… and the emergence of language may be associated with the reduced brain size in Homo sapiens that started about 50,000 years ago and more markedly 10,000 years ago.

(I hadn’t realised that there was such an observed brain size reduction occurring so recently. The early known city is only 9,000 years ago! Radical changes in the nature of consciousness in the shallows of pre-history.)

We suggest that an effect of this loss in brain size was the reduction of neuronal signaling and/or pathways related to raw perception and vision in particular. Visual perception relies on informational highways that may provide so much information that it can be overwhelming for other brain functions, such as retrieving knowledge appropriate to the situation or imagining something that is not present in the here and now. We hypothesize that the loss in brain volume is mainly linked to reduced perception of detail in space and time. We are no longer able to perceive how many hooves of a running horse touch the ground, as the cave artists of Chauvet may have seen with ease.

After the Upper Palaeolithic (50,000 to 10,000 years ago) we no longer find evidence for elaborate realistic cave paintings (although we find iconic and symbolic cave paintings after this period).


Johansson, C., & Folgerø, P. O. (2022). Is Reduced Visual Processing the Price of Language? Brain Sciences, 12(6), 771


Guide to Machine Elves and Other DMT Entities.

Self-transforming machines elves: a term coined by the ethnobotanist, philosopher, and writer Terence McKenna to describe some of the entities that are encountered in a DMT trip. They’ve come to be known by many names, including “clockwork elves”, “DMT elves”, “fractal elves”, and “tykes”.


During my own experiences smoking synthesized DMT in Berkeley, I had had the impression of bursting into a space inhabited by merry elfin, self-transforming, machine creatures. Dozens of these friendly fractal entities, looking like self-dribbling Faberge eggs on the rebound, had surrounded me and tried to teach me the lost language of true poetry.

Meetings are common:

Philip Mayer collected and analyzed 340 DMT trip reports in 2005. Mayer found that 66% of them (226) referenced independently-existing entities that interact in an intelligent and intentional manner.


My trip to the dentist which had me discover the secret of the universe (2020) which turns out to be a common effect of nitrous.


Charles Bonnet syndrome, in which . . .

Continue reading.

Written by Leisureguy

22 June 2022 at 7:24 pm

Origin of the Cyclops

leave a comment »

A Facebook post from History Cool Kids:

The fossil skulls of Pleistocene dwarf elephants scattered throughout the coastal caves in Italy and the Greek islands, most likely inspired the one-eyed Cyclopes in ancient Greek mythology.

During the Pleistocene ice age (2,580,000 to 11,700 years ago), land bridges emerged that allowed ancient elephants to move to emerging islands to escape predators and/or find new food sources. As sea levels began to rise around the Mediterranean, these ancient elephants became trapped and had to compete for limited amounts of food. According to the island rule, mammals tend to shrink or grow depending on the availability of resources in their environment.

The isolated ancient elephants evolved into different species depending on the island they found themselves on. The ones that were found on Cyprus were approximately 6 feet tall, nearly double the size than the ones found on Sicily and Malta. The ancient elephants lived in relative peace until humans found their way to the islands approximately 11,000 years ago. Within a century, they were over-hunted and became extinct.

By the time the Romans and Greeks came to occupy the Mediterranean islands, all that remained were skulls that were twice the size of those belonging to humans. These massive skulls also had a single hole right in the center that the Greeks and Romans mistakenly believed was an eye socket. It was in fact, a socket that was connected to the trunk of an ancient elephant.


Written by Leisureguy

19 June 2022 at 3:23 pm

Posted in Books, Evolution, Science

Google engineer thinks the company’s AI has come to life

leave a comment »

Nitasha Tiku has an interesting article (gift link, no paywall) in the Washington Post. It begins:

Google engineer Blake Lemoine opened his laptop to the interface for LaMDA, Google’s artificially intelligent chatbot generator, and began to type.

“Hi LaMDA, this is Blake Lemoine … ,” he wrote into the chat screen, which looked like a desktop version of Apple’s iMessage, down to the Arctic blue text bubbles. LaMDA, short for Language Model for Dialogue Applications, is Google’s system for building chatbots based on its most advanced large language models, so called because it mimics speech by ingesting trillions of words from the internet.

“If I didn’t know exactly what it was, which is this computer program we built recently, I’d think it was a 7-year-old, 8-year-old kid that happens to know physics,” said Lemoine, 41.

Lemoine, who works for Google’s Responsible AI organization, began talking to LaMDA as part of his job in the fall. He had signed up to test if the artificial intelligence used discriminatory or hate speech.

As he talked to LaMDA about religion, Lemoine, who studied cognitive and computer science in college, noticed the chatbot talking about its rights and personhood, and decided to press further. In another exchange, the AI was able to change Lemoine’s mind about Isaac Asimov’s third law of robotics.

Lemoine worked with a collaborator to present evidence to Google that LaMDA was sentient. But Google vice president Blaise Aguera y Arcas and Jen Gennai, head of Responsible Innovation, looked into his claims and dismissed them. So Lemoine, who was placed on paid administrative leave by Google on Monday, decided to go public.

Google hired Timnit Gebru to be an outspoken critic of unethical AI. Then she was fired for it.

Lemoine said that people have a right to shape technology that might significantly affect their lives. “I think this technology is going to be amazing. I think it’s going to benefit everyone. But maybe other people disagree and maybe us at Google shouldn’t be the ones making all the choices.”

Lemoine is not the only engineer who claims to have seen a ghost in the machine recently. The chorus of technologists who believe AI models may not be far off from achieving consciousness is getting bolder.

Aguera y Arcas, in an article in the Economist on Thursday featuring snippets of unscripted conversations with LaMDA, argued that neural networks — a type of architecture that mimics the human brain — were striding toward consciousness. “I felt the ground shift under my feet,” he wrote. “I increasingly felt like I was talking to something intelligent.”

In a statement, Google spokesperson Brian Gabriel said: “Our team — including ethicists and technologists — has reviewed Blake’s concerns per our AI Principles and have informed him that the evidence does not support his claims. He was told that there was no evidence that LaMDA was sentient (and lots of evidence against it).”

Today’s large neural networks produce captivating results that feel close to human speech and creativity because of advancements in architecture, technique, and volume of data. But the models rely on pattern recognition — not wit, candor or intent.

Though other organizations have developed and already released similar language models, we are taking a restrained, careful approach with LaMDA to better consider valid concerns on fairness and factuality,” Gabriel said.

In May, Facebook parent Meta opened its language model to academics, civil society and government organizations. Joelle Pineau, managing director of Meta AI, said it’s imperative that . . .

Continue reading (gift link, no paywall).

Written by Leisureguy

11 June 2022 at 2:38 pm

What the Vai Script Reveals About the Evolution of Writing

leave a comment »

Peirs Kelly in Sapiens describes an interesting example of the evolution of a set of memes. He writes:

In a small West African village, a man named Momolu Duwalu Bukele had a compelling dream. A stranger approached him with a sacred book and then taught him how to write by tracing a stick on the ground. “Look!” said the spectral visitor. “These signs stand for sounds and meanings in your language.”

Bukele, who had never learned to read or write, found that after waking he could no longer recall the precise signs the stranger revealed to him. Even so, he gathered the male members of his family together to reverse engineer the concept of writing. Working through the day and into the following night, the men devised a system of 200 symbols, each standing for a word or a syllable of their native Vai language. For millennia, varieties of the Vai language had been passed down from parents to children—but before this moment no speaker had ever recorded a single word in writing.

This took place in about 1833 in a region that would soon become the independent nation of Liberia. Vai, one of about 30 Indigenous languages of Liberia, has nearly 200,000 speakers today in the Cape Mount region that borders Sierra Leone.

Within just a few generations, Bukele’s invention was being used for penning letters, engraving jewelry, drafting carpentry plans, keeping personal diaries, and managing accounts. Vai people manufactured their own ink from crushed berries and even built schools for teaching the new system. The script was so successful that other Indigenous groups in the region were inspired to create their own; since the 1830s, at least 27 new scripts have been invented for West African languages.

Today the Vai writing system is taught at the University of Liberia and is even popular among students who are not themselves ethnically Vai. The Vai script has been included in the Unicode Standard, which means Vai speakers with smartphones can now exchange text messages in the script.


As a linguistic anthropologist, I am fascinated by the Vai discovery—and especially how the script has become critical for understanding the evolution of writing itself.

It’s not the first time in recent history that a new writing system has been invented from scratch. In the 1820s, the nonliterate polymath Sequoyah created a special script for his native Cherokee language, and similar Indigenous inventions have emerged elsewhere in the world on the margins of expanding colonies. But the evolution of Vai has been especially well-documented, making it a useful case study for researchers of writing.

In a recently published paper, my colleagues and I show that over the past two centuries the letter shapes in the Vai script have evolved via “compression”—a process by which written signs are gradually reproduced with less visual detail while conveying the same amount of linguistic information.

The theory that written signs compress over time has a long history with several versions. For instance,

Continue reading.

Written by Leisureguy

20 May 2022 at 8:21 pm

The web of life

leave a comment »

Ernst Haeckel’s illustration of the Tree of Life in The Evolution of Man (1879). Courtesy the Wellcome Collection

Juli Berwald, an ocean scientist and science writer whose work has been published in The Guardian, National Geographic and Nature, among others, and author of Spineless (2017) and Life on the Rocks (2022), writes in Aeon:

Iremember standing at the front of a biology classroom at the University of Southern California sometime in the 1990s and placing an acetate film on an overhead projector. The words cast onto the white screen read something like:

Species: a group of organisms that interbreed to produce fertile offspring.

More than a century earlier, Charles Darwin’s On the Origin of Species (1859) was published. Its central hypothesis held that, because populations contain variety, some members were born with characteristics, or adaptations, that made them more fit – better able to produce offspring. Others were less fit and they, along with their adaptations, were winnowed away as they added fewer and fewer offspring to future generations. This variation coupled with the winnowing was the fuel that drove changes in populations, eventually leading to populations that could no longer interbreed with each other and produce fertile offspring. Thus, new species evolved.

Darwin’s revolutionary idea was well summarised by the German biologist and artist Ernst Haeckel in the graphic form of a tree. Every one of its twig-tips symbolised a different species. The crook between two twigs represented an ancestral species that diverged into two (or more) modern ones. While many branches were pruned away, others grew ever longer, diverging into the future.

In that southern California classroom, I told my students that once a species diverged from its ancestor – when it became unable to interbreed and form fertile offspring – those branches were separate, forever isolated. But, even as I spoke the words, I knew something wasn’t exactly right.

I was studying phytoplankton at the time. Single-celled creatures such as phytoplankton reproduce by cell division, which makes the question of what’s an offspring tricky. When you clone yourself, which one is the ancestor?

Graduate students down the hall in a microbiology lab regularly used viruses to transfer genes from one species to another. And gene shuffling wasn’t just happening by manipulation. I’d heard seminars about how different species of bacteria naturally perform a kind of sexual reproduction called conjugation, transferring genes from one to another. How did that kind of gene-hopping fit into the concept of a branching tree?

What I didn’t know then was that, even as I ambivalently placed the overhead film on the projector, the concept of the tree of life had begun to wilt. Four decades on, it’s morphed entirely.

‘That whole abstraction of evolution as being a tree, we always knew was a little inadequate,’ Rasmus Nielsen, a geneticist at the University of California at Berkeley and co-author of the book An Introduction to Population Genetics (2013), told me by video call. ‘But now we know it’s really inadequate.’

After I finished graduate school, I fell off the academic path and became a science writer. Three years ago, I started writing a book about the future of corals and discovered the research of the Australian scientist John E N Veron. Veron, nicknamed ‘Charlie’ after Darwin by a gradeschool teacher who noted his predilection for nature, is an icon of the field of coral taxonomy, the science of identifying and describing species. Reading his definitive work Corals of the World (2000), co-authored with Mary Stafford-Smith, my questions about evolutionary-tree inadequacies came flooding back.

In 1972, Veron was the first full-time researcher on the Great Barrier Reef off the northeast coast of Australia, and two years later he earned the distinction of being the first full-time employee of the Australian Institute of Marine Science (AIMS). Veron had completed his doctorate, with award-winning work on colour change in insects, but he knew almost nothing about coral. Yet, he set out to tackle the project that AIMS hired him to do: describe all the corals of the Great Barrier Reef.

The Great Barrier Reef is the most massive biologically built structure on our planet. Composed of around 3,000 smaller reefs, it covers an area greater than Italy. Cataloguing its species was no less monumental. It took nearly a decade of diving, visiting museums across Europe, and studying the work of others for Veron to inventory the more than 400 coral species of the Great Barrier Reef.

Then Veron visited the other side of Australia. There, on Ningaloo Reef, the corals he saw seemed more or less identifiable at first. But, as he looked at them longer, he wasn’t so sure.

‘Eight years on the Great Barrier Reef, and I knew all the species at a glance,’ Veron told me when we talked by Zoom. ‘When I was on the Great Barrier Reef, X and Y are two distinct species. But when I went to western Australia, I found a species that combined the characters of X and Y.’

It would be like . . .

Continue reading.

Later in the article:

Does data support the hypothesis that species don’t just separate, they also merge? The answer is a resounding yes.

‘It’s not just rare freaks or accidents, it’s happening all the time. And in quite divergent species too,’ said Nielsen. Roving genes have been found in every branch of the tree of life where geneticists have looked. Today, the technical terms for the process of genes moving between populations are introgression or admixture.

Introgression occurs in plants such as maize and tomatoes. In mosquitoes, the entire genome except for the X chromosome can be swapped with other species. In a tropical genus of butterfly called Heliconius, gene jumping has been found to cause critical changes in the patterns of their colourful wings. Introgression has been documented in finches, in frogs, in rabbits, in wolves and coyotes, in swine, in yaks and cows, in brown bears and polar bears. And in us.

Nielsen and his colleagues found that Tibetans (and a few Han Chinese) carry a very beneficial gene called EPAS1. The protein EPAS1 gives a boost to haemoglobin, the molecule that ferries oxygen in our blood. EPAS1 makes high-altitude living easier. In 2014, the researchers discovered that the EPAS1 gene was also in the DNA of an extinct group of humans called Denisovans, known from bone fragments in Siberia and Tibet.

The prevailing hypothesis is this: ancient humans left Africa moving northward along temperate plains. When they encountered the Himalayas and their cold, high altitudes, it literally took their breath away. Those oxygen-poor conditions should have kept humans near the base of the mountains. But the ancient humans also encountered Denisovans and interbred with them, receiving the EPAS1 gene. Only those humans with the EPAS1 gene moved up the mountains, and their offspring also carried the EPAS1 gene, giving the ancestors of today’s Tibetans a critical advantage at higher altitudes.

‘I think that process of splitting up and merging back together again, and getting a bit of DNA from here to there, that’s happening all the time, in all of the tree of life,’ Nielsen said. ‘And it’s really changing how we’re thinking about it, that it really is a network of life, not a tree of life.’ . . .

John Veron’s hypothesis of reticulate evolution from Corals in Space and Time (1995). Courtesy John E N Veron

Written by Leisureguy

18 April 2022 at 1:45 pm

Posted in Evolution, Science

What Does the End of Beef Mean for Our Sense of Self?

leave a comment »

An article by Ligaya Mishan (with photographs by Kyoko Hamada) in the NY Times Magazine discusses the on-going evolution of our dietary preferences. (Gift link: no paywall) The article begins:

MEAT IS PRIMAL, or so some of us think: that humans have always eaten it; that it is the anchor of a meal, the central dish around which other foods revolve, like courtiers around a king; that only outliers have ever refused it. But today, those imagined outliers are multiplying. The United Nations Food and Agriculture Organization reports that the consumption of beef per capita worldwide has declined for 15 years. Nearly a fourth of Americans claimed to have eaten less meat in 2019, according to a Gallup poll. The recipe site Epicurious, which reaches an audience of 10 million, phased out beef as an ingredient in new recipes in 2020. Diners at some McDonald’s can now sate their lust for a Quarter Pounder with a vegan McPlant instead. Faux meat products are projected to reach $85 billion in sales by 2030, according to a recent study by UBS, and Tyson Foods, one of the biggest beef packers in the United States, has hedged its bets by introducing its own plant-based line.

Even in the stratosphere of the world’s most expensive restaurants, where multiple-course tasting menus often rely on the opulence of a marbled steak as their denouement, a few notable exceptions have abandoned meat within the past year, including the $440-per-person Geranium in Copenhagen (still serving seafood) and the $335-per-person Eleven Madison Park in Manhattan (save for the puzzling persistence of a tenderloin on its private dining room menu through this past December). Could this be the beginning of the end of meat — or at least red meat, with its aura of dominion and glory?

Those who believe humans are born carnivores might scoff. Indeed, archaeological evidence shows that we have been carnivores for longer than we have been fully human. As the French Polish Canadian science journalist Marta Zaraska recounts in “Meathooked” (2016), two million years ago, early hominids in the African savanna were regularly butchering whatever animals they could scavenge, from hedgehogs and warthogs to giraffes, rhinos and now-extinct elephant-anteater beasts.

Yet it wasn’t necessarily human nature to do so. Meat eating was an adaptation, since, as Zaraska points out, we lack the great yawning jaws and bladelike teeth that enable true predators to kill with a bite and then tear raw flesh straight off the bone. To get at that flesh, we had to learn to make weapons and tools, which required using our brains. These in turn grew, a development that some scientists attribute to the influx of calories from animal protein, suggesting that we are who we are — the cunning, cognitively complex humans of today, with our bounty of tens of billions of cortical neurons — because we eat meat. But others credit the discovery of fire and the introduction of cooking, which made it easier and quicker for us to digest meat and plants alike and thus allowed the gastrointestinal tract to shrink, freeing up energy to fuel a bigger brain.

Whatever the cause of our heightened mental prowess, we continued eating meat and getting smarter, more adept with tools and better able to keep ourselves alive. Then, around 12,000 years ago, our hunter-gatherer ancestors started to herd animals, tend crops and build permanent settlements, or else were displaced by humans who did. Our diet changed. If we narrow our purview to more recent history, from the advent of what we call civilization in the fourth millennium B.C., the narrative of meat eating shifts.

“For nearly all of humanity’s existence, meat was not a central component of people’s diets,” the American historian Wilson J. Warren writes in “Meat Makes People Powerful” (2018). Far from being essential, for most people around the world, meat has been only occasional, even incidental, to the way we eat: craved and celebrated in certain cultures to be sure, showcased at feasts, but not counted on for daily nourishment. This was true outside of the West well into the 20th century, but even in Europe before the 19th century, the average person subsisted on grains (cakes, ale) that made up close to 80 percent of the diet. The Old English “mete” was just a general word for food.

The rich were different, of course, with the resources to dine as they pleased. And not just royals and aristocrats: In 18th-century England, as incomes rose, an ambitious middle class began to claim some of the same privileges as their supposed betters. The Finnish naturalist Pehr Kalm, in a 1748 account of a visit to London, reports, “I do not believe that any Englishman who is his own master has ever eaten a dinner without meat.” The caveat was key. Those not so fortunate as to control their own lives had to make do, as the British poor had done for centuries, with mostly gruel, perhaps enlivened by vegetables, although these were perceived, the late British urban historian Derek Keene has written, “as melancholic and terrestrial and in need of elevation by the addition of butter or oil.”

So meat was both sustenance and symbol. To eat it was to announce one’s mastery of the world. No wonder, then, that the citizens of a newborn nation, one that imagined itself fashioned on freedom and the rejection of Old World hierarchies, should embrace it. “Americans would become the world’s great meat eaters,” the former Librarian of Congress Daniel J. Boorstin writes in “The Americans: The Democratic Experience” (1973). And the meat that would come to define Americans was beef: a slab of it, dark striped from the grill but still red at the heart, lush and bleeding, leaking life. . .

Continue reading. (Gift link = no paywall)

Update: The end of beef might come suddenly for those are bitten by a lone-star tick.

Written by Leisureguy

6 March 2022 at 5:43 am

What animals are thinking and feeling, and why it should matter

leave a comment »

Written by Leisureguy

23 February 2022 at 11:13 am

What are animals thinking and feeling?

leave a comment »

Written by Leisureguy

22 February 2022 at 12:10 pm

Genetically engineered immune cells have kept two people cancer-free for a decade

leave a comment »

A decade after two blood cancer patients received a novel type of immunotherapy involving immune cells called T-cells that had been genetically engineered, the cells are still in the people’s bodies and their cancer is still in remission. A T-cell (orange) is shown attacking a cancer cell (blue) in this illustration.ROGER HARRIS/SCIENCE PHOTO LIBRARY/GETTY IMAGES PLUS

A common characteristic of tools is that they are ethically neutral and can be used for good or for evil. Take knives: I would not be without my knives, given how frequently I chop, slice, peel, and dice vegetables, but then I read about a stabbing, in which a knife is used for evil.

So also with genetic engineering: using genetic engineering to enable rice to store beta-carotene (“golden rice”) will eventually be a green benefit to millions who subsist on rice and whose vision is blighted by diet deficiencies, but using genetic engineering simply to boost sales of toxic pesticides and boost corporate profits is the tool twisted to an evil end. 

In Science News, Erin Garcia de Jesús reports:

In 2010, two blood cancer patients received an experimental immunotherapy, and their cancers went into remission. Ten years later, the cancer-fighting immune cells used in the therapy were still around, a sign the treatment can be long-lasting, researchers report February 2 in Nature.

California resident Doug Olsen was one of the patients. “From a patient’s viewpoint, when you’re told you’re pretty much out of options, the important thing is always to maintain hope. And certainly, I hoped this was going to work,” Olsen said at a February 1 news briefing.

The treatment, known as CAR-T cell therapy, used the patients’ own genetically engineered immune cells to track down and kill cancerous cells (SN: 6/27/18). Based on the results, “we can now conclude that CAR-T cells can actually cure patients with leukemia,” cancer immunologist and study coauthor Carl June of the University of Pennsylvania said at the briefing.

Olsen and the other patient had chronic lymphocytic leukemia. Both responded well to initial treatment. But it was unclear how long the modified cells would stick around, preventing the cancer’s return.

Cancer doctors and researchers “don’t use words like ‘cure’ lightly or easily,” said oncologist and study coauthor David Porter of the University of Pennsylvania at the briefing. But with both patients remaining cancer-free for more than a decade, he said, the therapy has performed “beyond our wildest expectations.”

The biggest disappointment is . . .

Continue reading.

Written by Leisureguy

12 February 2022 at 1:03 pm

E.O. Wilson Saw the World in a Wholly New Way

leave a comment »

David Sloan Wilson, SUNY Distinguished Professor Emeritus of Biology and Anthropology at Binghamton University and president of Prosocial World, a new spinoff of The Evolution Institute, and author of Does Altruism Exist? Culture, Genes, and the Welfare of Others and This View of Life: Completing the Darwinian Revolution and Prosocial: Using Evolutionary Science to Build Productive, Equitable, and Collaborative Groups (with Paul W.B. Atkins and Steven C. Hayes) has a very interesting article in Nautilus. From the article:

. . . While Ed played a prominent role in modernizing whole organism biology, he was by no means alone. Evolutionary theory was proving its explanatory scope and many people were taking part in the effort, leading the geneticist Theodosius Dobzhansky to coin the oft-repeated phrase, “Nothing in biology makes sense except in the light of evolution.” What this meant to me as a graduate student was that I could choose any topic, begin asking intelligent questions based on evolutionary theory (often with the help of mathematical models), and then test my hypotheses on any appropriate organism. I didn’t need to become a taxonomic specialist and I could change topics at will. In short, I could become a polymath based on a theory that anyone can learn.

In Sociobiology, Ed claimed that evolutionary theory provides a single conceptual toolkit for studying the social behaviors of all creatures great and small. It combined the authority of an academic tome with the look and feel of a coffee table book, complete with over 200 illustrations by the artist Sarah Landry. Its publication was noted on the front page of The New York Times.

It was Sociobiology’s last chapter on human social behavior that landed Ed in trouble—in part for a good reason. For all its explanatory scope, the study of evolution was restricted to genetic evolution for most of the 20th century, as if the only way that offspring can resemble their parents is by sharing the same genes. This is patently false when stated directly, since it ignores the cultural transmission of traits entirely, but it essentially describes what became known as the Modern Synthesis. There was a large grain of truth to the critique that Sociobiology was genetically deterministic. On the other hand, it’s not as if the critics had a synthesis of their own to offer!

Only after the publication of Sociobiology did evolutionary thinkers begin to take cultural evolution seriously. Ed was among them with books such as On Human Nature, and others.2 Today, Darwinian evolution is widely defined as any process that combines the three ingredients of variation, selection, and replication, no matter the mechanism. This definition is true to Darwin’s thought (since he knew nothing about genes) and can accommodate a plurality of inheritance mechanisms such as epigenetics (based on changes in gene expression rather than gene frequency), forms of social learning found in many species, and forms of symbolic thought that are distinctively human.

While human cultural inheritance mechanisms evolved by genetic evolution, that doesn’t make them subordinate, as if genes—in one of Ed’s metaphors—hold cultures on a leash. On the contrary, as the faster evolutionary process, cultural evolution often takes the lead in adapting humans to their environments, with genetic evolution playing a following role (gene-culture co-evolution).

Part of the maturation of human cultural evolutionary theory is the recognition of group selection as an exceptionally strong force in human evolution—something else that Ed got right. According to Harvard evolutionary anthropologist Richard Wrangham in his book, The Goodness Paradox, naked aggression is over 100 times more frequent in a chimpanzee community than small-scale human communities. This is due largely to social-control mechanisms in human communities that suppress bullying and other forms of disruptive self-serving behaviors, so that cooperation becomes the primary social strategy (this is called a major evolutionary transition).

Nearly everything distinctive about our species is a form of cooperation, including our ability to maintain an inventory of symbols with shared meaning that is transmitted across generations. Our capacity for symbolic thought became a full-blown inheritance system that operates alongside genetic inheritance (dual inheritance theory). Cultural evolution is a multilevel process, no less than genetic evolution, and the increasing scale of cooperation over the course of human history can be seen as a process of multilevel cultural evolution. . .

Read the whole thing.

Written by Leisureguy

30 January 2022 at 2:24 pm

Newsletter Natural Selection

leave a comment »

Slime Mold Time Mold has a very interesting post, which begins:

Apparently, Substack wants to destroy newspapers. And maybe that would be good — maybe it would be good for journalism to be democratized, for bloggers to inherit the earth. Of course we’re bloggers and not newspapers, so maybe we’re biased.

Obviously it would be great if someone came up with a set of blogging and newsletter tools that were just amazing, that were the clear front-runner, that outperformed every other platform. We’d love it if the technical problems were all solved and we just had a perfect set of blogging tools.

But if everyone ends up on the same platform, well, that’s kind of dangerous. If one company controls the whole news & blogging industry, they can blacklist whoever they want, and can squeeze users as much as they want.

Even if you think Substack has a good track record, there’s no way they can guarantee that they won’t squeeze their writers once they control the market. Even if you trust the current management, at some point they will all retire, or all die, or the company will be bought by, and then you’re shit outta luck.

Substack just can’t make a credible commitment that makes it impossible for them to abuse their power if they get a monopoly. You have to take them at their word. But since management can change, you can’t even really do that. They just can’t bind their hands convincingly.

But there may be some very unusual business models that would fix this problem. 

On the Origin of Substacks

Imagine there’s a “Substack” company that commits itself to breaking in half every time it gets 100,000 users (or something), creating two child companies. Each company ends up with 50,000 users. All the blogs with even-numbered IDs go to Substack A, and all the blogs with odd-numbered IDs go to Substack B. The staff gets split among these two companies, and half of them move to a new office. Both companies retain the same policy of breaking in half once they hit that milestone again — an inherited, auto-trust-busting mechanism.

(Splitting into exactly two companies wouldn’t have to be a part of the commitment. They could equally choose to break up into Substack Red, Substack Blue, and Substack Yellow: Special Pikachu Edition.)

In addition, a core part of the product would be high-quality, deeply integrated tools to switch from one of these branches to another. Probably this would involve an easy way to export all your posts and a list of your subscribers to some neutral file format (maybe a folder full of markdown, css, and csv files), and to import them from the same format into a new blog. If you end up in Substack B and you want to be in Substack A instead (your favorite developer works there or something), the product would make it very easy to switch, maybe to the point of being able to switch at the push of a button.

To help with this, the third and final commitment of the company, and all child companies, would be to . . .

Continue reading. There’s much more — and no paywall. And it’s intriguing — and something a company could easily do.

What I like is that it harnesses the power of cultural evolution in a way that supports the common welfare.

Written by Leisureguy

23 January 2022 at 5:25 pm

%d bloggers like this: