Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Evolution’ Category

Mating Contests Among Females, Long Ignored, May Shape Evolution

leave a comment »

Jake Buehler writes in Quanta:

As the midday sun hangs over the Scandinavian spruce forest, a swarm of hopeful suitors takes to the air. They are dance flies, and it is time to attract a mate. Zigzagging and twirling, the flies show off their wide, darkened wings and feathery leg scales. They inflate their abdomens like balloons, making themselves look bigger and more appealing to a potential partner.

Suddenly, the swarm electrifies with excitement at the arrival of a new fly, the one they have all been waiting for: a male. It’s time for the preening flock of females to shine.

The flies are flipping the classic drama reenacted across the animal kingdom, in which eager males with dazzling plumage, snarls of antlers or other extraordinary traits compete for a chance to woo a reluctant female. Such competitions between males for the favor of choosy females are enshrined in evolutionary theory as “sexual selection,” with the females’ choices molding the evolution of the males’ instruments of seduction over generations.

Yet it’s becoming clear that this traditional picture of sexual selection is woefully incomplete. Dramatic and obvious reversals of the selection scenario, like that of the dance flies, aren’t often observed in nature, but recent research suggests that throughout the tree of animal life, females jockey for the attention of males far more than was believed. A new study hosted on the preprint server biorxiv.org has found that in animals as diverse as sea urchins and salamanders, females are subject to sexual selection — not as harshly as males are, but enough to make biologists rethink the balance of evolutionary forces shaping species in their accounts of the history of life.

The new work turns a spotlight on a lopsidedness in sexual selection research that may have robbed evolutionary studies on about half of all animal species of important context. Scientists have reported scattered evidence of female sexual selection in the past, but more often they haven’t had reason to look for it. That could now be changing.

“We really don’t know very much compared to how much we’ve worked on the male side of things,” said Tommaso Pizzari, an evolutionary biologist at the University of Oxford who was not involved with the new paper. “Sexual selection in females is still relatively unknown. It’s still barely charted territory.”

The concept of sexual selection dates back to Charles Darwin’s first writings on natural selection — briefly mentioned in The Origin of Species, and then covered more extensively in The Descent of Man — where he detailed reproductive preferences between the sexes as potentially driving evolutionary change. Within the framework of conventional natural selection, it makes sense that individuals prefer fit mates. But a key point of sexual selection is that attractiveness to potential mates can be a criterion for selection in itself, independently of how it affects fitness otherwise. Members of one sex can develop traits and behaviors appealing to the other that directly conflict with survival-driven natural selection. Taken to extremes, this can result in the unwieldy, exceptionally elongated display feathers of some male birds, for example, which are only useful in the mating contests that the males stage.

The Victorian View of Females

Yet from its very beginning, the science focused on males as the objects of sexual selection. Darwin saw females as reluctantly picking mates from gaggles of desperate male suitors. He was open to the idea of sexual selection in either direction, but the intensity of the obvious competitions for mates among males fed the idea that sexual selection happened primarily to males; the females were prizes to be won. Females might be setting the terms of the mating competitions, but it was the males who were truly being reshaped through evolution by those choices.

Darwin’s perspective was typical of his time. Theories about sexual selection were born “in the Victorian era, when you had these certain sexual stereotypes about how women should behave,” said Rebecca Boulton, an evolutionary biologist at the University of Exeter in the U.K. “And so, because the field essentially sprung up at that time, it was like, ‘Of course females aren’t mating with multiple males. Of course they’re coy or choosy.”

This viewpoint has contributed to a ubiquitous bias in how sexual selection has been investigated in the last century and a half, the researchers behind the new study argue. They estimate that studies of male-male competition and the phenomenon of female choice are 10 times more common than studies targeting the reverse.

“A lot of people are influenced by the culture that they live in and the things that [they] see,” said Salomé Fromonteil, a graduate student in evolutionary biology now at Uppsala University in Sweden and Ludwig Maximilian University in Munich, and lead author on the study. “It’s influenced by what we read, and what they read is that sexual selection works on males primarily.”

There are undeniable exceptions. Some that have caught researchers’ attention are in species with “sex roles” that are flipped from the conventional arrangement, as in the dance flies. Females of the American tropical wading birds called wattled jacanas (Jacana jacanakeep and defend territories rich in male mates. Among the seahorses and other pipefish, males even take on the job of “pregnancy” by internally incubating their young in a specialized pouch.

Still, scientists studying sexual selection have mostly continued to defer to Darwin’s initial observations in the 19th century. It was generally accepted that males — with their propensity for ornaments and courtship displays — experienced greater sexual selection pressures.

“Of course, that’s not how research should be,” said Tim Janicke, an evolutionary biologist at the Center for Functional and Evolutionary Ecology at the University of Montpellier in France, and senior author of the new study. “If the aim is to describe general patterns in nature, we need data-driven syntheses behind this.”

In 2016, Janicke and his team dove into the published literature measuring the strength of sexual selection acting on a variety of animal species and compared those values between the sexes. That study, published in Science Advances, confirmed that . . .

Continue reading. There’s more.

Written by Leisureguy

2 August 2021 at 2:58 pm

Foods Designed to Hijack Our Appetites

leave a comment »

This video is quite interesting — and the function of the ileal brake is eye-opening (see this previous post for details).

Companies that manufacture food products (as opposed to whole plant foods, which are grown and harvested, not manufactured) are focus on their profits and not your health. Corporations in general have little interest in consumers’ health (cf. cigarette companies).

Written by Leisureguy

26 July 2021 at 11:52 am

Genetic Memory: How We Know Things We Never Learned

leave a comment »

David Treffert published an interesting article in Scientific American in January 2015. It begins:

I met my first savant 52 years ago and have been intrigued with that remarkable condition ever since. One of the most striking and consistent things in the many savants I have seen is that that they clearly know things they never learned.

Leslie Lemke is a musical virtuoso even though he has never had a music lesson in his life. Like “Blind Tom” Wiggins a century before him, his musical genius erupted so early and spontaneously as an infant that it could not possibly have been learned. It came ‘factory installed’. In both cases professional musicians witnessed and confirmed that Lemke and Wiggins somehow, even in the absence of formal training, had innate access to what can be called “the rules” or vast syntax of music.

Alonzo Clemons has never had an art lesson in his life. As an infant, after a head injury, he began to sculpt with whatever was handy–Crisco or whatever–and now is a celebrated sculptor who can mold a perfect specimen of any animal with clay in an hour or less after only a single glance at the animal itself–every muscle and tendon perfectly positioned. He has had no formal training.

To explain the savant, who has innate access to the vast syntax and rules of art, mathematics, music and even language, in the absence of any formal training and in the presence of major disability, “genetic memory,” it seems to me, must exist along with the more commonly recognized cognitive/semantic and procedural/habit memory circuits.

Genetic memory, simply put, is complex abilities and actual sophisticated knowledge inherited along with other more typical and commonly accepted physical and behavioral characteristics. In savants the music, art or mathematical “chip” comes factory installed. In addition to the examples mentioned above, I describe others in my book, Islands of Genius: The Bountiful Mind of the Autistic, Acquired and Sudden Savant.

Genetic memory is not an entirely new concept. In 1940, A.A. Brill quoted Dr. William Carpenter who, in comparing math prodigy Zerah Colburn’s calculating powers to Mozart’s mastery of musical composition, wrote the following:

In each of the foregoing cases, then, we have a peculiar example of the possession of an extraordinary congenital aptitude for certain mental activity, which showed itself at so early a period as to exclude the notion that it could have been acquired by the experience of the individual. To such congenital gifts we give the name of intuitions: it can scarcely be questioned that like the instincts of the lower animals, they are the expressions of constitutional tendencies embodied in the organism of the individuals who manifest them.

Carl Jung used the term “collective unconscious” to define his even broader concept of inherited traits, intuitions and collective wisdom of the past.

Wilder Penfield in his pioneering 1978 book, Mystery of the Mindalso referred to three types of memory. “Animals,” he wrote, “particularly show evidence of what might be called racial memory” (this would be the equivalent of genetic memory). He lists the second type of memory as that associated with “conditioned reflexes” and a third type as “experiential”. The two latter types would be consistent with the terminology commonly applied to “habit or procedural” memory and “cognitive or semantic” memory.

In his 1998 book, The Mind’s Past, Michael Gazzaniga wrote:

The baby does not learn trigonometry, but knows it; does not learn how to distinguish figure from ground, but knows it; does not need to learn, but knows, that when one object with mass hits another, it will move the object … The vast human cerebral cortex is chock full of specialized systems ready, willing and able to be used for specific tasks. Moreover, the brain is built under tight genetic control … As soon as the brain is built, it starts to express what it knows, what it comes with from the factory. And the brain comes loaded. The number of special devices that are in place and active is staggering. Everything from perceptual phenomena to intuitive physics to social exchange rules comes with the brain. These things are not learned; they are innately structured. Each device solves a different problem … the multitude of devices we have for doing what we do are factory installed; by the time we know about an action, the devices have already performed it.

Steven Pinker’s 2003 book, The Blank Slate: The Modern Denial of Human Naturerefutes the “blank slate” theories of human development. Brian Butterworth, in his 1999 book, What Counts: How Every Brain is Hardwired for Math, points out that babies have many specialized innate abilities, including numerical ones that he attributes to a “number module” encoded in the human genome from ancestors 30,000 years ago.

Marshall Nivenberg, from the National Heart Institute, provided insight into the actual DNA/RNA mechanics of this innate knowledge in an article titled “Genetic Memory” published in 1968 in JAMA.

Whether called genetic, ancestral or racial memory, or intuitions or congenital gifts, the concept of a genetic transmission of sophisticated knowledge well beyond instincts, is necessary to explain how prodigious savants can know things they never learned.

We tend to think of ourselves as being born with a magnificent and intricate piece of organic machinery (“hardware”) we call the brain, along with a massive but blank hard drive (memory). What we become, it is commonly believed, is an accumulation and culmination of our continuous learning and life experiences, which are added one by one to memory. But the prodigious savant apparently comes already programmed with a vast amount of innate skill -and knowledge in his or her area of expertise–factory-installed “software” one might say–which accounts for the extraordinary abilities over which the savant innately shows mastery in the face of often massive cognitive and other learning handicaps. It is an area of memory function worthy of much more exploration and study.

Indeed recent cases of “acquired savants” or “accidental genius” have convinced me that we all have such factory-installed software. I discussed some of those cases in detail in . . .

Continue reading.

Written by Leisureguy

22 July 2021 at 11:27 am

Self-Medicating Chimps, Pugilistic Shrimp, and Other Remarkable Animals: An Illustrated Guide

leave a comment »

MIT Press Reader has an extract from Emmanuelle Pouydebat’s new book. It’s worth clicking the link for the images. Here’s some of the text:

“Ifeel that there’s nothing more important than to pass on, to my son, the little piece of nothing and everything that I’ve observed — the happiness that comes from watching a dragonfly, spider, frog, lizard, elephant, parrot, mouse, orangutan, or ladybug,” Emmanuelle Pouydebat writes in her new book, “Atlas of Poetic Zoology.” “Each individual creature enriches my own existence boundlessly.”

For Pouydebat, a researcher at the French Museum of Natural History, animals are lyric poets; they discover and shape the world when they sing, dance, explore and reproduce. They are also highly adaptive, having weathered many crises of extinction over millions of years. Her book, a lively and idiosyncratic collection of meditations on 36 extraordinary creatures, invites readers to draw inspiration from their enduring vitality.

In the excerpts featured below, accompanied by striking illustrations by artist Julie Terrazzoni, Pouydebat guides readers through just a fraction of the natural world — one occupied by flightless birds, soaring turtles, self-medicating chimpanzees, pugilistic shrimp, and venomous octopuses.

These great apes are primates, like us. Sure, they’re a little hairier, but they’re hominids. Like us, these creatures are amazing. Like us? It’s a risky comparison. In many respects, chimpanzees are more accomplished. One talent they possess would benefit us, too: they know how to take care of themselves. Since the 1970s, researchers have known that chimpanzees — especially those in Tanzania or Uganda — use medicinal plants. They consume fruits with antimicrobial properties; sometimes they combine them with other substances to reduce the toxicity. Other chimpanzees eat flowers with antibiotic properties or leaves with antiparasitic ones, which act as laxatives or even induce uterine contractions. Chimpanzees also tear off bark and lick the resin to kill internal worms; the compounds, tests in vitro have shown, slow the growth of cancerous cells.

Significantly, practices of self-medication vary between chimpanzee populations. When chimpanzees feel sick, they seek out a particular tree and ingest a few leaves. The bitter leaves contain molecules that are quite effective against plasmodium parasites, which cause malaria. But chimpanzees also consume about ten other kinds of plant to combat these organisms. Thus, in contrast to human beings (who use a small number of substances for warding off malaria), chimpanzees diversify their medical arsenal. What’s more, when making their bed for the night, chimpazees in Uganda do so in areas where there are fewer mosquitos. Do they choose plants based on their potential for repelling pests, or is softness — and resulting comfort — decisive? We’ll see.

For some time now, chimpanzees’ pharmacopoeia has been the object of study. Indeed, research by Jane Goodall in the 1960s even prompted scientists to reexamine traits thought to be exclusively human. In fact, chimpanzees use an array of tools for different purposes: branches for digging out termites, honey, or marrow; sticks and stones for cracking nuts; and sharpened pieces of wood for spearing galagos (bushbabies); they even make “shoes” to protect their feet when climbing thorny trunks. Using these tools can be complex and require training; some mothers actively show their young the right way. Pedagogy, then, is a practice we share with chimpanzees. Another exciting finding: techniques differ from one population to the next (Uganda, Ivory Coast, Guinea, etc.). Many writers on the subject don’t hesitate to speak of “traditions” and “cultures.” This observation raises another set of questions. Do chimpanzees invent? In Tai National Park, Ivory Coast, generations of chimpanzees were known to use branches to break extremely hard nuts (dura laboriosa). One day, a female member of the group, Eureka, employed a stone for the same purpose and continued to do so in the presence of her companions. And then? Other chimpanzees started doing the same. After a few generations, the entire population had switched tools for cracking nuts, from sticks to stones.

Chimpanzees devise tools, but they’re even better at something else: memorization. The same chimpanzees in Ivory Coast have a geometrical understanding of their territory, which spans twenty-five square kilometers; they move from one spot to another in straight lines, more or less. Even with a limited range of vision — thirty meters, at most — they know where to go to find ripe fruit and avoid danger (including rival chimpanzees!). By remembering topographical features of the landscape and picturing abstract space, they can calculate distance and direction, no matter where they are.

And that’s not the only proof of chimpanzees’ cognitive abilities. In a computer-based test of spatial memory, researchers compared young chimpanzees and university students. The experiment involved clicking numbers one by one, in the right order. At an ulterior stage, the task became more complicated: as soon as subjects clicked the first number, a white square blanked out the other ones; the point was still to press the right series, in order to receive a reward. And the results? The chimpanzees pulled it off 80 percent of the time — that is, twice as often as the students. From an early age, these animals demonstrate highly developed visual memory; it’s almost photographic. This is what enables them to memorize where the best fruit is growing and determine the best path to take. Whenever they don’t prefer to break our cameras, instead — or throw all kinds of stuff at us.


.
Allow me to introduce another evolutionary and adaptive marvel: the parrot that doesn’t fly. The kakapo — which means “night parrot” in Maori — is the heaviest parrot in the world; it can weigh up to 4 kilos and has short wings and feathers that keep it grounded. But birds haven’t always flown. In all likelihood, feathers didn’t develop in order to enable flight so much as to facilitate individual distinctness and communication. In this regard, the kakapo isn’t an anomaly; it’s a living reminder of extinct birds that never flew in the first place.

The kakapo doesn’t fly; it walks. And thanks to its steely claws, it has no difficulty climbing trees. Yet survival proves difficult, despite a lifespan of ninety years and great skill in the art of seduction. What’s this skill? Males make a real impression by “booming.” Picture the kakapo digging a hole in the ground, inflating a sac in its chest, flapping its wings, and bringing forth inimitable screams — plus a mighty, exploding sound to attract the ladies…. It works! The basin Mr. Kakapo has made amplifies the noise. To this end, he has removed any twigs that might get in the way and carefully chosen a resonant location — for instance, a spot between rock walls or tree trunks. For about eight hours, he makes one “boom” after another night after night, for three or four months straight. In the process, he can lose up to half his body weight. Depending on the winds, Madame Kakapo will hear these booms as far as five kilometers away; wherever she may be, the sound reaches her, for her beau makes sure to boom in every direction. Females journey to meet their suitors who will fight to the death to parade around, click their beaks, and brandish outstretched wings.

Unfortunately, the heartwarming part of the spectacle stops there. Booming attracts predators, too. Plus,  . . .

Continue reading. There’s much more.

Written by Leisureguy

15 July 2021 at 1:50 pm

Weird lifeform: Glacier ice worms

leave a comment »

Scientists aren’t sure why the segmented worms, each less than an inch long, wriggle to the surface of the glacier late in the day, though they think it may be to feed or to soak up the sun’s rays.
Scott Hotaling

Life finds unexpected niches. Nell Greenfieldboyce reports at NPR:

High up on Mount Rainier in Washington, there’s a stunning view of the other white-capped peaks in the Cascade Range. But Scott Hotaling is looking down toward his feet, studying the snow-covered ground.

“It’s happening,” he says, gesturing across Paradise Glacier.

Small black flecks suddenly appear on the previously blank expanse of white. The glacier’s surface quickly transforms as more and more tiny black creatures emerge. The ice worms have returned, snaking in between ice crystals and shimmering in the sun.

These thread-like worms, each only about an inch long, wiggle up en masse in the summertime, late in the afternoon, to do — what? Scientists don’t know. It’s just one of many mysteries about these worms, which have barely been studied, even though they’re the most abundant critter living up there in the snow and ice.

Billions and billions of inch-long black creatures

“There are so many,” says Hotaling, a researcher at Washington State University. An estimated 5 billion ice worms can live in a single glacier.

“From where we’re standing right now, I can see, five, six, 10 glaciers,” he says. “And if every one hosts that density of ice worms? That is just a massive amount of biomass in a place that is generally biomass-poor.”

For a long time, he says, biologists have written off high-altitude glaciers such as these as basically sterile, lifeless places. Ice worms, however, show that this fragile environment — where the glaciers are vulnerable to climate change and are retreating — is potentially far more complicated.

“If you were going to put a biological mascot on glaciers of the Northwest,” Hotaling says, “it’s an ice worm.”

And yet, with the possible exception of the annual Cordova Iceworm Festival in Alaska, these bizarre worms have generally been either ignored or treated as a mere curiosity.

The National Park Service’s visitors center near Paradise Glacier, for example, has a nice display on alpine wildlife, Hotaling says, “and there is somehow nothing about ice worms. And it is a source of frustration for me.”

He admits that it bothers “probably no one else that comes here.” Many people who hike, ski or work on these mountains have never seen an ice worm despite their abundance, partly because the beasts only come to the surface at certain times of the year, at certain times of day.

Continue reading. There’s quite a bit more and the worms are weird: they can withstand very high levels of ultraviolet light, but they die if they freeze. (They should have thought of that before making their home in a glacier.)

Written by Leisureguy

13 July 2021 at 10:27 am

Why the Delta variant is so contagious

leave a comment »

The numerals in this illustration show the main mutation sites of the delta variant of the coronavirus, which is likely the most contagious version. Here, the virus’s spike protein (red) binds to a receptor on a human cell (blue).
Juan Gaertner/Science Source

Michaeleen Doucleff reports at NPR:

After months of data collection, scientists agree: The delta variant is the most contagious version of the coronavirus worldwide. It spreads about 225% faster than the original version of the virus, and it’s currently dominating the outbreak in the United States.

A new study, published online Wednesday, sheds light on why. It finds that the variant grows more rapidly inside people’s respiratory tracts and to much higher levels, researchers at the Guangdong Provincial Center for Disease Control and Prevention reported.

On average, people infected with the delta variant had about 1,000 times more copies of the virus in their respiratory tracts than those infected with the original strain of the coronavirus, the study reported.

In addition, after someone catches the delta variant, the person likely becomes infectious sooner. On average, it took about four days for the delta variant to reach detectable levels inside a person, compared with six days for the original coronavirus variant.

In the study, scientists analyzed . . .

Continue reading.

Written by Leisureguy

11 July 2021 at 7:37 am

Evolution finds amazing solutions: Moustached bat example

leave a comment »

Written by Leisureguy

6 July 2021 at 3:09 pm

Posted in Evolution, Science

“Finding the Mother Tree”: Discovering forests’ hidden networks

leave a comment »

In Science News Cori Vanchieri reviews Suzanne Simard’s book:

Finding the Mother Tree
Suzanne Simard
Knopf, $28.95

Opening Suzanne Simard’s new book, Finding the Mother Tree, I expected to learn about the old growth forests of the Pacific Northwest. I had an inkling that Simard, a forest ecologist at the University of British Columbia in Vancouver, would walk through her painstaking research to convince logging companies and others that clear-cutting large parcels of land is too damaging for forests to recover. I didn’t expect to be carried along on her very relatable journey through life.

Simard was born in the Monashee Mountains of British Columbia in 1960. Her family of loggers selectively cut trees and dragged them out with horses, leaving plenty still standing. In her first stab at a career, she joined a commercial logging company that clear-cut with large machinery. Her job was to check on seedlings the firm had planted in those areas to restart the forest. The fledgling plants were often yellowed and failing. Simard’s instincts told her those trees were missing the resources that exist within a diverse community of plants, so she set out to see if her hunch was right.

She learned how to do experiments, with close calls with grizzly bears and other mishaps along the way, eventually becoming a tenured professor. She and colleagues discovered that underground networks of fungi among tree roots shuttle carbon and nutrients from tree to tree (SN: 8/9/97, p. 87). Simard seamlessly weaves details of her studies of these networks with her life’s travails: sibling relationships and loss, struggles as a woman in a male-dominated field and her own recovery from a health crisis. Like many women who work outside the home, she felt torn between being with her young daughters and pursuing her professional passions.

Readers will feel for Simard as much as they worry for the forests that are quickly disappearing. Simard presents plenty of evidence and writes enthusiastically to build her analogy of the “mother trees” — the biggest, oldest trees in a forest that nurture those nearby. In her experiments, seedlings planted near a mother tree were much more likely to survive.

“Trees and plants have agency,” she writes. “They cooperate, make decisions, learn and remember — qualities we normally ascribe to sentience, wisdom, intelligence.” Simard encourages logging companies to . . .

Continue reading.

Written by Leisureguy

29 June 2021 at 3:53 pm

This Weirdly Smart, Creeping Slime Is Redefining Our Understanding of Intelligence

leave a comment »

Michelle Starr writes at ScienceAlert:

Imagine you’re walking into a forest, and you roll over a fallen log with your foot. Fanning out on the underside, there is something moist and yellow – a bit like something you may have sneezed out, if that something was banana-yellow and spread itself out into elegant fractal branches.

What you’re looking at is the plasmodium form of Physarum polycephalum, the many-headed slime mold. Like other slime molds found in nature, it fills an important ecological role, aiding in the decay of organic matter to recycle it into the food web.

This bizarre little organism doesn’t have a brain, or a nervous system – its blobby, bright-yellow body is just one cell. This slime mold species has thrived, more or less unchanged, for a billion years in its damp, decaying habitats.

And, in the last decade, it’s been changing how we think about cognition and problem-solving.

“I think it’s the same kind of revolution that occurred when people realized that plants could communicate with each other,” says biologist Audrey Dussutour of the French National Center for Scientific Research.

“Even these tiny little microbes can learn. It gives you a bit of humility.”

P. polycephalum – adorably nicknamed “The Blob” by Dussutour – isn’t exactly rare. It can be found in dark, humid, cool environments like the leaf litter on a forest floor. It’s also really peculiar; although we call it a ‘mold’, it is not actually fungus. Nor is it animal or plant, but a member of the protist kingdom – a sort of catch-all group for anything that can’t be neatly categorized in the other three kingdoms.

It starts its life as many individual cells, each with a single nucleus. Then, they merge to form the plasmodium, the vegetative life stage in which the organism feeds and grows.

In this form, fanning out in veins to search for food and explore its environment, it’s still a single cell, but containing millions or even billions of nuclei swimming in the cytoplasmic fluid confined within the bright-yellow membrane.

Cognition without a brain

Like all organisms, P. polycephalum needs to be able to make decisions about its environment. It needs to seek food and avoid danger. It needs to find the ideal conditions for its reproductive cycle. And this is where our little yellow friend gets really interesting. P. polycephalum doesn’t have a central nervous system. It doesn’t even have specialized tissues.

Yet it can solve complex puzzles, like labyrinth mazes, and remember novel substances. The kind of tasks we used to think only animals could perform.

“We’re talking about cognition without a brain, obviously, but also without any neurons at all. So the underlying mechanisms, the whole architectural framework of how it deals with information is totally different to the way your brain works,” biologist Chris Reid of Macquarie University in Australia tells ScienceAlert.

“By providing it with the same problem-solving challenges that we’ve traditionally given to animals with brains, we can start to see how this fundamentally different system might arrive at the same outcome. It’s where . . .

Continue reading. There’s much more.

Later in the article:

Although it’s technically a single-celled organism, P. polycephalum is considered a network, exhibiting collective behavior. Each part of the slime mold is operating independently and sharing information with its neighboring sections, with no centralized processing.

“I guess the analogy would be neurons in a brain,” Reid says. “You have this one brain that’s composed of lots of neurons – it’s the same for the slime mold.”

Written by Leisureguy

16 June 2021 at 1:48 pm

A deep look at a speck of human brain reveals never-before-seen quirks

leave a comment »

Nerve cells that resided in a woman’s brain send out message-sending tendrils called axons (shown). A preliminary analysis has turned up some super-strong connections between cells. – Lichtman Lab/Harvard University, Connectomics Team/Google

Laura Sanders writes in Science News:

A new view of the human brain shows its cellular residents in all their wild and weird glory. The map, drawn from a tiny piece of a woman’s brain, charts the varied shapes of 50,000 cells and 130 million connections between them.

This intricate map, named H01 for “human sample 1,” represents a milestone in scientists’ quest to provide evermore detailed descriptions of a brain (SN: 2/7/14).

“It’s absolutely beautiful,” says neuroscientist Clay Reid at the Allen Institute for Brain Science in Seattle. “In the best possible way, it’s the beginning of something very exciting.”

Scientists at Harvard University, Google and elsewhere prepared and analyzed the brain tissue sample. Smaller than a sesame seed, the bit of brain was about a millionth of an entire brain’s volume. It came from the cortex — the brain’s outer layer responsible for complex thought — of a 45-year-old woman undergoing surgery for epilepsy. After it was removed, the brain sample was quickly preserved and stained with heavy metals that revealed cellular structures. The sample was then sliced into more than 5,000 wafer-thin pieces and imaged with powerful electron microscopes.

Computational programs stitched the resulting images back together and artificial intelligence programs helped scientists analyze them. A short description of the resulting view was published as a preprint May 30 to bioRxiv.org. The full dataset is freely available online.

For now, researchers are just beginning to see what’s there. “We have really just dipped our toe into this dataset,” says study coauthor Jeff Lichtman, a developmental neurobiologist at Harvard University. Lichtman compares the brain map to Google Earth: “There are gems in there to find, but no one can say they’ve looked at the whole thing.”

But already, some “fantastically interesting” sights have appeared, Lichtman says. “When you have large datasets, suddenly these odd things, these weird things, these rare things start to stand out.”

One such curiosity concerns  . . ..

Continue reading. There’s much more.

Written by Leisureguy

13 June 2021 at 12:12 pm

Posted in Evolution, Science

These ferns may be the first plants known to share work like ants

leave a comment »

Many of this fern colony’s fan-shaped nest fronds (growing closer to the tree trunk) are sterile, while the thinner strap fronds (sticking up and out from between the nest fronds) lift more of the reproductive load for the colony. – Ian Hutton

Jake Buehler writes in Science News:

High in the forest canopy, a mass of strange ferns grips a tree trunk, looking like a giant tangle of floppy, viridescent antlers. Below these fork-leaved fronds and closer into the core of the lush knot are brown, disk-shaped plants. These, too, are ferns of the very same species.

The ferns — and possibly similar plants — may form a type of complex, interdependent society previously considered limited to animals like ants and termites, researchers report online May 14 in Ecology

Kevin Burns, a biologist at Victoria University of Wellington in New Zealand, first became familiar with the ferns while conducting fieldwork on Lord Howe Island, an isolated island between Australia and New Zealand. He happened to take note of the local epiphytes — plants that grow upon other plants — and one species particularly caught his attention: the staghorn fern (Platycerium bifurcatum), also native to parts of mainland Australia and Indonesia.

“I realized, God, you know, they never occur alone,” says Burns, noting that some of the larger clusters of ferns were massive clumps made of hundreds of individuals. 

It was soon clear to Burns that “each one of those individuals was doing a different thing.”

He likens the fern colonies to an upside-down umbrella made of plants. Ferns with long, green, waxy “strap” fronds appeared to deflect water to the center of the aggregation, where disk-shaped, brown, spongey “nest” fronds could soak it up.

The shrubby apparatus reminded Burns of a termite mound, with a communal store of resources and the segregation of different jobs in the colony. Scientists call these types of cooperative groups, where overlapping generations live together and form castes to divide labor and reproductive roles, “eusocial.” The term has been used to describe certain insect and crustacean societies, along with two mole rat species as the only mammalian examples (SN: 10/18/04). Burns wondered if the ferns could also be eusocial.

His team’s analysis of frond fertility revealed 40 percent couldn’t reproduce, and the sterile colony members were predominantly nest fronds. This suggests a reproductive division of labor between the nest and strap frond types. Tests of the fronds’ absorbency confirmed that nest fronds sop up more water than strap fronds do. Previous research by other scientists found networks of roots running throughout the colony, which means that nest fronds have the ability to slake strap fronds’ thirst. The fronds divided labor, much like ants and termites.

The team also analyzed genetic samples from 10 colonies on Lord Howe Island and found that eight were composed of genetically identical individuals, while two contained ferns of differing genetic origins. High degrees of genetic relatedness are also seen in colonies of eusocial insects, where many sisters contribute to the survival of the nest.

Taken together, Burns thinks these traits tick many of the boxes for eusociality. That would be a “big deal,” he says.

An assumed requirement for eusocial colonial living is behavioral coordination, because it allows different individuals to work together. But ferns are plants, not animals, which so often coordinate their behaviors. Seeing eusocial living in plants “seems to indicate to me that this type of transition in the evolution of complexity doesn’t require a brain,” Burns says.

The study opens up the . . .

Continue reading. There’s more. Evolution arrives at amazing solutions.

It occurs to me that some meme clusters show eusociality.

Written by Leisureguy

13 June 2021 at 12:04 pm

Evolution unleashed: Revolution in the making?

leave a comment »

Kevin Lalan, professor of behavioural and evolutionary biology at the University of St Andrews in Scotland, an elected fellow of the Royal Society of Edinburgh, a fellow of the Society of Biology, and co-author (with Tobias Uller) of Evolutionary Causation: Biological and Philosophical Reflections (2019), writes in Aeon:

When researchers at Emory University in Atlanta trained mice to fear the smell of almonds (by pairing it with electric shocks), they found, to their consternation, that both the children and grandchildren of these mice were spontaneously afraid of the same smell. That is not supposed to happen. Generations of schoolchildren have been taught that the inheritance of acquired characteristics is impossible. A mouse should not be born with something its parents have learned during their lifetimes, any more than a mouse that loses its tail in an accident should give birth to tailless mice.

If you are not a biologist, you’d be forgiven for being confused about the state of evolutionary science. Modern evolutionary biology dates back to a synthesis that emerged around the 1940s-60s, which married Charles Darwin’s mechanism of natural selection with Gregor Mendel’s discoveries of how genes are inherited. The traditional, and still dominant, view is that adaptations – from the human brain to the peacock’s tail – are fully and satisfactorily explained by natural selection (and subsequent inheritance). Yet as novel ideas flood in from genomics, epigenetics and developmental biology, most evolutionists agree that their field is in flux. Much of the data implies that evolution is more complex than we once assumed.

Some evolutionary biologists, myself included, are calling for a broader characterisation of evolutionary theory, known as the extended evolutionary synthesis (EES). A central issue is whether what happens to organisms during their lifetime – their development – can play important and previously unanticipated roles in evolution. The orthodox view has been that developmental processes are largely irrelevant to evolution, but the EES views them as pivotal. Protagonists with authoritative credentials square up on both sides of this debate, with big-shot professors at Ivy League universities and members of national academies going head-to-head over the mechanisms of evolution. Some people are even starting to wonder if a revolution is on the cards.

In his book On Human Nature (1978), the evolutionary biologist Edward O Wilson claimed that human culture is held on a genetic leash. The metaphor was contentious for two reasons. First, as we’ll see, it’s no less true that culture holds genes on a leash. Second, while there must be a genetic propensity for cultural learning, few cultural differences can be explained by underlying genetic differences.

Nonetheless, the phrase has explanatory potential. Imagine a dog-walker (the genes) struggling to retain control of a brawny mastiff (human culture). The pair’s trajectory (the pathway of evolution) reflects the outcome of the struggle. Now imagine the same dog-walker struggling with multiple dogs, on leashes of varied lengths, with each dog tugging in different directions. All these tugs represent the influence of developmental factors, including epigenetics, antibodies and hormones passed on by parents, as well as the ecological legacies and culture they bequeath.

The struggling dog-walker is a good metaphor for how EES views the adaptive process. Does this require a revolution in evolution? Before we can answer this question, we need to examine how science works. The best authorities here are not biologists but philosophers and historians of science. Thomas Kuhn’s book The Structure of Scientific Revolutions (1962) popularised the idea that sciences change through revolutions in understanding. These ‘paradigm shifts’ were thought to follow a crisis of confidence in the old theory that arose through the accumulation of conflicting data.

Then there’s Karl Popper, and his conjecture that scientific theories can’t be proven but can be falsified. Consider the hypothesis: ‘All sheep are white.’ Popper maintained that no amount of positive findings consistent with this hypothesis could prove it to be correct, since one could never rule out the chance that a conflicting data-point might arise in the future; conversely, the observation of a single black sheep would decisively prove the hypothesis to be false. He maintained that scientists should strive to carry out critical experiments that could potentially falsify their theories.

While Kuhn and Popper’s ideas are well-known, they remain disputed and contentious in the eyes of philosophers and historians. Contemporary thinking in these fields is better captured by the Hungarian philosopher Imre Lakatos in The Methodology of Scientific Research Programmes (1978):

The history of science refutes both Popper and Kuhn: on close inspection both Popperian crucial experiments and Kuhnian revolutions turn out to be myths.

Popper’s arguments might make logical sense, but they don’t quite map on to how science works in the real world. Scientific observations are susceptible to errors of measurement; scientists are human beings and get attached to their theories; and scientific ideas can be fiendishly complex – all of which makes evaluating scientific hypotheses a messy business. Rather than accepting that our hypotheses might be wrong, we challenge the methodology (‘That sheep’s not black – your instruments are faulty’), dispute the interpretation (‘The sheep’s just dirty’), or come up with tweaks to our hypotheses (‘I meant domesticated breeds, not wild mouflon’). Lakatos called such fixes and fudges ‘auxiliary hypotheses’; scientists propose them to ‘protect’ their core ideas, so that they need not be rejected.

This sort of behaviour is clearly manifest in scientific debates over evolution. Take the idea that new features acquired by an organism during its life can be passed on to the next generation. This hypothesis was brought to prominence in the early 1800s by the French biologist Jean-Baptiste Lamarck, who used it to explain how species evolved. However, it has long been regarded as discredited by experiment – to the point that the term ‘Lamarckian’ has a derogatory connotation in evolutionary circles, and any researchers expressing sympathy for the idea effectively brand themselves ‘eccentric’. The received wisdom is that parental experiences can’t affect the characters of their offspring.

Except they do. The way that genes are expressed to produce an organism’s phenotype – the actual characteristics it ends up with – is affected by chemicals that attach to them. Everything from diet to air pollution to parental behaviour can influence the addition or removal of these chemical marks, which switches genes on or off. Usually these so-called ‘epigenetic’ attachments are removed during the production of sperm and eggs cells, but it turns out that some escape the resetting process and are passed on to the next generation, along with the genes. This is known as ‘epigenetic inheritance’, and more and more studies are confirming that it really happens.

Let’s return to the almond-fearing mice. The inheritance of . . .

Continue reading. There’s much more.

Written by Leisureguy

5 June 2021 at 10:22 am

Black cumin’s health benefits and how I use it

leave a comment »

Some time back I learned about black cumin, which (as studies have confirmed) has a variety of nutritional benefits, some of which are of particular interest to diabetics with other benefits generally useful. Evolution has made seeds generally difficult to digest: the fruit acts as bait for an animal, and the seeds pass through through its digestive track unharmed and later sprout where deposited, generally far from the parent plant. Thus the animal unwittingly assists the plant by ensuring a wider propagation (plants in general being deficient in mobility, thus the observation that the apple falls not far from the tree). Other seeds reflect other strategies not depending on being consumed. Dandelion, milk thistle, and maple seeds take flight and are dispersed by the wind — they have no fruit because their propagation doesn’t require it. Cockleburs, burdock, and other seeds hitch a ride by sticking to the fur (or clothing) of passing animals.

To be actually digested, seeds in general must be cracked or ground, so (for example) I grind the flaxseed and peppercorns that I eat. Some seeds — sesame seeds, for example — can be ground by chewing; other seeds — cumin, for example — are generally sold already ground.

I tried grinding black cumin in the whirling-blade grinder, which works well for flaxseed, but it seemed that black cumin seeds are too small and tough for that to work. So I bought a pepper mill and filled it with black cumin. You need only a little black cumin to gain its benefits, and this seemed like a good approach. (At the bottom of the mill you see some peppercorns that came with it. I’ll soon work through those and reach the black cumin.)

The three-minute video below explains some of the benefits (and see also this video for how it helps with Hashimoto’s disease, an autoimmune disease that affects the thyroid).

Written by Leisureguy

4 June 2021 at 10:08 am

The cultural iceberg

leave a comment »

The image below presents various memes, memes being the atoms from which human culture evolves. A meme is anything you can learn from another person or teach to another person. Everything listed as part of the iceberg meets that definition.

Once a meme is born (by something being taught/learned), it evolves (and quite rapidly — millions of times faster than lifeforms evolve) Meme evolution follows Darwinian logic: just like lifeforms, memes reproduce and the new generation resembles the first with occasional variations (when one person learns from another, the meme from the first is reproduced in the second, who may not do it quite the same way). So while reproduction does result in the reproduced memes being like the original, some variation occurs — and variations that have a survival advantage (e.g., a better or easier way of doing something) tend to replace the original.

Moreover, mutations can arise (a person creates a new song (not yet a meme: it was not learned from another nor taught to another), and when they teach the song to others, it becomes a meme. If the song becomes popular, it is learned by many and the meme thrives and grows.

Not all memes thrive. Human attention is limited, so a natural selection works on memes takes place. While some memes are widely reproduced (look at how many speak English), others occupy a small niche (many fewer speak Esperanto) and some go extinct (dialects of ancient Sumerian that today are unknown). The result is the on-going process of evolution.

The image includes various things we learn from others, and sometimes the meme is not formally taught but is learned through osmosis:  by simply living in the culture into which one was born. Nevertheless, these all are learned behaviors and attitudes, as shown by how they differ from culture to culture. That variance shows that the behaviors and attitudes are not inborn but rather are taught/learned.

To a great degree the memes we host control and direct our actions — we make decisions and live our lives mostly under the sway of the memes we’ve collected (or — another way of looking at it — with which we’ve been infected).

Written by Leisureguy

3 June 2021 at 11:08 am

America Has a Drinking Problem

leave a comment »

I have gradually come to recognize that alcohol undermines constancy of purpose. A recovering alcoholic warned me when I was still in college, “Alcohol is sneaky.” He meant that you can think things are going well, but if alcohol is part of one’s daily diet, I would say that person is at serious risk. In recent years my consumption of alcohol has been minimal. I am not a teetotaler, but I drink very little and most weeks not at all.

Kate Julian writes in the Atlantic:

Few things are more American than drinking heavily. But worrying about how heavily other Americans are drinking is one of them.

The Mayflower landed at Plymouth Rock because, the crew feared, the Pilgrims were going through the beer too quickly. The ship had been headed for the mouth of the Hudson River, until its sailors (who, like most Europeans of that time, preferred beer to water) panicked at the possibility of running out before they got home, and threatened mutiny. And so the Pilgrims were kicked ashore, short of their intended destination and beerless. William Bradford complained bitterly about the latter in his diary that winter, which is really saying something when you consider what trouble the group was in. (Barely half would survive until spring.) Before long, they were not only making their own beer but also importing wine and liquor. Still, within a couple of generations, Puritans like Cotton Mather were warning that a “flood of RUM” could “overwhelm all good Order among us.”

George Washington first won elected office, in 1758, by getting voters soused. (He is said to have given them 144 gallons of alcohol, enough to win him 307 votes and a seat in Virginia’s House of Burgesses.) During the Revolutionary War, he used the same tactic to keep troops happy, and he later became one of the country’s leading whiskey distillers. But he nonetheless took to moralizing when it came to other people’s drinking, which in 1789 he called “the ruin of half the workmen in this Country.

Hypocritical though he was, Washington had a point. The new country was on a bender, and its drinking would only increase in the years that followed. By 1830, the average American adult was consuming about three times the amount we drink today. An obsession with alcohol’s harms understandably followed, starting the country on the long road to Prohibition.

[Hypocrisy is a serious accusation that should not be lightly made. If an automobile manufacturer — or a typical driver — made a statement opposing speeding or reckless driving, I would not see that as hypocrisy. For a brewer or distiller to state that drinking excessively is bad does not seem hypocritical to me, any more than a restaurateur or grocer stating that gluttony is bad. It seems to me that the author did not think through that accusation. – LG  Postscript: It occurs to me that perhaps people nowadays do not understand how bad hypocrisy is. Perhaps the term has weakened through being used too frequently and/or inappropriately. But hypocrisy is a serious failing indeed, and a hypocrite weakens the social fabric though a basic dishonesty.]

What’s distinctly American about this story is not alcohol’s prominent place in our history (that’s true of many societies), but the zeal with which we’ve swung between extremes. Americans tend to drink in more dysfunctional ways than people in other societies, only to become judgmental about nearly any drinking at all. Again and again, an era of overindulgence begets an era of renunciation: Binge, abstain. Binge, abstain.

Right now we are lurching into another of our periodic crises over drinking, and both tendencies are on display at once. Since the turn of the millennium, alcohol consumption has risen steadily, in a reversal of its long decline throughout the 1980s and ’90s. Before the pandemic, some aspects of this shift seemed sort of fun, as long as you didn’t think about them too hard. In the 20th century, you might have been able to buy wine at the supermarket, but you couldn’t drink it in the supermarket. Now some grocery stores have wine bars, beer on tap, signs inviting you to “shop ’n’ sip,” and carts with cup holders.

Actual bars have decreased in number, but drinking is acceptable in all sorts of other places it didn’t used to be: Salons and boutiques dole out cheap cava in plastic cups. Movie theaters serve alcohol, Starbucks serves alcohol, zoos serve alcohol. Moms carry coffee mugs that say things like this might be wine, though for discreet day-drinking, the better move may be one of the new hard seltzers, a watered-down malt liquor dressed up—for precisely this purpose—as a natural soda.

Even before COVID-19 arrived on our shores, the consequences of all this were catching up with us. From 1999 to 2017, the number of alcohol-related deaths in the U.S. doubled, to more than 70,000 a year—making alcohol one of the leading drivers of the decline in American life expectancy. These numbers are likely to get worse: During the pandemic, frequency of drinking rose, as did sales of hard liquor. By this February, nearly a quarter of Americans said they’d drunk more over the past year as a means of coping with stress.

Explaining these trends is hard; they defy so many recent expectations. Not long ago, Millennials were touted as the driest generation—they didn’t drink much as teenagers, they were “sober curious,” they were so admirably focused on being well—and yet here they are day-drinking White Claw and dying of cirrhosis at record rates. Nor does any of this appear to be an inevitable response to 21st-century life: Other countries with deeply entrenched drinking problems, among them Britain and Russia, have seen alcohol use drop in recent years.

Media coverage, meanwhile, has swung from cheerfully overselling the (now disputed) health benefits of wine to screeching that no amount of alcohol is safe, ever; it might give you cancer and it will certainly make you die before your time. But even those who are listening appear to be responding in erratic and contradictory ways. Some of my own friends—mostly 30- or 40-something women, a group with a particularly sharp uptick in drinking—regularly declare that they’re taking an extended break from drinking, only to fall off the wagon immediately. One went from extolling the benefits of Dry January in one breath to telling me a funny story about hangover-cure IV bags in the next. A number of us share the same (wonderful) doctor, and after our annual physicals, we compare notes about the ever nudgier questions she asks about alcohol. “Maybe save wine for the weekend?” she suggests with a cheer so forced she might as well be saying, “Maybe you don’t need to drive nails into your skull every day?”

What most of us want to know, coming out of the pandemic, is this: Am I drinking too much? And: How much are other people drinking? And: Is alcohol actually that bad?

The answer to all these questions turns, to a surprising extent, not only on how much you drink, but on how and where and with whom you do it. But before we get to that, we need to consider a more basic question, one we rarely stop to ask: Why do we drink in the first place? By we, I mean Americans in 2021, but I also mean human beings for the past several millennia.

Let’s get this out of the way: Part of the answer is “Because it is fun.” Drinking releases endorphins, the natural opiates that are also triggered by, among other things, eating and sex. Another part of the answer is “Because we can.” Natural selection has endowed humans with the ability to drink most other mammals under the table. Many species have enzymes that break alcohol down and allow the body to excrete it, avoiding death by poisoning. But about 10 million years ago, a genetic mutation left our ancestors with a souped-up enzyme that increased alcohol metabolism 40-fold.

This mutation occurred around the time that a major climate disruption transformed the landscape of eastern Africa, eventually leading to widespread extinction. In the intervening scramble for food, the leading theory goes, our predecessors resorted to eating fermented fruit off the rain-forest floor. Those animals that liked the smell and taste of alcohol, and were good at metabolizing it, were rewarded with calories. In the evolutionary hunger games, the drunk apes beat the sober ones.

But even presuming that this story of natural selection is right, it doesn’t explain why, 10 million years later, I like wine so much. “It should puzzle us more than it does,” Edward Slingerland writes in his wide-ranging and provocative new book, Drunk: How We Sipped, Danced, and Stumbled Our Way to Civilization, “that one of the greatest foci of human ingenuity and concentrated effort over the past millennia has been the problem of how to get drunk.” The damage done by alcohol is profound: impaired cognition and motor skills, belligerence, injury, and vulnerability to all sorts of predation in the short run; damaged livers and brains, dysfunction, addiction, and early death as years of heavy drinking pile up. As the importance of alcohol as a caloric stopgap diminished, why didn’t evolution eventually lead us away from drinking—say, by favoring genotypes associated with hating alcohol’s taste? That it didn’t suggests that alcohol’s harms were, over the long haul, outweighed by some serious advantages.

Versions of this idea have recently bubbled up at academic conferences and in scholarly journals and anthologies (largely to the credit of the British anthropologist Robin Dunbar). Drunk helpfully synthesizes the literature, then underlines its most radical implication: Humans aren’t merely built to get buzzed—getting buzzed helped humans build civilization. Slingerland is not unmindful of alcohol’s dark side, and his exploration of when and why its harms outweigh its benefits will unsettle some American drinkers. Still, he describes the book as “a holistic defense of alcohol.” And he announces, early on, that “it might actually be good for us to tie one on now and then.”

Slingerland is a professor at the University of British Columbia who, for most of his career, has specialized in ancient Chinese religion and philosophy. In a conversation this spring, I remarked that it seemed odd that he had just devoted several years of his life to a subject so far outside his wheelhouse. He replied that alcohol isn’t quite the departure from his specialty that it might seem; as he has recently come to see things, intoxication and religion are parallel puzzles, interesting for very similar reasons. As far back as his graduate work at Stanford in the 1990s, he’d found it bizarre that across all cultures and time periods, humans went to such extraordinary (and frequently painful and expensive) lengths to please invisible beings.

In 2012, Slingerland and several scholars in other fields won a big grant to study religion from an evolutionary perspective. In the years since, . . .

Continue reading. There’s much more, and it’s interesting.

Written by Leisureguy

2 June 2021 at 12:29 pm

When Earth was in beta

leave a comment »

TierZoo is a YouTube series that presents the history of life’s evolution on Earth presented as a giant multiplayer video game. Real lifeforms are presented as “builds” and “upgrades.” Entertaining and will appeal to those who have experience in playing video games. Here’s the intorduction:

And here’s a sample video on the Cat dynasty tier list:

And here’s an investigation of a specific build:

Written by Leisureguy

30 May 2021 at 9:02 am

Posted in Evolution, Games, Science, Video

Radioactivity May Fuel Life Deep Underground and Inside Other Worlds

leave a comment »

More and more it seems as though if life is possible in an environment, then it is inevitable. Matter wants to live.

Jordana Cepelewicz writes in Quanta:

Scientists poke and prod at the fringes of habitability in pursuit of life’s limits. To that end, they have tunneled kilometers below Earth’s surface, drilling outward from the bottoms of mine shafts and sinking boreholes deep into ocean sediments. To their surprise, “life was everywhere that we looked,” said Tori Hoehler, a chemist and astrobiologist at NASA’s Ames Research Center. And it was present in staggering quantities: By various estimates, the inhabited subsurface realm has twice the volume of the oceans and holds on the order of 1030 cells, making it one of the biggest habitats on the planet, as well as one of the oldest and most diverse.

Researchers are still trying to understand how most of the life down there survives. Sunlight for photosynthesis cannot reach such depths, and the meager amount of organic carbon food that does is often quickly exhausted. Unlike communities of organisms that dwell near hydrothermal vents on the seafloor or within continental regions warmed by volcanic activity, ecosystems here generally can’t rely on the high-temperature processes that support some subsurface life independent of photosynthesis; these microbes must hang on in deep cold and darkness.

Two papers appearing in February by different research groups now seem to have solved some of this mystery for cells beneath the continents and in deep marine sediments. They find evidence that, much as the sun’s nuclear fusion reactions provide energy to the surface world, a different kind of nuclear process — radioactive decay — can sustain life deep below the surface. Radiation from unstable atoms in rocks can split water molecules into hydrogen and chemically reactive peroxides and radicals; some cells can use the hydrogen as fuel directly, while the remaining products turn minerals and other surrounding compounds into additional energy sources.

Although these radiolytic reactions yield energy far more slowly than the sun and underground thermal processes, the researchers have shown that they are fast enough to be key drivers of microbial activity in a broad range of settings — and that they are responsible for a diverse pool of organic molecules and other chemicals important to life. According to Jack Mustard, a planetary geologist at Brown University who was not involved in the new work, the radiolysis explanation has “opened up whole new vistas” into what life could look like, how it might have emerged on an early Earth, and where else in the universe it might one day be found.

Hydrogen Down Deep

Barbara Sherwood Lollar set off for university in 1981, four years after the discovery of life at the hydrothermal vents. As the child of two teachers who “fed me on a steady diet of Jules Verne,” she said, “all of this really spoke to the kid in me.” Not only was studying the deep subsurface a way to “understand a part of the planet that had never been seen before, a kind of life that we didn’t understand yet,” but it “clearly was going to trample [the] boundaries” between chemistry, biology, physics and geology, allowing scientists to combine those fields in new and intriguing ways.

Throughout Sherwood Lollar’s training in the 1980s and her early career as a geologist at the University of Toronto in the ’90s, more and more subterranean microbial communities were uncovered. The enigma of what supported this life prompted some researchers to propose that there might be “a deep hydrogen-triggered biosphere” full of cells using hydrogen gas as an energy source. (Microbes found in deep subsurface samples were often enriched with genes for enzymes that could derive energy from hydrogen.) Many geological processes could plausibly produce that hydrogen, but the best-studied ones occurred only at high temperatures and pressures. These included interactions between volcanic gases, the breakdown of particular minerals in the presence of water, and serpentinization — the chemical alteration of certain kinds of crustal rock through reactions with water.

By the early 2000s, Sherwood Lollar, Li-Hung Lin (now at National Taiwan University), Tullis Onstott of Princeton University and their colleagues were finding high concentrations of hydrogen — “in some cases, stunningly high,” Sherwood Lollar said — in water isolated from deep beneath the South African and Canadian crust. But serpentinization couldn’t explain it: The kinds of minerals needed often weren’t present. Nor did the other processes seem likely, because of the absence of recent volcanic activity and magma flows.

“So we began to look and expand our understanding of hydrogen-producing reactions and their relationship to the chemistry and mineralogy of the rocks in these places,” Sherwood Lollar said.

A clue came from their discovery that . . .

Continue reading.

Written by Leisureguy

24 May 2021 at 4:26 pm

Good distinction: “Processed” food v. “Ultra-processed” food

leave a comment »

Most people (I believe) think of things like Cheez Whiz, Oreos, and Pringles as processed foods, but not so much homemade mashed potatoes (though instant mashed-potato flakes are indeed though of as “processed”) or cooked kale. But, as some love to point out, the mashed potatoes and cooked kale have indeed gone through a process of preparation (washing, chopping, and cooking) and so those, too, are “processed.”

But those are definitely unlike Cheez Whis, Beyond Beef, and Spam, whose processing is more invasive and intensive. Such foods as these are now called “ultra-processed,” and though at first I resisted the use of the label, I have to admit that I do process the whole foods I eat — for example, by cooking them.

Nevertheless, when people refer to “processed foods,” they almost always are talking about ultra-processed foods, and their intention should be honored.

Nicola Temple’s article for BBC prompted these thoughts. She writes:

My first introduction to processed food began as a child in rural Canada, where we grew 90% of what we ate on our seven-acre homestead. After a carefree summer of catching fireflies and frogs and plucking sugar snaps from the vine, late August marked the start of winter preparation.

In the stifling humidity of an Ontario summer, perched on vinyl 1970s kitchen furniture, we topped and tailed, shucked and shelled, and boiled and blanched – processing all of our home grown produce so that it would feed us through the long, cold winter.

Yet, processed food these days has a far more negative connotation. The words conjure images of “cheese” covered polystyrene-like snacks or “just add water” meals with suspicious “flavour” pouches; these are the ultra-processed foods.

Is it fair to paint all processed food with the same brush of disdain? We forget that innovations in food processing have also helped to improve nutrition, reduce food waste and provide us with more leisure time. It is far more complex than to claim all processed food is bad. Processed food has, for better or for worse (and likely both), changed our relationship with food. Long before that, it shaped us as a species.

Our hominin relation, Homo habilis, which lived between 2.4 million and 1.4 million years ago, bares the first evidence of food processing. Unlike its evolutionary predecessors, habilis had relatively small teeth. It is thought that such an evolutionary trend could only begin if food was being manipulated before it reached the mouth. Pounding roots with rocks or slicing thin strips of meat to make it easier to chew could translate to about 5% less chewing. With less strain on the chewing apparatus – the jaws, muscles and teeth – the body can redirect those energetically expensive tissues elsewhere, causing the face to become smaller relative to the overall skull size.

Homo erectus (1.89 mya – 108,000 years ago) and Homo neanderthalensis (400,000-40,000 years ago) had much smaller teeth than one would predict based on their skull sizes. Evolution could only favour such a reduction in tooth size if food had become easier to chew, and this is likely to only have been accomplished through thermal processing – cooking.

Cooked food requires 22% less muscle to chew and it can release energy (calories) that might otherwise be inaccessible in the raw product. As well as arguably putting our ancient ancestors on a trend toward small faces and big bodies, processed food led to a significant gain in leisure time. Less time spent chewing left the mouth free to develop complex oral language. Energy could be directed to growing a bigger brain rather than a heavy-duty chewing mechanism, and cooked food fed that calorie-hungry brain. When I say that processed food has helped shape us as a species, I mean it quite literally.

However, it continues to do so and that is perhaps more worrisome. Ultra-processed foods have certainly been linked to our ever-increasing body size and our cooked, soft diet is ultimately to blame for misaligned teeth. Small face, big body, crooked teeth – perhaps this is not a trend we wish to continue.

What drove our early ancestors to process food – preservation – remains the main driver behind food processing today. Advancements in technology mean we can now flash-freeze produce in the height of the season mere moments after it has been plucked from the earth, locking those essential nutrients up until they are released again months later on some stove top thousands of miles from where the produce was grown.

Yet there have been many other drivers along the way that have forced food innovation. When more seamen died of malnutrition than in battle during the Seven Years War and Napoleonic Wars, the push to find new ways of preserving food drove the development and widespread adoption of canning. In 1912, a change in legislation in the UK made it necessary for the middle classes to give their household servant a half day off each week; this drove the first iterations of the “ready-meal” as middle-class housewives suddenly found themselves having to cook one evening meal each week. . .

Continue reading.

Written by Leisureguy

20 May 2021 at 9:12 am

Sleep Evolved Before Brains. Hydras Are Living Proof.

leave a comment »

Veronique Greenwood writes in Quanta:

The hydra is a simple creature. Less than half an inch long, its tubular body has a foot at one end and a mouth at the other. The foot clings to a surface underwater — a plant or a rock, perhaps — and the mouth, ringed with tentacles, ensnares passing water fleas. It does not have a brain, or even much of a nervous system.

And yet, new research shows, it sleeps. Studies by a team in South Korea and Japan showed that the hydra periodically drops into a rest state that meets the essential criteria for sleep.

On the face of it, that might seem improbable. For more than a century, researchers who study sleep have looked for its purpose and structure in the brain. They have explored sleep’s connections to memory and learning. They have numbered the neural circuits that push us down into oblivious slumber and pull us back out of it. They have recorded the telltale changes in brain waves that mark our passage through different stages of sleep and tried to understand what drives them. Mountains of research and people’s daily experience attest to human sleep’s connection to the brain.

But a counterpoint to this brain-centric view of sleep has emerged. Researchers have noticed that molecules produced by muscles and some other tissues outside the nervous system can regulate sleep. Sleep affects metabolism pervasively in the body, suggesting that its influence is not exclusively neurological. And a body of work that’s been growing quietly but consistently for decades has shown that simple organisms with less and less brain spend significant time doing something that looks a lot like sleep. Sometimes their behavior has been pigeonholed as only “sleeplike,” but as more details are uncovered, it has become less and less clear why that distinction is necessary.

It appears that simple creatures — including, now, the brainless hydra — can sleep. And the intriguing implication of that finding is that sleep’s original role, buried billions of years back in life’s history, may have been very different from the standard human conception of it. If sleep does not require a brain, then it may be a profoundly broader phenomenon than we supposed.

Recognizing Sleep

Sleep is not the same as hibernation, or coma, or inebriation, or any other quiescent state, wrote the French sleep scientist Henri Piéron in 1913. Though all involved a superficially similar absence of movement, each had distinctive qualities, and that daily interruption of our conscious experience was particularly mysterious. Going without it made one foggy, confused, incapable of clear thought. For researchers who wanted to learn more about sleep, it seemed essential to understand what it did to the brain.

And so, in the mid-20th century, if you wanted to study sleep, you became an expert reader of electroencephalograms, or EEGs. Putting electrodes on humans, cats or rats allowed researchers to say with apparent precision whether a subject was sleeping and what stage of sleep they were in. That approach produced many insights, but it left a bias in the science: Almost everything we learned about sleep came from animals that could be fitted with electrodes, and the characteristics of sleep were increasingly defined in terms of the brain activity associated with them.

This frustrated Irene Tobler, a sleep physiologist working at the University of Zurich in the late 1970s, who had begun to study the behavior of cockroaches, curious whether invertebrates like insects sleep as mammals do. Having read Piéron and others, Tobler knew that sleep could be defined behaviorally too.

She distilled a set of behavioral criteria to identify sleep without the EEG. A sleeping animal does not move around. It is harder to rouse than one that’s simply resting. It may take on a different pose than when awake, or it may seek out a specific location for sleep. Once awakened it behaves normally rather than sluggishly. And Tobler added a criterion of her own, drawn from her work with rats: A sleeping animal that has been disturbed will later sleep longer or more deeply than usual, a phenomenon called sleep homeostasis. . .

Continue reading. There’s more.

Written by Leisureguy

18 May 2021 at 4:26 pm

Weird dreams train us for the unexpected, says new theory

leave a comment »

Linda Geddes writes in the Guardian:

It’s a common enough scenario: you walk into your local supermarket to buy some milk, but by the time you get to the till, the milk bottle has turned into a talking fish. Then you remember you’ve got your GCSE maths exam in the morning, but you haven’t attended a maths lesson for nearly three decades.

Dreams can be bafflingly bizarre, but according to a new theory of why we dream, that’s the whole point. By injecting some random weirdness into our humdrum existence, dreams leave us better equipped to cope with the unexpected.

The question of why we dream has long divided scientists. Dreams’ subjective nature, and the lack of any means of recording them, makes it fiendishly difficult to prove why they occur, or even how they differ between individuals.

“While various hypotheses have been put forward, many of these are contradicted by the sparse, hallucinatory, and narrative nature of dreams, a nature that seems to lack any particular function,” said Erik Hoel, a research assistant professor of neuroscience at Tufts University in Massachusetts, US.

Inspired by recent insights into how machine “neural networks” learn, Hoel has proposed an alternative theory: the overfitted brain hypothesis.

A common problem when it comes to training artificial intelligence (AI) is that it becomes too familiar with the data it’s trained on, because it assumes that this training set is a perfect representation of anything it might encounter. Scientists try to fix this “overfitting” by introducing some chaos into the data, in the form of noisy or corrupted inputs.

Hoel suggests that our brains do something similar when we dream. Particularly as we get older, our days become statistically pretty similar to one another, meaning our “training set” is limited. But we still need to be able to generalise our abilities to new and unexpected circumstances – whether it’s our physical movements and reactions, or our mental processes and understanding. We can’t inject random noise into our brains while we’re awake, because we need to concentrate on the tasks at hand, and perform them as accurately as possible. But sleep is a different matter.

By creating a weirded version of the world, dreams may make our understanding of it less simplistic and more well-rounded. “It is the very strangeness of dreams in their divergence from waking experience that gives them their biological function,” Hoel said.

Already, there’s some evidence from neuroscience research to support this, he argues. For instance, one of the most reliable ways of prompting dreams about something that happens in real life is to repetitively perform a new task, such as learning to juggle, or repeatedly training on a ski simulator, while you are awake. Overtraining on the task triggers this overfitting phenomenon, meaning your brain attempts to generalise beyond its training set while you sleep by creating dreams. This may help explain why we often get better at physical tasks such as juggling, following a good night’s sleep.

Although Hoel’s hypothesis is still untested, an advantage is that . . .

Continue reading. If weird dreams are a survival advantage, they would certainly be favored by natural selection (and thus we would have them today).

Written by Leisureguy

18 May 2021 at 12:35 pm

%d bloggers like this: