Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Evolution’ Category

Could One Shot Kill the Flu?

leave a comment »

Matthew Hutson has an interesting article in the New Yorker:

In 2009, global health officials started tracking a new kind of flu. It appeared first in Mexico, in March, and quickly infected thousands. Influenza tends to kill the very young and the very old, but this flu was different. It seemed to be severely affecting otherwise healthy young adults.

American epidemiologists soon learned of cases in California, Texas, and Kansas. By the end of April, the virus had reached a high school in Queens, where a few kids, returning from a trip to Mexico, had infected a third of the student body. The Mexican government closed its schools and banned large gatherings, and the U.S. considered doing the same. “It was a very scary situation,” Richard Besser, who was then the acting director of the Centers for Disease Control and Prevention, told me. Early estimates suggested that the “swine flu,” as the new strain became known, killed as many as fourteen per cent of those it infected—a case fatality rate more than two hundred times higher than typical seasonal flu. The virus soon spread to more than a hundred and fifty countries, and the Obama Administration considered delaying the start of school until after Thanksgiving, when a second wave could be under way. Manufacturers worried about vaccine supplies. Like most flu vaccines, the one for the swine flu was grown in chicken eggs. “Even if you yell at them, they don’t grow faster,” Tom Frieden, who replaced Besser as the director of the C.D.C., said, at a press conference.

In the end, the world got lucky. The early stats were misleading: although swine flu was extremely contagious, it wasn’t especially deadly. Sometimes the reverse is true. Avian flu, which spread across the world during the winter of 2005-06, is not particularly transmissible but is highly lethal, killing more than half of those it infects. Each flu virus has its own epidemiological profile, determined by its genetic makeup, and flu genes shift every year. Howard Markel, a physician and historian of epidemics who, in the early two-thousands, helped invent the concept of “flattening the curve,” compared influenza’s swappable genetic components to “two wheels of fortune.” A double whammy—ease of spread combined with lethality—could make covid-19, or even the 1918 flu, which killed between forty million and a hundred million people, look like a twenty-four-hour bug.

After the swine flu’s relatively harmless nature became apparent, many people asked if the alarm it provoked had been warranted. A Swiss survey found that trust in institutions had decreased. Some scientists and officials accused the World Health Organization of stirring up a “faked” pandemic to justify its budget. But most drew the opposite conclusion from the experience. Trying to prepare for a deadly flu pandemic had left them more alarmed. “There was just a sense of overwhelming relief,” Besser said. “If this had been like 1918, we sure weren’t ready.”

In truth, we’re never fully ready for the flu. We know it’s coming, like the first fall leaf, and yet three times in the past century—in 1918, 1957, and 1968—it has flattened us, killing a million or more each time. Even in ordinary years, the disease infects a billion people around the world, killing hundreds of thousands; one study estimated that it costs the United States economy close to a hundred billion dollars annually. Our primary weapon against the virus, the flu vaccine, is woefully inadequate. Over the last decade and a half in the United States, flu vaccines have prevented illness only forty per cent of the time; in particularly bad years, when vaccines were less fine-tuned to the strains that were circulating, they were only ten-per-cent protective. Today, the coronavirus pandemic is rightfully the object of our most strenuous efforts. And yet, as the infectious-disease specialists David Morens, Jeffrey Taubenberger, and Anthony Fauci wrote, in a 2009 article in The New England Journal of Medicine, that “we are living in a pandemic era that began around 1918,” when the flu used shipping networks to traverse the world. Since the 1918 pandemic, this century-long, multi-wave pandemic has killed roughly the same number of people.

We’ve controlled a vast number of diseases with vaccination—chicken pox, diphtheria, measles, mumps, polio, rabies, rubella, smallpox, tetanus, typhoid, whooping cough, yellow fever—and, to some degree, we’ve added covid-19 to the list. But the pathogens behind those diseases tend to be relatively static compared with the flu, which returns each year in a vexingly different form. For decades, scientists have dreamed of what some call a “universal” flu vaccine—one that could target many strains of the virus. A universal vaccine would save countless lives not just this year but every year; as those numbers add up, it would become one of the greatest medical breakthroughs in history. Until recently, it’s been beyond the reach of molecular biology. But new technologies are extending our abilities, and researchers are learning how to see through the flu’s disguises. Without knowing it, we’re living on the cusp of a remarkable scientific achievement. One of the world’s longest pandemics could soon be coming to an end.

What we call “the flu” is really plural. Every season, several strains circulate. When it’s summer in one hemisphere, flu infections surge in the other. Virologists at the W.H.O. investigate the viruses and share what they learn with pharmaceutical companies; pharmaceutical researchers then often develop quadrivalent vaccines, which target four separate strains simultaneously. It’s the shotgun approach.

It takes more than six months to design, test, and manufacture a season’s worth of flu vaccine. In that time, a lot can change. Out in the world, strains mutate, jostling for dominance; prevalent varieties fade away, and sleepers come to the fore. Arnold Monto, an epidemiologist at the University of Michigan who has advised the Food and Drug Administration on flu-vaccine targeting, told me that choosing strains to target requires “science and a little bit of art.” The selected flu viruses mutate further as a result of vaccine manufacturing. By the time a needle reaches your arm, there’s a good chance that the vaccine might be off target or obsolete.

 

Each strain of the flu can be seen as plural, too. Morrens, Taubenberger, and Fauci explain that “it is helpful to think of influenza viruses not as distinct entities but as eight-member ‘gene teams.’ ” A flu virus, they write, “must sometimes trade away one or more team members to make way for new gene ‘players’ with unique skills.”

The surface of a virus is covered by a forest of . . .

Continue reading.

Written by Leisureguy

22 November 2021 at 4:25 pm

Evolution never stops

leave a comment »

This experiment observes evolution in action. It’s odd to me that some people continue to deny that evolution is a fact, but then there are some who deny that the Earth is a globe.

Written by Leisureguy

15 November 2021 at 1:42 pm

Ancient History Shows How We Can Create a More Equal World

leave a comment »

David Graeber and  are the authors of the forthcoming book, The Dawn of Everything: A New History of Humanity, from which this NY Times essay is adapted. Mr. Graeber died shortly after completing the book. The links are gift links, which bypass the paywall, so you can read the entire essay, which begins:

Most of human history is irreparably lost to us. Our species, Homo sapiens, has existed for at least 200,000 years, but we have next to no idea what was happening for the majority of that time. In northern Spain, for instance, at the cave of Altamira, paintings and engravings were created over a period of at least 10,000 years, between around 25,000 and 15,000 B.C. Presumably, a lot of dramatic events occurred during that period. We have no way of knowing what most of them were. This is of little consequence to most people, since most people rarely think about the broad sweep of human history anyway. They don’t have much reason to. Insofar as the question comes up at all, it’s usually when reflecting on why the world seems to be in such a mess and why human beings so often treat each other badly — the reasons for war, greed, exploitation and indifference to others’ suffering. Were we always like that, or did something, at some point, go terribly wrong?

One of the first people to ask this question in the modern era was the Swiss-French philosopher Jean-Jacques Rousseau, in an essay on the origins of social inequality that he submitted to a competition in 1754. Once upon a time, he wrote, we were hunter-gatherers, living in a state of childlike innocence, as equals. These bands of foragers could be egalitarian because they were isolated from one another, and their material needs were simple. According to Rousseau, it was only after the agricultural revolution and the rise of cities that this happy condition came to an end. Urban living meant the appearance of written literature, science and philosophy, but at the same time, almost everything bad in human life: patriarchy, standing armies, mass executions and annoying bureaucrats demanding that we spend much of our lives filling out forms.

Rousseau lost the essay competition, but the story he told went on to become a dominant narrative of human history, laying the foundations upon which contemporary “big history” writers — such as Jared Diamond, Francis Fukuyama and Yuval Noah Harari — built their accounts of how our societies evolved. These writers often talk about inequality as the natural result of living in larger groups with a surplus of resources. For example, Mr. Harari writes in “Sapiens: A Brief History of Humankind” that, after the advent of agriculture, rulers and elites sprang up “everywhere … living off the peasants’ surplus food and leaving them with only a bare subsistence.”

For a long time, the archaeological evidence — from Egypt, Mesopotamia, China, Mesoamerica and elsewhere — did appear to confirm this. If you put enough people in one place, the evidence seemed to show, they would start dividing themselves into social classes. You could see inequality emerge in the archaeological record with the appearance of temples and palaces, presided over by rulers and their elite kinsmen, and storehouses and workshops, run by administrators and overseers. Civilization seemed to come as a package: It meant misery and suffering for those who would inevitably be reduced to serfs, slaves or debtors, but it also allowed for the possibility of art, technology, and science.

That makes wistful pessimism about the human condition seem like common sense: Yes, living in a truly egalitarian society might be possible if you’re a Pygmy or a Kalahari Bushman. But if you want to live in a city like New York, London or Shanghai — if you want all the good things that come with concentrations of people and resources — then you have to accept the bad things, too. For generations, such assumptions have formed part of our origin story. The history we learn in school has made us more willing to tolerate a world in which some can turn their wealth into power over others, while others are told their needs are not important and their lives have no intrinsic worth. As a result, we are more likely to believe that inequality is just an inescapable consequence of living in large, complex, urban, technologically sophisticated societies.

We want to offer an entirely different account of human history. We . . .

Continue reading. There’s more — and no paywall.

Written by Leisureguy

5 November 2021 at 8:54 am

How do tall trees get water to the top?

leave a comment »

Written by Leisureguy

4 November 2021 at 1:08 pm

The weirdness of the world

leave a comment »

Perhaps it’s just me, but this morning the world seems very odd and a little off.

When I arise, it’s generally too early to start the day, so I open my laptop and browse my email and then the internet. The first thing that I chanced upon was Cory Doctorow’s review in Medium of Daniel Pinkwater’s Crazy in Poughkeepsie. (Perhaps the title was a clue that things were about to go off-track.)

The review is an entertaining look at the bizarre events and plot in Daniel Pinkwater’s new Young Adult novel. The story was strange enough that I read the Wikipedia entry for Daniel Pinkwater. Some things I learned that struck me as odd. Here are a few snippets from the article:

Born Manus Pinkwater in Memphis, Tennessee, to Jewish immigrant parents from Poland. He describes his father, Philip Pinkwater, as a “ham-eating, iconoclastic Jew” who was expelled from Warsaw by the decent Jews. 
. . . 
A moment of fame came when he posed as Inspector Fermez LaBouche for the fumetti strip that ran in the final issues of Help! (September 1965); he had been spotted at a party by Terry Gilliam. Pinkwater rode in a Volkswagen convertible to a photo shoot with Gilliam, Robert Crumb, and Help’s creator Harvey Kurtzman—none of the men had any interest in the others. He met a children’s book editor by chance at a party; he invited her to his studio to promote an African artist’s cooperative, and she suggested that he illustrate a book. Pinkwater received a $1,500 advance for his first book, The Terrible Roar (1970), after replying that he would try to write the book himself.

With his wife Jill, Pinkwater published a dog training book and ran an obedience school while living in Hoboken, New Jersey. At the time, he was training to become an art therapist, but found he was unsuited to the work and dropped his studies. However, he attended a meeting of an unspecified cult with a therapy client, and later joined the cult. Pinkwater says “the quality of the [cult’s] rip-off was so minor you could ignore it”, although both he and Jill later left the cult. 
. . .
He adopted the name Daniel in the 1970s after consulting his cult’s guru, who said his true name should begin with a “D”. 
. . .
Pinkwater authored the newspaper comic strip Norb, which was illustrated by Tony Auth. The strip, syndicated by King Features, launched in 70 papers, but received nothing but hate-mail from the readers. Auth and Pinkwater agreed to end the project after 52 weeks.[2] The daily strips were released in a 78-page collection by MU Press in 1992.

Pinkwater was a longtime commentator on All Things Considered on National Public Radio. He regularly reviewed children’s books on NPR’s Weekend Edition Saturday. For several years, he had his own NPR show: Chinwag Theater. Pinkwater was also known to avid fans of the NPR radio show Car Talk, where he has appeared as a (seemingly) random caller, commenting, for example, on the physics of the buttocks (giving rise to the proposed unit of measure of seat size: the Pinkwater), and giving practical advice as to the choice of automobiles. In the early 1990s Pinkwater voiced a series of humorous radio advertisements for the Ford Motor Company.

I wondered whether Pinkwater had concocted his entry as a kind of surrealistic exercise, but on reading one of the source articles linked in the Wikipedia footnotes, it seems to be about right. (That article says that Pinkwater’s father was expelled from Warsaw not so much for eating ham as for being a gangster.)

A little discombobulated, I next read the Doctorow article I had meant to read in the first place. (The Pinkwater article was the first article in the list, and so had caught my eye.) My original goal was an article about the power that large tech companies — YouTube, Twitter, Tik Tok, Instagram, et al. — have to decide whether to allow content to remain on their platforms. 

Consider a content creator who suddenly finds that work developed over months or years of effort is suddenly gone, with no good appeal procedure to get it restored. Doctorow’s article is well worth reading. A few snippets:

After Novara’s channel was deleted, the group tried to find out more. The email that Youtube sent announcing the removal was terse (“YouTube said Novara was guilty of ‘repeated violations’ of YouTube’s community guidelines, without elaborating”). Novara editor Gary McQuiggin “filled in a YouTube appeal form,” which disappeared into the ether.

Then McQuiggin went to YouTube’s creator support chatbot, which introduced itself as “Rose.” What happened next is the stuff of Kafka-esque farce:

“I know this is important,” [Rose said,] before the conversation crashed.

The Times’s story quite rightly homes in on the problems of doing content moderation at scale without making errors. It notes that YouTube deletes 2,000 channels every hour to fight “spam, misinformation, financial scams, nudity, hate speech.” . . .

The platforms remove content in the blink of an eye, often through fully automated processes (such as copyright filters). Takedown systems are built without sparing any expense (YouTube’s copyright filter, Content ID, cost the company $100,000,000 and counting).

But the put-back processes — by which these automated judgments are examined and repealed — are slow-moving afterthoughts. If you’re a nightclub owner facing a takedown of the promo material sent to you by next week’s band, the 2.5-year delay you face in getting that content put back up is worse than a joke. . .

The reality is that there is no army of moderators big enough to evaluate 2,000 account deletions per hour. . .

YouTube’s takedown regime has to contend with 500 hours’ worth of new video every minute, in every language spoken. It has to parse out in-jokes and slang, obscure dialects, and even more obscure references.

In such a seemingly dystopian system, what do content creators do? For one thing, they face challenges — see “8 Challenges Even Millionaire YouTuber Ali Abdaal Faces,” by Amardeep Parmar. That article describes the strange (to me) work situation of a modern-day content creator. Their lives seem to be unceasing effort to move faster and do more, while overhead hangs a sword of Damocles: that a twitch of a corporate algorithm can delete in an instant all the work they’ve posted. 

I myself am involved in this same ecosystem of software, information, creation, and business practices, but at a very low level with little at stake. Having my livelihood depend on such an unstable and slippery amalgam of forces would make me uneasy indeed.

Reading the above collection of weird expressions of human culture — memetic evolution creating very odd results — made me want to read something about nature, something calming and restorative. I happened on the article “Do Not Eat, Touch, Or Even Inhale the Air Around the Manchineel Tree,” by Dan Nosowitz — a tree that likely played a role in the death of Ponce de Leon. It was interesting, but did nothing to dispel this morning’s fog of weirdness. It did make me recalll that natural evolution, just like cultural evolution, can go in strange directions — for example:

 

Written by Leisureguy

3 November 2021 at 10:45 am

Why Don’t Societies See Their Own Collapse Coming?

leave a comment »

Umair Haque provides some reasonable answers to the title question:

Continue reading. There’s quite a bit more, and I must say his arguments are persuasive.

Written by Leisureguy

27 October 2021 at 6:45 pm

On the Origin of Minds

leave a comment »

Pamela Lyonis, interdisciplinary visiting research fellow at the Southgate Institute for Health, Society and Equity, College of Medicine and Public Health, Flinders University in Adelaide, writes in Aeon:

In On the Origin of Species (1859), Charles Darwin draws a picture of the long sweep of evolution, from the beginning of life, playing out along two fundamental axes: physical and mental. Body and mind. All living beings, not just some, evolve by natural selection in both ‘corporeal and mental endowments’, he writes. When psychology has accepted this view of nature, Darwin predicts, the science of mind ‘will be based on a new foundation’, the necessarily gradual evolutionary development ‘of each mental power and capacity’.

Darwin guessed that life arose from a single ancestral ‘form’, presumed to be single-celled. Soon, scientists in Germany, France and the United States began investigating microscopic organisms for evidence of ‘mental faculties’ (perception, memory, decision-making, learning). All three groups were headed by men destined for eminence. Two of them – Alfred Binet, the psychologist who devised the first practical intelligence test, and Herbert Spencer Jennings, who laid the foundations of mathematical genetics – were persuaded by their research that Darwin was right: behaviour even in microbes suggests that mental as well as physical evolution exists. Max Verworn, a giant of German physiology, was unconvinced.

Thus kicked off a heated debate about the continuity of mental evolution, the idea that what in humans is called ‘mind’ and in other animals is usually called ‘cognition’ developed and changed over millions of years (billions, actually). That debate continues to this day. The rise of behaviourism in the early 1900s, which privileged observable behaviour as the only admissible scientific data, curtailed discussion by taking talk about minds off the table for decades. When the ‘cognitive revolution’ launched mid-century, discontinuity was firmly established. The consensus was that, at some point in evolution (and we might never know when), cognition – poof! – somehow appears in some animals. Before that, behaviour – the only indicator of cognition available without language – would have been entirely innate, machine-like, reflexive. It might have looked cognitively driven but wasn’t. This remains the dominant view, almost entirely on the grounds of its ‘intuitive’ plausibility based on commonsense understanding.

The philosopher Daniel Dennett, among the earliest cognitive philosophers to invoke evolution, dubbed natural selection ‘Darwin’s dangerous idea’ because it showed that the appearance of design in nature requires no designer, divine or otherwise. Like most of his colleagues, philosophical and scientific, Dennett didn’t buy the continuity of mental evolution. However, my view is that this neglected insight of Darwin’s was his most radical idea, one with the potential to induce a full-blown Copernican revolution in the cognitive sciences and to transform the way we see the world and our place in it.

The Copernican revolution turned on a single shift in perspective. For 1,400 years, European scholars agreed with ordinary folk that Earth is the still point around which the heavens revolve. The Ptolemaic model of the cosmos had set the Sun, Moon, stars and planets moving in nested crystalline spheres around Earth. In 1543 Nicolaus Copernicus published a detailed alternative that replaced Earth with the Sun. By setting it in motion around the Sun with other celestial ‘wanderers’, our planet was dethroned as the cosmic centre, and modern astronomy was born.

Similarly, Darwin’s radical idea dethrones human and other brains from their ‘intuitively obvious’ position at the centre of the (Western) cognitive universe. In their place, Darwin sets an evolving, cognising being struggling to survive, thrive and reproduce in predictably and unpredictably changing conditions. This single shift of perspective – from a brain-centred focus, where Homo sapiens is the benchmark, to the facts of biology and ecology – has profound implications. The payoff is a more accurate and productive account of an inescapable natural phenomenon critical to understanding how we became – and what it means to be – human.

What is cognition? Like many mental concepts, the term has no consensus definition, a fact that infuriated William James 130 years ago and occasional others since. This is my definition: Cognition comprises the means by which organisms become familiar with, value, exploit and evade features of their surroundings in order to survive, thrive and reproduce. Here is how I came to it.

As a PhD student in Asian Studies 21 years ago, my research focused on four Buddhist propositions that I aimed to subject to forensic Western philosophical and scientific analysis. Implicit in these propositions is a highly sophisticated Buddhist view of mind: what it is, how it works, what it does under benighted conditions, what it can do with training and practice. I looked for a Western comparator in what was then called cognitive science (in the singular) and found… nothing.

Four cartons of books and a laptop full of articles later, I had a collection of loose, dissonant ideas, and related streams of argument, none of which provided purchase on the experience of having a mind or its role in behaviour. As the neurobiologist Steven Rose observed in 1998 (and nothing much has changed), neuroscience had generated oceans of data but no theory to make sense of them. At the dawn of the 21st century, this struck me as outrageous. It still does.

Cognitive science was then ruled by three (decades-old) reference frames that provided the foundation of the ‘cognitivist’ paradigm: 1) the human brain; 2) a belief that the brain is a computing machine; and 3) the idea that ‘cognition is computation over representations’. The latter tenet doggedly resists easy explanation because the central concepts introduced fresh ambiguities into the field (as if any more were needed). Roughly, it boils down to this: there are identifiable things in the brain that ‘stand in’ for aspects of the world, much how words do in sentences. These bits of information are ‘processed’ according to algorithms yet to be discovered. This is what we call thinking, planning, decision-making, and so on. No mental activity exists over and above the processing of representations; cognition just is this processing.

Biology and evolution, which I assumed must be of utmost importance, were largely absent; so were physiology, emotion and motivation. Researchers who believed the study of animal behaviour had something useful to offer cognitive science were just beginning to publish in the field and were not warmly welcomed. ‘Embodied’ and ‘situated’ cognition were gaining traction but were then more an acknowledgement of the bleeding obvious than a coherent framework. Without criteria for identification, attributions of biological cognition were all over the taxonomic shop.

I still needed a comparator, however. I decided to investigate whether biology held the answers I assumed it must. I opted to start at the rootstock of the tree of life – bacteria – to see if anything conceivably cognitive was going on; 20 years on, I am still mining this rich seam.

My first theoretical guides, recommended by . . .

Continue reading. There’s more.

Written by Leisureguy

24 October 2021 at 4:05 pm

Posted in Evolution, Science

More on the new view of humanity’s social structures

leave a comment »

I posted recently about an Atlantic article about a new take on the cultural evolution of human society, drawing on the work of David Graebner and David Wengrow, particularly their book  The Dawn of Everything: A New History of Humanity. The Guardian has an extract from that book that’s worth reading. It begins:

n some ways, accounts of “human origins” play a similar role for us today as myth did for ancient Greeks or Polynesians. This is not to cast aspersions on the scientific rigour or value of these accounts. It is simply to observe that the two fulfil somewhat similar functions. If we think on a scale of, say, the last 3m years, there actually was a time when someone, after all, did have to light a fire, cook a meal or perform a marriage ceremony for the first time. We know these things happened. Still, we really don’t know how. It is very difficult to resist the temptation to make up stories about what might have happened: stories which necessarily reflect our own fears, desires, obsessions and concerns. As a result, such distant times can become a vast canvas for the working out of our collective fantasies.

Let’s take just one example. Back in the 1980s, there was a great deal of buzz about a “mitochondrial Eve”, the putative common ancestor of our entire species. Granted, no one was claiming to have actually found the physical remains of such an ancestor, but DNA sequencing demonstrated that such an Eve must have existed, perhaps as recently as 120,000 years ago. And while no one imagined we’d ever find Eve herself, the discovery of a variety of other fossil skulls rescued from the Great Rift Valley in east Africa seemed to provide a suggestion as to what Eve might have looked like and where she might have lived. While scientists continued debating the ins and outs, popular magazines were soon carrying stories about a modern counterpart to the Garden of Eden, the original incubator of humanity, the savanna-womb that gave life to us all.

Many of us probably still have something resembling this picture of human origins in our mind. More recent research, though, has shown it couldn’t possibly be accurate. In fact, biological anthropologists and geneticists are now converging on an entirely different picture. For most of our evolutionary history, we did indeed live in Africa – but not just the eastern savannas, as previously thought. Instead, our biological ancestors were distributed everywhere from Morocco to the Cape of Good Hope. Some of those populations remained isolated from one another for tens or even hundreds of thousands of years, cut off from their nearest relatives by deserts and rainforests. Strong regional traits developed, so that early human populations appear to have been far more physically diverse than modern humans. If we could travel back in time, this remote past would probably strike us as something more akin to a world inhabited by hobbits, giants and elves than anything we have direct experience of today, or in the more recent past. . .

Continue reading.

Written by Leisureguy

24 October 2021 at 12:34 pm

Human History Gets a Rewrite

leave a comment »

William Deresiewicz has an interesting article in the Atlantic, which, month after month, seems to be chockablock with interesting articles. Deresiewicz writes:

Many years ago, when I was a junior professor at Yale, I cold-called a colleague in the anthropology department for assistance with a project I was working on. I didn’t know anything about the guy; I just selected him because he was young, and therefore, I figured, more likely to agree to talk.

Five minutes into our lunch, I realized that I was in the presence of a genius. Not an extremely intelligent person—a genius. There’s a qualitative difference. The individual across the table seemed to belong to a different order of being from me, like a visitor from a higher dimension. I had never experienced anything like it before. I quickly went from trying to keep up with him, to hanging on for dear life, to simply sitting there in wonder.

That person was David Graeber. In the 20 years after our lunch, he published two books; was let go by Yale despite a stellar record (a move universally attributed to his radical politics); published two more books; got a job at Goldsmiths, University of London; published four more books, including Debt: The First 5,000 Years, a magisterial revisionary history of human society from Sumer to the present; got a job at the London School of Economics; published two more books and co-wrote a third; and established himself not only as among the foremost social thinkers of our time—blazingly original, stunningly wide-ranging, impossibly well read—but also as an organizer and intellectual leader of the activist left on both sides of the Atlantic, credited, among other things, with helping launch the Occupy movement and coin its slogan, “We are the 99 percent.”

On September 2, 2020, at the age of 59, David Graeber died of necrotizing pancreatitis while on vacation in Venice. The news hit me like a blow. How many books have we lost, I thought, that will never get written now? How many insights, how much wisdom, will remain forever unexpressed? The appearance of The Dawn of Everything: A New History of Humanity is thus bittersweet, at once a final, unexpected gift and a reminder of what might have been. In his foreword, Graeber’s co-author, David Wengrow, an archaeologist at University College London, mentions that the two had planned no fewer than three sequels.

And what a gift it is, no less ambitious a project than its subtitle claims. The Dawn of Everything is written against the conventional account of human social history as first developed by Hobbes and Rousseau; elaborated by subsequent thinkers; popularized today by the likes of Jared Diamond, Yuval Noah Harari, and Steven Pinker; and accepted more or less universally. The story goes like this. Once upon a time, human beings lived in small, egalitarian bands of hunter-gatherers (the so-called state of nature). Then came the invention of agriculture, which led to surplus production and thus to population growth as well as private property. Bands swelled to tribes, and increasing scale required increasing organization: stratification, specialization; chiefs, warriors, holy men.

Eventually, cities emerged, and with them, civilization—literacy, philosophy, astronomy; hierarchies of wealth, status, and power; the first kingdoms and empires. Flash forward a few thousand years, and with science, capitalism, and the Industrial Revolution, we witness the creation of the modern bureaucratic state. The story is linear (the stages are followed in order, with no going back), uniform (they are followed the same way everywhere), progressive (the stages are “stages” in the first place, leading from lower to higher, more primitive to more sophisticated), deterministic (development is driven by technology, not human choice), and teleological (the process culminates in us).

It is also, according to Graeber and Wengrow, completely wrong. Drawing on a wealth of recent archaeological discoveries that span the globe, as well as deep reading in often neglected historical sources (their bibliography runs to 63 pages), the two dismantle not only every element of the received account but also the assumptions that it rests on. Yes, we’ve had bands, tribes, cities, and states; agriculture, inequality, and bureaucracy, but what each of these were, how they developed, and how we got from one to the next—all this and more, the authors comprehensively rewrite. More important, they demolish the idea that human beings are passive objects of material forces, moving helplessly along a technological conveyor belt that takes us from the Serengeti to the DMV. We’ve had choices, they show, and we’ve made them. Graeber and Wengrow offer a history of the past 30,000 years that is not only wildly different from anything we’re used to, but also far more interesting: textured, surprising, paradoxical, inspiring.

The bulk of the book (which weighs in at more than 500 pages) takes us from the Ice Age to the early states (Egypt, China, Mexico, Peru). In fact, it starts by glancing back before the Ice Age to the dawn of the species. Homo sapiens developed in Africa, but it did so across the continent, from Morocco to the Cape, not just in the eastern savannas, and in a great variety of regional forms that only later coalesced into modern humans. There was no anthropological Garden of Eden, in other words—no Tanzanian plain inhabited by “mitochondrial Eve” and her offspring. As for the apparent delay between our biological emergence, and therefore the emergence of our cognitive capacity for culture, and the actual development of culture—a gap of many tens of thousands of years—that, the authors tell us, is an illusion. The more we look, especially in Africa (rather than mainly in Europe, where humans showed up relatively late), the older the evidence we find of complex symbolic behavior.

That evidence and more—from the Ice Age, from later Eurasian and Native North American groups—demonstrate, according to Graeber and Wengrow, that hunter-gatherer societies were far more complex, and more varied, than we have imagined. The authors introduce us to sumptuous Ice Age burials (the beadwork at one site alone is thought to have required 10,000 hours of work), as well as to monumental architectural sites like Göbekli Tepe, in modern Turkey, which dates from about 9000 B.C. (at least 6,000 years before Stonehenge) and features intricate carvings of wild beasts. They tell us of Poverty Point, a set of massive, symmetrical earthworks erected in Louisiana around 1600 B.C., a “hunter-gatherer metropolis the size of a Mesopotamian city-state.” They describe an indigenous Amazonian society that shifted seasonally between two entirely different forms of social organization (small, authoritarian nomadic bands during the dry months; large, consensual horticultural settlements during the rainy season). They speak of the kingdom of Calusa, a monarchy of hunter-gatherers the Spanish found when they arrived in Florida. All of these scenarios are unthinkable within the conventional narrative.

The overriding point is that hunter-gatherers made choices—conscious, deliberate, collective—about the ways that they wanted to organize their societies: to apportion work, dispose of wealth, distribute power. In other words, they practiced politics. Some of them experimented with agriculture and decided that it wasn’t worth the cost. Others looked at their neighbors and determined to live as differently as possible—a process that Graeber and Wengrow describe in detail with respect to the Indigenous peoples of Northern California, “puritans” who idealized thrift, simplicity, money, and work, in contrast to the ostentatious slaveholding chieftains of the Pacific Northwest. None of these groups, as far as we have reason to believe, resembled the simple savages of popular imagination, unselfconscious innocents who dwelt within a kind of eternal present or cyclical dreamtime, waiting for the Western hand to wake them up and fling them into history.

The authors carry this perspective forward to the ages that saw the emergence of farming, of cities, and of kings. In the locations where it first developed, about 10,000 years ago, agriculture did not take over all at once, uniformly and inexorably. (It also didn’t start in only a handful of centers—Mesopotamia, Egypt, China, Mesoamerica, Peru, the same places where empires would first appear—but more like 15 or 20.) Early farming was typically flood-retreat farming, conducted seasonally in river valleys and wetlands, a process that is much less labor-intensive than the more familiar kind and does not conduce to the development of private property. It was also what the authors call “play farming”: farming as merely one element within a mix of food-producing activities that might include hunting, herding, foraging, and horticulture.

Settlements, in other words, preceded agriculture—not, as we’ve thought, the reverse. What’s more, it took some 3,000 years for . . .

Continue reading. There’s much more.

Written by Leisureguy

23 October 2021 at 2:10 pm

Factory farms of disease: How industrial chicken production is breeding the next pandemic

leave a comment »

John Vidal reports in the Guardian:

One day last December, 101,000 chickens at a gigantic farm near the city of Astrakhan in southern Russia started to collapse and die. Tests by the state research centre showed that a relatively new strain of lethal avian flu known as H5N8 was circulating, and within days 900,000 birds at the Vladimirskaya plant were hurriedly slaughtered to prevent an epidemic.

Avian flu is the world’s other ongoing pandemic and H5N8 is just one strain that has torn through thousands of chicken, duck and turkey flocks across nearly 50 countries including Britain in recent years and shows no sign of stopping.

But the Astrakhan incident was different. When 150 workers at the farm were tested, five women and two men were found to have the disease, albeit mildly. It was the first time that H5N8 had been known to jump from birds to humans.

The World Health Organization (WHO) was alerted but, this being at the height of the Covid-19 pandemic, little attention was paid even when Anna Popova, chief consumer adviser to the Russian Federation, went on TV to warn “with a degree of probability” that human-to-human transmission of H5N8 would evolve soon and that work should start immediately on developing a vaccine.

Global attention is fixed on the origins of Covid-19, either in nature or from a laboratory, but eight or more variants of avian flu, all of which are able to infect and kill humans and are potentially more severe than Covid-19, now regularly rattle around the world’s factory farms barely noticed by governments.

There have been no further reports of human H5N8 infections in 2021, but concern last week turned to China, where another type of avian flu known as H5N6 has infected 48 people since it was first identified in 2014. Most cases have been linked to people working with farmed birds, but there has been a spike in recent weeks and more than half of all the people infected have died, suggesting that H5N6 is gathering pace, mutating and extremely dangerous.

WHO and Chinese virologists have been worried enough to call on governments to increase their vigilance. “The likelihood of human-to-human spread is low [but] wider geographical surveillance in the China affected areas and nearby areas is urgently required to better understand the risk and the recent increase of spillover to humans,” said a WHO Pacific-region spokesperson in a statement.

Earlier this month, China’s Centre for Disease Control [CDC] identified several mutations in two recent H5N6 cases. The spread of the H5N6 virus is now a “serious threat” to the poultry industry and human health, said Gao Fu, CDC director, and Shi Weifeng, dean of public health at Shandong First Medical University.

“The zoonotic potential of AIVs [avian influenza viruses] warrants continuous, vigilant monitoring to avert further spillovers that could result in disastrous pandemics,” they say.

Factory farming and disease

The WHO suspects, but has no proof, that Covid-19 is linked to the intensive breeding of animals in south-east Asia’s many barely regulated wildlife farms. Major outbreaks over the past 30 years including Q fever in the Netherlands and highly pathogenic avian influenza outbreaks have been linked with intensive livestock farming.

Governments and the £150bn-a-year poultry and livestock industries emphasise how intensive farming is generally extremely safe and now essential for providing fast-growing populations with protein [though in fact plant-based foods are more efficient and safer – LG], but scientific evidence shows that stressful, crowded conditions drive the emergence and spread of many infectious diseases, and act as an “epidemiological bridge” between wildlife and human infections.

UN bodies, academics and epidemiologists recognise the link between the emergence of highly pathogenic avian influenza viruses and increasingly intensive poultry farming.

According to the UN’s Food and Agriculture Organization (FAO): “Avian influenza viruses are evolving into a large, diverse virus gene pool … A pathogen may turn into a hyper-virulent disease agent; in monocultures involving mass rearing of genetically identical animals that are selected for high feed conversion, an emerging hyper-virulent pathogen will rapidly spread within a flock or herd.” . . .

Continue reading. There’s more, and it’s disturbing.

Also, I highly recommend watching this video of a presentation by Dr. Michael Greger. He describes in the video how factory farms are ideal incubators for new diseases. It’s a talk worth watching, and as you watch keep in mind that the talk was given in 2008. Excluding animal-based foods is good for your health both directly — in terms of what the food does to your body — and indirectly — in terms of what raising does to create new diseases.

Written by Leisureguy

20 October 2021 at 9:30 am

Making a Living: The history of what we call “work”

leave a comment »

Aaron Benanav reviews an interesting book in The Nation:

We have named the era of runaway climate change the “Anthropocene,” which tells you everything you need to know about how we understand our tragic nature. Human beings are apparently insatiable consuming machines; we are eating our way right through the biosphere. The term seems to suggest that the relentless expansion of the world economy, which the extraction and burning of fossil fuels has made possible, is hard-wired into our DNA. Seen from this perspective, attempting to reverse course on global warming is likely to be a fool’s errand. But is unending economic growth really a defining feature of what it means to be human?

For the longest part of our history, humans lived as hunter-gatherers who neither experienced economic growth nor worried about its absence. Instead of working many hours each day in order to acquire as much as possible, our nature—insofar as we have one—has been to do the minimum amount of work necessary to underwrite a good life.

This is the central claim of the South African anthropologist James Suzman’s new book, Work: A Deep History, From the Stone Age to the Age of Robots, in which he asks whether we might learn to live like our ancestors did—that is, to value free time over money. Answering that question takes him on a 300-millennium journey through humanity’s existence.

Along the way, Suzman draws amply on what he has learned since the 1990s living and dissertating among the Ju/’hoansi Bushmen of Eastern Namibia, whose ancestral home is in southern Africa’s Kalahari Desert. The Ju/’hoansi are some of the world’s last remaining hunter-gatherers, although few engage in traditional forms of foraging anymore.

Suzman has less to say in Work about his years as the director of corporate citizenship and, later, the global director of public affairs at De Beers, the diamond-mining corporation. He took that job in 2007. Around the same time, in response to a public outcry after the Botswanan government evicted Bushmen from the Kalahari so that De Beers could conduct its mining operations there, the company sold its claim to a deposit to a rival firm, Gem Diamonds, which opened a mine in the Bushmen’s former hunting grounds in 2014. It later shuttered the mine and then sold it in 2019, after reportedly losing $170 million on the venture.

Suzman’s employment with De Beers—a company that has spent vast sums on advertising to convince the world’s middle classes that diamonds, one of the most common gems, are actually among the scarcest—may have left its mark on Work nonetheless. “The principal purpose” of his undertaking, Suzman explains, is “to loosen the claw-like grasp that scarcity economics has held” over our lives and thereby “diminish our corresponding and unsustainable preoccupation with economic growth.” It is an arresting intervention, although one that reveals the limits of both contemporary economics and anthropology as guides to thinking about our era of climate emergency.

For 95 percent of our 300,000-year history, human beings have lived as hunter-gatherers on diets consisting of fruits, vegetables, nuts, insects, fish, and game. Ever since Adam Smith published The Wealth of Nations in 1776, it has largely been taken for granted that staying alive was an all-consuming activity for our ancestors, as well as for the remaining hunter-gatherers who still lived as they did. Latter-day foragers appeared to have been “permanently on the edge of starvation,” Suzman explains, and “plagued by constant hunger.”

This disparaging perspective on the life of the hunter-gatherer found ample support in Western travel narratives and then in ethnographic studies. Explorers treated contemporary foraging peoples as if they were living fossils, artifacts of an earlier era. In reality, these foragers were living in time, not out of it, and trying to survive as best they could under adverse historical conditions. Expanding communities of agriculturalists, like both colonial empires and post-colonial states, had violently pushed most foragers out of their ancestral homelands and into more marginal areas. Western reportage has made it seem as if these dispossessed refugees were living as their ancestors had since time immemorial, when in fact their lives were typically much more difficult.

A countercurrent of thinkers has provided a consistent alternative to this largely contemptuous mainstream perspective. The 18th-century French philosopher Jean-Jacques Rousseau, for example, took the forager to be an unrealizable ideal for modern humans rather than our embarrassing origin story. In the 20th century, anthropologists Franz Boas and Claude Levi-Strauss continued this tradition: They countered racist, stage-based theories of human evolution by showing that foraging peoples possessed complex and intelligent cultures. These thinkers form important precursors to Suzman’s perspective, but, in Work, he sets them aside.

Instead, Suzman focuses on the comparatively recent “Man the Hunter” conference, co-organized by the American anthropologist Richard Lee. That 1966 gathering marked a decisive shift in how anthropologists thought about foragers as economic actors, and this is the point that Suzman wants to emphasize. Lee had been conducting research among the !Kung Bushmen of southern Africa, a people related to the Ju/’hoansi. Lee showed that the !Kung acquired their food through only “a modest effort,” leaving them with more “free time” than people in the advanced industrial societies of the West. The same was likely true, he suggested, of human beings over the largest part of their history.

One implication of this finding is that economists since Adam Smith have been consistently wrong about what Lee’s colleague Marshall Sahlins called “stone age economics.” Using modern research methods, social scientists have confirmed that Lee and Sahlins were largely right (although they may have underestimated foragers’ average work hours). The chemical analysis of bones has demonstrated conclusively that early humans were not constantly teetering on the brink of starvation. On the contrary, they ate well despite having at their disposal only a few stone and wooden implements. What afforded these early humans existences of relative ease and comfort? According to Suzman, the turning point in the history of early hominids came with their capacity to control fire, which gave them access to a “near-limitless supply of energy” and thereby lightened their toils.

Fire predigests food. When you roast the flesh of a woolly mammoth—or, for that matter, a bunch of carrots—the process yields significantly more calories than if the food was left uncooked. The capacity to access those additional calories gave humans an evolutionary advantage over other primates. Whereas chimpanzees spend almost all of their waking hours foraging, early humans got the calories they needed with just a few hours of foraging per day.

Mastering fire thus made for a radical increase in humanity’s free time. Suzman contends that it was this free time that subsequently shaped our species’s cultural evolution. Leisure afforded long periods of hanging around with others, which led to the development of language, storytelling, and the arts. Human beings also gained the capacity to care for those who were “too old to feed themselves,” a trait we share with few other species.

The use of fire helped us become . . .

Continue reading.

Written by Leisureguy

5 October 2021 at 12:15 pm

Single Cells Evolve Large Multicellular Forms in Just Two Years

leave a comment »

It’s difficult to deny that evolution happens when it is demonstrated — difficult, but certainly not impossible as Ken Ham (no relation! at all!) will tell you. But it’s interesting to see the big step taken in a laboratory setting. Veronique Greenwood writes in Quanta:

To human eyes, the dominant form of life on Earth is multicellular. These cathedrals of flesh, cellulose or chitin usually take shape by following a sophisticated, endlessly iterated program of development: A single microscopic cell divides, then divides again, and again and again, with each cell taking its place in the emerging tissues, until there is an elephant or a redwood where there was none before.

At least 20 times in life’s history — and possibly several times as often — single-celled organisms have made the leap to multicellularity, evolving to make forms larger than those of their ancestors. In a handful of those instances, multicellularity has gone into overdrive, producing the elaborate organisms known as plants, animals, fungi and some forms of algae. In these life forms, cells have shaped themselves into tissues with different functions — cells of the heart muscle and cells of the bloodstream, cells that hold up the stalk of a wheat plant, cells that photosynthesize. Some cells pass their genes on to the next generation, the germline cells like eggs and sperm, and then there are all the rest, the somatic cells that support the germline in its quest to propagate itself.

But compared to the highly successful simplicity of single-celled life, with its mantra of “eat, divide, repeat,” multicellularity seems convoluted and full of perilous commitments. Questions about what circumstances could have enticed organisms to take this fork in the road millions of years ago on Earth — not once but many times — therefore tantalize scientists from game theorists and paleontologists to biologists tending single-celled organisms in the lab.

Now, the biologist William Ratcliff at the Georgia Institute of Technology and his colleagues report that over the course of nearly two years of evolution, they have induced unicellular yeasts to grow into multicellular clusters of immense size, going from microscopic to branching structures visible to the naked eye. The findings illustrate how such a transition can happen, and they imply intriguing future experiments to see whether these structures develop differentiation — whether cells start to play specialized roles in the drama of life lived together.

Incentives to Be Snowflakes

Nearly a decade ago, scientists who study multicellularity were set abuzz by an experiment performed by Ratcliff, Michael Travisano, and their colleagues at the University of Minnesota. Ratcliff, who was doing his doctoral thesis on cooperation and symbiosis in yeasts, had been chatting with Travisano about multicellularity, and they wondered whether it might be possible to evolve yeast into something multicellular. On a whim, they took tubes of yeast growing in culture, shook them, and selected the ones that settled to the bottom fastest to seed a new culture, over and over again for 60 days.

This simple procedure, as they later described in the Proceedings of the National Academy of Sciences, rapidly caused the evolution of tiny clumps — yeasts that had evolved to stay attached to each other, the better to survive the selection pressure exerted by the scientists. The researchers subsequently determined that because of a single mutation in ACE2, a transcription factor, the cells did not break apart after they divided, which made them heavier and able to sink faster.

This change in the cells emerged quickly and repeatedly. In less than 30 transfers, one of the tubes exhibited this clumping; within 60 transfers, all of the tubes were doing it. The researchers dubbed the cells snowflake yeast, after the ramifying shapes they saw under the microscope.

Snowflake yeast started out as a side project, but it looked like a promising avenue to explore. “That’s been my life for 10 years since then,” Ratcliff said. The work garnered him collaborators like Eric Libby, a mathematical biologist at Umeå University in Sweden, and Matthew Herron, a research scientist at Georgia Tech, where Ratcliff is now a professor. He had joined the varied ecosystem of researchers trying to understand how multicellular life came about.

It’s easy for us, as the vast architectures of cells that we are, to take it for granted that multicellularity is an unqualified advantage. But as far as we can tell from fossils, life seems to have been cheerfully unicellular for its first billion years. And even today, there are far more unicellular organisms than multicellular ones on the planet. Staying together has serious downsides: A cell’s fate becomes tied to those of the cells around it, so if they die, it may die too. And if a cell does become part of a multicellular collective, it may end up as a somatic cell instead of a germ cell, meaning that it sacrifices the opportunity to pass on its genes directly through reproduction.

There are also questions of competition. “Cells of the same species tend to compete for resources,” said Guy Cooper, a theorist at the University of Oxford. “When you stick a bunch of them together, that competition for resources becomes even stronger. That’s a big cost … so you need a benefit that’s equal or greater on the far side for multicellularity to evolve.” . . .

Continue reading. There’s more, including videos.

Written by Leisureguy

22 September 2021 at 5:20 pm

All the Biomass of Earth, in One Graphic

leave a comment »

At Visual Capitalist Iman Ghosh writes about an infographic created by Mark Belan:

All the Biomass of Earth, in One Graphic

Our planet supports approximately 8.7 million species, of which over a quarter live in water.

But humans can have a hard time comprehending numbers this big, so it can be difficult to really appreciate the breadth of this incredible diversity of life on Earth.

In order to fully grasp this scale, we draw from research by Bar-On et al. to break down the total composition of the living world, in terms of its biomass, and where we fit into this picture.

Why Carbon?

A “carbon-based life form” might sound like something out of science fiction, but that’s what we and all other living things are.

Carbon is used in complex molecules and compounds—making it an essential part of our biology. That’s why biomass, or the mass of organisms, is typically measured in terms of carbon makeup.

In our visualization, one cube represents 1 million metric tons of carbon, and every thousand of these cubes is equal to 1 Gigaton (Gt C).

Here’s how the numbers stack up in terms of biomass of life on Earth: . . .

There’s more, and here’s the infographic (click to see in new tab, and click that to enlarge the image): Read the rest of this entry »

Written by Leisureguy

22 September 2021 at 10:26 am

Where Do Species Come From?

leave a comment »

During the most recent ice age, glaciers divided an ancestral population of crows; one group became all-black carrion crows, the other hooded crows with gray breasts and bodies.Illustrations by François-Nicolas Martinet / Alamy

Ben Crair has a very interesting article in the New Yorker, which begins:

The evolutionary biologist Jochen Wolf was working from home when we first spoke, in April, 2020. Germany was under lockdown, and his lab, at Ludwig Maximilian University, in Munich, had been closed for weeks. Still, a reminder of his research had followed him from the office. “I have a crow nest right in front of me,” Wolf said, from his rooftop terrace. The nest was well hidden at the top of a tall spruce tree. Through the branches, Wolf could see a female crow sitting on her eggs.

Over the years, Wolf had climbed many similar trees to gather genetic material from crow nests. He had also collected samples from falconers whose goshawks hunt the birds. By comparing the genomes of European crows, Wolf wanted to bring fresh data to one of biology’s oldest and most intractable debates. Scientists have named more than a million different species, but they still argue over how any given species evolves into another and do not even agree on what, exactly, a “species” is. “I have just been comparing definitions of species,” Charles Darwin wrote to a friend, three years before he would publish “On the Origin of Species,” in 1859. “It is really laughable to see what different ideas are prominent in various naturalists’ minds.” To an extent, the same holds true today. It is difficult to find a definition of “species” that works for organisms as different as goshawks and spruce trees. Similarly, it can be hard to draw a line between organisms among whom there are only small differences, such as the goshawks in North America, Europe, and Siberia. Are they separate species, subspecies, or simply locally adapted populations of a single type?

Darwin thought that the blurriness of species boundaries was a clue that the living world was not a divine creation but actually changing over time. He encouraged biologists to treat species as “merely artificial combinations made for convenience,” which would never map perfectly onto nature. “We shall at least be freed from the vain search for the undiscovered and undiscoverable essence of the term species,” he wrote. His imprecision, however, did not sit well with all of his successors. One of the most influential evolutionary biologists of the twentieth century, a German-born ornithologist named Ernst Mayr, attacked Darwin for failing “to solve the problem indicated by the title of his work.” Darwin had shown how natural selection honed a species to its niche, but he’d “never seriously attempted a rigorous analysis of the problem of the multiplication of species, of the splitting of one species into two,” Mayr wrote, in 1963. Mayr, who spent much of his career at Harvard, called speciation “the most important single event in evolution,” and proposed reproductive isolation as an “objective yardstick” for understanding it: individuals of a sexually reproducing species could procreate with one another but not with individuals of other species.

For decades, Mayr’s arguments dominated evolutionary thought. But consensus was crumbling by the two-thousands, when Wolf confronted the species problem. Wolf had learned Mayr’s “biological species concept” as a student, but he’d also discovered dozens of competing species concepts with alternative criteria, such as an animal’s form, ecology, evolutionary history, and ability to recognize potential mates. (Philosophers had joined the debate, too, with head-scratching questions about the ontological status of a species.) “The more you looked into it, the more confused you got,” Wolf said. Mayr had written that “the process of speciation could not be understood until after the nature of species and of geographic variation had been clarified.” But, in time, Wolf had come to believe the opposite: the nature of species could not be understood until the process of speciation—the ebb and flow of genetic differences between populations, and the evolution of reproductive isolation—had been clarified. . . 

Continue reading.

Later in the article:

. . . Beginning in the nineteen-eighties, Tautz had spent his career sequencing DNA, focussing on only a few hundred base pairs at a time. He was looking to see whether DNA might solve the species puzzle. Decades earlier, Mayr had argued that reproductive isolation can only develop in geographic isolation, after an impassable physical barrier, such as a mountain range or a river, divides a population in two; without migration the two populations would evolve into different species that could remain separate even when the barrier dried up or crumbled. This model, which Mayr called allopatric, or other-place, speciation, became the textbook standard of speciation, even though plenty of organisms appeared to have evolved without a geographic barrier. Some African lakes, for example, contain hundreds of species of colorful fish called cichlids; it was hard to imagine each species evolving in isolation, but Mayr and other mid-century leaders of evolutionary biology were dismissive of alternative ideas. (“These species have come into contact only after they had evolved,” Mayr wrote, of the fish.) For Tautz, the question was not whether allopatric speciation was valid—everyone agreed it was—but whether it was the only way species could diversify. “The allopatric paradigm was based on a few facts, a lot of faith, and on paradigmatic despots ruling the field,” he wrote.

In the early nineties, one of Tautz’s students, Ulrich Schliewen, brought him samples of cichlids he had collected from two small crater lakes in Cameroon. There were no barriers in either crater lake, and also no fertile hybrid forms. Either each species had evolved somewhere else and invaded the lake before its source population went extinct, as Mayr claimed, or they had all evolved together in their lake, through a process of sympatric, or same-place, speciation. Schliewen and Tautz sequenced a short snippet of mitochondrial DNA from each fish. By comparing the sequences, they could work their way backward to calculate the time of the organisms’ most recent common ancestor. The results indicated that the two ancestors of the twenty different species had lived and diversified within each lake. It was solid evidence of sympatric speciation. . .

And later:

Beginning in the nineteen-eighties, Tautz had spent his career sequencing DNA, focussing on only a few hundred base pairs at a time. He was looking to see whether DNA might solve the species puzzle. Decades earlier, Mayr had argued that reproductive isolation can only develop in geographic isolation, after an impassable physical barrier, such as a mountain range or a river, divides a population in two; without migration the two populations would evolve into different species that could remain separate even when the barrier dried up or crumbled. This model, which Mayr called allopatric, or other-place, speciation, became the textbook standard of speciation, even though plenty of organisms appeared to have evolved without a geographic barrier. Some African lakes, for example, contain hundreds of species of colorful fish called cichlids; it was hard to imagine each species evolving in isolation, but Mayr and other mid-century leaders of evolutionary biology were dismissive of alternative ideas. (“These species have come into contact only after they had evolved,” Mayr wrote, of the fish.) For Tautz, the question was not whether allopatric speciation was valid—everyone agreed it was—but whether it was the only way species could diversify. “The allopatric paradigm was based on a few facts, a lot of faith, and on paradigmatic despots ruling the field,” he wrote.

In the early nineties, one of Tautz’s students, Ulrich Schliewen, brought him samples of cichlids he had collected from two small crater lakes in Cameroon. There were no barriers in either crater lake, and also no fertile hybrid forms. Either each species had evolved somewhere else and invaded the lake before its source population went extinct, as Mayr claimed, or they had all evolved together in their lake, through a process of sympatric, or same-place, speciation. Schliewen and Tautz sequenced a short snippet of mitochondrial DNA from each fish. By comparing the sequences, they could work their way backward to calculate the time of the organisms’ most recent common ancestor. The results indicated that the two ancestors of the twenty different species had lived and diversified within each lake. It was solid evidence of sympatric speciation.

And later:

Konrad Lorenz, a Nobel Prize-winning biologist from Austria, formalized the study of animal behavior, or ethology, in the middle of the twentieth century. Lorenz’s most famous insight came when he hatched and raised a clutch of goslings. Not only did the young birds treat him as their mother, but, when they matured, they sought out human mates. Lorenz realized the geese had no innate sexual preference; rather, they learned to recognize appropriate mates by imprinting on their parents in the first days of their lives.

Crows and most other birds also imprint. Wolf has come to think that this might be the mechanism of their speciation. If carrion crows imprint on carrion crows and hooded crows imprint on hooded crows, then the occasional hybrid, who looks like neither, will be disadvantaged when it comes to finding mates. This process of “social marginalization,” Wolf believes, may be enough to create effective reproductive isolation, even though the birds’ genomes are perfectly compatible. (In genetic terms, the crows’ imprinting ends up putting divergent selection pressure on chromosome eighteen and nowhere else, allowing the rest of the genome to homogenize.) He now wants to find out just how long this mechanism has been operating. Crow bones are common in ancient human trash piles—“eating crow” used to be more than just a figure of speech—and, by sequencing the DNA from these bones, Wolf hopes to determine whether cavemen ate hooded crows in Eastern Europe and carrion crows in Western Europe, reconstructing the birds’ speciation continuum with data points stretching back thousands of years.

The entire article is worth reading.

Written by Leisureguy

21 September 2021 at 5:52 pm

Giant Viruses and the Tree of Life

leave a comment »

Patrick Forterre writes in Inference:

SCIENTISTS HAVE ALWAYS thought viruses much smaller than bacteria. And with good reason. Most bacteriophages are 100 times smaller than the bacteria that they infect. Bacteria can be viewed under an optical microscope; but an electron microscope is required in order to see a viral particle. When giant viruses were discovered in 2003, they came as a surprise. The giant mimivirus, for example, had actually been discovered in 1992, but misidentified as a bacterium—Bradfordcoccus.1 The confusion was understandable. Mimivirus particles are 750 nanometers long—easily visible with an optical microscope; and, what is more, the dye used to reveal bacterial cell walls also stained mimivirus particles.

A number of other monster viruses have been discovered in the last decade.2 Most of them have been isolated and described by Didier Raoult and Jean-Michel Claverie in Marseille. If Marseille is now the Mecca of giant virus research, Vancouver is something of a mini Mecca. It is there that Curtis Suttle and his team isolated and described both Cafeteria roenbergensis and Bodo saltans.3 Most giant viruses observed in the laboratory have been studied in amoebae,4 but giant viruses are found in extraordinarily diverse terrestrial and aquatic environments.5 Some infect algae, and there is some suspicion that the mimivirus infects human cells as well.6 All giant viruses infect eukaryotes.

Viruses closely related to the mimivirus have been grouped into the family Mimiviridae. The other giant viruses have been classified into three families: MolliviridaePandoraviridae, and Pithoviridae.

Mimiviridae and Molliviridae produce virions, or viral particles, with a characteristically icosahedral shape. Pandoraviridae and Pithoviridae produce strange ovoid particles that have often been confused for intracellular protists.7 One of the most unusual of the giant viruses is a member of Mimiviridae. Discovered in Brazil, the Tupanvirus contains a virion featuring a gigantic head and an equally gigantic membranous tail. Such a shape is without precedent in the viral world.

Giant viruses contain linear or double-stranded DNA that encode for 500 to 2,500 proteins. The Pandoravirus encodes 2,000 genes, which is only 10 times fewer than a human cell, and, at roughly 2.5 million base pairs, its genome is the largest of any known virus. The mimivirus genome encodes about half that number. Produced by a Pithovirus, the largest known virion is an ovoid particle with a length of 1.5 micrometers and a width of 0.5 micrometers. The size of a virion and the size of its genome are not necessarily correlated. They are no good guide to the threshold beyond which a virus is counted giant.8

Five years after giant viruses were discovered, researchers learned that giant viruses can themselves become infected by smaller viruses.9 The virophages that infect them have genomes that code for only about twenty genes. These virophages, unable to infect amoebae by themselves, are transported inside amoebae by their giant virus hosts.10 Once inside, the virophages transcribe and replicate their genes using the machinery of the giant virus, the giant virus then using the amoeba’s machinery to transcribe and replicate its own genes.11 The three known virophages—Mavirus, Sputnik, and Zamilon—happen to infect members of the Mimiviridae family, but virophages targeting other giant viruses are likely to be identified.

Frankenlike

THE DISCOVERY OF giant viruses and their virophages immediately reopened an old question: are viruses alive? Viruses had been excluded from the tree of life because they lacked the machinery needed either to reproduce or to synthesize proteins. A virus must hijack a cell before it can do either. But when scientists realized that viruses are more complex than originally presumed—encoding several thousand genes and becoming infected by other viruses—they began to suspect that viruses might be alive after all. When a virophage infects a Mimiviridae, it seems to become ill, its virions manifesting an abnormal morphology.

How can something be ill if it is not alive?12

Viruses had been excluded from living systems for another reason. They did not seem to share proteins that are universal across the three cellular domains: Archaea, Bacteria, and Eukarya. Yet many giant viruses do encode universal proteins, including RNA polymerase, some aminoacyl tRNA synthetases, and a few proteins involved in protein synthesis or DNA replication. Some phylogenetic analysts now place giant viruses in a fourth monophyletic group somewhere between Archaea and Eukarya.13 For all that, the fact remains that giant viruses lack the capacity to synthesize their own proteins without parasitizing a cell. Purificación López-García and David Moreira have thus disputed the phylogenetic analysis behind the phylogenetic analysts, arguing that the giant viruses are nothing more than genetic pickpockets, their genes acquired from a cellular origin in yet another triumph of theft over honest toil.14

Chantal Abergel and Claverie have also argued for the cellular origin of viral genes. But they have noticed, in addition, that most of the genes that giant viruses encode lack homologues in both modern cellular organisms and giant viruses from other families. Giant viruses, they suggest, might have arisen by regressive evolution—features lost instead of gained—from cellular lineages that diverged from modern cellular organisms before the advent of the last universal common ancestor of Archaea, Bacteria, and Eukarya. Claverie predicts that, as new giants are discovered, the distinction between viruses and cells will blur even further.15

Virus, Virion, and Virocell

WHEN IN DOUBT, define. The existence of giant viruses prompted virologists to search for a definition that could encompass the whole range of viruses, from the smallest, with genomes encoding two genes, to the largest, encoding thousands. All viruses produce virions—a viral particle consisting of a core of nucleic acid surrounded by a capsid protein shell.16 It is the capsid that distinguishes viruses from other mobile genetic elements, such as plasmids. The smallest virus and the smallest plasmid both have one gene coding for a replication protein. The virus has an additional gene that codes for a capsid.17

All virions have at least one capsid. For this reason, Raoult and I initially suggested defining viruses as capsid-encoding organisms.18 Some small virions are formed by one or more DNA- or RNA-binding proteins; others, by several capsid proteins, with a lipid membrane inside or outside the shell. The virions of giant viruses are elaborate structures involving hundreds of proteins and a lipid membrane that is often decorated with polysaccharide extensions. Virions and viruses are not the same thing. Confusion between the two is pervasive. The confusion is easy to understand. Virions can be easily isolated, they are infectious, and they can be photographed.

But they are not viruses.

Claverie was the first to emphasize the distinction.19 Within the cytoplasm of an infected cell, the mimivirus produces a large compartment called a viral factory, where the viral DNA, while being transcribed and replicated, is shielded from the cell’s defense mechanisms. Many RNA and DNA viruses produce viral factories.20 But in the mimivirus, the factory is huge—the size of the infected amoeba’s nucleus. Claverie suggested that the viral factory is the actual virus, and that virions are the equivalent of the spores or gametes of cellular organisms.21

After Claverie published this argument, I observed that bacterial and archaeal viruses do not produce an isolated viral factory inside the cytoplasm of the infected cell: they transform the entire cell into a factory.22 I suggested calling the infected cell a virocell.23 Adopting Claverie’s idea, I argued that . . .

Continue reading. There’s more.

Written by Leisureguy

14 September 2021 at 1:07 pm

Posted in Evolution, Science

The fungal mind: on the evidence for mushroom intelligence

leave a comment »

Fleecy milk-cap fungus (Lactifluus / Lactarius vellereus) on the forest floor in beech woodland in autumn, France, November

Nicholas P Money, professor of biology and Western Program director at Miami University in Oxford, Ohio, whose latest book is Nature Fast and Nature Slow: How Life Works from Fractions of a Second to Billions of Years (2021), writes in Aeon:

Mushrooms and other kinds of fungi are often associated with witchcraft and are the subjects of longstanding superstitions. Witches dance inside fairy rings of mushrooms according to German folklore, while a French fable warns that anyone foolish enough to step inside these ‘sorcerer’s rings’ will be cursed by enormous toads with bulging eyes. These impressions come from the poisonous and psychoactive peculiarities of some species, as well as the overnight appearance of toadstool ring-formations.

Given the magical reputation of the fungi, claiming that they might be conscious is dangerous territory for a credentialled scientist. But in recent years, a body of remarkable experiments have shown that fungi operate as individuals, engage in decision-making, are capable of learning, and possess short-term memory. These findings highlight the spectacular sensitivity of such ‘simple’ organisms, and situate the human version of the mind within a spectrum of consciousness that might well span the entire natural world.

Before we explore the evidence for fungal intelligence, we need to consider the slippery vocabulary of cognitive science. Consciousness implies awareness, evidence of which might be expressed in an organism’s responsiveness or sensitivity to its surroundings. There is an implicit hierarchy here, with consciousness present in a smaller subset of species, while sensitivity applies to every living thing. Until recently, most philosophers and scientists awarded consciousness to big-brained animals and excluded other forms of life from this honour. The problem with this favouritism, as the cognitive psychologist Arthur Reber has pointed out, is that it’s impossible to identify a threshold level of awareness or responsiveness that separates conscious animals from the unconscious. We can escape this dilemma, however, once we allow ourselves to identify different versions of consciousness across a continuum of species, from apes to amoebas. That’s not to imply that all organisms possess rich emotional lives and are capable of thinking, although fungi do appear to express the biological rudiments of these faculties.

Just what are mushrooms? It turns out that this question doesn’t have a simple answer. Mushrooms are the reproductive organs produced by fungi that spend most of their lives below ground in the form of microscopic filaments called hyphae. These hyphae, in turn, branch to form colonies called mycelia. Mycelia spread out in three dimensions within soil and leaf litter, absorbing water and feeding on roots, wood, and the bodies of dead insects and other animals. Each of the hyphae in a mycelium is a tube filled with pressurised fluid, and extends at its tip. The materials that power this elongation are conveyed in little packages called vesicles, whose motion is guided along an interior system of rails by proteins that operate as motors. The speed and direction of hyphal extension, as well as the positions of branch formation, are determined by patterns of vesicle delivery. This growth mechanism responds, second by second, to changes in temperature, water availability, and other opportunities and hardships imposed by the surrounding environment.

Hyphae can detect ridges on surfaces, grow around obstacles, and deploy a patch-and-repair system if they’re damaged. These actions draw upon an array of protein sensors and signalling pathways that link the external physical or chemical inputs to cellular response. The electrical activity of the cell is also sensitive to changes in the environment. Oscillations in the voltage across the hyphal membrane have been likened to nerve impulses in animals, but their function in fungi is poorly understood. Hyphae react to confinement too, altering their growth rate, becoming narrower and branching less frequently. The fungus adapts to the texture of the soil and the anatomy of plant and animal tissues as it pushes ahead and forages for food. It’s not thinking in the sense that a brained animal thinks – but the fundamental mechanisms that allow a hypha to process information are the same as those at work in our bodies.

We tend to associate consciousness and intelligence with the appearance of wilfulness or intentionality – that is, decision-making that results in a particular behavioural outcome. Whether or not humans have free will, we take actions that seem wilful: she finished her coffee, whereas her friend left her cup half full. Fungi express simpler versions of individualistic behaviour all the time. Patterns of branch formation are a good example of their seemingly idiosyncratic nature. Every young fungal colony assumes a unique shape, because the precise timing and positions of branch emergence from a hypha vary. This variation isn’t due to genetic differences, since identical clones from a single parent fungus still create colonies with unique shapes. Although the overall form is highly predictable, its detailed geometry is usually irreproducible. Each mycelium is like a snowflake, with a shape that arises at one place and time in the Universe.

Fungi also show evidence of learning and memory. Working with fungi isolated from grassland soil, German mycologists measured the effect of temperature changes on the growth of mycelia. When heated up quickly for a few hours, the mycelia stopped growing. When the temperature was reduced again, they bounced back from the episode by forming a series of smaller colonies from different spots across the original mycelium.

Meanwhile, a different set of mycelia were exposed to a mild temperature stress before the application of a more severe temperature shock. Colonies that had been ‘primed’ in this way resumed normal growth very swiftly after the severe stress, and continued their smooth expansion, rather than recovering here and there in the form of smaller colonies. This outcome suggests that they . . .

Continue reading. There’s more.

The article concludes:

. . . Fungal expressions of consciousness are certainly very simple. But they do align with an emerging consensus that, while the human mind might be particular in its refinements, it’s typical in its cellular mechanisms. Experiments on fungal consciousness are exciting for mycologists because they’ve made space for the study of behaviour within the broader field of research on the biology of fungi. Those who study animal behaviour do so without reference to the molecular interactions of their muscles; likewise, mycologists can learn a great deal about fungi simply by paying closer attention to what they do. As crucial players in the ecology of the planet, these fascinating organisms deserve our full attention as genuine partners in sustaining a functional biosphere.

Written by Leisureguy

3 September 2021 at 2:38 pm

The Complex Truth About ‘Junk DNA’

leave a comment »

Jake Buehler writes in Quanta:

Imagine the human genome as a string stretching out for the length of a football field, with all the genes that encode proteins clustered at the end near your feet. Take two big steps forward; all the protein information is now behind you.

The human genome has three billion base pairs in its DNA, but only about 2% of them encode proteins. The rest seems like pointless bloat, a profusion of sequence duplications and genomic dead ends often labeled “junk DNA.” This stunningly thriftless allocation of genetic material isn’t limited to humans: Even many bacteria seem to devote 20% of their genome to noncoding filler.

Many mysteries still surround the issue of what noncoding DNA is, and whether it really is worthless junk or something more. Portions of it, at least, have turned out to be vitally important biologically. But even beyond the question of its functionality (or lack of it), researchers are beginning to appreciate how noncoding DNA can be a genetic resource for cells and a nursery where new genes can evolve.

“Slowly, slowly, slowly, the terminology of ‘junk DNA’ [has] started to die,” said Cristina Sisu, a geneticist at Brunel University London.

Scientists casually referred to “junk DNA” as far back as the 1960s, but they took up the term more formally in 1972, when the geneticist and evolutionary biologist Susumu Ohno used it to argue that large genomes would inevitably harbor sequences, passively accumulated over many millennia, that did not encode any proteins. Soon thereafter, researchers acquired hard evidence of how plentiful this junk is in genomes, how varied its origins are, and how much of it is transcribed into RNA despite lacking the blueprints for proteins.

Technological advances in sequencing, particularly in the past two decades, have done a lot to shift how scientists think about noncoding DNA and RNA, Sisu said. Although these noncoding sequences don’t carry protein information, they are sometimes shaped by evolution to different ends. As a result, the functions of the various classes of “junk” — insofar as they have functions — are getting clearer.

Cells use some of their noncoding DNA to create a diverse menagerie of RNA molecules that regulate or assist with protein production in various ways. The catalog of these molecules keeps expanding, with small nuclear RNAsmicroRNAssmall interfering RNAs and many more. Some are short segments, typically less than two dozen base pairs long, while others are an order of magnitude longer. Some exist as double strands or fold back on themselves in hairpin loops. But all of them can bind selectively to a target, such as a messenger RNA transcript, to either promote or inhibit its translation into protein.

These RNAs can have substantial effects on an organism’s well-being. Experimental shutdowns of certain microRNAs in mice, for instance, have induced disorders ranging from tremors to liver dysfunction.

By far the biggest category of noncoding DNA in the genomes of humans and many other organisms consists of transposons, segments of DNA that can change their location within a genome. These “jumping genes” have a propensity to make many copies of themselves — sometimes hundreds of thousands — throughout the genome, says Seth Cheetham, a geneticist at the University of Queensland in Australia. Most prolific are the retrotransposons, which spread efficiently by making RNA copies of themselves that convert back into DNA at another place in the genome. About half of the human genome is made up of transposons; in some maize plants, that figure climbs to about 90%.

Noncoding DNA also shows up within the genes of humans and other eukaryotes (organisms with complex cells) in the intron sequences that interrupt the protein-encoding exon sequences. When genes are transcribed, the exon RNA gets spliced together into mRNAs, while much of the intron RNA is discarded. But some of the intron RNA can get turned into small RNAs that are involved in protein production. Why eukaryotes have introns is an open question, but researchers suspect that introns help accelerate gene evolution by making it easier for exons to be reshuffled into new combinations.

A large and variable portion of the noncoding DNA in genomes consists of . . .

Continue reading.

Written by Leisureguy

1 September 2021 at 2:52 pm

Posted in Evolution, Science

What Slime Knows

leave a comment »

Photographs by Alison Pollack

Lacy M. Johnson has an article in Orion illustrated with stunning photos. The article’s subtitle is “There is no hierarchy in the web of life,” and it begins:

IT IS SPRING IN HOUSTON, which means that each day the temperature rises and so does the humidity. The bricks of my house sweat. In my yard the damp air condenses on the leaves of the crepe myrtle tree; a shower falls from the branches with the slightest breeze. The dampness has darkened the flower bed, and from the black mulch has emerged what looks like a pile of snotty scrambled eggs in a shade of shocking, bilious yellow. As if someone sneezed on their way to the front door, but what came out was mustard and marshmallow.

I recognize this curious specimen as the aethalial state of Fuligo septica, more commonly known as “dog vomit slime mold.” Despite its name, it’s not actually a mold—not any type of fungus at all—but rather a myxomycete (pronounced MIX-oh-my-seat), a small, understudied class of creatures that occasionally appear in yards and gardens as strange, Technicolor blobs. Like fungi, myxomycetes begin their lives as spores, but when a myxomycete spore germinates and cracks open, a microscopic amoeba slithers out. The amoeba bends and extends one edge of its cell to pull itself along, occasionally consuming bacteria and yeast and algae, occasionally dividing to clone and multiply itself. If saturated with water, the amoeba can grow a kind of tail that whips around to propel itself; on dry land the tail retracts and disappears. When the amoeba encounters another amoeba with whom it is genetically compatible, the two fuse, joining chromosomes and nuclei, and the newly fused nucleus begins dividing and redividing as the creature oozes along the forest floor, or on the underside of decaying logs, or between damp leaves, hunting its microscopic prey, drawing each morsel inside its gooey plasmodium, growing ever larger, until at the end of its life, it transforms into an aethalia, a “fruiting body” that might be spongelike in some species, or like a hardened calcium deposit in others, or, as with Stemonitis axifera, grows into hundreds of delicate rust-colored stalks. As it transitions into this irreversible state, the normally unicellular myxomycete divides itself into countless spores, which it releases to be carried elsewhere by the wind, and if conditions are favorable, some of them will germinate and the cycle will begin again.

From a taxonomical perspective, the Fuligo septica currently “fruiting” in my front yard belongs to the Physaraceae family, among the order of Physarales, in class Myxogastria, a taxonomic group that contains fewer than a thousand individual species. These creatures exist on every continent and almost everywhere people have looked for them: from Antarctica, where Calomyxa metallica forms iridescent beads, to the Sonoran Desert, where Didymium eremophilum clings to the skeletons of decaying saguaro cacti; from high in the Spanish Pyrenees, where Collaria chionophila fruit in the receding edge of melting snowbanks, to the forests of Singapore, where the aethalia of Arcyria denudata gather on the bark of decaying wood, like tufts of fresh cotton candy.

Although many species are intensely colored

Continue reading. There’s much more.

Later in the article:

It is a single cell that can grow as large as a bath mat, has no brain, no sense of sight or smell, but can solve mazes, learn patterns, keep time, and pass down the wisdom of generations.

Written by Leisureguy

25 August 2021 at 12:32 pm

Genetic patterns offer clues to evolution of homosexuality

leave a comment »

Conservatives — particularly, I believe, evangelicals — think that being homosexual is a choice rather than a genetically-determined preference (thus “pray the gay away,” as though the object of prayer can change his/her decision (with, of course, the help of God Almighty, who works night and day to obey all those prayers). See, for example, But I’m a Cheerleader on Amazon Prime Video. Deciding not to be homosexual is akin to deciding not to have, say, brown eye: it’s not something a person decides.

Sara Reardon in Nature discusses the genetic differences among homosexuals, those who are promiscuous, and the monogamously inclined:

To evolutionary biologists, the genetics of homosexuality seems like a paradox. In theory, humans and other animals who are exclusively attracted to others of the same sex should be unlikely to produce many biological children, so any genes that predispose people to homosexuality would rarely be passed on to future generations. Yet same-sex attraction is widespread in humans, and research suggests that it is partly genetic.

In a study of data from hundreds of thousands of people, researchers have now identified genetic patterns that could be associated with homosexual behaviour, and showed how these might also help people to find different-sex mates, and reproduce. The authors say their findings, published on 23 August in Nature Human Behaviour1could help to explain why genes that predispose people to homosexuality continue to be passed down. But other scientists question whether these data can provide definitive conclusions.

Evolutionary geneticist Brendan Zietsch at the University of Queensland in Brisbane, Australia, and his colleagues used data from the UK Biobank, the US National Longitudinal Study of Adolescent to Adult Health and the company 23andMe, based in Sunnyvale, California, which sequence genomes and use questionnaires to collect information from their participants. The team analysed the genomes of 477,522 people who said they had had sex at least once with someone of the same sex, then compared these genomes with those of 358,426 people who said they’d only had heterosexual sex. The study looked only at biological sex, not gender, and excluded participants whose gender and sex did not match.

In earlier research, the researchers had found that people who’d had at least one same-sex partner tended to share patterns of small genetic differences scattered throughout the genome2. None of these variations seemed to greatly affect sexual behaviour on its own, backing up previous research that has found no sign of a ‘gay gene’. But the collection of variants seemed to have a small effect overall, explaining between 8% and 25% of heritability.

Next, the researchers used a computer algorithm to simulate human evolution over 60 generations. They found that the array of genetic variations associated with same-sex behaviour would have eventually disappeared, unless it somehow helped people to survive or reproduce.

Overlapping genes

Zietsch and his team decided to test whether these genetic patterns might provide an evolutionary edge by increasing a person’s number of sexual partners. They sorted the participants who had only had heterosexual sex by the number of partners they said they had had, and found that those with numerous partners tended to share some of the markers that the team had found in people who had had a same-sex partner.

The researchers also found that people who’d had same-sex encounters shared genetic markers with people who described themselves as risk-taking and open to new experiences. And there was a small overlap between heterosexual people who had genes linked to same-sex behaviour and those whom interviewers rated as physically attractive. Zietsch suggests that traits such as charisma and sex drive could also share genes that overlap with same-sex behaviour, but he says that those traits were not included in the data, so “we’re just guessing”.

The authors acknowledge . . .

Continue reading.

Written by Leisureguy

24 August 2021 at 2:59 pm

Bad news: New studies hint that the coronavirus may be evolving to become more airborne

leave a comment »

Masks become more important at a time when a segment of the population becomes more resistant to wearing them (a segment that presumably also refuses to wear seatbelts).

Tina Hesman Saey reports in Science News:

Small aerosol particles spewed while people breathe, talk and sing may contain more coronavirus than larger moisture droplets do. And the coronavirus may be evolving to spread more easily through the air, a new study suggests. But there is also good news: Masks can help.

About 85 percent of coronavirus RNA detected in COVID-19 patients’ breath was found in fine aerosol particles less than five micrometers in size, researchers in Singapore report August 6 in Clinical Infectious Diseases. The finding is the latest evidence to suggest that COVID-19 is spread mainly through the air in fine droplets that may stay suspended for hours rather than in larger droplets that quickly fall to the ground and contaminate surfaces.

Similar to that result, Donald Milton at the University of Maryland in College Park and colleagues found that people who carried the alpha variant had 18 times as much viral RNA in aerosols than people infected with less-contagious versions of the virus. That study, posted August 13 at medRxiv.org, has not been yet been peer reviewed. It also found that loose-fitting masks could cut the amount of virus-carrying aerosols by nearly half.

In one experiment, the Maryland team grew the virus from the air samples in the lab. That could be evidence that may convince some reluctant experts to embrace the idea that the virus spreads mainly through the air.

The debate over aerosol transmission has been ongoing since nearly the beginning of the COVID-19 pandemic. Last year, 200 scientists wrote a letter to the World Health Organization asking for the organization to acknowledge aerosol spread of the virus (SN: 7/7/20). In April, the WHO upgraded its information on transmission to include aerosols (SN: 5/18/21). The U.S. Centers for Disease Control and Prevention had acknowledged aerosols as the most likely source of spread just a few weeks before.

Previous studies in monkeys have also suggested that more virus ends up in aerosols than in large droplets. But some experts say that direct evidence that the virus spreads mainly through the air is still lacking.

“There’s lots of indirect evidence that the airborne route — breathing it in — is dominant,” says Linsey Marr, a civil and environmental engineer at Virginia Tech in Blacksburg, who studies viruses in the air. She was one of the 200 scientists who wrote to the WHO last year. “‘Airborne’ is a loaded word in infection control circles,” she says, requiring health care workers to isolate patients in special rooms, wear protective equipment and take other costly and resource-intensive measures to stop the spread of the disease. For those reasons, infection control experts have been reluctant to call the coronavirus airborne without especially strong proof.

Most COVID-19 cases have been among . . .

Continue reading. There’s more.

Virus mutations that are more successful at infecting people will prevail over the variants that are not so infectiouss: basic evolution.

Written by Leisureguy

22 August 2021 at 10:42 am

%d bloggers like this: