Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘History’ Category

The case of Norman Douglas: When pederasts are accepted and even lionized

leave a comment »

Rachel Hope Cleves, a historian and professor at the University of Victoria, British Columbia, has an interesting and lengthy extract from her book Unspeakable: A Life Beyond Sexual Morality (2020) in Aeon. Let me quote the conclusion:

. . . Popular toleration of pederasty, in Italy and elsewhere, took the form of wilful ignorance. As the American literary theorist Eve Kosofsky Sedgwick pointed out in Epistemology of the Closet (1990), ignorance is not a singular ‘maw of darkness’ but a multiple phenomenon. Ignorance can entail intentional not-knowing, making the closet a performance of silence as a speech act. The Australian anthropologist Michael Taussig called communal expressions of wilful ignorance ‘public secrets’ that rested on ‘active not-knowing’. The experiences of the German photographer Wilhelm von Gloeden demonstrates how such a public secret, or active not-knowing, operated. Gloeden lived in Taormina, in Sicily, from 1878 to his death in 1931. During his decades of residence, he photographed generations of boys, frequently posing them naked in Hellenic ruins, adorned with laurel crowns and other symbols of ancient Greece. Gloeden’s photographs were popular with many early gay activists, including Symonds. The people of Taormina, who benefitted from the tourist trade that Gloeden’s photography brought to their town, also liked him. Gloeden and other foreign men often paid local youths for sexual encounters, an open secret in the community. Locals silenced any journalists, priests and politicians who attempted to criticise Gloeden, since they felt that these criticisms dishonoured the community and threatened their economic wellbeing. As Mario Bolognari, a historian of Taormina, concluded in 2017: ‘having chosen not to see does not imply being blind. It only means having decided that it was preferable not to notice certain things.’

Active not-knowing happens at the intimate level as well as the communal level. Families engage in active not-knowing about sexual wrongdoing in the home. This applies not only to child sexual abuse, but to all sorts of misbehaviours, including adultery, sibling incest and domestic violence. The motivations for active not-knowing are various, ranging from love and loyalty for the offender, to fear of retribution, to a desire to shield the family from public shame. Active not-knowing applies to more than sexual misbehaviour, and extends beyond the family. Friends exercise active not-knowing on behalf of friends, not wanting to risk meaningful relationships. Fans of artists engage in active not-knowing about their idols, motivated by awe and admiration, or by a desire to protect a favourite artwork from scrutiny and rejection. And disempowered people engage in active not-knowing about the powerful, from fear of the consequences that might result from confronting the truth, or from appreciation for the benefits that accrue from maintaining ignorance. Lastly, everyone benefits from silence by avoiding being implicated themselves in the bad thing that they know about.

Many of these ways of not-knowing helped Douglas escape condemnation. Some members of his extended family disowned him because of the abusive way he treated his wife, who was his first cousin and thus their relation as well. But his sons, who witnessed firsthand his sexual encounters with children (and might even, in the case of his older son, have experienced abuse) maintained loyalty to their father and defended him from posthumous accusations. Some writer friends wrote off Douglas after his arrests, but many loved his books and maintained a deliberate ignorance about what actually happened between Douglas and the boys and girls he recounted meeting in the pages of his travel books. The children themselves knew the most about Douglas’s sexual predations, but they had the most to gain financially – and often emotionally – from keeping close to him. There’s almost no evidence of children speaking out against Douglas either during their connections or afterwards, as adults. One exception is a 16-year-old whose complaint led to Douglas’s initial arrest in London in 1916.

The lack of panic about paedophilia during Douglas’s lifetime made it easier for all these people to look the other way, even when he flaunted his predilections. Douglas went so far as to write about how he’d purchased children for sex in his memoir, Looking Back (1933). Very few reviewers took issue with the material, at least until after Douglas’s death, when, freed from the fear of a libel suit, they pointed out how unseemly it was for Douglas to have admitted to such behaviour. The author and former politician Harold Nicolson complained that he was ‘shocked by people who, when past the age of 70, openly avow indulgences which they ought to conceal’. In the eyes of reviewers who wanted to maintain the pretence of active not-knowing, Douglas’s admission might have been a worse crime than the acts themselves, since they implicated the readers by forcing them into a state of knowing.

If Douglas escaped condemnation during his lifetime, he couldn’t escape the assault on his reputation following the intensification of anti-paedophilic sentiment after his death. The shift in public mores during the 1980s towards viewing paedophiles as monsters made it impossible to defend Douglas. He disappeared from literary memory, except as an example of historical villainy – the role he plays in two novels published after the 1980s, Francis King’s The Ant Colony (1991) and Alex Preston’s In Love and War (2014). Most readers would consider that a salutary change and welcome the expulsion of paedophiles from acceptable society. However, the rise of the ‘monster’ discourse doesn’t seem to have made people much more willing to speak out against child sexual abuse in the present.

Looking at the example of Epstein, one can see the same old dynamics of active not-knowing operating among the leadership of the MIT Media Lab (who accepted donations from Epstein) and the scholars who turned a blind eye to his abuse, even after his conviction. The Media Lab didn’t want to lose Epstein’s financial patronage or be shamed by association. Individual scholars might have enjoyed his company (and the company of the girls and young women Epstein surrounded himself with), or they might have wanted funding from him, or feared the consequences to their careers if they spoke out against him. In an even more striking parallel to Douglas, Matzneff wrote and spoke openly about his paedophilia without censure, protected by fellow writers’ and publishers’ unwillingness to disturb the dense network of literary connections in which they all played a role, until one of his victims of abuse, the French publisher Vanessa Springora, broke the silence in 2019.

Is it possible that elevating the paedophile to the status of a monster has in fact, rather than making it easier to speak out against child abuse, made it more imperative for friends, family members and fans to engage in active not-knowing? Who wants to expose someone they love as a monster? More than that, people are inclined to disbelieve tales of extraordinary monstrosity. Who wants to disturb their own situation by making such explosive allegations? The stakes are too high to risk getting it wrong. Maybe it would be easier to counter the problem of child sexual abuse if we were able to acknowledge it as both bad and ordinary. In Douglas’s day, such sex was seen as questionable but mundane. Today, it’s seen as terrible but exceptional. If we could create a world where people agreed that sex between adults and children was not healthy for children, and that many ordinary adults engaged in such behaviour nonetheless, maybe more people would feel empowered to witness and speak out against everyday abuse.

This sort of wilful ignorance that accompanies acceptance is (as I fairly frequently mention) discussed in Daniel Goleman’s interesting book Vital Lies, Simple Truths.

This is also related to what is happening in France, where the acceptability of sexual harassment and rape, particularly by men in positions of power, is losing ground fairly rapidly. See Norimitsu Onishu’s NY Times article “Powerful Men Fall, One After Another, in France’s Delayed #MeToo.” (And the articles to which that report links are worth reading as well.) From the report:

. . . Since the beginning of the year, a series of powerful men from some of France’s most prominent fields — politics, sports, the news media, academia and the arts — have faced direct and public accusations of sexual abuse in a reversal from mostly years of silence. At the same time, confronted with these high-profile cases and a shift in public opinion, French lawmakers are hurrying to set 15 as the age of sexual consent — only three years after rejecting such a law.

The recent accusations have not only led to official investigations, the loss of positions for some men and outright banishment from public life for others. They have also resulted in a rethinking of French masculinity and of the archetype of Frenchmen as irresistible seducers — as part of a broader questioning of many aspects of French society and amid a conservative backlash against ideas on gender, race and postcolonialism supposedly imported from American universities.

. . . Ms. Haas said that France was going through a delayed reaction to #MeToo after a “maturation” period during which many French began to understand the social dimensions behind sexual violence and the concept of consent.

That was especially so, Ms. Haas said, after the testimony in the past year of Adèle Haenel, the first high-profile actress to speak out over abuse, and of Vanessa Springora, whose memoir, “Consent,” documented her abuse by the pedophile writer Gabriel Matzneff.

“The start of 2021 has been a sort of aftershock,” Ms. Haas said. “What’s very clear is that, today in France, we don’t at all have the same reaction that we did four, five years ago to testimonies of sexual violence against well-known people.”

Last month, Pierre Ménès, one of France’s most famous television sports journalists, was suspended indefinitely by his employer after the release of a documentary that exposed sexism in sports journalism, “I’m Not a Slut, I’m a Journalist.”

Just a few years ago, few criticized him for behavior that they now don’t dare defend in public, including forcibly kissing women on the mouth on television and, in front of a studio audience in 2016, lifting the skirt of a female journalist — Marie Portolano, the producer of the documentary.

“The world’s changed, it’s #MeToo, you can’t do anything anymore, you can’t say anything anymore,” Mr. Ménès said in a television interview after the documentary’s release. He said he didn’t remember the skirt incident, adding that he hadn’t been feeling like himself at the time because of a physical illness. . .

There’s more.

Written by LeisureGuy

9 April 2021 at 12:12 pm

The Rite of Spring, by Igor Stravinsky — An animated account of its debut

leave a comment »

Written by LeisureGuy

8 April 2021 at 7:57 pm

How an Abstinence Pledge in the ’90s Shamed a Generation of Evangelicals

leave a comment »

Clyde Haberman reports in the NY Times:

To the uninitiated, Christianity’s evangelical movement can seem like a monolith that brooks no dissent on certain core issues: Same-sex relationships are sinful, men’s spiritual dominance over women is divinely ordained and, on the political front, Donald J. Trump was an improbable but nonetheless valued protector of the faith.

Not everything is what it appears to be. The movement is in fact rife with division, a reality reinforced last month when Beth Moore, an evangelical writer and teacher with a huge following, formally ended her long affiliation with the Southern Baptist Convention, principally because of its tight embrace of the licentious, truth-challenged Mr. Trump.

It was a rupture several years in the making. As Ms. Moore told Religion News Service, disenchantment took hold when Mr. Trump became “the banner, the poster child for the great white hope of evangelicalism, the salvation of the church in America.” But the former president’s behavior is not the only issue buffeting the evangelical movement. White supremacy, male subjugation of women, a spate of sexual abuse cases, scandals involving prominent figures like Jerry Falwell Jr. — all have combined to undermine the authority of religious leaders and prompt members like Ms. Moore to abandon the Southern Baptist Convention.

Retro Report, which examines through video how the past shapes the present, turns attention to an artifact of religious conservatism from the movement. This is the so-called purity pledge, taken in the main by teenagers who pledged to abstain from sex until they married. Some swore to not so much as kiss another person or even go on a date, for fear of putting themselves on the road to moral failure.

Devotion to this concept took hold in the early ’90s, when fear of AIDS and other sexually transmitted diseases bolstered the evangelical movement’s gospel of teen abstinence. It was a view put forth as God-commanded and had the support of like-minded political leaders, from the White House of Ronald Reagan to that of Mr. Trump.

Many people certainly found lifelong contentment because of having waited for the right mate. But for others, as the Retro Report video shows, the dictates of the purity movement were so emotionally onerous that their adulthoods have been filled with apprehension and, in some instances, physical pain. They are people like Linda Kay Klein, who embraced the movement in her teens but left it in disenchantment at 21, two decades ago.

She described the trauma and the shame she felt this way: “I would find myself in tears and in a ball in the corner of a bed, crying, my eczema coming out, which it does when I’m stressed, and scratching myself till I bled, and having a deep shame reaction.” Ms. Klein found she was far from alone. She collected tales of enduring anxiety in a book, “Pure: Inside the Evangelical Movement That Shamed a Generation of Young Women and How I Broke Free” (Touchstone, 2018). “We went to war with ourselves, our own bodies and our own sexual natures,” she wrote, “all under the strict commandment of the church.”

It was under the aegis of the Southern Baptist Convention that the vow of virginity took distinct form, in True Love Waits, a program begun in 1993. As the movement grew in the ’90s, estimates of teenage adherents reached as high as 2.5 million worldwide. Youngsters wore purity rings, signed purity pledge cards and attended purity balls, with girls dressed in white and escorted by their fathers.

The fundamental message, inspired by a verse from Paul the Apostle’s First Epistle to the Thessalonians, was this: “I am making a commitment to myself, my family and my Creator that I will abstain from sexual activity of any kind before marriage. I will keep my body and my thoughts pure as I trust in God’s perfect plan for my life.”

Separate from religious imperatives, American teenagers in general have become warier of premarital relations — and certainly of unprotected sex. According to the federal government, there were 61.8 births in 1991 for every 1,000 young women in the 15-to-19 age group. By 2018, that figure had dwindled to 17.4, a decline that cut across racial and ethnic lines.

Among those who regarded purity in terms of spiritual enlightenment, few in the ’90s came to be more celebrated than Joshua Harris, a young man who preached that even sex-free dating was a dangerous first step on the slippery slope of a compromised life. His 1997 book “I Kissed Dating Goodbye” sold roughly a million copies. In his writings and speeches, Mr. Harris advocated courtship under the watchful eyes of a couple’s parents.

His message back then, he recalled for Retro Report, was that one should avoid conventional dating just as an alcoholic ought to steer clear of a bar. “It was, like, if you don’t want to have sex,” he said, “then don’t get into these sorts of short-term romantic relationships where there is an expectation to become intimate.”

Controlling teenage hormones, however, is easier said than done. Mr. Harris, who lives in Vancouver, eventually pulled his book from circulation, and has apologized for the role he played in causing anyone feelings of shame, fear and guilt. Today, he no longer considers himself a Christian.

Part of the problem for some critics of the movement is . . .

Continue reading.

Written by LeisureGuy

8 April 2021 at 7:49 pm

The Salmon Sushi Conspirarcy

leave a comment »

A very interesting story (and debunking):

Written by LeisureGuy

8 April 2021 at 1:02 pm

Go Beyond the Grocery Store With These Seven Innovative Spice Companies

leave a comment »

Reina Gattuso writes in Gastro Obscura:

IN 2016, SANA JAVERI KADRI found herself at a crossroads. After moving from her hometown of Mumbai to California, she wanted to learn more about the historical forces shaping her own identity and experience as a queer woman of color in the United States. A food photographer, Javeri Kadri turned to culinary history to better understand the history of global empire. For more than a century before the British crown officially made India a colony, the British East India company—a private corporation that had a monopoly over much of the South Asian spice trade—ruled the subcontinent.

Spice trading, Javeri Kadri realized, hasn’t changed much from its colonial roots. Often, the people growing spices are disconnected from the global marketplace by middlemen, who take the lion’s share of the profits. In 2017, following a series of sourcing trips to spice farms in India, Javeri Kadri founded Diaspora Co., a small spice company that directly sells seasonings from South Asian farms to U.S. and global consumers.

Diaspora Co. is one of a number of small companies bent on challenging the colonial legacy of the spice trade. In contrast to large spice companies, some of which have dominated the industry for hundreds of years, these endeavors tend to work directly with local farmers and are owned by people grounded in the cultural and culinary contexts of the spices they sell.

According to Greg Prang, co-founder of Culinary Culture Connections, which partners with South American Indigenous groups and nonprofits to import their products to the U.S., equitable spice sourcing should go beyond a “fair trade” label. It should focus on building relationships with producers and supporting their autonomy over traditional cultural and culinary practices.

“Fair trade is kind of a front for big corporations to say they’re doing something in respect of sustainability,” he says. Prang speaks from experience. He was trained as an anthropologist and worked in consumer research for multinational food companies for years. When corporations talked about leveraging fair trade branding for profit, “I remember laughing and saying, ‘If you don’t believe it, don’t do it.’”

Prang, Javeri Kadri, and others on this list believe in the importance of equitable sourcing—and they sell some tasty spices, too.

Diaspora Co.

Oakland, California

Besides bringing fresh spices to customers, Diaspora Co. states an intention to “redistribute power away from solely the trader and instead empower its farmers, laborers, and the earth,” according to their website. Today, the company directly sources more than a dozen spices from 12 farmers across six Indian states and Sri Lanka, many of whom use organic and regenerative farming methods.

Favorite offerings include sannam chillies; Sri Lankan kandyan cloves that taste like “pine, butterscotch, henna, and allspice”; and the masala dabba, a handmade brass version of the spice box ubiquitous in South Asian kitchens. The company also has a recipe blog and often weighs in on political issues—including a message of solidarity to the current farmers’ movement in India.

Loisa

New York, New York

In July 2020, politically progressive lovers of Latin American food were left with a dilemma. Robert Unanue, the CEO of Goya Foods—the largest producer of Latin American ingredients in the United States—had praised President Trump, despite the President’s record of racist rhetoric and policies targeting the Latinx community. Many boycotters, wondering where to get beloved seasonings, turned to Latinx-owned spice company Loisa.

Founded in 2017 by Kenny Luna and Scott Hattis, and co-owned by food activist Yadira Garcia, Loisa is named for the Spanglish moniker for the Lower East Side. Its two signature products, both organic, are sazón, a classic mix of cumin, coriander, garlic, oregano, and black pepper, and adobo, which is garlic, turmeric, black pepper, and oregano. The company also sells sofrito and rice and bean mixes. Loisa’s site offers vegan and vegetarian recipes for favorite Latin American dishes, and donates 2 percent of its monthly profits to community-based organizations in the greater New York City area.

Fly By Jing

Chengdu, China

Jing Gao’s spice company began out of a suitcase. As a young, European-raised Chinese chef exploring her roots in Chengdu, Gao began serving pop-up dinners out of her home kitchen. These dinner parties grew into a roving global series, with Gao lugging bags full of Chinese spices wherever she travelled. In 2018, she decided to turn the suitcase spice hustle into a full-fledged business. Gao’s first Kickstarter became the highest-funded craft food project in the site’s history, and Fly By Jing was born.

“I was completely blown away by the reception,” Gao writes via email. “It showed me that people were ready and excited to embrace these flavors.” Gao named the company after Chengdu’s “fly” restaurants, hole-in-the-wall joints so tasty that diners flock to them like flies. She also affixed her given name, Jing, to the company title, rather than Jenny, the name she’d gone by for most of her life.

The company’s first product is still its signature: Sichuan Chili Crisp, a spicy, savory sauce that will leave your mouth tingling. The company has expanded with a handful of other offerings, such as doubanjiang, aged fava bean paste, and zhong dumpling sauce, made of soy sauce, garlic, mushrooms, and spices. The company is also one of the few U.S. importers of Tribute Pepper, a mouth-numbing, citrusy chili once given to emperors as tribute.

Culinary Culture Connections

Bellevue, Washington

Greg Prang seems, at first, an unlikely founder of a company that imports small-scale, Indigenous-produced Brazilian spices. For years, . . .

Continue reading. There are more companies listed.

Written by LeisureGuy

6 April 2021 at 12:18 pm

A Tale of Two Tongues: English and Esperanto

leave a comment »

Stephanie Tam writes in The Believer:

I. THE ISLANDER 

Ever since Orlando Raola was a boy, he harbored a curiosity that stretched across the seas. Growing up in Havana, Cuba, in the 1960s, he perused the encyclopedia sets of his elementary school and pressed his ear to his shortwave radio to listen to programs on Radio Sweden. Always, he wondered what lay beyond the horizon.

“Having been born on an island, and being an islander by nature, I always had this great curiosity: What is beyond the sea?” Orlando told me. “What is the world out there? I understood early that the only way to communicate with humans is through language, and I was interested in many different cultures.”

Of all the cultures out there, he developed a special fascination with those of the European Nordic countries, compelled by exotic visions of snow-capped mountains and blue-eyed Swedes. In 1981 he joined the Swedish Institute, a public agency devoted to promoting interest in Sweden around the world. Eventually, he decided to learn the language, and the institute shipped him a package containing dictionaries, cassette tapes, reading material, and textbooks.

As he sifted through the contents of the box, he felt overwhelmed. His heart sank as he realized the magnitude of time and effort it would require for him to master Swedish. He would study for years and years—and then what? He would be able to speak to a small sliver of the world. True, he found Swedish culture fascinating. But he was also curious about the cultures of Japan, Hungary, and China.

“Do I have time to learn all of these languages?” he asked himself. “No, there won’t be time.” Sitting amid the piles of books and cassettes, he realized something. What he longed for was not just any language, but a universal language: one that would connect him not just to one people, but to the whole of humanity.

“That day,” he recounted with a slight smile, “that’s the day I became an Esperantist.”

II. THE DREAM OF A UNIVERSAL LANGUAGE

The dream of a universal language traces back millennia. One of our oldest stories about the origins of linguistic difference, the tale of Babel, is recounted in Genesis, the first book of the Torah. In it, men seek to build a tower that reaches the heavens: a rebellion of cosmic dimensions. To stop them, God scatters them into many nations and tongues across the earth. At its heart, Babel is an origin story about human miscommunication—language as a symbol for that which divides us. [1]

The history of universal languages tracks what its inventors believed divided humanity throughout the centuries. In the thirteenth century, the Catalan mystic and poet Ramon Llull developed a language that he believed would convert “infidels” to God’s truth. In his book Ars Magna, he designed a system of disks that could be rotated to combine theological concepts and generate 1,680 logical propositions by which the enterprising missionary might transcend linguistic barriers. (His eventual death at the hands of the Saracens suggests that the infidels felt otherwise.)

During the Enlightenment, the German philosopher and mathematician Gottfried Wilhelm Leibniz also attempted to create a logical language that transcended words. He planned to create a universal language out of symbols and equations that could not only perfectly mirror the mechanics of human intelligence but also calculate new knowledge and resolve disputes, which has led some to believe that his philosophy of mind and language anticipated artificial intelligence. “This language will be the greatest instrument of reason,” he wrote in The Art of Discovery in 1685. “….When there are disputes among persons, we can simply say: Let us calculate, without further ado, and see who is right.”

Each effort to create a language intelligible to the whole of humanity was informed by its creator’s understanding of what could allow or impair communication—conversion, heathenism; rationality, irrationality—and a desire to solve the problems that proliferated among our “natural” languages. In other words, language has always evolved as both a bridge and a barrier.

III. THE OPHTHALMOLOGIST AND THE EDUCATOR

One hundred years before Orlando Raola despaired in front of his box from the Swedish Institute, a young ophthalmologist by the name of Ludovik Lazarus Zamenhof looked with anguish at the rampant anti-Semitism ravaging his hometown of Białystok. Born as a Jew in the Russian Empire in 1859, Zamenhof was acutely aware of the forces that threatened to tear apart the fabric of his society—rising nationalism, ethnic divisions, the formation of nation-states—and that would eventually draw Europe into the first of two world wars.

Zamenhof had grown up believing that all people were part of the same human family, but when he looked around his neighborhood he saw only tribes divided by language. “In Białystok the inhabitants were divided into four distinct elements: Russians, Poles, Germans and Jews; each of these spoke their own language and looked on all the others as enemies,” he recalled. “….the diversity of languages is the first, or at least the most influential, basis for the separation of the human family into groups of enemies.”

In his teens, Zamenhof began work on a language that could serve as a bridge for all cultures. His creation would eventually become known as Esperanto, the world’s most successful “constructed” language. Zamenhof wanted his international language to be easy to learn, so he created a simplified grammar consisting of sixteen rules. There are no gendered nouns—no feminine moon or masculine sun, as is the case in French. Each word ending indicates its part of speech: all adjectives end in a, all nouns in o, all adverbs in e. For instance, Eŭropo (Europe)is the noun; Eŭropa (European)is the adjective. To make a noun plural, one simply adds j to the end of the root; there is also an accusative case, in which words end in n (Eŭropon). That’s about all the rules when it comes to nouns.

Unlike in English, verbs do not change for person or number, and there is only one ending, -as, for verbs in the present indicative: for example, mi estas (I am), vi estas (you are), li/ŝi/ĝi estas (he/she/it is). Verbs do conjugate for present (-as), past (-is), and future (-os) tenses, unlike Chinese and Indonesian, which rely mostly on context. The spelling is phonetic, with each letter corresponding to a single sound—in contrast to many natural languages, which often disappear consonants from words as their pronunciation evolves, like poignant and Worcester in English.

As a universal language, Esperanto was intended to be unaffiliated with any particular nationality or ethnicity. Zamenhof compiled nine hundred root words primarily from Indo-European languages: German, English, French, Italian, Spanish, and Russian. These could in turn be used to create new words, in a compound structure similar to those of languages like Chinese and Turkish. The word for steamship, for example, is vaporŝipo = vapor (steam) + ŝip (ship) + (noun ending). In this way, vocabulary can be built up from the base of root words with suffixes and affixes: for instance, the verb manĝi (to eat) + the suffix –aĵo (a thing) = manĝaĵo (food). A truly “neutral” language was beyond this well-intentioned polyglot creator (Zamenhof learned nearly a dozen languages over the course of his life), given his European origins and influences; its phonology is essentially Slavic, and its vocabulary derives primarily from Romance languages. But Zamenhof succeeded in creating a language that was simple to pick up. [2] One study among Francophone children found Esperanto an average of ten times faster to learn than English, Italian, or German.

In 1887, Zamenhof published his language manifesto in a Russian-language pamphlet under the pseudonym DoktoroEsperanto(“Doctor Hopeful”). He referred to his creation simply as the “lingvo internacia” (“international language”). Eventually, though, it came to be known by the name—or, in this case, pseudonym—of its inventor: Esperanto.

Behind Esperanto’s humble linguistic LEGO blocks lay a vast vision. “La interna ideo de Esperanto…,” Zamenhof declared in 1912, “estas: sur neŭtrala lingva fundamento forigi la murojn inter la gentoj…” The core idea of the language was a neutral linguistic foundation to facilitate communication between peoples: in other words, it was intended to create world peace through mutual understanding. The idea was not for Esperanto to supplant natural languages, but to work alongside them as an auxiliary language to bridge nations. The global establishment of this “interna ideo” would be the “fina venko”—the final victory—and the undoing of Babel.

As for Doktoro Esperanto himself, he ceded its evolution to the public, inviting others to take the language into their own hands: “From this day the future of the international language is no longer more in my hands than in the hands of any other friend of this sacred idea. We must now work together in equality… Let us work and hope!”

Even before Zamenhof set to work on Esperanto, the foundation was being laid for a different sort of world language. In 1835, Thomas Babington Macaulay delivered a treatise on Indian education that would have lasting repercussions for the spread of the English language in the British Empire. Macaulay had witnessed the struggles of a small number of British administrators to govern a massive local population. As chairman of the East India Company’s Committee of Public Instruction, he emphasized the need for his fellow colonialists to “form a class who may be interpreters between us and the millions whom we govern—a class of persons Indian in blood and colour, but English in tastes, in opinions, in morals and in intellect.” He supported his argument with glowing praise of the English language and an equally flamboyant savaging of Sanskrit literature:

It is, I believe, no exaggeration to say that all the historical information which has been collected from all the books written in the Sanscrit [sic] language is less valuable than what may be found in the most paltry abridgments used at preparatory schools in England. In every branch of physical or moral philosophy, the relative position of the two nations is nearly the same…. The claims of our own language it is hardly necessary to recapitulate. It stands pre-eminent even among the languages of the West.

For Macaulay, the English language was a way to inject Englishness into the minds and hearts of colonial subjects. Like Zamenhof, he had a vision for language, but it was not of bridging ethnic divisions; it was of building empire. In 1820, the Prussian philosopher and linguist Wilhelm von Humboldt had articulated a view of language as the activity that shaped an individual’s and a nation’s Weltansichten: “The diversity of languages is not a diversity of sounds and signs but a diversity of the views of the world.” However, this was no diversity of equals. Humboldt, like most of his European contemporaries, believed . . .

Continue reading. There’s much more.

See also on this page the section “Language Learning with Esperanto” for more information on the language and resources for learning it.

Written by LeisureGuy

4 April 2021 at 10:35 am

The radical aristocrat who put kindness on a scientific footing

leave a comment »

Lydia Syson has an interesting article in Psyche, which begins:

Five years had passed since Czar Alexander II promised the emancipation of the serfs. Trusting in a map drawn on bark with the point of a knife by a Tungus hunter, three Russian scientists set out to explore an area of trackless mountain wilderness stretching across eastern Siberia. Their mission was to find a direct passage between the gold mines of the river Lena and Transbaikalia. Their discoveries would transform understanding of the geography of northern Asia, opening up the route eventually followed by the Trans-Manchurian Railway. For one explorer, now better known as an anarchist than a scientist, this expedition was also the start of a long journey towards a new articulation of evolution and the strongest possible argument for a social revolution.

Prince Peter Kropotkin, the aristocratic graduate of an elite Russian military academy, travelled in 1866 with his zoologist friend Ivan Poliakov and a topographer called Maskinski. Boat and horseback took them to the Tikono-Zadonsk gold mine. From there, they continued with 10 Cossacks, 50 horses carrying three months’ supply of food, and an old Yukaghir nomad guide who’d made the journey 20 years earlier.

Kropotkin and Poliakov – enthusiastic, curious and well-read young men in their 20s – were fired by the prospect of finding evidence of that defining factor of evolution set out by Charles Darwin in On the Origin of Species (1859): competition. They were disappointed. As Kropotkin later wrote:

We saw plenty of adaptations for struggling, very often in common, against the adverse circumstances of climate, or against various enemies, and Polyakoff wrote many a good page upon the mutual dependency of carnivores, ruminants, and rodents in their geographical distribution; we witnessed numbers of facts of mutual support … [but] facts of real competition and struggle between higher animals of the same species came very seldom under my notice, though I eagerly searched for them.

Kropotkin pursued this contradiction for decades. Observation and wide reading convinced him that what he’d seen in Siberia was no exception, but a rule. In the 1860s, he watched a vast exodus of fallow deer gather in their thousands to cross the river Amur at its narrowest point to escape an early snowfall. In 1882, he was fascinated by a crab stuck on its back in a tank in Brighton Aquarium; it was painstakingly rescued by a band of comrades. Kropotkin collected descriptions from all over the world of the sociable behaviours of ants, bees, termites, falcons, swallows, horned larks, migrating birds, gazelles, buffalo, colonies of beavers, squirrels, mice, flocks of seals, herds of wild horses, tribes of dogs, wolf packs, marmots, rats, chinchillas, as well as apes and monkeys. He wrote that:

[A]s we ascend the scale of evolution, we see association growing more and more conscious. It loses its purely physical character, it ceases to be simply instinctive, it becomes reasoned.

It proved impossible for Kropotkin, a man ‘amiable to the point of saintliness’ according to George Bernard Shaw, to dedicate himself entirely to the ‘highest joys’ of scientific discovery, when all around him he saw ‘nothing but misery and struggle for a mouldy bit of bread’, as he put it in his Memoirs of a Revolutionist (1899). In 1872, in Switzerland, he became an anarchist, impressed by the egalitarian fraternity he found among the watchmakers of Jura. Back in Russia, he joined the revolutionary Circle of Tchaikovsky, disseminating underground literature and lecturing to the workers of St Petersburg disguised as Borodin the peasant agitator. His propaganda landed him in prison, but he escaped in 1876 with the help of comrades. By 1883, he was a political prisoner once again, this time in France. This second confinement gave him time to develop his arguments about evolution: he started to address systematically the conflicting interpretations of Darwin emerging in different parts of the world.

In England, the biologist, anthropologist and anatomist Thomas Huxley had quickly emerged as ‘Darwin’s bulldog’. Self-described as sharp of ‘claws and beak’, Huxley was prepared to ‘go to the Stake if requisite’ to defend evolutionary doctrine. His views on human nature and political economy were defined by Thomas Hobbes and Thomas Robert Malthus: life was an endless fight for scarce resources. The libertarian Herbert Spencer likewise applied natural selection to economics, using his infamous coinage the ‘survival of the fittest’ to justify laissez-faire capitalism. Popularly labelled ‘social Darwinism’, this view became gospel for Gilded Age industrialists such as John D Rockefeller. Although Huxley himself didn’t recommend the ‘survival of the fittest’ rule as a basis for morality – quite the reverse – he certainly believed that human beings were brutal and competitive, their sociability merely a recent veneer, rationalised by self-interest.

After Huxley published his pessimistic essay ‘The Struggle for Existence and Its Bearing Upon Man’ (1888) in The Nineteenth Century, an influential Victorian monthly review, Kropotkin was in a good position to launch an attack on Huxley’s idea of nature as a ‘gladiator’s show’. By this time, having been released from prison following an international outcry, Kropotkin was established in England, becoming quite a celebrity in the socialist and anarchist circles that blossomed through the mid-1880s. He promoted his political ideas in the international Left-wing press, and cofounded a London-based journal called Freedom, but made a living writing for scientific periodicals.

Between 1890 and 1915, in a series of interdisciplinary essays, Kropotkin drew on biology, sociology, history, (anti-racist) ethnology and anthropology to argue that species can organise and cooperate to overcome the natural environment and ensure their future survival. In 1902, the first eight essays were brought together in a book entitled Mutual Aid: A Factor of Evolution, an account of mutual support in action across the animal world (from microorganisms to mammals), ancient and modern ‘barbarian’ and ‘savage’ societies, medieval city-states and, finally, among modern humanity.

Kropotkin sought to recover an uncorrupted Darwin, whose metaphors should not be read too literally. But his call to understand compassion as ‘a powerful factor of further evolution’ cleared the way for a very particular political vision: human beings could overcome competitive struggle by voluntarily restructuring and decentralising society along principles of community and self-sufficiency.

Kropotkin became enamoured with mutual aid after reading an 1880 lecture on the subject by the celebrated zoologist Karl Kessler. Like other Russian naturalists at the time, Kessler didn’t deny the struggle for existence, but his own fieldwork in harsh and sparsely populated regions of the Russian empire strongly suggested that ‘the progressive development of the animal kingdom, and especially of mankind, is favoured much more by mutual support than by mutual struggle’. But, as Kropotkin mourned: ‘like so many good things published in the Russian tongue only, that remarkable address remains almost entirely unknown’.

Neither was Kropotkin alone politically. The historian of science Eric Johnson has recently demonstrated that . . .

Continue reading.

Written by LeisureGuy

3 April 2021 at 3:15 pm

Ending the Pandemic and Vaccine Resistance: Modern Questions, Long History

leave a comment »

Johns Hopkins School of Public Health interviews Graham Mooney, PhD, associate professor of the History of Medicine at the Johns Hopkins School of Medicine, and Jeremy Greene, MD, PhD, MA, director of Johns Hopkins’ Institute of the History of Medicine.

How do pandemics end?

Jeremy Greene: This question is often left to a relatively optimistic popular imagination that epidemics end with eradication—either [a virus] burns its way through a community and just ends through some sort of natural process, or it is blocked through successful containment strategies and the ability to actually get the reproduction quotient down.

But only a few epidemics in human history have been eradicated through intentional means. And so oftentimes when we tell stories about epidemics ending, what we’re really talking about is the point at which we stopped focusing on them. And that is located in place and in social position.

What did we learn from the 1918 Influenza Pandemic?

JG: The 1918 pandemic is thought to have gone through three major waves. But many historians who have revisited the epidemiology suggest that there are a higher number of deaths from flu and flu-like illnesses that happened in 1919 and 1920. And it may well be that there was a fourth wave and a fifth wave, and they just weren’t perceived. So even pinpointing exactly when the 1918 pandemic ends, it’s easier to pinpoint the moment in which we stopped attending to it as a pandemic than the moment in which there was an absolute freedom from this particular biological scourge throughout the world.

What have we learned from HIV/AIDS?

JG: The HIV/AIDS pandemic, which clearly was understood as a new, emergent threat of infectious disease of global importance when it was first detected in the early 1980s, was described in terms very similar to the way that COVID-19 is described: a new, lethal, frightening infectious agent of significant epidemic spread.

So at what point did the AIDS epidemic stop being an epidemic in the popular imagination? It still kills millions of people each year. We have not cured it. We don’t have a vaccine for it. But it’s become something that we have learned to live with, such that when people talk about AIDS in epidemic terms, they often talk in past tense.

What about polio?

JG: The same could be said even for polio, which we’ve had much more success in developing a vaccine with. But if you look at the question of when the polio epidemics ended, the real question is, for whom, and where?

My colleague Dora Vargha points out that many people still live with the complications of polio, and there are still polio epidemics breaking out in certain parts of the world. So to talk about the polio epidemic in the past tense is not actually historically true. Polio epidemics continue.

When we think about what this means for COVID-19, the real question is, what will happen when enough people are vaccinated within countries like the United States that the attention begins to shift away from calling a pandemic a pandemic, even though it’s still ravaging many parts of the world? And we don’t have a good answer to that question yet.

Can history help explain vaccine resistance?

Graham Mooney: One of the biological ways a pandemic ends is . . .

Continue reading.

Written by LeisureGuy

2 April 2021 at 10:14 am

America’s Immigration Amnesia: Despite recurrent claims of crisis at the border, the United States still does not have a coherent immigration policy

leave a comment »

Caitlin Dickerson writes in the Atlantic:

In the early 2000s, Border Patrol agents in the Rio Grande Valley of South Texas were accustomed to encountering a few hundred children attempting to cross the American border alone each month. Some hoped to sneak into the country unnoticed; others readily presented themselves to officials in order to request asylum. The agents would transport the children, who were exhausted, dehydrated, and sometimes injured, to Border Patrol stations and book them into austere concrete holding cells. The facilities are notoriously cold, so agents would hand the children Mylar blankets to keep warm until federal workers could deliver them to child-welfare authorities.

But starting in 2012, the number of children arriving at the border crept up, first to about 1,000 a month, then 2,000, then 5,000. By the summer of 2014, federal officials were processing more than 8,000 children a month in that region alone, cramming them into the same cells that had previously held only a few dozen at a time, and that were not meant to hold children at all.

As the stations filled, the Obama administration scrambled to find a solution. The law required that the children be moved away from the border within 72 hours and placed in the custody of the Department of Health and Human Services, so they could be housed safely and comfortably until they were released to adults willing to sponsor them. But HHS facilities were also overflowing. The department signed new contracts for “emergency-influx shelters,” growing its capacity by thousands of beds within a matter of months. Government workers pulled 100-hour weeks to coordinate logistics. And then, seemingly overnight, border crossings began to drop precipitously. No one knew exactly why.

“The numbers are unpredictable,” Mark Weber, an HHS spokesperson, told me in 2016, just as another child-migration surge was beginning to crest. “We don’t know why a bunch of kids decided to come in 2014, or why they stopped coming in 2015. The thing we do know is these kids are trying to escape violence, gangs, economic instability. That’s a common theme. The numbers have changed over the years, but the themes stayed the same.”

The cycle repeated itself under President Donald Trump in 2019, and is doing so again now. And as border crossings rise and the government rushes to open new emergency-influx shelters, some lawmakers and pundits are declaring that the Biden administration is responsible for the surge. “The #BidenBorderCrisis was caused by the message sent by his campaign & by the measures taken in the early days of his new administration,” Marco Rubio tweeted last week. The administration is “luring children to the border with the promise of letting them in,” Joe Scarborough, the Republican congressman turned cable-television host, told millions of viewers during a recent segment.

But for decades, most immigration experts have viewed border crossings not in terms of surges, but in terms of cycles that are affected by an array of factors. These include the cartels’ trafficking business, weather, and religious holidays as well as American politics—but perhaps most of all by conditions in the children’s home countries. A 2014 Congressional Research Service report found that young peoples’ “motives for migrating to the United States are often multifaceted and difficult to measure analytically,” and that “while the impacts of actual and perceived U.S. immigration policies have been widely debated, it remains unclear if, and how, specific immigration policies have motivated children to migrate to the United States.”

The report pointed out that special protections for children put into place under the Trafficking Victims Protection Reauthorization Act of 2008 may have shifted migration patterns by encouraging parents to send their children alone rather than travel as a family. But it found that blaming any one administration for a rise in border crossings ultimately made no sense—the United States has offered some form of protection to people fleeing persecution since the 1940s, and those rights were expanded more than 40 years ago under the Refugee Act of 1980.

This is not to say that President Joe Biden’s stance on immigration—which has thus far been to discourage foreigners from crossing the border while also declaring that those who do so anyway will be treated humanely—has had no effect on the current trend. Like other business owners, professional human traffickers, known as coyotes, rely on marketing—and federal intelligence suggests that perceived windows of opportunity have been responsible for some of their most profitable years.

For example, border crossings rose in the months before President Trump took office in part because coyotes encouraged people to hurry into the United States before the start of the crackdown that Trump had promised during his campaign. With Trump out of office, some prospective migrants likely feel impelled to seek refuge now, before another election could restore his policies.

But placing blame for the recent increase in border crossings entirely on the current administration’s policies ignores the reality that the federal government has held more children in custody in the past than it is holding right now, and that border crossings have soared and then dropped many times over the decades, seemingly irrespective of who is president.

Given, then, that the movement of unaccompanied minors has long ebbed and flowed—we are now experiencing the fourth so-called surge over the course of three administrations—why do border facilities still appear overwhelmed? The answer, in part, is . . .

Continue reading.

Written by LeisureGuy

30 March 2021 at 1:36 pm

Three groundbreaking journalists saw the Vietnam War differently. It’s no coincidence they were women.

leave a comment »

Cambodian Prime Minister Long Boret, center, meets with war correspondent Elizabeth Becker in Cambodia in 1974. (Elizabeth Becker)

Margaret Sullivan writes in the Washington Post:

Frances FitzGerald paid her own way into Vietnam. She was an “on spec” reporter with no editor to guide her, no office to support her, and no promise that anyone would publish what she wrote about the war.

She knocked out her first article on a blue Olivetti portable typewriter she had carried from New York and mailed it the cheap and slow way from a post office in the heart of Saigon’s French quarter to the Village Voice, nearly 9,000 miles away.

It arrived, and on April 21, 1966, the Voice published FitzGerald’s indictment of the chaotic U.S. war policy.

“The result was a highly original piece written in the style of an outsider, someone who asked different questions and admitted when she didn’t have answers,” wrote Elizabeth Becker in her new book, “You Don’t Belong Here: How Three Women Rewrote the Story of War,” which celebrates the work of FitzGerald, Kate Webb and Catherine Leroy.

Becker, a former war correspondent in Cambodia toward the end of the decades-long conflict, wrote about these women in part because she had experienced much of what they did — just a little later, and with appreciation for the paths they’d broken.

“I went through it at the tail end, and they were my role models,” Becker told me last week. She admired them because they had broken gender barriers, endured sexual harassment and been belittled by journalistic peers who thought women had no place near a war zone.

But “I wanted to write more than a ‘breaking the glass ceiling’ book,” said Becker, who has broken a few of her own: It’s likely that, as a stringer in Cambodia in the early 1970s, she was the first woman to regularly report from a war zone for The Washington Post. Later, she became the senior foreign editor at NPR and a New York Times correspondent.

What struck Becker about her subjects went far beyond gender. It was the women’s approach to their work. They were more interested in people than in battlefields, quicker to see the terrible cost of violence to the Vietnamese as well as to Westerners, less likely than many of their male colleagues to swallow the government’s party line.

“They brought this common humanity and an originality to their work,” Becker said.

Remarkably early, FitzGerald clearly described what American officials didn’t want the public to see: the chaos, the lack of sensible purpose.

“For the Embassy here the problem has not been how to deal with the crisis — there is no way to deal with it under U.S. Standard Operating Procedures — but rather how to explain what is happening in any coherent terms,” she wrote in that 1966 article for the Voice. . .

Continue reading. There’s more.

Written by LeisureGuy

28 March 2021 at 6:22 pm

The Real Reason Republicans Couldn’t Kill Obamacare

leave a comment »

Adapted from The Ten Year War: Obamacare and the Unfinished Crusade for Universal Coverage, St. Martin’s Press 2021, and quoted from the Atlantic:

The affordable care act, the health-care law also known as Obamacare, turns 11 years old this week. Somehow, the program has not merely survived the GOP’s decade-long assault. It’s actually getting stronger, thanks to some major upgrades tucked in the COVID-19 relief package that President Joe Biden signed into law earlier this month.

The new provisions should enable millions of Americans to get insurance or save money on coverage they already purchase, bolstering the health-care law in precisely the way its architects had always hoped to do. And although the measures are temporary, Biden and his Democratic Party allies have pledged to pass more legislation making the changes permanent.

The expansion measures are a remarkable achievement, all the more so because Obamacare’s very survival seemed so improbable just a few years ago, when Donald Trump won the presidency. Wiping the law off the books had become the Republicans’ defining cause, and Trump had pledged to make repeal his first priority. As the reality of his victory set in, almost everybody outside the Obama White House thought the effort would succeed, and almost everybody inside did too.

One very curious exception was Jeanne Lambrew, the daughter of a doctor and a nurse from Maine who was serving as the deputy assistant to the president for health policy. As a longtime Obama adviser, going back to the 2008 transition, Lambrew was among a handful of administration officials who had been most responsible for shaping his health-care legislation and shepherding it through Congress—and then for overseeing its implementation. Almost every other top official working on the program had long since left government service for one reason or another. Lambrew had stayed, a policy sentry unwilling to leave her post.

On that glum November 2016 day following the election, Lambrew decided to gather some junior staffers in her office and pass out beers, eventually taking an informal survey to see who thought Obama’s signature domestic-policy achievement would still be on the books in a year. Nobody did—except Lambrew.

Yes, Republicans had already voted to repeal “Obamacare” several times. But, she knew, they had never done so with real-world consequences, because Obama’s veto had always stood in the way. They’d never had to think through what it would really mean to take insurance away from a hotel housekeeper or an office security guard on Medicaid—or to tell a working mom or dad that, yes, an insurance company could deny coverage for their son’s or daughter’s congenital heart defect.

A repeal bill would likely have all of those effects. And although Republicans could try to soften the impact, every adjustment to legislation would force them to sacrifice other priorities, creating angry constituents or interest groups and, eventually, anxious lawmakers. GOP leaders wouldn’t be able to hold the different camps within their caucuses together, Lambrew believed, and the effort would fail.

All of those predictions proved correct. And that wasn’t because Lambrew was lucky or just happened to be an optimist. It was because she knew firsthand what most of the Republicans didn’t: Passing big pieces of legislation is a lot harder than it looks.

It demands unglamorous, grinding work to figure out the precise contours of rules, spending, and revenue necessary to accomplish your goal. It requires methodical building of alliances, endless negotiations among hostile factions, and making painful compromises on cherished ideals. Most of all, it requires seriousness of purpose—a deep belief that you are working toward some kind of better world—in order to sustain those efforts when the task seems hopeless.

Democrats had that sense of mission and went through all of those exercises because they’d spent nearly a century crusading for universal coverage. It was a big reason they were able to pass their once-in-a-generation health-care legislation. Republicans didn’t undertake the same sorts of efforts. Nor did they develop a clear sense of what they were trying to achieve, except to hack away at the welfare state and destroy Obama’s legacy. Those are big reasons their legislation failed.

Obamacare’s survival says a lot about the differences between the two parties nowadays, and not just on health care. It’s a sign of how different they have become, in temperament as much as ideology, and why one has shown that it’s capable of governing and the other has nearly forgotten how.

Democrats were so serious about health care that they began planning what eventually became the Affordable Care Act more than a decade earlier, following the collapse of Bill Clinton’s reform attempt in the 1990s. The ensuing political backlash, which saw them lose control of both the House and Senate, had left top Democrats in no mood to revisit the issue. But reform’s champions knew that another opportunity would come, because America’s sick health-care system wouldn’t heal itself, and they were determined not to make the same mistakes again.

At conferences and private dinners, on chat boards and in academic journals, officials and policy advisers obsessively analyzed what had gone wrong and why—not just in 1993 and 1994 but in the many efforts at universal coverage that had come before. They met with representatives of the health-care industry as well as employers, labor unions, and consumer advocates. Industry lobbyists had helped kill reform since Harry Truman’s day. Now they were sitting down with the champions of reform, creating a group of “strange bedfellows” committed to crafting a reform proposal they could all accept.

Out of these parallel efforts, a rough consensus on substance and strategy emerged. Democrats would put forward a plan that minimized disruption of existing insurance arrangements, in order to avoid scaring people with employer coverage, and they would seek to accommodate rather than overpower the health-care industry. The proposal would err on the side of less regulation, spending, and taxes—basically, anything that sounded like “big government”—and Democrats would work to win over at least a few Republicans, because that would probably be necessary in Congress.

Proof of concept came in 2006, in Massachusetts, when its Republican governor, Mitt Romney, teamed up with the Democratic state legislature to pass a plan that fit neatly into the new vision. It had the backing from a broad coalition, including insurers and progressive religious organizations. Ted Kennedy, the liberal icon and U.S. senator, played a key role, by helping secure changes in funding from Washington that made the plan possible. “My son said something … ‘When Kennedy and Romney support a piece of legislation, usually one of them hasn’t read it,’” Kennedy joked at the signing ceremony, standing at Romney’s side.

Kennedy’s endorsement said a lot about the psychology of Democrats at the time. No figure in American politics was more closely associated with the cause of universal health care and, over the years, he had tried repeatedly to promote plans that looked more like the universal-coverage regimes abroad, with the government providing insurance directly in “single-payer” systems that resembled what today we call “Medicare for All.” But those proposals failed to advance in Congress, and Kennedy frequently expressed regret that, in the early 1970s, negotiations over a more private sector-oriented coverage plan with then-President Richard Nixon had broken down, in part because liberals were holding out for a better deal that never materialized.

Kennedy was not alone in his belief that the champions of universal coverage would have to accept big concessions in order to pass legislation. The liberal House Democrats John Dingell, Pete Stark, and Henry Waxman, veteran crusaders for universal coverage who’d accrued vast power over their decades in Congress, were similarly willing to put up with what they considered second-, third-, and even fourth-best solutions—and they were masters of the legislative process, too. Waxman in particular was an expert at doing big things with small political openings, such as inserting seemingly minor adjustments to Medicaid into GOP legislation, expanding the program’s reach over time. “Fifty percent of the social safety net was created by Henry Waxman when no one was looking,” Tom Scully, who ran Medicare and Medicaid for the Bush administration in the early 2000s, once quipped.

Obama had a similar experience putting together health-care legislation in the Illinois state legislature—where, despite proclaiming his support for the idea of a single-payer system, he led the fight for coverage expansions and universal coverage by working with Republicans and courting downstate, more conservative voters. He also was a master of policy detail, and as president, when it was time to stitch together legislation from different House and Senate versions, he presided over meetings directly (highly unusual for a president) and got deep into the weeds of particular programs.

Obama could do this because the concept of universal coverage fit neatly within . . .

Continue reading. There’s much more.

Later in the column:

Another problem was a recognition that forging a GOP consensus on replacement would have been difficult because of internal divisions. Some Republicans wanted mainly to downsize the Affordable Care Act, others to undertake a radical transformation in ways they said would create more of an open, competitive market. Still others just wanted to get rid of Obama’s law and didn’t especially care what, if anything, took its place.

“The homework that hadn’t been successful was the work to coalesce around a single plan, a single set of specific legislative items that could be supported by most Republicans,” Price told me. “Clearly, looking at the history of this issue, this has always been difficult for us because there are so many different perspectives on what should be done and what ought to be the role of the federal government in health care.”

The incentive structure in conservative politics didn’t help, because it rewarded the ability to generate outrage rather than the ability to deliver changes in policy. Power had been shifting more and more to the party’s most extreme and incendiary voices, whose great skill was in landing appearances on Hannity, not providing for their constituents. Never was that more apparent than in 2013, when DeMint, Senator Ted Cruz of Texas, and some House conservatives pushed Republicans into shutting down the government in an attempt to “defund” the Affordable Care Act that even many conservative Republicans understood had no chance of succeeding.

The failure to grapple with the complexities of American health care and the difficult politics of enacting any kind of change didn’t really hurt Republicans until they finally got power in 2017 and, for the first time, had to back up their promises of a superior Obamacare alternative with actual policy. Their solution was to minimize public scrutiny, bypassing normal committee hearings so they could hastily write bills in the leadership offices of House Speaker Paul Ryan and, after that, Senate Majority Leader Mitch McConnell.

Written by LeisureGuy

28 March 2021 at 4:52 pm

Puddles: Tears, butterflies, and the shootings in Atlanta

leave a comment »

Sabrina Imbler writes in Sierra, the magazine of the Sierra Club:

This past week, I have been trying to figure out if a puddle is a body of water.

According to Wikipedia, a body of water is defined as a significant accumulation of water, such as an ocean, a sea, or a lake. When geographers map out bodies of water, they include oceans and lakes, perhaps even ponds, but not puddles. A puddle is defined by a small accumulation of water on a surface. I have to wonder, is “small” significant? What about “very small”? How much water must you hold to be considered a body of water?

As a mixed Asian American person, I have spent a lifetime trying to understand how small something like an experience can be and still be considered significant. How small I can be and still be significant.

I have been thinking about puddles because they are the only bodies of water I see nowadays. In Brooklyn, where I live, puddles accumulate by sidewalks and surround intersections, meaning you have to look down to know where to step. Sometimes, after rainfall but before the murk and trash sets in, you can see a glimmer of yourself, or how you are seen.

Last spring, amid a first wave of lockdowns—after my mom sent me an email cautioning me, an Asian asthmatic, not to cough in public—a man spit at me, maybe. I wasn’t sure. He was standing on a corner and I had just walked past him on the otherwise empty street. His spit landed on my shoe, and I faltered for a second but kept walking. When I looked back, I saw him watching me. When he didn’t say anything, I figured I was assuming too much, that I had been the one to intrude in his pre-planned spitting, that it was ingloriously vain of me to assume that he meant to spit on me. A few blocks away, surrounded by brownstones and shuttered shops—no storefront glass in sight—I looked at myself in a puddle as if this could answer my question. I saw a face mask and a beanie and then the only part of my face that was exposed: my eyes. I returned from my destination—a Japanese restaurant converted into a grocery store—and passed by a mailbox with a directive in Sharpie: Go back to China! As I walked home, I wondered, was this significant?

I have been thinking about puddles this past week because I have been crying, in fits and bursts, leaking enough tears and mucus that I could form a very small, probably insignificant, puddle. I did not cry when I learned about the shooting at the spas in Atlanta—where a white man shot eight people, six of whom were Asian women—but I cried later that night, while I was brushing my teeth. I am not a woman, but I am reminded constantly by strangers that I am seen as a woman, objectified as an Asian woman. I thought about the images I’d seen in past months of Asian elders shovedassaulted, and slashed, many of whom lived in towns near where my own grandparents live. My grandpa, a 98-year-old man who wears flat caps and speaks mostly in Mandarin these days, walks around his neighborhood for an hour each day. I wondered, should I ask him to stop?

I do not mean to equate my Asian American experience with the experience of the women killed in Atlanta. Asian massage workers face violence, racism, and sexism every day, Elene Lam, the executive director of Butterfly, a support network for Asian and migrant sex workers in Toronto, told The Cut. Their work is stigmatized, precarious, criminalized, and overpoliced, regardless of whether they are sex workers. They may lack legal protections or be excluded from other jobs due to their immigration status or language barriers. “Those women were assumed to be sex workers & therefore not worthy of safety,” tweeted the writer and social worker Kai Cheng Thom in a thread about the shootings. I felt frustrated at the futility of my tears; they were not helping the victims or the families left to grieve the losses of their daughters, mothers, grandmothers.

When I was in high school, I learned that puddles, bereft of flow, could become vectors of disease. Standing water is dangerous because it is a breeding ground for mosquitoes that spread diseases such as malaria and dengue. I did not learn until much later that when Chinese women began immigrating to California in the 19th century, white health professionals and legislators cast these women as a threat to American morality and a contagion to public health. The president of the American Medical Association warned of a (completely fictitious) sexually transmitted disease that was only carried by Chinese women, Mari Uyehara writes in The Nation. In 1875, the US passed the Page Act, which effectively banned Chinese women from immigrating.

Puddles may not be significant to geographers, but they are significant to wildlife, particularly butterflies. Adult butterflies can only consume liquids, which they imbibe through their spiraling proboscises. They subsist almost entirely on a diet of leaves and nectar, foods rich in sugar but devoid of sodium. Butterflies must seek out sodium elsewhere in liquid form. So they resort to what’s known as puddling, seeking out minerals in water and damp substrates. Shallow puddles are safer havens for such small creatures than the surging currents of rivers or depths of a pond. Butterflies in Sulawesi, . . .

Continue reading. The conclusion is powerful.

Written by LeisureGuy

28 March 2021 at 11:35 am

Elite panic

leave a comment »

I have observed, as perhaps you have as well, that wealth seems to make people fearful, and as wealth increases more and more stringent forms of security are embraced. Rebecca Solnit has an interesting Facebook post on this pathology. She writes:

The marauding hordes of the underclass is a topic of constant fantasy among elites, so much so two of the sociologists I cited in A Paradise Built in Hell labeled this delusion “elite panic.” It often justifies what you could call marauding hordes of the overclass — suppressing the people they assume are bestial but also at some level they acknowledge are legitimately resentful of social inequality, which they [the overclass] are willing to use violence to perpetuate.

In a way the premise of white supremacy is “your imaginary violence is the justification for my real violence,” and here’s Graham trotting that out as “the violence I imagine could happen in extreme situations is my justification for pushing instruments of extreme violence into everyday life.”

Those sociologists also demonstrate that most people are altruistic, generous, resourceful, and helpful in disasters. Note the alignment of racist fantasies here — gangs, cops, white people with weapons of war. But what that violence from elites and authorities is really used for is to maintain the status quo, and there’s a way mass shootings do so, as attacks on women, immigrants, people of color, perceived enemies to be punished by people who have allocated the right to punish unto death.

From the book Disasters: A Sociological Approach, sociologist Kathleen Tierney, who directs the University of Colorado’s Natural Hazards Center, gave a riveting talk at the University of California, Berkeley, for the centennial of the 1906 earthquake. In the talk she stated, “Elites fear disruption of the social order, challenges to their legitimacy.” She reversed the image of a panicking public and a heroic minority to describe what she called “elite panic.” She itemized its ingredients as “fear of social disorder; fear of poor, minorities and immigrants; obsession with looting and property crime; willingness to resort to deadly force; and actions taken on the basis of rumor.”

In other words, it is the few who behave badly and the many who rise to the occasion. And those few behave badly not because of facts but of beliefs: they believe the rest of us are about to panic or become a mob or upend property relations, and in their fear they act out to prevent something that may have only existed in their imaginations. Thus the myth of malevolent disaster behavior becomes something of a self-fulfilling prophesy. Elsewhere she adds, “The media emphasis on lawlessness and the need for strict social control both reflects and reinforces political discourse calling for a greater role for the military in disaster management. Such policy positions are indicators of the strength of militarism as an ideology in the United States.”

From their decades of meticulous research, most of the disaster sociologists have delineated a worldview in which civil society triumphs and existing institutions often fail during disaster. They quietly endorse much of what anarchists like Kropotkin have long claimed, though they do so from a studiously neutral position buttressed by quantities of statistics and carefully avoid prescriptions and conclusions about the larger social order. And yet, they are clear enough that in disaster we need an open society based on trust in which people are free to exercise their capacities for improvisation, altruism, and solidarity. In fact, we need it all the time, only most urgently in disaster.

Written by LeisureGuy

28 March 2021 at 10:57 am

Did the Black Death Rampage Across the World a Century Earlier Than Previously Thought?

leave a comment »

David Parry writes in Smithsonian:

For over 20 years, I’ve been telling the same story to students whenever I teach European history. At some point in the 14th century, the bacterium Yersinia pestis somehow moved out of the rodent population in western China and became wildly infectious and lethal to humans. This bacterium caused the Black Death, a plague pandemic that moved from Asia to Europe in just a few decades, wiping out one-third to one-half of all human life wherever it touched. Although the plague pandemic definitely happened, the story I’ve been teaching about when, where, and the history of the bacterium has apparently been incomplete, at best.

In December, the historian Monica Green published a landmark article, The Four Black Deaths, in the American Historical Review, that rewrites our narrative of this brutal and transformative pandemic. In it, she identifies a “big bang” that created four distinct genetic lineages that spread separately throughout the world and finds concrete evidence that the plague was already spreading from China to central Asia in the 1200s. This discovery pushes the origins of the Black Death back by over a hundred years, meaning that the first wave of the plague was not a decades-long explosion of horror, but a disease that crept across the continents for over a hundred years until it reached a crisis point.

As the world reels beneath the strains of its own global pandemic, the importance of understanding how humans interact with nature both today and throughout the relatively short history of our species becomes more critical. Green tells me that diseases like the plague and arguably SARS-CoV-2 (before it transferred into humans in late 2019 causing Covid-19) are not human diseases, because the organism doesn’t rely on human hosts for reproduction (unlike human-adapted malaria or tuberculosis). They are zoonotic, or animal diseases, but humans are still the carriers and transporters of the bacteria from one site to the other, turning an endemic animal disease into a deadly human one.

The Black Death, as Monica Green tells me, is “one of the few things that people learn about the European Middle Ages.” For scholars, the fast 14th-century story contained what Green calls a “black hole.” When she began her career in the 1980s, we didn’t really know “when it happened, how it happened, [or] where it came from!” Now we have a much clearer picture.

“The Black Death and other pre-modern plague outbreaks were something everyone learned about in school, or joked about in a Monty Python-esque way. It wasn’t something that most of the general public would have considered particularly relevant to modernity or to their own lives,” says Lisa Fagin Davis, executive director of the Medieval Academy of America. But now, “with the onset of the Covid-19 pandemic, suddenly medieval plagues became relevant to everyone everywhere.”

The project that culminated in Green’s article unfolded over many years. She says that the first step required paleogenetic analysis of known victims of the plague, including a critical study 2011Paleogenetics is the study of preserved organic material—really any part of the body or the microbiome, down to the DNA—of long dead organisms. This means that if you can find a body, or preferably a lot of bodies, that you’re sure died in the Black Death, you can often access the DNA of the specific disease that killed them and compare it to both modern and other pre-modern strains.

This has paid off in numerous ways. First, as scientists mapped the genome, they first put to rest long lingering doubts about the role Y. pestis played in the Black Death (there was widespread but unsubstantiated speculation that other diseases were at fault). Scientists mapped the genome of the bacterium and began building a dataset that revealed how it had evolved over time. Green was in London in 2012 just as findings on the London plague cemetery came out confirming without a doubt both the identity of the bacterium and the specific genetic lineage of the plague that hit London in June 1348. “The Black Death cemetery in London is special because it was created to accommodate bodies from the Black Death,” she says, “and then when [the plague wave] passed, they closed the cemetery. We have the paperwork!”

Green established herself as the foremost expert in medieval women’s healthcare with her work on a medical treatise known as The Trotula. Her careful analysis of manuscript traditions revealed that some of the text was attributable to a southern Italian woman, Trota. Other sections, though, revealed male doctors’ attempts to take over the market for women’s health. It’s a remarkable text that prepared Green for her Black Death project not only by immersing her in the history of medicine, but methodologically as well. Her discipline of philology, the study of the development of texts over time, requires comparing manuscripts to each other, building a stemma, or genealogy of texts, from a parent or original manuscript. She tells me that this is precisely the same skill one needs to read phylogenetic trees of mutating bacteria in order to trace the history of the disease.

Still, placing the Black Death in 13th-century Asia required more than genetic data. Green needed a . . .

Continue reading.

Written by LeisureGuy

28 March 2021 at 10:13 am

The forgotten medieval fruit with a vulgar name

leave a comment »

Zaria Gorvett writes at BBC of a once-popular fruit now almost forgotten:

In 2011, archaeologists found something unusual in a Roman toilet.

The team were excavating the ancient village of Tasgetium (now Eschenz, Switzerland), ruled by a Celtic king who was personally given the land by Julius Caesar. It was built on the banks of the river Rhine, along what was then an important trade route – and as a result, its remains have been steeped in water ever since. What should have rotted away centuries ago was uncovered in a remarkable state of preservation, protected by the lack of oxygen in the boggy conditions.

It was here that, nestled among the remains of familiar foods such as plums, damsons, cherries, peaches and walnuts in an ancient cesspit, the archaeologists found 19 curiously large seeds. Though they were, let’s say, “deposited” there nearly 2,000 years ago, they almost looked fresh enough to have been found yesterday – except that the fruit they belong to is now so obscure, it can baffle even professional botanists.

The polite, socially acceptable name by which it’s currently known is the medlar. But for the best part of 900 years, the fruit was called the “open-arse” – thought to be a reference to the appearance of its own large “calyx” or bottom. The medlar’s aliases abroad were hardly more flattering. In France, it was variously known as “la partie postérieure de ce quadrupede” (the posterior part of this quadruped), “cu d’singe” (monkey’s bottom), “cu d’ane” (donkey’s bottom), and cul de chien (dog’s bottom)… you get the idea.

And yet, medieval Europe was crazy about this fruit.

The first record of the medlar’s existence is a fragment of Greek poetry from the 7th Century BC. Eventually the fruit is thought to have fallen into the hands of the Romans, who brought it to southern France and Britain. In 800AD, Charlemagne included it on a list of plants that were mandatory in the king’s many gardens, and nearly 200 years later, the English abbot and writer Ælfric of Eynsham first committed its rather rude sobriquet to the public record.

From there, the fruit’s popularity steadily increased. It became a staple of medieval monasteries and royal courtyards, as well as public spaces such as village greens.

It’s featured in Chaucer’s Canterbury Tales, Shakespeare’s Romeo and Juliet, and the two-time queen consort Anne of Brittany’s Book of Hours – a kind of illustrated religious manuscript popular in the Middle Ages. Henry VIII had the medlar planted at Hampton Court, and gifted his French counterpart with large quantities.

The fruit reached its peak in the 1600s when it was widely grown across England – as ordinary as apples, pears, mulberries and quince. From this lofty pinnacle, it underwent a steady decline. It was still widely known until the early 20th Century, though less celebrated. Then in the 1950s it abruptly vanished from the public consciousness altogether.

Once a household name, described by one Roman commentator as amounting “almost to a craze“, now the medlar is primarily grown as a romantic relic from the past – a niche plant for eccentric gardeners and a historical curiosity at palaces and museums.

Just a few decades after it disappeared, it was already mysterious to many greengrocers. In 1989, one American academic wrote that “probably not one in a hundred” botanists had seen a medlar. Today it’s not sold at a single British supermarket. Where there are still plants growing in public spaces, they often go unrecognised and are left to rot on the ground.

What was it about this strange fruit that gripped medieval Europe, and why did it disappear? . . .

Continue reading.

Written by LeisureGuy

28 March 2021 at 10:07 am

The Spectacular Rise of Ornamental Plants

leave a comment »

MIT Press has an article excerpted from George Gessert’s book Green Light: Toward an Art of Evolution:

Aesthetic appeal may have played a role in the domestication of plants and animals, but the rise of pure ornamentals, that is, plants cultivated only for their aesthetic characteristics, is a much later development. Long after the emergence of urban civilization, ornamental and economic uses of plants seem not to have been distinguished. For example, the elegant gardens depicted in Egyptian tombs of the 18th Dynasty (ca. 1415 BCE) consisted, as far as we can tell, of multiple-use plants. Among those that have been identified are date palms, grapes, pomegranates, papyruses, and figs.

A few Egyptian tomb paintings show flowering plants that may have been pure ornamentals, but could just as well have been medicinals. Even blue water lilies, which are ubiquitous in Egyptian art, were more than symbolic and ornamental. The rhizomes of Nymphaea caerulea yield a powerful hallucinogen that the Egyptians probably used to make contact with the gods.

The earliest gardens that seem to have been intended primarily for pleasure were in Mesopotamia. The Gilgamesh epic, which refers to events in 2700 BCE, contains descriptions of what may have been ornamental gardens; however, the first unmistakable evidence of plants cultivated for pleasure is from Assyria. There, kings had hunting preserves and parklike tree plantations. Tiglath Pilesar I, who reigned about 1100 BCE, brought back cedars and box from lands he conquered. Other Assyrian kings left records of parks planted with palms, cypresses, and myrrh.

We do not know what these parks looked like. The first nonutilitarian gardens that can be loosely reconstructed date from the sixth century BCE. The Hanging Gardens of Babylon were created by Nebuchadnezzar, who, the story goes, built them for his Persian wife, who was homesick for the mountains of her childhood. Babylon was situated on a river plain. The terraced gardens, which covered three or four acres, were said to resemble a green mountain. The earliest records of the Hanging Gardens are by the Greek historians Diodorus and Strabo, but no remains have ever been found. However, remnants of Cyrus the Great’s (ca. 585-ca. 529 BCE) garden at Pasargadae still exist. It had trees and shrubs planted symmetrically in plots.

Records of Mesopotamian parks and gardens emphasize trees. Why trees rather than flowers? In the case of Cyrus the Great’s garden, only the remains of trees and shrubs have survived the centuries. Herbaceous plants, if they existed, have vanished. The Greeks, whose records we must rely on for much of our information about Mesopotamian gardens, were not horticulturally advanced, and may have been unduly impressed by the largest, most obvious plants. Still, trees were almost certainly important features of Mesopotamian gardens. Trees provide shade, a necessity in that part of the world, with its intense light and scorching heat.

In addition to their utilitarian value, many trees are architecturally pleasing, and have symbolic and social significance. Like other agricultural peoples, the Mesopotamians cleared land for crops and cut trees for wood. Near towns and cities, groves left uncut may have gradually disappeared because cattle, sheep, and goats grazed and trampled seedlings, allowing no new trees to grow. When forests are reduced to memories, surviving remnants may take on new meanings. Groves can become emblematic of the past, and sacred. They can also become indicators of wealth and worldly power.

The same meanings do not necessarily accrue to smaller flowering plants. Agriculture and herding eliminate many kinds of small plants, but . . .

Continue reading.

Written by LeisureGuy

27 March 2021 at 4:22 pm

Posted in Books, Daily life, Environment, History, Science

Tagged with

A scorching reply to Georgia’s vile new voting law unmasks a big GOP lie

leave a comment »

Greg Sargent writes in the Washington Post:

The 2020 elections in Georgia should have been cause for celebration among everyone, not just Democrats who won the state’s presidential and Senate races. Amid extremely challenging conditions, election officials took smart, public-spirited steps to ensure that as many voters as possible could participate.

And it worked. Turnout was high on Election Day and during the Senate runoffs, especially among African American voters.

That should have been widely cheered. Yet it’s precisely what the state’s Republican officials apparently want to ensure never happens again.

Georgia Republicans just passed a far-reaching voter suppression law that is shockingly blatant in its efforts to restrict voting. It was signed Thursday by Gov. Brian Kemp (R), as one Democratic lawmaker who sought to watch was arrested.

In multiple ways, the measure appears designed to target African American voters, the very voters who drove the 2020 Democratic wins. That complaint is at the core of a new lawsuit filed on Thursday night against the law.

But the lawsuit also exposes — in a fresh way — the appalling dishonesty of Republicans who continue using former president Donald Trump’s lie about the election to justify voter suppression efforts everywhere.

Voter suppression on steroids

Most conspicuously, the new law bars third-party groups from sharing food and water with people waiting in voting lines. It imposes new ID requirements for vote-by-mail, restricts drop boxes for mail ballots and bans mobile voting places, among many other things.

The lawsuit by several voting rights groups — represented by Democratic lawyer Marc Elias — argues that the package unduly burdens the voting rights of all Georgians, disproportionately African Americans, violating the Voting Rights Act and the Constitution.

The lawsuit cites the extremely high voter turnout in the general and runoff elections, facilitated amid a raging pandemic by vote-by-mail, which was used by African American voters at higher rates than White voters.

The law is largely targeted toward that fact, the lawsuit argues. Restrictions on drop boxes and mobile voting units come after both were heavily utilized in Fulton County, a populous, majority-Black area. African Americans are more likely to use drop boxes because they more often work multiple jobs, the suit argues.

Meanwhile, bans on sharing food and water target the fact that voting lines and wait times tend to be longer in African American areas. And Black voters are disproportionately less likely to have the right ID to qualify to vote by mail, the lawsuit argues.

The critical point is that the past election worked, due to the very practices Republicans now want to curb. Organizers distributed food and water, enabling voters to brave lines. Election officials used expanded vote-by-mail, drop boxes and mobile units to facilitate pandemic voting.

“This successful mobilization was widely heralded as crucial in facilitating Black voter turnout,” the lawsuit notes. Which is precisely the problem, the lawsuit argues: What Republicans want to avert is another such “successful mobilization.”

Republicans give away the game

The justification that Republicans themselves offer for these measures gives away the real game here. Defenders say they are needed to ensure the integrity of future elections and boost public confidence in them.

But the elections in Georgia actually were conducted with absolute integrity, and the Republican secretary of state has himself attested to this. That official, Brad Raffensperger, declared the elections “safe” and “secure.”

This caused Raffensperger to become the target of Trump’s rage. But that doesn’t mean what Raffensperger said isn’t true. It is true.

This was confirmed in a statewide audit. Indeed, Raffensperger has attested to the integrity of Georgia elections more generally, declaring: “Georgia’s voting system has never been more secure or trustworthy.”

Which raises the question: Why are these new measures needed, if Georgia elections are already secure and trustworthy? Why, to avert another “successful mobilization.”

As the lawsuit argues, the very fact that GOP election officials confirmed the integrity of Georgia elections shows the measures “serve no legitimate purpose or compelling state interest other than to make absentee, early, and election-day voting more difficult — especially for minority voters.” . . .

Continue reading.

The column concludes:

p class=”font–body font-copy gray-darkest ma-0 pb-md ” data-el=”text”>All this points to a bigger lie. All across the country, Republicans are escalating voter suppression efforts, fake-justified by the lie that the election was stolen from Trump.

In the softer version of this, it’s fake-justified by the notion that many Republican voters believe that to be true and just need their “confidence” restored to ensure future participation.

But the real way to restore such confidence is to tell voters the truth: That the election was an inspiring success amid very difficult conditions — and its outcome was unimpeachably legitimate — precisely because of the integrity of election workers everywhere. Grounds for confidence in future elections

Written by LeisureGuy

26 March 2021 at 2:37 pm

Trump Complains Government Is ‘Persecuting’ Capitol Rioters

leave a comment »

The situation in the US is actively getting more dangerous because Donald Trump is leading and fomenting an already-violent insurrection against the government — against the administration, really, to force someone — Congress, Georgia Governor, Mitch McConnell, Mike Pence… anyone — to provide an election “count” sufficient to put Trump back in the White House. And he’s not going to shut up until he is in the White House. He’ll butt into every situation he can. As we’ve seen, he has zero sense of shame and zero decorum.

Jonathan Chait writes in New York:

One of the most dangerous, long-lasting changes effected by Donald Trump is the rightward extension of the Republican coalition. A wide array of far-right militias and cults was either created or inspired to join the Republican Party by Trump’s racist, paranoid, and authoritarian rhetoric. Now those groups are the subject of regular apologias in party-aligned media.

The new reality was driven home in Trump’s interview with Laura Ingraham Thursday night. At one point, the Fox News host, whose “interview” was more like an exchange of talking points, brought up a new report that the Homeland Security Department will be giving more attention to right-wing domestic extremism. “The idea is to identify people who may, through their social-media behavior, be prone to influence by toxic messaging spread by foreign governments, terrorists, and domestic extremists,” Ingraham noted. “Mr. President, their DHS is going after people who may be your supporters.”

It is worth pausing for a moment to record that Ingraham’s reaction to a description of people “prone to influence by toxic messaging spread by foreign governments, terrorists, and domestic extremists” is hey, they’re talking about us!

Trump, taking the cue, denounced federal authorities for charging his supporters with crimes. “They go after that, I guess you’d call them leaning toward the right … those people, they’re arresting them by the dozens,” he complained.

Ingraham did not follow up by asking who was being arrested by the dozens. But Trump’s answer became clear a few questions later. Ingraham prompted him with a safe question about the security fencing around the Capitol, a precaution even Democrats have deemed excessive long after the insurrection ended.

Rather than simply denounce the fencing, Trump launched into . . .

Continue reading.

Written by LeisureGuy

26 March 2021 at 2:14 pm

When lies come home to roost

leave a comment »

Heather Cox Richardson writes:

Last night, federal prosecutors filed a motion revealing that a leader of the paramilitary group the Oath Keepers claimed to be coordinating with the Proud Boys and another far-right group before the January 6 insurrection.

After former President Donald Trump tweeted that his supporters should travel to Washington, D.C., on January 6 for a rally that “will be wild!,” Kelly Meggs, a member of the Oath Keepers, wrote on Facebook: “He wants us to make it WILD that’s what he’s saying. He called us all to the Capitol and wants us to make it wild!!! Sir Yes Sir!!! Gentlemen we are heading to DC pack your s***!!”

In a series of messages, Meggs went on to make plans with another individual for an attack on the process of counting the electoral votes. On December 25, Meggs told his correspondent that “Trumps staying in, he’s Gonna use the emergency broadcast system on cell phones to broadcast to the American people. Then he will claim the insurrection act…. Then wait for the 6th when we are all in DC to insurrection.”

The Big Lie, pushed hard by Trump and his supporters, was that Trump had won the 2020 election and it had been stolen by the Democrats. Although this was entirely discredited in more than 60 lawsuits, the Big Lie inspired Trump supporters to rally to defend their president and, they thought, their country.

The former president not only inspired them to fight for him; he urged them to send money to defend his election in the courts. A story today by Allan Smith of NBC News shows that as soon as Trump began to ask for funds to bankroll election challenges, supporters who later charged the Capitol began to send him their money. Smith’s investigation found that those who have been charged in the Capitol riot increased their political donations to Trump by about 75% after the election.

In the 19 days after the election, Trump and the Republican National Committee took in more than $207 million, prompted mostly by their claims of election fraud. John Horgan, who runs the Violent Extremism Research Group at Georgia State University, told Smith that “Trump successfully convinced many of his followers that unless they acted, and acted fast, their very way of life was about to come to an end…. He presented a catastrophic scenario whereby if the election was — for him — lost, his followers would suffer as a result. He made action not just imperative, but urgent, convincing his followers that they needed to do everything they could now, rather than later, to prevent the ‘enemy’ from claiming victory.”

And yet, on Monday, Trump’s former lawyer, Sidney Powell, moved to dismiss the Dominion Voting Systems defamation lawsuit against her. Powell helped to craft the Big Lie, and won the president’s attention with her determination to combat the results of the election and restore Trump to the presidency. In January, Dominion sued Powell for $1.3 billion after her allegations that the company was part of an international Communist plot to steal the 2020 presidential election.

On Monday, Powell argued that “no reasonable person would conclude” that her statements about a scheme to rig the election “were truly statements of fact.” Eric Wilson, a Republican political technologist, explained away the Big Lie to NBC News’s Smith: “[T]here are a lot of dumb people in the world…. And a lot of them stormed the Capitol on January 6th.”

And yet, 147 Republicans—8 senators and 139 representatives—signed onto the Big Lie, voting to sustain objections to the counting of the electoral votes on January 6.

So the Republicans are left with increasing evidence that there was a concerted plan to attack the Capitol on January 6, fed by the former president, whose political campaign pocketed serious cash from his declarations that he had truly won the election and that all patriots would turn out to defend his reelection. Those claims were pressed by a lawyer who now claims that no reasonable person would believe she was telling the truth.

The Republicans tied themselves to this mess, and it is coming back to haunt them. President Biden’s poll numbers are high, with a Reuters/Ipsos poll released last Friday showing that 59% of adults approve of Biden’s overall performance. (Remember that Trump never broke 50%). They are happy with his response to the coronavirus pandemic and his handling of the economy.

Rather than trying to pass popular measures to make up the ground they have lost, Republicans are trying to suppress voting. By mid-February, in 43 states, Republicans had introduced 253 bills to restrict voting. Today, Republicans in Michigan introduced 39 more such bills. In at least 8 states, Republicans are trying to gain control over elections, taking power from nonpartisan election boards, secretaries of state, and governors. Had their systems been in place in 2020, Republicans could have overturned the will of the voters.

To stop these state laws, Democrats are trying to pass a sweeping federal voting rights bill, the For the People Act, which would protect voting, make it easier to vote, end gerrymandering, and get dark money out of politics. The bill has already passed the House, but Republicans in the Senate are fighting it with all they’ve got.

Senate Majority Leader Chuck Schumer (D-NY) told them: “This . . .

Continue reading. There’s more.

Written by LeisureGuy

24 March 2021 at 9:06 pm

Saving Collards, the South’s Signature Greens

leave a comment »

Nearly-lost collard green varieties are being preserved and propagated across the country. ALL IMAGES COURTESY OF THE HEIRLOOM COLLARD PROJECT

Debra Freeman writes in Gastro Obscura about one of my favorite greens. The only place I can readily find it here is Whole Foods, so I buy a couple of bunches on every visit.

Her article begins:

IN THE AMERICAN SOUTH, MANY people have fond memories of a pot of collard greens simmering on the stove for hours, seasoned with a ham hock and stirred by a parent or grandparent. Cousins to cauliflower and broccoli, collards are a hearty green known for their robust, slightly bitter taste and the rich, nutritious “pot liquor” they produce when cooked. These greens and their liquor have been lauded for generations, but few in the South know that there’s more than one kind of collard green. Even fewer know that there are dozens of different varieties, and that many are now on the verge of disappearing forever.

That’s where the Heirloom Collard Project comes in. By distributing and growing rare and unique collards, this massive collaboration has created ties between chefs, gardeners, farmers, and seedsmen who hope to preserve the plant’s genetic diversity.

Collards are not native to the United States. Instead, they’re Eurasian in origin, and ancient Romans and Greeks feasted on them thousands of years ago. As for how they became prevalent in the American South, scholars have a number of theories. Collard seeds may have been brought over from Portugal in the 18th century, or from the British Isles to the early colonies. However, the most prevalent theory is that enslaved Africans introduced them to the region, since collard greens were a staple crop in many parts of Africa. Historian John Egerton, in his 1987 book Southern Food, declared that “from Africa with the people in bondage came new foods,” such as okra, black-eyed peas, yams, and collard greens.

Regardless of when or how they arrived stateside, collard greens flourished in Southern gardens. 20 main varieties, from the Yellow Cabbage collard to the Old Timey Green, established themselves as garden favorites. But after World War II, many Americans moved away from both their farmland and their agricultural lifestyles. One victim of this shift was the collard green. With fewer people farming, variety after variety dropped off the map, leaving only five types that could easily be found—Georgia Green, Champion, Vates, Morris Heading, and Green Glaze.

But five years ago, Ira Wallace and the members of the Seed Savers Exchange asked the USDA for over 60 collard green varieties to plant in Iowa. Wallace, as worker/owner of the cooperatively run Southern Exposure Seed Exchange, had been promoting the versatility and resilience of collards for years. Her inspiration for spearheading the Heirloom Collard Project was a series of photos taken by Edward Davis.

Davis and John Morgan, both geography professors at Emory & Henry College, traversed the South to collect rare heirloom collards between 2003 and 2007. The pair published a book on their quest, Collards: A Southern Tradition from Seed to Table, in 2015. They then gave the dozens of collard varieties they had gathered to the USDA. When Davis shared photos of all of the collards he tracked down, Wallace knew she wanted to help make the seeds widely available once more.

The project has several goals, among them seed preservation, documenting the stories of the still-living seed stewards that Davis and Morgan met while writing their book, and, perhaps most importantly, providing seeds to companies and gardeners interested in growing these storied old varieties.

So far, many have risen to the challenge. That’s according to . . .

Continue reading. There’s quite a bit more.

Written by LeisureGuy

23 March 2021 at 12:35 pm

%d bloggers like this: