Later On

A blog written for those whose interests more or less match mine.

Archive for March 28th, 2021

Since the Civil War, voter suppression in America has had a unique cast.

leave a comment »

Heather Cox Richardson writes:

Since the Civil War, voter suppression in America has had a unique cast.

The Civil War brought two great innovations to the United States that would mix together to shape our politics from 1865 onward:

First, the Republicans under Abraham Lincoln created our first national system of taxation, including the income tax. For the first time in our history, having a say in society meant having a say in how other people’s money was spent.

Second, the Republicans gave Black Americans a say in society.

They added the Thirteenth Amendment to the Constitution, outlawing human enslavement except as punishment for crime and, when white southerners refused to rebuild the southern states with their free Black neighbors, in March 1867 passed the Military Reconstruction Act. This landmark law permitted Black men in the South to vote for delegates to write new state constitutions. The new constitutions confirmed the right of Black men to vote.

Most former Confederates wanted no part of this new system. They tried to stop voters from ratifying the new constitutions by dressing up in white sheets as the ghosts of dead southern soldiers, terrorizing Black voters and the white men who were willing to rebuild the South on these new terms to keep them from the polls. They organized as the Ku Klux Klan, saying they were “an institution of chivalry, humanity, mercy, and patriotism” intended “to protect and defend the Constitution of the United States… [and] to aid and assist in the execution of all constitutional laws.” But by this they meant the Constitution before the war and the Thirteenth Amendment: candidates for admission to the Ku Klux Klan had to oppose “Negro equality both social and political” and favor “a white man’s government.”

The bloody attempts of the Ku Klux Klan to suppress voting didn’t work. The new constitutions went into effect, and in 1868 the former Confederate states were readmitted to the Union with Black male suffrage. In that year’s election, Georgia voters put 33 Black Georgians into the state’s general assembly, only to have the white legislators expel them on the grounds that the Georgia state constitution did not explicitly permit Black men to hold office.

The Republican Congress refused to seat Georgia’s representatives that year—that’s the “remanded to military occupation” you sometimes hear about– and wrote the Fifteenth Amendment to the Constitution protecting the right of formerly enslaved people to vote and, by extension, to hold office. The amendment prohibits a state from denying the right of citizens to vote “on account of race, color, or previous condition of servitude.”

So white southerners determined to prevent Black participation in society turned to a new tactic. Rather than opposing Black voting on racial grounds—although they certainly did oppose Black rights on these grounds– they complained that the new Black voters, fresh from their impoverished lives as slaves, were using their votes to redistribute wealth.

To illustrate their point, they turned to South Carolina, where between 1867 and 1876, a majority of South Carolina’s elected officials were African American. To rebuild the shattered state, the legislature levied new taxes on land, although before the war taxes had mostly fallen on the personal property owned by professionals, bankers, and merchants. The legislature then used state funds to build schools, hospitals, and other public services, and bought land for resale to settlers—usually freedpeople—at low prices.

White South Carolinians complained that members of the legislature, most of whom were professionals with property who had usually been free before the war, were lazy, ignorant field hands using public services to redistribute wealth.

Fears of workers destroying society grew potent in early 1871, when . . .

Continue reading.

Written by LeisureGuy

28 March 2021 at 9:24 pm

“I see no color” is not the goal.

leave a comment »

Written by LeisureGuy

28 March 2021 at 7:11 pm

What Can We Learn from a Big Boat Stuck in a Canal?

leave a comment »

Matt Stoller explains in his current BIG column:

Today I’ll be writing about the big boat stuck in the Suez Canal. This situation is a very simple and dumb disruption to global trade, and it is precisely the simplicity and stupidity at work that lets us peak beneath the glossy sheen of trade happy talk that has fooled us for so long.

First, some house-keeping. A few years ago I wrote a piece in the American Conservative with national security expert Lucas Kunce on private equity and monopolies in the military defense base. Kunce is now running for Senate on an anti-monopoly platform. I don’t tend to mention political candidates in this newsletter, but I’ll put a caveat in there for people who have a bylined article with me about monopoly power. Also, this week I was on the podcast Useful Idiots with Matt Taibbi and Katie Halper to talk about why changing the business model behind big tech is better than censorship.

Finally, my organization is doing an event on health care monopolies this coming Tuesday at 2pm ET. If you are interested, you can RSVP here.

And now…

The Empire State Building Falls into the Suez Canal

In this newsletter, I do a lot of explaining about complicated problems caused by big dumb corporate institutions. I don’t have to do that this time, because the story of the mess in the Suez is so simple. “After years of bitcoin and reddit short selling and credit default swaps and a million other things I don’t understand,” one random person put in a tweet that went viral, “it’s so refreshing to hear that global commerce is in peril because a big boat got stuck in a canal.”

That’s basically the story right there, it’s a big boat and it got stuck in a canal. The ship blocking the Suez, called the Ever Given, weights 220,000 tons, and is as long as the Empire State Building is high. Despite the hilarious nature of the problem, the disruption to world trade is large and serious, costing tens of billions of dollars. And if the ship can’t be dislodged soon, some consumers will once again experience shortages of basic staples like toilet paper.

That said, the reason this disruption to global commerce seems so dumb is because it is. It starts with the ship size itself. Over the last few decades, ships have gotten really really big, four times the size of what they were 25 years ago, what the FT calls “too big to sail.’ The argument behind making such massive boats was efficiency, since you can carry more at a lower cost. The downside of such mega-ships should have been obvious. Ships like this, which are in effect floating islands, are really hard to steer in tight spaces like ports and canals, and if they get stuck, they are difficult to unstick. In other words, the super smart wizard financiers who run global trade made ships that don’t fit in the canals they need to fit into.

The rise of mega-ships is paralleled by the consolidation of the shipping industry itself. In 2000, the ten biggest shipping companies had a 12% market share, by 2019 that share had increased to 82%. This understates the consolidation, because there are alliances among these shippers. The stuck ship is being run by the Taiwanese shipping conglomerate Evergreen, which bought Italian shipping firm Italia Marittima in 1998 and London-based Hatsu in 2002, and is itself part of the OCEAN alliance, which has more than a third of global shipping.

Making ships massive, and combining such massive ships into massive shipping monopolies, is a bad way to run global commerce. We’ve already seen significant problems from big shipping lines helping to transmit financial shocks into trade shocks, such as when Korean shipper Hanjin went under and stranded $14 billion of cargo on the ocean while in bankruptcy. It’s also much harder for small producers and retailers to get shipping space, because large shippers want to deal with large clients. And fewer ports can handle these mega-ships, so such ships induce geographical inequality. Increasingly, we’re not moving ships between cities, we’re moving cities to where the small number of giant shipping lines find it efficient to ship.

Dumb big ships owned by monopolies are the result of dumb big ideas, the physical manifestation of what Thomas Friedman was pushing in the 1990s and 2000s with books such as The Lexus and the Olive Tree and The World is Flatthe idea that “taking fat out of the system at every joint” was leading towards a more prosperous, peaceful and competitive world. Friedman’s was a finance-friendly perspective, a belief that making us all interdependent with a very thin margin of error would force global cooperation.

Just make ships bigger, went the thinking, until . . .

Continue reading. There’s more.

Written by LeisureGuy

28 March 2021 at 6:29 pm

Three groundbreaking journalists saw the Vietnam War differently. It’s no coincidence they were women.

leave a comment »

Cambodian Prime Minister Long Boret, center, meets with war correspondent Elizabeth Becker in Cambodia in 1974. (Elizabeth Becker)

Margaret Sullivan writes in the Washington Post:

Frances FitzGerald paid her own way into Vietnam. She was an “on spec” reporter with no editor to guide her, no office to support her, and no promise that anyone would publish what she wrote about the war.

She knocked out her first article on a blue Olivetti portable typewriter she had carried from New York and mailed it the cheap and slow way from a post office in the heart of Saigon’s French quarter to the Village Voice, nearly 9,000 miles away.

It arrived, and on April 21, 1966, the Voice published FitzGerald’s indictment of the chaotic U.S. war policy.

“The result was a highly original piece written in the style of an outsider, someone who asked different questions and admitted when she didn’t have answers,” wrote Elizabeth Becker in her new book, “You Don’t Belong Here: How Three Women Rewrote the Story of War,” which celebrates the work of FitzGerald, Kate Webb and Catherine Leroy.

Becker, a former war correspondent in Cambodia toward the end of the decades-long conflict, wrote about these women in part because she had experienced much of what they did — just a little later, and with appreciation for the paths they’d broken.

“I went through it at the tail end, and they were my role models,” Becker told me last week. She admired them because they had broken gender barriers, endured sexual harassment and been belittled by journalistic peers who thought women had no place near a war zone.

But “I wanted to write more than a ‘breaking the glass ceiling’ book,” said Becker, who has broken a few of her own: It’s likely that, as a stringer in Cambodia in the early 1970s, she was the first woman to regularly report from a war zone for The Washington Post. Later, she became the senior foreign editor at NPR and a New York Times correspondent.

What struck Becker about her subjects went far beyond gender. It was the women’s approach to their work. They were more interested in people than in battlefields, quicker to see the terrible cost of violence to the Vietnamese as well as to Westerners, less likely than many of their male colleagues to swallow the government’s party line.

“They brought this common humanity and an originality to their work,” Becker said.

Remarkably early, FitzGerald clearly described what American officials didn’t want the public to see: the chaos, the lack of sensible purpose.

“For the Embassy here the problem has not been how to deal with the crisis — there is no way to deal with it under U.S. Standard Operating Procedures — but rather how to explain what is happening in any coherent terms,” she wrote in that 1966 article for the Voice. . .

Continue reading. There’s more.

Written by LeisureGuy

28 March 2021 at 6:22 pm

The Real Reason Republicans Couldn’t Kill Obamacare

leave a comment »

Adapted from The Ten Year War: Obamacare and the Unfinished Crusade for Universal Coverage, St. Martin’s Press 2021, and quoted from the Atlantic:

The affordable care act, the health-care law also known as Obamacare, turns 11 years old this week. Somehow, the program has not merely survived the GOP’s decade-long assault. It’s actually getting stronger, thanks to some major upgrades tucked in the COVID-19 relief package that President Joe Biden signed into law earlier this month.

The new provisions should enable millions of Americans to get insurance or save money on coverage they already purchase, bolstering the health-care law in precisely the way its architects had always hoped to do. And although the measures are temporary, Biden and his Democratic Party allies have pledged to pass more legislation making the changes permanent.

The expansion measures are a remarkable achievement, all the more so because Obamacare’s very survival seemed so improbable just a few years ago, when Donald Trump won the presidency. Wiping the law off the books had become the Republicans’ defining cause, and Trump had pledged to make repeal his first priority. As the reality of his victory set in, almost everybody outside the Obama White House thought the effort would succeed, and almost everybody inside did too.

One very curious exception was Jeanne Lambrew, the daughter of a doctor and a nurse from Maine who was serving as the deputy assistant to the president for health policy. As a longtime Obama adviser, going back to the 2008 transition, Lambrew was among a handful of administration officials who had been most responsible for shaping his health-care legislation and shepherding it through Congress—and then for overseeing its implementation. Almost every other top official working on the program had long since left government service for one reason or another. Lambrew had stayed, a policy sentry unwilling to leave her post.

On that glum November 2016 day following the election, Lambrew decided to gather some junior staffers in her office and pass out beers, eventually taking an informal survey to see who thought Obama’s signature domestic-policy achievement would still be on the books in a year. Nobody did—except Lambrew.

Yes, Republicans had already voted to repeal “Obamacare” several times. But, she knew, they had never done so with real-world consequences, because Obama’s veto had always stood in the way. They’d never had to think through what it would really mean to take insurance away from a hotel housekeeper or an office security guard on Medicaid—or to tell a working mom or dad that, yes, an insurance company could deny coverage for their son’s or daughter’s congenital heart defect.

A repeal bill would likely have all of those effects. And although Republicans could try to soften the impact, every adjustment to legislation would force them to sacrifice other priorities, creating angry constituents or interest groups and, eventually, anxious lawmakers. GOP leaders wouldn’t be able to hold the different camps within their caucuses together, Lambrew believed, and the effort would fail.

All of those predictions proved correct. And that wasn’t because Lambrew was lucky or just happened to be an optimist. It was because she knew firsthand what most of the Republicans didn’t: Passing big pieces of legislation is a lot harder than it looks.

It demands unglamorous, grinding work to figure out the precise contours of rules, spending, and revenue necessary to accomplish your goal. It requires methodical building of alliances, endless negotiations among hostile factions, and making painful compromises on cherished ideals. Most of all, it requires seriousness of purpose—a deep belief that you are working toward some kind of better world—in order to sustain those efforts when the task seems hopeless.

Democrats had that sense of mission and went through all of those exercises because they’d spent nearly a century crusading for universal coverage. It was a big reason they were able to pass their once-in-a-generation health-care legislation. Republicans didn’t undertake the same sorts of efforts. Nor did they develop a clear sense of what they were trying to achieve, except to hack away at the welfare state and destroy Obama’s legacy. Those are big reasons their legislation failed.

Obamacare’s survival says a lot about the differences between the two parties nowadays, and not just on health care. It’s a sign of how different they have become, in temperament as much as ideology, and why one has shown that it’s capable of governing and the other has nearly forgotten how.

Democrats were so serious about health care that they began planning what eventually became the Affordable Care Act more than a decade earlier, following the collapse of Bill Clinton’s reform attempt in the 1990s. The ensuing political backlash, which saw them lose control of both the House and Senate, had left top Democrats in no mood to revisit the issue. But reform’s champions knew that another opportunity would come, because America’s sick health-care system wouldn’t heal itself, and they were determined not to make the same mistakes again.

At conferences and private dinners, on chat boards and in academic journals, officials and policy advisers obsessively analyzed what had gone wrong and why—not just in 1993 and 1994 but in the many efforts at universal coverage that had come before. They met with representatives of the health-care industry as well as employers, labor unions, and consumer advocates. Industry lobbyists had helped kill reform since Harry Truman’s day. Now they were sitting down with the champions of reform, creating a group of “strange bedfellows” committed to crafting a reform proposal they could all accept.

Out of these parallel efforts, a rough consensus on substance and strategy emerged. Democrats would put forward a plan that minimized disruption of existing insurance arrangements, in order to avoid scaring people with employer coverage, and they would seek to accommodate rather than overpower the health-care industry. The proposal would err on the side of less regulation, spending, and taxes—basically, anything that sounded like “big government”—and Democrats would work to win over at least a few Republicans, because that would probably be necessary in Congress.

Proof of concept came in 2006, in Massachusetts, when its Republican governor, Mitt Romney, teamed up with the Democratic state legislature to pass a plan that fit neatly into the new vision. It had the backing from a broad coalition, including insurers and progressive religious organizations. Ted Kennedy, the liberal icon and U.S. senator, played a key role, by helping secure changes in funding from Washington that made the plan possible. “My son said something … ‘When Kennedy and Romney support a piece of legislation, usually one of them hasn’t read it,’” Kennedy joked at the signing ceremony, standing at Romney’s side.

Kennedy’s endorsement said a lot about the psychology of Democrats at the time. No figure in American politics was more closely associated with the cause of universal health care and, over the years, he had tried repeatedly to promote plans that looked more like the universal-coverage regimes abroad, with the government providing insurance directly in “single-payer” systems that resembled what today we call “Medicare for All.” But those proposals failed to advance in Congress, and Kennedy frequently expressed regret that, in the early 1970s, negotiations over a more private sector-oriented coverage plan with then-President Richard Nixon had broken down, in part because liberals were holding out for a better deal that never materialized.

Kennedy was not alone in his belief that the champions of universal coverage would have to accept big concessions in order to pass legislation. The liberal House Democrats John Dingell, Pete Stark, and Henry Waxman, veteran crusaders for universal coverage who’d accrued vast power over their decades in Congress, were similarly willing to put up with what they considered second-, third-, and even fourth-best solutions—and they were masters of the legislative process, too. Waxman in particular was an expert at doing big things with small political openings, such as inserting seemingly minor adjustments to Medicaid into GOP legislation, expanding the program’s reach over time. “Fifty percent of the social safety net was created by Henry Waxman when no one was looking,” Tom Scully, who ran Medicare and Medicaid for the Bush administration in the early 2000s, once quipped.

Obama had a similar experience putting together health-care legislation in the Illinois state legislature—where, despite proclaiming his support for the idea of a single-payer system, he led the fight for coverage expansions and universal coverage by working with Republicans and courting downstate, more conservative voters. He also was a master of policy detail, and as president, when it was time to stitch together legislation from different House and Senate versions, he presided over meetings directly (highly unusual for a president) and got deep into the weeds of particular programs.

Obama could do this because the concept of universal coverage fit neatly within . . .

Continue reading. There’s much more.

Later in the column:

Another problem was a recognition that forging a GOP consensus on replacement would have been difficult because of internal divisions. Some Republicans wanted mainly to downsize the Affordable Care Act, others to undertake a radical transformation in ways they said would create more of an open, competitive market. Still others just wanted to get rid of Obama’s law and didn’t especially care what, if anything, took its place.

“The homework that hadn’t been successful was the work to coalesce around a single plan, a single set of specific legislative items that could be supported by most Republicans,” Price told me. “Clearly, looking at the history of this issue, this has always been difficult for us because there are so many different perspectives on what should be done and what ought to be the role of the federal government in health care.”

The incentive structure in conservative politics didn’t help, because it rewarded the ability to generate outrage rather than the ability to deliver changes in policy. Power had been shifting more and more to the party’s most extreme and incendiary voices, whose great skill was in landing appearances on Hannity, not providing for their constituents. Never was that more apparent than in 2013, when DeMint, Senator Ted Cruz of Texas, and some House conservatives pushed Republicans into shutting down the government in an attempt to “defund” the Affordable Care Act that even many conservative Republicans understood had no chance of succeeding.

The failure to grapple with the complexities of American health care and the difficult politics of enacting any kind of change didn’t really hurt Republicans until they finally got power in 2017 and, for the first time, had to back up their promises of a superior Obamacare alternative with actual policy. Their solution was to minimize public scrutiny, bypassing normal committee hearings so they could hastily write bills in the leadership offices of House Speaker Paul Ryan and, after that, Senate Majority Leader Mitch McConnell.

Written by LeisureGuy

28 March 2021 at 4:52 pm

Puddles: Tears, butterflies, and the shootings in Atlanta

leave a comment »

Sabrina Imbler writes in Sierra, the magazine of the Sierra Club:

This past week, I have been trying to figure out if a puddle is a body of water.

According to Wikipedia, a body of water is defined as a significant accumulation of water, such as an ocean, a sea, or a lake. When geographers map out bodies of water, they include oceans and lakes, perhaps even ponds, but not puddles. A puddle is defined by a small accumulation of water on a surface. I have to wonder, is “small” significant? What about “very small”? How much water must you hold to be considered a body of water?

As a mixed Asian American person, I have spent a lifetime trying to understand how small something like an experience can be and still be considered significant. How small I can be and still be significant.

I have been thinking about puddles because they are the only bodies of water I see nowadays. In Brooklyn, where I live, puddles accumulate by sidewalks and surround intersections, meaning you have to look down to know where to step. Sometimes, after rainfall but before the murk and trash sets in, you can see a glimmer of yourself, or how you are seen.

Last spring, amid a first wave of lockdowns—after my mom sent me an email cautioning me, an Asian asthmatic, not to cough in public—a man spit at me, maybe. I wasn’t sure. He was standing on a corner and I had just walked past him on the otherwise empty street. His spit landed on my shoe, and I faltered for a second but kept walking. When I looked back, I saw him watching me. When he didn’t say anything, I figured I was assuming too much, that I had been the one to intrude in his pre-planned spitting, that it was ingloriously vain of me to assume that he meant to spit on me. A few blocks away, surrounded by brownstones and shuttered shops—no storefront glass in sight—I looked at myself in a puddle as if this could answer my question. I saw a face mask and a beanie and then the only part of my face that was exposed: my eyes. I returned from my destination—a Japanese restaurant converted into a grocery store—and passed by a mailbox with a directive in Sharpie: Go back to China! As I walked home, I wondered, was this significant?

I have been thinking about puddles this past week because I have been crying, in fits and bursts, leaking enough tears and mucus that I could form a very small, probably insignificant, puddle. I did not cry when I learned about the shooting at the spas in Atlanta—where a white man shot eight people, six of whom were Asian women—but I cried later that night, while I was brushing my teeth. I am not a woman, but I am reminded constantly by strangers that I am seen as a woman, objectified as an Asian woman. I thought about the images I’d seen in past months of Asian elders shovedassaulted, and slashed, many of whom lived in towns near where my own grandparents live. My grandpa, a 98-year-old man who wears flat caps and speaks mostly in Mandarin these days, walks around his neighborhood for an hour each day. I wondered, should I ask him to stop?

I do not mean to equate my Asian American experience with the experience of the women killed in Atlanta. Asian massage workers face violence, racism, and sexism every day, Elene Lam, the executive director of Butterfly, a support network for Asian and migrant sex workers in Toronto, told The Cut. Their work is stigmatized, precarious, criminalized, and overpoliced, regardless of whether they are sex workers. They may lack legal protections or be excluded from other jobs due to their immigration status or language barriers. “Those women were assumed to be sex workers & therefore not worthy of safety,” tweeted the writer and social worker Kai Cheng Thom in a thread about the shootings. I felt frustrated at the futility of my tears; they were not helping the victims or the families left to grieve the losses of their daughters, mothers, grandmothers.

When I was in high school, I learned that puddles, bereft of flow, could become vectors of disease. Standing water is dangerous because it is a breeding ground for mosquitoes that spread diseases such as malaria and dengue. I did not learn until much later that when Chinese women began immigrating to California in the 19th century, white health professionals and legislators cast these women as a threat to American morality and a contagion to public health. The president of the American Medical Association warned of a (completely fictitious) sexually transmitted disease that was only carried by Chinese women, Mari Uyehara writes in The Nation. In 1875, the US passed the Page Act, which effectively banned Chinese women from immigrating.

Puddles may not be significant to geographers, but they are significant to wildlife, particularly butterflies. Adult butterflies can only consume liquids, which they imbibe through their spiraling proboscises. They subsist almost entirely on a diet of leaves and nectar, foods rich in sugar but devoid of sodium. Butterflies must seek out sodium elsewhere in liquid form. So they resort to what’s known as puddling, seeking out minerals in water and damp substrates. Shallow puddles are safer havens for such small creatures than the surging currents of rivers or depths of a pond. Butterflies in Sulawesi, . . .

Continue reading. The conclusion is powerful.

Written by LeisureGuy

28 March 2021 at 11:35 am

Elite panic

leave a comment »

I have observed, as perhaps you have as well, that wealth seems to make people fearful, and as wealth increases more and more stringent forms of security are embraced. Rebecca Solnit has an interesting Facebook post on this pathology. She writes:

The marauding hordes of the underclass is a topic of constant fantasy among elites, so much so two of the sociologists I cited in A Paradise Built in Hell labeled this delusion “elite panic.” It often justifies what you could call marauding hordes of the overclass — suppressing the people they assume are bestial but also at some level they acknowledge are legitimately resentful of social inequality, which they [the overclass] are willing to use violence to perpetuate.

In a way the premise of white supremacy is “your imaginary violence is the justification for my real violence,” and here’s Graham trotting that out as “the violence I imagine could happen in extreme situations is my justification for pushing instruments of extreme violence into everyday life.”

Those sociologists also demonstrate that most people are altruistic, generous, resourceful, and helpful in disasters. Note the alignment of racist fantasies here — gangs, cops, white people with weapons of war. But what that violence from elites and authorities is really used for is to maintain the status quo, and there’s a way mass shootings do so, as attacks on women, immigrants, people of color, perceived enemies to be punished by people who have allocated the right to punish unto death.

From the book Disasters: A Sociological Approach, sociologist Kathleen Tierney, who directs the University of Colorado’s Natural Hazards Center, gave a riveting talk at the University of California, Berkeley, for the centennial of the 1906 earthquake. In the talk she stated, “Elites fear disruption of the social order, challenges to their legitimacy.” She reversed the image of a panicking public and a heroic minority to describe what she called “elite panic.” She itemized its ingredients as “fear of social disorder; fear of poor, minorities and immigrants; obsession with looting and property crime; willingness to resort to deadly force; and actions taken on the basis of rumor.”

In other words, it is the few who behave badly and the many who rise to the occasion. And those few behave badly not because of facts but of beliefs: they believe the rest of us are about to panic or become a mob or upend property relations, and in their fear they act out to prevent something that may have only existed in their imaginations. Thus the myth of malevolent disaster behavior becomes something of a self-fulfilling prophesy. Elsewhere she adds, “The media emphasis on lawlessness and the need for strict social control both reflects and reinforces political discourse calling for a greater role for the military in disaster management. Such policy positions are indicators of the strength of militarism as an ideology in the United States.”

From their decades of meticulous research, most of the disaster sociologists have delineated a worldview in which civil society triumphs and existing institutions often fail during disaster. They quietly endorse much of what anarchists like Kropotkin have long claimed, though they do so from a studiously neutral position buttressed by quantities of statistics and carefully avoid prescriptions and conclusions about the larger social order. And yet, they are clear enough that in disaster we need an open society based on trust in which people are free to exercise their capacities for improvisation, altruism, and solidarity. In fact, we need it all the time, only most urgently in disaster.

Written by LeisureGuy

28 March 2021 at 10:57 am

Did the Black Death Rampage Across the World a Century Earlier Than Previously Thought?

leave a comment »

David Parry writes in Smithsonian:

For over 20 years, I’ve been telling the same story to students whenever I teach European history. At some point in the 14th century, the bacterium Yersinia pestis somehow moved out of the rodent population in western China and became wildly infectious and lethal to humans. This bacterium caused the Black Death, a plague pandemic that moved from Asia to Europe in just a few decades, wiping out one-third to one-half of all human life wherever it touched. Although the plague pandemic definitely happened, the story I’ve been teaching about when, where, and the history of the bacterium has apparently been incomplete, at best.

In December, the historian Monica Green published a landmark article, The Four Black Deaths, in the American Historical Review, that rewrites our narrative of this brutal and transformative pandemic. In it, she identifies a “big bang” that created four distinct genetic lineages that spread separately throughout the world and finds concrete evidence that the plague was already spreading from China to central Asia in the 1200s. This discovery pushes the origins of the Black Death back by over a hundred years, meaning that the first wave of the plague was not a decades-long explosion of horror, but a disease that crept across the continents for over a hundred years until it reached a crisis point.

As the world reels beneath the strains of its own global pandemic, the importance of understanding how humans interact with nature both today and throughout the relatively short history of our species becomes more critical. Green tells me that diseases like the plague and arguably SARS-CoV-2 (before it transferred into humans in late 2019 causing Covid-19) are not human diseases, because the organism doesn’t rely on human hosts for reproduction (unlike human-adapted malaria or tuberculosis). They are zoonotic, or animal diseases, but humans are still the carriers and transporters of the bacteria from one site to the other, turning an endemic animal disease into a deadly human one.

The Black Death, as Monica Green tells me, is “one of the few things that people learn about the European Middle Ages.” For scholars, the fast 14th-century story contained what Green calls a “black hole.” When she began her career in the 1980s, we didn’t really know “when it happened, how it happened, [or] where it came from!” Now we have a much clearer picture.

“The Black Death and other pre-modern plague outbreaks were something everyone learned about in school, or joked about in a Monty Python-esque way. It wasn’t something that most of the general public would have considered particularly relevant to modernity or to their own lives,” says Lisa Fagin Davis, executive director of the Medieval Academy of America. But now, “with the onset of the Covid-19 pandemic, suddenly medieval plagues became relevant to everyone everywhere.”

The project that culminated in Green’s article unfolded over many years. She says that the first step required paleogenetic analysis of known victims of the plague, including a critical study 2011Paleogenetics is the study of preserved organic material—really any part of the body or the microbiome, down to the DNA—of long dead organisms. This means that if you can find a body, or preferably a lot of bodies, that you’re sure died in the Black Death, you can often access the DNA of the specific disease that killed them and compare it to both modern and other pre-modern strains.

This has paid off in numerous ways. First, as scientists mapped the genome, they first put to rest long lingering doubts about the role Y. pestis played in the Black Death (there was widespread but unsubstantiated speculation that other diseases were at fault). Scientists mapped the genome of the bacterium and began building a dataset that revealed how it had evolved over time. Green was in London in 2012 just as findings on the London plague cemetery came out confirming without a doubt both the identity of the bacterium and the specific genetic lineage of the plague that hit London in June 1348. “The Black Death cemetery in London is special because it was created to accommodate bodies from the Black Death,” she says, “and then when [the plague wave] passed, they closed the cemetery. We have the paperwork!”

Green established herself as the foremost expert in medieval women’s healthcare with her work on a medical treatise known as The Trotula. Her careful analysis of manuscript traditions revealed that some of the text was attributable to a southern Italian woman, Trota. Other sections, though, revealed male doctors’ attempts to take over the market for women’s health. It’s a remarkable text that prepared Green for her Black Death project not only by immersing her in the history of medicine, but methodologically as well. Her discipline of philology, the study of the development of texts over time, requires comparing manuscripts to each other, building a stemma, or genealogy of texts, from a parent or original manuscript. She tells me that this is precisely the same skill one needs to read phylogenetic trees of mutating bacteria in order to trace the history of the disease.

Still, placing the Black Death in 13th-century Asia required more than genetic data. Green needed a . . .

Continue reading.

Written by LeisureGuy

28 March 2021 at 10:13 am

The forgotten medieval fruit with a vulgar name

leave a comment »

Zaria Gorvett writes at BBC of a once-popular fruit now almost forgotten:

In 2011, archaeologists found something unusual in a Roman toilet.

The team were excavating the ancient village of Tasgetium (now Eschenz, Switzerland), ruled by a Celtic king who was personally given the land by Julius Caesar. It was built on the banks of the river Rhine, along what was then an important trade route – and as a result, its remains have been steeped in water ever since. What should have rotted away centuries ago was uncovered in a remarkable state of preservation, protected by the lack of oxygen in the boggy conditions.

It was here that, nestled among the remains of familiar foods such as plums, damsons, cherries, peaches and walnuts in an ancient cesspit, the archaeologists found 19 curiously large seeds. Though they were, let’s say, “deposited” there nearly 2,000 years ago, they almost looked fresh enough to have been found yesterday – except that the fruit they belong to is now so obscure, it can baffle even professional botanists.

The polite, socially acceptable name by which it’s currently known is the medlar. But for the best part of 900 years, the fruit was called the “open-arse” – thought to be a reference to the appearance of its own large “calyx” or bottom. The medlar’s aliases abroad were hardly more flattering. In France, it was variously known as “la partie postérieure de ce quadrupede” (the posterior part of this quadruped), “cu d’singe” (monkey’s bottom), “cu d’ane” (donkey’s bottom), and cul de chien (dog’s bottom)… you get the idea.

And yet, medieval Europe was crazy about this fruit.

The first record of the medlar’s existence is a fragment of Greek poetry from the 7th Century BC. Eventually the fruit is thought to have fallen into the hands of the Romans, who brought it to southern France and Britain. In 800AD, Charlemagne included it on a list of plants that were mandatory in the king’s many gardens, and nearly 200 years later, the English abbot and writer Ælfric of Eynsham first committed its rather rude sobriquet to the public record.

From there, the fruit’s popularity steadily increased. It became a staple of medieval monasteries and royal courtyards, as well as public spaces such as village greens.

It’s featured in Chaucer’s Canterbury Tales, Shakespeare’s Romeo and Juliet, and the two-time queen consort Anne of Brittany’s Book of Hours – a kind of illustrated religious manuscript popular in the Middle Ages. Henry VIII had the medlar planted at Hampton Court, and gifted his French counterpart with large quantities.

The fruit reached its peak in the 1600s when it was widely grown across England – as ordinary as apples, pears, mulberries and quince. From this lofty pinnacle, it underwent a steady decline. It was still widely known until the early 20th Century, though less celebrated. Then in the 1950s it abruptly vanished from the public consciousness altogether.

Once a household name, described by one Roman commentator as amounting “almost to a craze“, now the medlar is primarily grown as a romantic relic from the past – a niche plant for eccentric gardeners and a historical curiosity at palaces and museums.

Just a few decades after it disappeared, it was already mysterious to many greengrocers. In 1989, one American academic wrote that “probably not one in a hundred” botanists had seen a medlar. Today it’s not sold at a single British supermarket. Where there are still plants growing in public spaces, they often go unrecognised and are left to rot on the ground.

What was it about this strange fruit that gripped medieval Europe, and why did it disappear? . . .

Continue reading.

Written by LeisureGuy

28 March 2021 at 10:07 am

Does a low-carb/ketogenic diet help diabetes? or make it worse?

leave a comment »

As it turns out, a low-carb/ketogenic diet can reduce diabetes symptoms (high blood glucose readings) — as aspirin can reduce a fever — while having no effect on the disease — as aspirin will not cure pneumonia. In fact, it’s even worse: a low-carb/ketogenic diet can reduce the symptoms while making the disease worse. It is an example of “bending the needle”: responding to a dangerous situation, where the needle on the gauge has moved into the red zone, by bending the needle so it’s no longer in the red: not really a solution and can lead to disaster.

Watch this brief video (and persist through the awkward metaphors in the middle: he does return to study results).

And for a more detailed explanation of how a low-carb/ketogenic diet has detrimental effects on one’s health:

Written by LeisureGuy

28 March 2021 at 6:42 am

%d bloggers like this: