Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Memes’ Category

Capitalism and democracy are not synonyms

leave a comment »

Heather Cox Richardson:

All day, I have been coming back to this: How have we arrived at a place where 90% of Americans want to protect our children from gun violence, and yet those who are supposed to represent us in government are unable, or unwilling, to do so?

This is a central problem not just for the issue of gun control, but for our democracy itself.

It seems that during the Cold War, American leaders came to treat democracy and capitalism as if they were interchangeable. So long as the United States embraced capitalism, by which they meant an economic system in which individuals, rather than the state, owned the means of production, liberal democracy would automatically follow.

That theory seemed justified by the fall of the Soviet Union in 1991. The crumbling of that communist system convinced democratic nations that they had won, they had defeated communism, their system of government would dominate the future. Famously, in 1992, political philosopher Francis Fukuyama wrote that humanity had reached “the universalization of Western liberal democracy as the final form of human government.” In the 1990s, America’s leaders believed that the spread of capitalism would turn the world democratic as it delivered to them global dominance, but they talked a lot less about democracy than they did about so-called free markets.

In fact, the apparent success of capitalism actually undercut democracy in the U.S. The end of the Cold War was a gift to those determined to destroy the popular liberal state that had regulated business, provided a basic social safety net, and invested in infrastructure since the New Deal. They turned their animosity from the Soviet Union to the majority at home, those they claimed were bringing communism to America. “​​For 40 years conservatives fought a two-front battle against statism, against the Soviet empire abroad and the American left at home,” right-wing operative Grover Norquist said in 1994. “Now the Soviet Union is gone and conservatives can redeploy. And this time, the other team doesn’t have nuclear weapons.”

Republicans cracked down on Democrats trying to preserve the active government that had been in place since the 1930s. Aided by talk radio hosts, they increasingly demonized their domestic political opponents. In the 1990 midterm elections, a political action committee associated with House Republican whip Newt Gingrich gave to Republican candidates a document called “Language: A Key Mechanism of Control.” It urged candidates to label Democrats with words like “decay,” “failure,” “crisis,” “pathetic,” “liberal,” “radical,” “corrupt,” and “taxes,” while defining Republicans with words like “opportunity,” “moral,” “courage,” “flag,” “children,” “common sense,” “hard work,” and “freedom.” Gingrich later told the New York Times his goal was “reshaping the entire nation through the news media.”

Their focus on capitalism undermined American democracy. They objected when the Democrats in 1993 made it easier to . . .

Continue reading.

Written by Leisureguy

26 May 2022 at 12:03 am

The reinvention of a ‘real man’

leave a comment »

Cultural change comes slowly, one person at a time, each one a paradigm shift from one way of understanding how the world works (that is, understanding the interplay and intermeshing of individuals in their cultural matrix) to another way. Because people are to a large extent — in their outlook, their values, their behaviors — an assemblage of memes, patterns learned through imitation and taught the same way — changing a culture means changing those who live within it (and within whom the culture lives). This is slow work, particularly since many if not most will view such a change as a threat almost as real as death: if they become different as a person, the person they now are will no longer exist as a person, and that threat to identity is as frightening as a threat to life, for it is indeed the life of that identity that’s at stake.

Jose A. Del Real reports in the Washington Post about a public health worker who is trying to change the cultural view of manhood (gift link, no paywall).

— In BUFFALO, Wyoming

Bill Hawley believes too many men are unwilling or unable to talk about their feelings, and he approaches each day as an opportunity to show them how.

“There’s my smile,” he says to a leathered cowboy in the rural northeast Wyoming town where he lives.

“I could cry right now thinking about how beautiful your heart is,” he says to a middle-aged male friend at work.

“After our conversation last week, your words came back to me several times,” he tells an elderly military veteran in a camouflage vest. “Make of that what you will, but it meant something to me.”

On paper, Bill is the “prevention specialist” for the public health department in Johnson County, a plains-to-peaks frontier tract in Wyoming that is nearly the size of Connecticut but has a population of 8,600 residents. His official mandate is to connect people who struggle with alcohol and drug abuse, tobacco addiction, and suicidal impulses to the state’s limited social service programs. Part bureaucrat, part counselor, much of Bill’s life revolves around Zoom calls and subcommittees, government acronyms and grant applications.

But his mission extends beyond the drab county building on Klondike Drive where he works. One Wyoming man at a time, he hopes to till soil for a new kind of American masculinity.

His approach is at once radical and entirely routine.

It often begins with a simple question.

“How are you feeling?” Bill asks the man in camouflage, who lives in the Wyoming Veterans’ Home, which Bill visits several times a week. Bill recently convinced him to quit smoking cigarettes.

The man lumbers forward on a walker, oxygen tank attached.

“We can talk about triggers for a hot minute, or six, or 10,” Bill encourages him. “All those things are going to try to sneak up on you and trick you.”

“I’ve got a whole bunch of triggers,” the 72-year-old veteran responds, finally, between violent coughs. “Well they’re called triggers, but they never go away.”

Here in cowboy country, the backdrop and birthplace of countless American myths, Bill knows “real men” are meant to be stoic and tough. But in a time when there are so many competing visions of masculinity — across America and even across Wyoming — Bill is questioning what a real man is anyway.

Often, what he sees in American men is despair.

Across the United States, men accounted for 79 percent of suicide deaths in 2020, according to a Washington Post analysis of new data from the Centers for Disease Control and Prevention, which also shows Wyoming has the highest rate of suicide deaths per capita in the country. A majority of suicide deaths involve firearms, of which there are plenty in Wyoming, and alcohol or drugs are often a factor. Among sociologists, the Mountain West is nicknamed “The Suicide Belt.”

More and more, theories about the gender gap in suicides are focused on the potential pitfalls of masculinity itself.

The data also contains a sociological mystery even the experts are unsure how to explain fully: Of the 45,979 people who died by suicide in the United States in 2020, about 70 percent were White men, who are just 30 percent of the country’s overall population. That makes White men the highest-risk group for suicide in the country, especially in middle age, even as they are overrepresented in positions of power and stature in the United States. The rate that has steadily climbed over the past 20 years.

Some clinical researchers and suicidologists are now asking whether there is something particular about White American masculinity worth interrogating further. The implications are significant: On average, there are more than twice as many deaths by suicide than by homicide each year in the United States.

Bill, who is 59 years old and White, is working out his own theory. It has to do with the gap between . . .

Continue reading. (gift link, no paywall)

Written by Leisureguy

23 May 2022 at 11:03 am

What the Vai Script Reveals About the Evolution of Writing

leave a comment »

Peirs Kelly in Sapiens describes an interesting example of the evolution of a set of memes. He writes:

In a small West African village, a man named Momolu Duwalu Bukele had a compelling dream. A stranger approached him with a sacred book and then taught him how to write by tracing a stick on the ground. “Look!” said the spectral visitor. “These signs stand for sounds and meanings in your language.”

Bukele, who had never learned to read or write, found that after waking he could no longer recall the precise signs the stranger revealed to him. Even so, he gathered the male members of his family together to reverse engineer the concept of writing. Working through the day and into the following night, the men devised a system of 200 symbols, each standing for a word or a syllable of their native Vai language. For millennia, varieties of the Vai language had been passed down from parents to children—but before this moment no speaker had ever recorded a single word in writing.

This took place in about 1833 in a region that would soon become the independent nation of Liberia. Vai, one of about 30 Indigenous languages of Liberia, has nearly 200,000 speakers today in the Cape Mount region that borders Sierra Leone.

Within just a few generations, Bukele’s invention was being used for penning letters, engraving jewelry, drafting carpentry plans, keeping personal diaries, and managing accounts. Vai people manufactured their own ink from crushed berries and even built schools for teaching the new system. The script was so successful that other Indigenous groups in the region were inspired to create their own; since the 1830s, at least 27 new scripts have been invented for West African languages.

Today the Vai writing system is taught at the University of Liberia and is even popular among students who are not themselves ethnically Vai. The Vai script has been included in the Unicode Standard, which means Vai speakers with smartphones can now exchange text messages in the script.


As a linguistic anthropologist, I am fascinated by the Vai discovery—and especially how the script has become critical for understanding the evolution of writing itself.

It’s not the first time in recent history that a new writing system has been invented from scratch. In the 1820s, the nonliterate polymath Sequoyah created a special script for his native Cherokee language, and similar Indigenous inventions have emerged elsewhere in the world on the margins of expanding colonies. But the evolution of Vai has been especially well-documented, making it a useful case study for researchers of writing.

In a recently published paper, my colleagues and I show that over the past two centuries the letter shapes in the Vai script have evolved via “compression”—a process by which written signs are gradually reproduced with less visual detail while conveying the same amount of linguistic information.

The theory that written signs compress over time has a long history with several versions. For instance,

Continue reading.

Written by Leisureguy

20 May 2022 at 8:21 pm

When Libertarianism is put into practice, it fails

leave a comment »

Libertarians tend to argue from logic, but as Oliver Wendell Holmes, Jr. observed, “The life of the law has not been logic; it has been experience.” Things that make sense logically — for example, that heavy objects fall faster than light objects — often fail the test of experience, and so it is with Libertarianism.

This came to mind when I saw a comment on Facebook that whenever Socialism has been tried, it has failed. I strongly suspect the commenter does not know that a Socialist and a Social Democrat are not the same thing, and that in fact Social Democratic countries have been highly successful (currently more successful than the US by a variety of measures: health, education, happiness, and so on).

It occurred to me that it would be useful to collect links to various accounts of what happens when Libertarianism is put into practice, and rather than merely a logical idea becomes a lived experience. Here are some I’ve collected. I’ll add to this last as I come across others:

  • Sears Corporation — It’s been observed that corporations, internally, are run on Socialist principles: centralized planning, shared costs and shared resources, mutual cooperation in terms of achieving the overall goal. In some areas, competition prevails — for example, sales people often compete (though care must be taken: one way to win competitions is to cheat by crippling competitors in one way or another). But one CEO had the insight and fortitude to switch his corporation — Sears — to a Libertarian structure and approach.
  • Van Ormy, “The Freest Little City in Texas” — Government is law as lived, government being the working out of a network of laws. So when an entire city switched to operate on Libertarian principles, the experience of Libertarianism reveals the shortcomings of its logic.
  • Grafton NH and the Free State Project — You may think that the root cause of Van Ormy’s problems was that it was in Texas (home of power-grid failures, with one such failure resulting in the deaths of around 700 people), but Libertarianism’s failures are not limited by state. The movement in New Hampshire and its effect on the small town of Grafton shows that the problem is Libertarianism, not the state. Here’s a different review of the book.
  • Colorado Springs — This city provides a good example of unenlightened self-interest at work, and shows the god-awful products of that work.

Ayn Rand’s novels show the fictional triumphs of an idea, but it is Quixotic (literally) to confuse fiction with reality and to cling to logic while ignoring experience.

Update: See also “The True History of Libertarianism in America: A Phony Ideology to Promote a Corporate Agenda.”

Written by Leisureguy

18 May 2022 at 11:40 am

Social media as agents in the breakdown of shared understanding

leave a comment »

The above image is from an interesting and useful article by the Center for Humane Technology. The article begins:

In our last newsletter, we unpacked why technology is never neutral. Social media is no exception. Social media doesn’t simply reflect society; it shapes society. 

The world we see through social media is distorted, like looking into a funhouse mirror. These distortions are negative externalities of an advertising-driven, engagement-maximizing business model, which affects people and relationships in myriad ways.


  1. The Extreme Emotion Distortion 🥵 occurs as users have access to virtually unlimited amounts of personalized, emotional content, any user can find overwhelming evidence for their deeply held beliefs. This situation creates contradicting “evidence-based” views, resulting in animosity and fracturing of our collective sensemaking. 
  2. The Information Flooding Distortion 🤯 happens as algorithms and bots flood or curate the information users see based on their likelihood to engage with it, resulting in users believing that what is popular (e.g., hashtags, comments, trends) is public consensus, when in fact it can also be manipulated distortion. 
  3. The Micro-Targeting Distortion 🔬 happens as . . .

Continue reading.

Later in the article:


These distortions don’t just affect individuals. Over time these distortions warp society’s perception of reality, breaking down our ability to find shared understanding.

Shared understanding is needed for . . .

It’s important to note that the article is not simply a jeremiad. It includes:


We can uphold open society values by enabling . . .

The whole thing is worth reading.

I highly recommend subscribing to their (free) newsletter, The Catalyst.

Written by Leisureguy

15 May 2022 at 5:49 am

The Strange Afterlife of George Carlin

leave a comment »

Dave Itzkoff has a good NY Times article on George Carlin (gift link, no paywall). It begins:

In the closing monologue from a recent episode of his HBO talk show, Bill Maher cataloged a series of social conditions that he suggested were hampering stand-up comedy and imperiling free speech: cancel culture, a perceived increase of sensitivity on college campuses, and Will Smith slapping Chris Rock at the Oscars.

Near the end of his remarks, Maher invoked the comedian George Carlin, a personal hero whose iconoclastic spirit, he seemed to believe, could never thrive in such a thin-skinned and overly entitled era. “Oh, George,” he said, “it’s a good thing you’re dead.”

Carlin, the cantankerous, longhaired sage who used his withering insight and gleefully profane vocabulary to take aim at American hypocrisy, died in 2008. But in the years since, it can feel like he never really left us.

On an almost daily basis, parts of Carlin’s routines rise to the surface of our discourse, and he is embraced by people who span the political spectrum — they may rarely agree with each other, but they are certain that Carlin would agree with them.

Carlin’s rueful 1996 routine about conservatives’ opposition to abortion (“they will do anything for the unborn, but once you’re born, you’re on your own”) became a newly viral phenomenon and was shown on a recent broadcast of the MSNBC program “11th Hour.” A video clip of a Carlin bit about how Americans are ravenous for war (“so we’re good at it, and it’s a good thing we are — we’re not very good at anything else anymore!”) has been tweeted by Representative Ilhan Omar, Democrat of Minnesota. On the right-wing website Breitbart, Carlin has been cited as an expert on bipartisanship (“the word bipartisan usually means some larger-than-usual deception is being carried out”) and hailed as a rebel who didn’t acquiesce to authority.

Carlin is a venerated figure in his chosen field who unites performers as disparate as Joe Rogan and Jim Gaffigan, but he’s also someone whose influence transcends comedy. He is a touchstone shared by . . .

Continue reading (gift link, no paywall).

Written by Leisureguy

11 May 2022 at 10:40 am

Posted in Daily life, Memes, Politics

Trust in science has become increasingly partisan

leave a comment »

It is difficult for people to trust what they do not understand, and Republicans, with their rejection of education and embrace of ignorance, do not understand science much at all, plus their leaders fervently advocate a distrust of science. And to exacerbate the problem, those who do understand the science fail to understand how to communicate effectively. The result is shown in the graph above, taken from an interesting article by Monica Potts in FiveThirtyEight, which begins:

By September 2021, the scientists and staffers at the Arkansas Game and Fish Commission had gathered enough data to know that the trees in its green-tree reservoirs — a type of hardwood wetland ecosystem — were dying. At Hurricane Lake, a wildlife management area of 17,000 acres, the level of severe illness and death in the timber population was up to 42 percent, especially for certain species of oak, according to a 2014 forest-health assessment. The future of another green-tree reservoir, Bayou Meto, more than 33,000 acres, would look the same if they didn’t act quickly.

There were a lot of reasons the trees were dying, but it was also partly the commission’s fault. Long ago, the Mississippi and Arkansas rivers and their tributaries would have flooded the bayous naturally, filling bottomland forests during the winter months when the trees were dormant and allowing new saplings to grow after the waters receded in the spring. Widespread European settlement and agriculture largely halted the natural flooding, but in the 1950s, the Arkansas Game and Fish Commission began buying bottomland forests for preservation, which it then flooded with a system of levees and other tools.

This made the forests an ideal winter stop for ducks to eat and rest on their annual migration south. Arkansas is a magnet for duck hunters, and the state has issued more than 100,000 permits for duck hunters from Arkansas and out of state for every year since 2014. But it turned out the commission was flooding the reservoirs too early and at levels too high, which was damaging the trees. The ducks that arrive in Arkansas especially love eating the acorns from a certain species of oak — and those oaks are now dying.

Austin Booth, director of the Arkansas Game and Fish Commission, knew that convincing the state’s duck hunters and businesses that there was a serious problem would be tricky. Part of the solution the commission planned to propose to save the trees involved delaying the annual fall flooding, which could mean less habitat for the ducks, fewer ducks stopping in the area and more duck hunters crowded into smaller spaces fighting over targets.

And all the duck hunters would have their own ideas about who to blame for the problem and what the solution should be.

Last September, Booth gave a brief speech that was streamed live on YouTube, outlining the problem. He announced a series of public meetings to begin in the following months. Booth told me that when he began to plan those meetings, he thought of all the government meetings and town halls he’d attended after years working in politics. “I wanted to ratchet down some of the intensity that happens when a government official stands up on a stage and talks down to people,” he said.

Instead, he decided the meetings would be dinners where the Game and Fish staff would eat alongside the people they sought to convince. “I just believe there’s a human component to sitting down and having a meal with someone,” he said. At those dinners, he’d give a brief introduction, then invite people to ask questions of the staff as they ate and mingled. 

At the end of the dinners, Booth said he’d stand up again and ask, “Is there anyone that’s going to walk through that door tonight without their questions answered or comments taken for the record, or with their concerns ignored?” No one, he said, came forward. The four dinners were attended by between 50 and 100 people, according to Booth, but those attendees then spread the word, dampening criticism of the new management system.

What’s interesting about this dinner program is that it began during the COVID-19 pandemic, which also required effective science communication to convince the public to accept changes, major and minor, to their lives. Even before this pandemic, there’s been a long history of resistance to public health measures and new vaccines, and many researchers suspected that could likely be the case with COVID-19 as well. The social scientists who study these issues might have counseled an approach like that employed by the Arkansas Game and Fish Commission, using local messengers who had relationships with the communities in question and who could communicate in less intimidating ways.

But the U.S. did not do that with COVID-19. Instead, rapidly changing information came from only a few sources, usually at the national level and seemingly without much strategy. And as such, many places have seen widespread resistance to public health interventions, like wearing masks and getting the vaccine. 

The intensely local, personal way that Arkansas Game and Fish approached this challenge is difficult, time-consuming and perhaps not always the most practical. But it shows the kind of intensity it takes to communicate an urgent problem, and may provide lessons for how to approach the next big problems — whether that’s another pandemic, an ecological disaster or something bigger and more existential, like climate change.

Before the pandemic, Matthew Motta, a political science professor at Oklahoma State University, and his colleagues Timothy Callaghan, Steven Sylvester, Kristin Lunz Trujillo and Christine Crudo Blackburn studied parents’ hesitancy about giving their kids routine vaccinations, like those for measles, mumps and rubella. Reasons varied, and the most prominent was conspiratorial thinking.1 Some parents who delayed their children’s vaccines also held strong ideas about . . .

Continue reading. There’s much more.

Written by Leisureguy

11 May 2022 at 9:42 am

Beat out that rhythm on your feet — podorythmie from Quebec

leave a comment »

This morning I chanced across an essay in the NY Times by Eric Boodman (gift link, no paywall), a reporter for STAT who has written for The Atlantic, Undark, and other publications.  His essay begins:

When I was 17 or so, I worked evenings at a dentist’s office. At first, it carried the thrill of a secret world: The office building was locked — just me and the janitors and the whir of the autoclave. Then it was stultifying. I worked for only two hours at a time, but those two hours stretched out endlessly, a canvas for my teenage dread and insecurity. The families I was calling with appointment reminders often mistook me for a machine. I was there to develop some kind of work ethic, but all I could think about was the awful, oobleck-like quality of time. I tried singing between calls. I looked for constellations in the ceiling tiles. What I remember working best — what still works, when I feel the trapped-bug flutter of a panic attack starting up — is foot percussion.

It’s a ubiquitous sound in Québécois traditional music, a galloping pattern that musicians beat out with their shoes while playing, giving them a Dick-Van-Dyke-like dynamism. If you wanted to be fancy and ethno-musicological, you’d call it podo-rythmie, from the Greek for “feet” and “rhythm.” If you wanted to be down home and colloquial, it would just be tapage de pieds, or foot tapping. In English, it’s sometimes referred to as “doing feet.” It’s the secret weapon that allows a lone fiddler to make a whole room get up and dance.

At my high school in downtown Montreal, my classmates were . . .

Continue reading (gift link, no paywall).

That sparked my curiosity, so I went to YouTube and first found a set of three short basic instructional videos (first, second, third), and then a video of a conversation (with foot-tapping) between two practitioners, one of whom (Alain Lamontagne) originated the term podorythmie. The conversation video was followed by a demonstration with the two playing (violin and harmonica), with rhythmic foot-tapping.

With those as background, I understood more of actual performance, such as this 4-minute clip:


Written by Leisureguy

30 April 2022 at 6:59 am

Posted in Daily life, Memes, Music, Video

Why the School Wars Still Rage

leave a comment »

In the New Yorker of March 14, 2022, Jill Lepore has an article directly related to the Washington Post article in the previous post. These school wars are, to my mind, exactly why some so strongly resist the introduction of critical thinking skills into the school curriculum: they at some level recognize that their own positions and beliefs will not stand up to critical, reasoned thinking.

Just as some do not want some scientific theories taught, or certain books read or analyzed, they also do not want students to learn thinking skills that might call into question ideas strongly embraced. In particular such parents do not want their own children learning — and even worse, practicing — critical thinking skills.

Those who lack such skills do have strong feelings, and generally they are keenly aware that they have a right to those feelings. They do not understand the benefits of subjecting one’s feelings to questioning and reasoning and logic, particularly when they view those feelings as part of their identity. One advantage of a liberal education is that students routinely subject their own feelings and ideas to this sort of critical thinking, and in so doing they acquire familiarity with the process and know from experience that it is not so threatening or harmful as those who have not tried such an exercise imagine, that instead the exercise leads to the shedding of failed ideas and a deeper understanding of the ideas that survive.

One advantage of manmade physical structures — say, a building or a motorcycle or a loaf of bread — is that when they fail, the failure is physically evident and hard to deny. The failure of a manmade cultural structure — an idea or philosophy — is not physically visible and, for those who have made the idea a part of their identify, impossible to see because the threat to self were the idea to fail. The stakes are so high that failure is not an option, and they will cling to the idea and reject every argument — however strong, however obvious — against it because they feel if they idea fails they will no longer exist as who they are. That is, growth and change are threats to be avoided, not things to be explored and potentially embraced.

In the list of books I frequently recommend is a book by Daniel Goleman, Vital Lies, Simple Truths: The Psychology of Self-Deception, that explains why and how people will avoid seeing things that cause psychological pain. It’s worth reading; the link is to inexpensive secondhand editions.

Lepore writes:

In 1925, Lela V. Scopes, twenty-eight, was turned down for a job teaching mathematics at a high school in Paducah, Kentucky, her home town. She had taught in the Paducah schools before going to Lexington to finish college at the University of Kentucky. But that summer her younger brother, John T. Scopes, was set to be tried for the crime of teaching evolution in a high-school biology class in Dayton, Tennessee, in violation of state law, and Lela Scopes had refused to denounce either her kin or Charles Darwin. It didn’t matter that evolution doesn’t ordinarily come up in an algebra class. And it didn’t matter that Kentucky’s own anti-evolution law had been defeated. “Miss Scopes loses her post because she is in sympathy with her brother’s stand,” the Times reported.

In the nineteen-twenties, legislatures in twenty states, most of them in the South, considered thirty-seven anti-evolution measures. Kentucky’s bill, proposed in 1922, had been the first. It banned teaching, or countenancing the teaching of, “Darwinism, atheism, agnosticism, or the theory of evolution in so far as it pertains to the origin of man.” The bill failed to pass the House by a single vote. Tennessee’s law, passed in 1925, made it a crime for teachers in publicly funded schools “to teach any theory that denies the story of the Divine Creation of man as taught in the Bible, and to teach instead that man has descended from a lower order of animals.” Scopes challenged the law deliberately, as part of an effort by the A.C.L.U. to bring a test case to court. His trial, billed as the trial of the century, was the first to be broadcast live on the radio. It went out across the country, to a nation, rapt.

A century later, the battle over public education that afflicted the nineteen-twenties has started up again, this time over the teaching of American history. Since 2020, with the murder of George Floyd and the advance of the Black Lives Matter movement, seventeen states have made efforts to expand the teaching of one sort of history, sometimes called anti-racist history, while thirty-six states have made efforts to restrict that very same kind of instruction. In 2020, Connecticut became the first state to require African American and Latino American history. Last year, Maine passed “An Act to Integrate African American Studies into American History Education,” and Illinois added a requirement mandating a unit on Asian American history.

On the blackboard on the other side of the classroom are scrawled what might be called anti-anti-racism measures. Some ban the Times’ 1619 Project, or ethnic studies, or training in diversity, inclusion, and belonging, or the bugbear known as critical race theory. Most, like a bill recently introduced in West Virginia, prohibit “race or sex stereotyping,” “race or sex scapegoating,” and the teaching of “divisive concepts”—for instance, the idea that “the United States is fundamentally racist or sexist,” or that “an individual, by virtue of his or her race or sex, is inherently racist, sexist or oppressive, whether consciously or unconsciously.”

While all this has been happening, I’ve been working on a U.S.-history textbook, so it’s been weird to watch lawmakers try their hands at writing American history, and horrible to see what the ferment is doing to public-school teachers. In Virginia, Governor Glenn Youngkin set up an e-mail tip line “for parents to send us any instances where they feel that their fundamental rights are being violated . . . or where there are inherently divisive practices in their schools.” There and elsewhere, parents are harassing school boards and reporting on teachers, at a time when teachers, who earn too little and are asked to do too much, are already exhausted by battles over remote instruction and mask and vaccine mandates and, not least, by witnessing, without being able to repair, the damage the pandemic has inflicted on their students. Kids carry the burdens of loss, uncertainty, and shaken faith on their narrow shoulders, tucked inside their backpacks. Now, with schools open and masks coming off, teachers are left trying to figure out not only how to care for them but also what to teach, and how to teach it, without losing their jobs owing to complaints filed by parents.

There’s a rock, and a hard place, and then there’s a classroom. Consider the dilemma of teachers in New Mexico. In January, the month before the state’s Public Education Department finalized a new social-studies curriculum that includes a unit on inequality and justice in which students are asked to “explore inequity throughout the history of the United States and its connection to conflict that arises today,” Republican lawmakers proposed a ban on teaching “the idea that social problems are created by racist or patriarchal societal structures and systems.” The law, if passed, would make the state’s own curriculum a crime.

Evolution is a theory of change. But in February—a hundred years, nearly to the day, after the Kentucky legislature debated the nation’s first anti-evolution bill—Republicans in Kentucky introduced a bill that mandates the teaching of twenty-four historical documents, beginning with the 1620 Mayflower Compact and ending with Ronald Reagan’s 1964 speech “A Time for Choosing.” My own account of American history ends with the 2021 insurrection at the Capitol, and “The Hill We Climb,” the poem that Amanda Gorman recited at Joe Biden’s Inauguration. “Let the globe, if nothing else, say this is true: / That even as we grieved, we grew.”

Did we, though? In the nineteen-twenties, the curriculum in question was biology; in the twenty-twenties, it’s history. Both conflicts followed a global pandemic and fights over public education that pitted the rights of parents against the power of the state. It’s not clear who’ll win this time. It’s not even clear who won last time. But the distinction between these two moments is less than it seems: what was once contested as a matter of biology—can people change?—has come to be contested as a matter of history. Still, this fight isn’t really about history. It’s about political power. Conservatives believe they can win midterm elections, and maybe even the Presidency, by whipping up a frenzy about “parents’ rights,” and many are also in it for another long game, a hundred years’ war: the campaign against public education.

Before states began deciding what schools would require—from textbooks to vaccines—they had to require children to attend school. That happened in the Progressive era, early in the past century, when a Progressive strain ran through not only the Progressive Party but also the Republican, Democratic, Socialist, and Populist Parties. Lela and John Scopes grew up in Paducah, but they spent part of their childhood in Illinois, which, in 1883, became one of the first states in the Union to make school attendance compulsory. By 1916, nearly every state had mandated school attendance, usually between the ages of six and sixteen. Between 1890 and 1920, a new high school opened every day.

Some families objected, citing “parental rights,” a legal novelty, but courts broadly upheld compulsory-education laws, deeming free public schooling to be essential to democratic citizenship. “The natural rights of a parent to the custody and control of his infant child are subordinate to the power of the state, and may be restricted and regulated by municipal laws,” the Indiana Supreme Court ruled in 1901, characterizing a parent’s duty to educate his children as a “duty he owes not to the child only, but to the commonwealth.” As Tracy Steffes argues in “School, Society, and State: A New Education to Govern Modern America, 1890-1940” (2012), “Public schooling was not just one more progressive reform among many but a major—perhaps the major—public response to tensions between democracy and capitalism.” Capitalism divided the rich and the poor; democracy required them to live together as equals. Public education was meant to bridge the gap, as wide as the Cumberland.

Beginning in the eighteen-nineties, states also introduced textbook laws, in an attempt to wrest control of textbook publishing from what Progressives called “the book trust”—a conglomerate of publishers known as the American Book Company. Tennessee passed one of these laws in 1899: it established a textbook commission that selected books for adoption. The biology book Scopes used to teach his students was a textbook that Tennessee had adopted, statewide, at a time when it made high school compulsory.

“Each year the child is coming to belong more and more to . . .

Continue reading. There’s much more.

Written by Leisureguy

18 April 2022 at 7:34 am

A History of the Pocket 

leave a comment »

Having pockets frees one’s hands, and so even 5300 years ago people had pockets (as we know from the clothing worn by “Otzi,” a frozen human found in a glacier. Cynthia Anderson writes a pocket history at Folkware. It begins:

Every now and again one gets to witness a societal shift up close. As they say, “times they are a changing” or at least coming into clearer focus for all to see. As my Grandmother would have said… the flap jack not only has been flipped, but has landed out of the pan with the revealing side up! The realization that things are not exactly as they may have been portrayed, is where we are at. The question is what do we do with this peeling back of the veneer? History has many sides and the truthful telling of the collective experience is the only thing that leads to a truly shared history. We at Folkwear have always felt the need and responsibility to educate ourselves and others about the historical and ethnic patterns we represent and promote. The unassuming pocket could seem like a less controversial place to start. Once again perceptions have been flipped. Knowing is understanding.

The pocket seems like such a simple and humble feature. A hidden, yet secure place within ones clothing to conveniently hold and keep items with you as you go about your daily life. Pockets have been around a long time and as it turns out they have a history more interesting and sordid than you might have imagined.  How is it, that something as practical and hidden as a pocket, could be subtly manipulated and denied to half the population through out history?

When you consider a pocket as a perfect metaphor for something that can be taken for granted, then you can begin to see the privilege it embodies.

This focus of this blog follows the lineage of the European pocket history tree.

Think about it . . . compare the closets of men and women. No matter how formal or casual the garments in each respective closet, there is a huge disparity. That disparity is the sexist and political divide of the obscure pocket. Simply put, pockets allow freedom and choice for one sex and deny the same for the opposite sex.

To get the full pocket evolutionary picture lets start at the beginning. The pouch was the progenitor to the pocket.

The oldest proof (so far) of a human sporting a pocket-like feature was a mummified fellow found frozen in the alps in 1991. Otzi or “Iceman”, as he is now known, is thought to have lived around 3,300 BCE. At 5,300 years old Otzi’s was found to be a perfectly preserved and clothed specimen of the ancient world. Otzi had held his plethora of secrets well, as enthusiastic researchers were to discover. One of the most interesting items Otzi was wearing, was a pouch that was sewn to his belt. The contents of his pouch held a cache of useful items including; a scraper, drill, flint flake, bone awl, and a bit of dried fungus. This link to the ancient world just goes to show how the need to carry about useful things has always been relevant.

The medieval period was a time when at least pouches were equal among the sexes. Men and women in the 13th century carried items in small pouches made of leather or cloth that were tied to their waists by rope. These pouches hung innocently on their outer clothing for the world to see. As societies grew and became more urban-like, crime swiftly followed. Hence, the pouch and its contents were hidden from view. Men wore their pouches tied to the body under their jackets and tunics. Women wore their pouches tied at their waists under their skirts. Slits were cut in clothing to make for easy access to the pouch. This prevented having to disrobe, which in a sense made men and women equal pouch wise. This method of wearing pouches continued for several more centuries.

It was not until the 17th century that the pouch made a significant leap that would have a profound impact on fashion and culture, thus securing a strict unequal divide between the sexes. The modern pocket was born for men, but excluded women. The pocket experience was quite different for men than women. The jackets, waistcoats, and breeches of men had pockets sewn directly into the seams and fabric lining of their clothing much as they still are today. This compact world allowed men to conveniently carry the accouterments that their privilege and status assumed. In turn the freedom of movement in public was allotted to men as well. Men carried money, keys, weapons, tobacco, writing pencils and little note books.

In comparison women were relegated to relying on pockets with slits or top openings, that were tied around the waist sandwiched between layers of undergarments. According to the Victoria & Albert Museum the average woman in 17th century wore  . . .

Continue reading.

Written by Leisureguy

15 April 2022 at 9:45 am

Whose Story (and Country) Is This? — On the myth of a “real” America

leave a comment »

Rebecca Solnit writes at Literary Hub:

Watching the film Phantom Thread, I kept wondering why I was supposed to be interested in a control freak who is consistently unpleasant to all the people around him. I kept looking at the other characters—his sister who manages his couture business, his seamstresses, eventually the furniture (as a child, I read a very nice story about the romance between two chairs)—wondering why we couldn’t have a story about one of them instead.

Who gets to be the subject of the story is an immensely political question, and feminism has given us a host of books that shift the focus from the original protagonist—from Jane Eyre to Mr. Rochester’s Caribbean first wife, from Dorothy to the Wicked Witch, and so forth. But in the news and political life, we’re still struggling over whose story it is, who matters, and who our compassion and interest should be directed at.

The common denominator of so many of the strange and troubling cultural narratives coming our way is a set of assumptions about who matters, whose story it is, who deserves the pity and the treats and the presumptions of innocence, the kid gloves and the red carpet, and ultimately the kingdom, the power, and the glory. You already know who. It’s white people in general and white men in particular, and especially white Protestant men, some of whom are apparently dismayed to find out that there is going to be, as your mom might have put it, sharing. The history of this country has been written as their story, and the news sometimes still tells it this way—one of the battles of our time is about who the story is about, who matters and who decides.

It is this population we are constantly asked to pay more attention to and forgive even when they hate us or seek to harm us. It is toward them we are all supposed to direct our empathy. The exhortations are everywhere. PBS News Hour featured a quiz by Charles Murray in March that asked “Do You Live in a Bubble?” The questions assumed that if you didn’t know people who drank cheap beer and drove pick-up trucks and worked in factories you lived in an elitist bubble. Among the questions: “Have you ever lived for at least a year in an American community with a population under 50,000 that is not part of a metropolitan area and is not where you went to college? Have you ever walked on a factory floor? Have you ever had a close friend who was an evangelical Christian?”

The quiz is essentially about whether you are in touch with working-class small-town white Christian America, as though everyone who’s not Joe the Plumber is Maurice the Elitist. We should know them, the logic goes; they do not need to know us. Less than 20 percent of Americans are white evangelicals, only slightly more than are Latino. Most Americans are urban. The quiz delivers, yet again, the message that the 80 percent of us who live in urban areas are not America, treats non-Protestant (including the quarter of this country that is Catholic) and non-white people as not America, treats many kinds of underpaid working people (salespeople, service workers, farmworkers) who are not male industrial workers as not America. More Americans work in museums than work in coal, but coalminers are treated as sacred beings owed huge subsidies and the sacrifice of the climate, and museum workers—well, no one is talking about their jobs as a totem of our national identity.

PBS added a little note at the end of the bubble quiz, “The introduction has been edited to clarify Charles Murray’s expertise, which focuses on white American culture.” They don’t mention that he’s the author of the notorious Bell Curve or explain why someone widely considered racist was welcomed onto a publicly funded program. Perhaps the actual problem is that white Christian suburban, small-town, and rural America includes too many people who want to live in a bubble and think they’re entitled to, and that all of us who are not like them are menaces and intrusions who needs to be cleared out of the way.

After all, there was a march in Charlottesville, Virginia, last year full of white men with tiki torches chanting “You will not replace us.” Which translates as get the fuck out of my bubble, a bubble that is a state of mind and a sentimental attachment to a largely fictional former America. It’s not everyone in this America; for example, Syed Ahmed Jamal’s neighbors in Lawrence, Kansas, rallied to defend him when ICE arrested and tried to deport the chemistry teacher and father who had lived in the area for 30 years. It’s not all white men; perpetration of the narrative centered on them is something too many women buy into and some admirable men are trying to break out of.

And the meanest voices aren’t necessarily those of the actual rural and small-town. In a story about a Pennsylvania coal town named Hazelton, Fox’s Tucker Carlson recently declared that immigration brings “more change than human beings are designed to digest,” the human beings in this scenario being the white Hazeltonians who are not immigrants, with perhaps an intimation that immigrants are not human beings, let alone human beings who have already had to digest a lot of change. Once again a small-town white American narrative is being treated as though it’s about all of us or all of us who count, as though the gentrification of immigrant neighborhoods is not also a story that matters, as though Los Angeles and New York City, both of which have larger populations than many American states, are not America. In New York City, the immigrant population alone exceeds the total population of Kansas (or Nebraska or Idaho or West Virginia, where all those coal miners are).

In the aftermath of the 2016 election, we were told that we needed to  . . .

Continue reading.

Written by Leisureguy

13 April 2022 at 7:25 am

Why the Past 10 Years of American Life Have Been Uniquely Stupid

leave a comment »

Jonathan Haidt, a social psychologist at the New York University Stern School of Business and author of The Righteous Mind and co-author of The Coddling of the American Mind, has a lengthy article in the Atlantic, which begins:

hat would it have been like to live in Babel in the days after its destruction? In the Book of Genesis, we are told that the descendants of Noah built a great city in the land of Shinar. They built a tower “with its top in the heavens” to “make a name” for themselves. God was offended by the hubris of humanity and said:

Look, they are one people, and they have all one language; and this is only the beginning of what they will do; nothing that they propose to do will now be impossible for them. Come, let us go down, and confuse their language there, so that they will not understand one another’s speech.

The text does not say that God destroyed the tower, but in many popular renderings of the story he does, so let’s hold that dramatic image in our minds: people wandering amid the ruins, unable to communicate, condemned to mutual incomprehension.

The story of Babel is the best metaphor I have found for what happened to America in the 2010s, and for the fractured country we now inhabit. Something went terribly wrong, very suddenly. We are disoriented, unable to speak the same language or recognize the same truth. We are cut off from one another and from the past.

It’s been clear for quite a while now that red America and blue America are becoming like two different countries claiming the same territory, with two different versions of the Constitution, economics, and American history. But Babel is not a story about tribalism; it’s a story about the fragmentation of everything. It’s about the shattering of all that had seemed solid, the scattering of people who had been a community. It’s a metaphor for what is happening not only between red and blue, but within the left and within the right, as well as within universities, companies, professional associations, museums, and even families.

Babel is a metaphor for what some forms of social media have done to nearly all of the groups and institutions most important to the country’s future—and to us as a people. How did this happen? And what does it portend for American life?

The Rise of the Modern Tower

there is a direction to history and it is toward cooperation at larger scales. We see this trend in biological evolution, in the series of “major transitions” through which multicellular organisms first appeared and then developed new symbiotic relationships. We see it in cultural evolution too, as Robert Wright explained in his 1999 book, Nonzero: The Logic of Human Destiny. Wright showed that history involves a series of transitions, driven by rising population density plus new technologies (writing, roads, the printing press) that created new possibilities for mutually beneficial trade and learning. Zero-sum conflicts—such as the wars of religion that arose as the printing press spread heretical ideas across Europe—were better thought of as temporary setbacks, and sometimes even integral to progress. (Those wars of religion, he argued, made possible the transition to modern nation-states with better-informed citizens.) President Bill Clinton praised Nonzero’s optimistic portrayal of a more cooperative future thanks to continued technological advance.

The early internet of the 1990s, with its chat rooms, message boards, and email, exemplified the Nonzero thesis, as did the first wave of social-media platforms, which launched around 2003. Myspace, Friendster, and Facebook made it easy to connect with friends and strangers to talk about common interests, for free, and at a scale never before imaginable. By 2008, Facebook had emerged as the dominant platform, with more than 100 million monthly users, on its way to roughly 3 billion today. In the first decade of the new century, social media was widely believed to be a boon to democracy. What dictator could impose his will on an interconnected citizenry? What regime could build a wall to keep out the internet?

The high point of techno-democratic optimism was arguably 2011, a year that began with the Arab Spring and ended with the global Occupy movement. That is also when Google Translate became available on virtually all smartphones, so you could say that 2011 was the year that humanity rebuilt the Tower of Babel. We were closer than we had ever been to being “one people,” and we had effectively overcome the curse of division by language. For techno-democratic optimists, it seemed to be only the beginning of what humanity could do.

In February 2012, as he prepared to take Facebook public, Mark Zuckerberg reflected on those extraordinary times and set forth his plans. “Today, our society has reached another tipping point,” he wrote in a letter to investors. Facebook hoped “to rewire the way people spread and consume information.” By giving them “the power to share,” it would help them to “once again transform many of our core institutions and industries.”

In the 10 years since then, Zuckerberg did exactly what he said he would do. He did rewire the way we spread and consume information; he did transform our institutions, and he pushed us past the tipping point. It has not worked out as he expected.

Things Fall Apart

historically, civilizations have relied on shared blood, gods, and enemies to counteract the tendency to split apart as they grow. But what is it that holds together large and diverse secular democracies such as the United States and India, or, for that matter, modern Britain and France?

Social scientists have identified at least three major forces that collectively bind together successful democracies: social capital (extensive social networks with high levels of trust), strong institutions, and shared stories. Social media has weakened all three. To see how, we must understand how social media changed over time—and especially in the several years following 2009.

In their early incarnations, platforms such as Myspace and Facebook were relatively harmless. They allowed users to create pages on which to post photos, family updates, and links to the mostly static pages of their friends and favorite bands. In this way, early social media can be seen as just another step in the long progression of technological improvements—from the Postal Service through the telephone to email and texting—that helped people achieve the eternal goal of maintaining their social ties.

But gradually, social-media users became more comfortable sharing intimate details of their lives with strangers and corporations. As I wrote in a 2019 Atlantic article with Tobias Rose-Stockwell, they became more adept at putting on performances and managing their personal brand—activities that might impress others but that do not deepen friendships in the way that a private phone conversation will.

Once social-media platforms had trained users to spend more time performing and less time connecting, the stage was set for the major transformation, which began in 2009: the intensification of viral dynamics.

Before 2009, Facebook had given users a simple timeline––a never-ending stream of content generated by their friends and connections, with the newest posts at the top and the oldest ones at the bottom. This was often overwhelming in its volume, but it was an accurate reflection of what others were posting. That began to change in 2009, when Facebook offered users a way to publicly “like” posts with the click of a button. That same year, Twitter introduced something even more powerful: the “Retweet” button, which allowed users to publicly endorse a post while also sharing it with all of their followers. Facebook soon copied that innovation with its own “Share” button, which became available to smartphone users in 2012. “Like” and “Share” buttons quickly became standard features of most other platforms.

Shortly after its “Like” button began to produce data about what best “engaged” its users, Facebook developed algorithms to bring each user the content most likely to generate a “like” or some other interaction, eventually including the “share” as well. Later research showed that posts that trigger emotions––especially anger at out-groups––are the most likely to be shared.

By 2013, social media had become a new game, with dynamics unlike those in 2008. If you were skillful or lucky, you might create a post that would “go viral” and make you “internet famous” for a few days. If you blundered, you could find yourself buried in hateful comments. Your posts rode to fame or ignominy based on the clicks of thousands of strangers, and you in turn contributed thousands of clicks to the game.

This new game encouraged dishonesty and mob dynamics: Users were guided not just by their true preferences but by their past experiences of reward and punishment, and their prediction of how others would react to each new action. One of the engineers at Twitter who had worked on the “Retweet” button later revealed that he regretted his contribution because it had made Twitter a nastier place. As he watched Twitter mobs forming through the use of the new tool, he thought to himself, “We might have just handed a 4-year-old a loaded weapon.”

As a social psychologist who studies emotion, morality, and politics, I saw this happening too. The newly tweaked platforms were almost perfectly designed to bring out our most moralistic and least reflective selves. The volume of outrage was shocking.

It was just this kind of twitchy and explosive spread of anger that James Madison had tried to protect us from as he was drafting the U.S. Constitution. The Framers of the Constitution were excellent social psychologists. They knew that democracy had an Achilles’ heel because it depended on the collective judgment of the people, and democratic communities are subject to “the turbulency and weakness of unruly passions.” The key to designing a sustainable republic, therefore, was to build in mechanisms to slow things down, cool passions, require compromise, and give leaders some insulation from the mania of the moment while still holding them accountable to the people periodically, on Election Day.

The tech companies that enhanced virality from 2009 to 2012 brought us deep into Madison’s nightmare. Many authors quote his comments in “Federalist No. 10” on the innate human proclivity toward “faction,” by which he meant our tendency to divide ourselves into teams or parties that are so inflamed with “mutual animosity” that they are “much more disposed to vex and oppress each other than to cooperate for their common good.”

But that essay continues on to a less quoted yet equally important insight, about democracy’s vulnerability to triviality. Madison notes that . . .

Continue reading. There’s much more.

Written by Leisureguy

11 April 2022 at 2:45 pm

Why Christopher Alexander Still Matters

leave a comment »

Michael W. Mehaff writes in Planetizen:

This week [week of March 22, 2022 – LG] came news of the passing of Christopher Alexander, widely described as one of the most influential architects and urbanists of the last half-century. Robert Campbell, the Pulitzer Prize-winning architecture critic for the Boston Globe, probably spoke for many when he observed that Alexander “had an enormous, critical influence on my life and work, and I think that’s true of a whole generation of people.”

Certainly, a remarkably diverse group of architects, urban planners and researchers claims to have been influenced by Alexander, including Rem Koolhaas, Andrés Duany, Bill Hillier, and many more. Many other widely known theorists and authors were influenced by him too, including Stewart Brand, author of How Buildings Learn: What Happens After They’re Builtand The Whole Earth Catalog.

Alexander’s influence also extended far beyond architecture and urbanism. Ward Cunningham, inventor of wiki (the technology behind Wikipedia), credits Alexander with directly inspiring that innovation, as well as pattern languages of programming, and the even more widespread Agile methodology.  Will Wright, creator of the popular games Sim City and The Sims, also credits Alexander as a major influence, as do the musicians Brian Eno and Peter Gabriel. Apple’s Steve Jobs was also said to be a fan.

Moreover, one can also find direct applications of Alexander’s pattern language methodology in product design, engineering, anthropology, sociology, biology, and many other fields. In fact, the chances are that every day, you are using—in your iPhone, on your computer, when you use Wikipedia or Google, or in countless other ways—some form of Alexander-inspired technology.

What is it about Alexander’s work that has made it so useful for all these fields? Moreover, what does his work say about where we are today in urbanism and architecture, and in design and technology more broadly—and where we need to go?

Most people who know Alexander’s work are most familiar with his 1977 book (written with six student co-authors), A Pattern Language: Towns, Buildings, Construction. For many people, this “pattern language” methodology offered a very helpful and appealing tool to help to organize a design process, and moreover, to create a web-like interrelation between the elements of a design. But there was a deeper set of ideas behind this methodology—one that many people found to be revolutionary and inspirational.

For Alexander, we’ve been getting some things very wrong—most centrally in our approach to the relationship between technology and life. In fact, we’ve embraced a deathly form of technology, one that is killing the planet, and certainly killing human habitats. (That goes for financial technology as well as planning and design technology, in addition to other kinds of technology.) The core problem is that we have failed to understand the living processes going on all around us, and instead of supporting them, our more mechanically oriented technology is destroying them. This need not be so, however, if we understand the kinds of mistakes we are making today.

Alexander made a contrast with the remarkably robust and beautiful structures of past societies. Whatever their faults, we can see that there was some kind of “unself-conscious process” at work in creating the richness and beauty of these cultures, and we can learn much from that process.  Indeed, it is now critical that we recapture this life-supporting process, albeit in a necessarily more self-conscious form. Our goal now must be to recapture a “timeless way of building”—and a timeless kind of technology that is more supportive of life.

Alexander believed that a “timeless way of building” is not . . .

Continue reading.

Written by Leisureguy

6 April 2022 at 7:24 pm

Laurie Penny on The Sexual Revolution

leave a comment »

Nathan Robinson interviews Laurie Penny in Current Affairs:

aurie Penny is a journalist and activist who has written seven books including Unspeakable Things: Sex, Lies and RevolutionBitch Doctrine: Essays for Dissenting Adults, and most recently, Sexual Revolution: Modern Fascism and the Feminist Fightback. Penny came on the Current Affairs podcast to talk to editor in chief Nathan J. Robinson about the “sexual revolution” of Penny’s book. This interview has been condensed and edited for clarity and grammar.


I thought we could start with the positive, before we get to pervasive sexual authoritarianism and patriarchy. In many ways, your book celebrates changes that have occurred in the last decade. You’ve been a political writer now for over ten years. There are positives in the world. You point out the increasing visibility of queer, non binary, and trans people these days, and the #MeToo movement. The book is about a sexual revolution taking place now, about women asserting autonomy and demanding consent.


It’s about a sexual revolution that is already happening. That’s the key. And thanks for opening with that—it’s one of the hardest things to get across to people. This is an exercise in pointing out the obvious. There is a slow-moving sea change happening in gendered power relations. It’s been building for decades now. And it has to do with economics; it has to do with imbalances and rebalances in structural violence and how power is organized and operates. And the reaction to that sexual revolution explains a great deal of modern politics.

It’s surprising to me that it hasn’t been directly named. The rise of the far right around the world, particularly around the global north, the rise of misogyny, and changes in how progressive politics are countered—a lot of that can be explained as a reaction and a backlash to the enormous changes in relations between the genders and in economic relations, which are experienced in very gendered ways.

This is one of the things that I point out in the book: a lot of the changes in people’s lives—even if you just look at millennials, and how our expectations in life have been very different from how our lives have turned out—are not, at their root, problems of sex and gender, but they’re experienced in highly gendered ways. American Twitter is great for concisely naming political phenomena, like the idea of the “failson.” I find it absolutely fascinating. Hundreds of thousands of young men are failsons. They’ve just had this failure to launch. They’re people who in previous generations would have found their niche; they would have found a job they could tolerate that would pay them a decent amount; they would have found a place to live; they would have been able to form relationships. But there’s just this sense that there’s no longer room or opportunity in their lives to access all of the things that they’re still raised to feel entitled to, some of which are things everyone should be entitled to. That’s an economic issue experienced in highly gendered ways.


So these are young men who essentially don’t live up to what they feel like they ought to have. Perhaps they achieve a lower level of status than their parents did. And they experience that as a kind of humiliation. And then they desire something that will affirm them as important or give them some sense of power.


Absolutely. Status, pride, and dignity. Neoliberalism does not make room for the concept of human dignity. And that is an enormous problem especially among centrists and traditional American liberals. One of the reasons that they found it so hard to understand Trump is that what Trump offered a certain kind of person who voted for him was pride and a reason to feel that their dignity mattered. There were all kinds of toxic heuristics behind that. But the fact is that everyday life under late stage capitalism—which I insist on saying, even though it annoys people so much—but our lives and the lives of young and not-so-young men under late stage capitalism, are full of indignities and everyday humiliations. And so the one place they feel they can control or demand a restoration of status is in their relationships with women and within sexuality, which has become incredibly freighted by this howling vortex of straight male need. It’s about politics, and it’s about gender. I was trained as a journalist to think that those were two separate things. You had your political writers, and then you had the people who wrote for the women’s pages. And that has actually gone away—not so much with the women’s section anymore. But it’s almost like the politics of sex and gender are real politics.

You asked me about what is positive about the sexual revolution. A great deal. If you look at women’s lives and the real options that women have today, even a generation ago, those things were just slightly out of reach.


Well, let’s go a little deeper. You said there have been profound changes in relations between the genders. Could you describe what you think are the most important of those changes?


There are enormous economic and social changes happening in the balance of power between men and women in most parts of the world. (When I say men and women, I am referring to political categories, the broad gender binary.) Let’s take an example. One thing that lawmakers and news outlets are suddenly paying attention to in an extremely panicked way is the plummeting birth rate across the global north and beyond. Birth rates have been dropping steadily over the past several decades. But over the past seven or eight years, they’ve really started to go down very fast. And in the last two years, they just fell off a cliff because women are not having babies anymore. And there are lots of reasons for that. I find it amusing reading. Some articles attempt to be objective about this, the collapse in birth rates and the change in women’s employment—that’s the way the BBC and the New York Times put it—increasing opportunities for women at work, as if the only reason that women weren’t having babies was because they would rather have a career.

But there are other things that have changed. The first thing that has changed is that i

Continue reading.

Written by Leisureguy

1 April 2022 at 2:10 pm

There Is No Liberal World Order

leave a comment »

Anne Applebaum has a strong essay in the Atlantic. It begins:

n february 1994, in the grand ballroom of the town hall in Hamburg, Germany, the president of Estonia gave a remarkable speech. Standing before an audience in evening dress, Lennart Meri praised the values of the democratic world that Estonia then aspired to join. “The freedom of every individual, the freedom of the economy and trade, as well as the freedom of the mind, of culture and science, are inseparably interconnected,” he told the burghers of Hamburg. “They form the prerequisite of a viable democracy.” His country, having regained its independence from the Soviet Union three years earlier, believed in these values: “The Estonian people never abandoned their faith in this freedom during the decades of totalitarian oppression.”

But Meri had also come to deliver a warning: Freedom in Estonia, and in Europe, could soon be under threat. Russian President Boris Yeltsin and the circles around him were returning to the language of imperialism, speaking of Russia as primus inter pares—the first among equals—in the former Soviet empire. In 1994, Moscow was already seething with the language of resentment, aggression, and imperial nostalgia; the Russian state was developing an illiberal vision of the world, and even then was preparing to enforce it. Meri called on the democratic world to push back: The West should “make it emphatically clear to the Russian leadership that another imperialist expansion will not stand a chance.”

At that, the deputy mayor of St. Petersburg, Vladimir Putin, got up and walked out of the hall.

Meri’s fears were at that time shared in all of the formerly captive nations of Central and Eastern Europe, and they were strong enough to persuade governments in Estonia, Poland, and elsewhere to campaign for admission to NATO. They succeeded because nobody in Washington, London, or Berlin believed that the new members mattered. The Soviet Union was gone, the deputy mayor of St. Petersburg was not an important person, and Estonia would never need to be defended. That was why neither Bill Clinton nor George W. Bush made much attempt to arm or reinforce the new NATO members. Only in 2014 did the Obama administration finally place a small number of American troops in the region, largely in an effort to reassure allies after the first Russian invasion of Ukraine.

Nobody else anywhere in the Western world felt any threat at all. For 30 years, Western oil and gas companies piled into Russia, partnering with Russian oligarchs who had openly stolen the assets they controlled. Western financial institutions did lucrative business in Russia too, setting up systems to allow those same Russian kleptocrats to export their stolen money and keep it parked, anonymously, in Western property and banks. We convinced ourselves that there was no harm in enriching dictators and their cronies. Trade, we imagined, would transform our trading partners. Wealth would bring liberalism. Capitalism would bring democracy—and democracy would bring peace.

After all, it had happened before. Following the cataclysm of 1939–45, Europeans had indeed collectively abandoned wars of imperial, territorial conquest. They stopped dreaming of eliminating one another. Instead, the continent that had been the source of the two worst wars the world had ever known created the European Union, an organization designed to find negotiated solutions to conflicts and promote cooperation, commerce, and trade. Because of Europe’s metamorphosis—and especially because of the extraordinary transformation of Germany from a Nazi dictatorship into the engine of the continent’s integration and prosperity—Europeans and Americans alike believed that they had created a set of rules that would preserve peace not only on their own continents, but eventually in the whole world.

This liberal world order relied on the mantra of “Never again.” Never again would there be genocide. Never again would large nations erase smaller nations from the map. Never again would we be taken in by dictators who used the language of mass murder. At least in Europe, we would know how to react when we heard it.

But while we were happily living under the illusion that “Never again” meant something real, the leaders of Russia, owners of the world’s largest nuclear arsenal, were reconstructing an army and a propaganda machine designed to facilitate mass murder, as well as a mafia state controlled by a tiny number of men and bearing no resemblance to Western capitalism. For a long time—too long—the custodians of the liberal world order refused to understand these changes. They looked away when Russia “pacified” Chechnya by murdering tens of thousands of people. When Russia bombed schools and hospitals in Syria, Western leaders decided that that wasn’t their problem. When Russia invaded Ukraine the first time, they found reasons not to worry. Surely Putin would be satisfied by the annexation of Crimea. When Russia invaded Ukraine the second time, occupying part of the Donbas, they were sure he would be sensible enough to stop.

Even when the Russians, having grown rich on the kleptocracy we facilitated, bought Western politicians, funded far-right extremist movements, and ran disinformation campaigns during American and European democratic elections, the leaders of America and Europe still refused to take them seriously. It was just some posts on Facebook; so what? We didn’t believe that we were at war with Russia. We believed, instead, that we were safe and free, protected by treaties, by border guarantees, and by the norms and rules of the liberal world order.

With the third, more brutal invasion of Ukraine, the vacuity of those beliefs was revealed. The Russian president openly denied the existence of a legitimate Ukrainian state: “Russians and Ukrainians,” he said, “were one people—a single whole.” His army targeted civilians, hospitals, and schools. His policies aimed to create refugees so as to destabilize Western Europe. “Never again” was exposed as an empty slogan while a genocidal plan took shape in front of our eyes, right along the European Union’s eastern border. Other autocracies watched to see what we would do about it, for Russia is not the only nation in the world that covets its neighbors’ territory, that seeks to destroy entire populations, that has no qualms about the use of mass violence. North Korea can attack South Korea at any time, and has nuclear weapons that can hit Japan. China seeks to eliminate the Uyghurs as a distinct ethnic group, and has imperial designs on Taiwan.

We can’t turn the clock back to 1994, to see what would have happened had we heeded Lennart Meri’s warning. But we can face the future with honesty. We can name the challenges and prepare to meet them.

There is no natural liberal world order, and there are no rules without someone to enforce them. Unless democracies  . . .

Continue reading. She offers some excellent specific steps to undertake.

Written by Leisureguy

31 March 2022 at 5:17 pm

Inside Starling Lab, a moonshot project to preserve the world’s most important information

leave a comment »

Interesting project — and important, as more and more of our history and culture becomes digital or digitized, and as AI-assisted alteration and fakery become better and easier. Katharine Schwab writes at Fast Company:

hen the British army liberated the Bergen-Belsen concentration camp in April 1945, they found horrors so shocking that a journalist’s eyewitness reports to the BCCwere held for days because their veracity was in doubt.

“We lived among heaps of bodies,” says Anita Lasker-Wallfisch, a survivor of the camp whose firsthand experience at both Bergen-Belsen and Auschwitz is now memorialized in a 130-minute video testimony. In the 1998 video, she tells an interviewer from the USC Shoah Foundation, a nonprofit dedicated to preserving the memories of genocide survivors, about how playing the cello in the Auschwitz orchestra helped her endure one of the most horrific atrocities in human history.

Lasker-Wallfisch’s recollections have now become the first test case for an ambitious project to preserve the foundation’s archive of 56,000 audio-visual testimonies through a radical means: the blockchain. While most oral histories are stored in more traditional ways—on hard drives, for example, or in the cloud—the digital file of Lasker-Wallfisch’s testimony is also being archived using a decentralized web protocol, creating extra redundancies in an effort to preserve her account on the internet for the long term. Right now, her testimony lives on dozens of different servers. One day, it may live on thousands.

The foundation’s move to the blockchain is in partnership with Starling Lab, a nonprofit academic research center that’s on a mission to use decentralized ledgers to help preserve historical data of importance to humanity. Its lofty goal is to restore integrity both to data and to the internet itself—starting with some of the most precious information we have.

For the past three years, the lab’s founding director, Jonathan Dotan, has been developing a set of technologies, called the Starling Framework, that aims to maintain the integrity of a piece of information as it is captured and stored. Now, the lab is working with the USC Shoah Foundation to upload the nonprofit’s interviews from survivors and witnesses of 14 genocides and episodes of mass violence to a decentralized storage system. Each testimony is first checked to make sure the file’s data hasn’t degraded over its lifetime. It’s then given a unique content identifier—called a hash—that refers to both the image and its corresponding metadata, which includes where and when the testimony was taken. The storage system that Starling uses, called Filecoin, is built on a blockchain that requires data providers to constantly prove that they hold the same data that they were originally tasked with storing—ensuring that information hasn’t been tampered with.

A low-resolution copy of the foundation’s archive has already been uploaded to four Filecoin data providers. Starling and the foundation are currently in the midst of uploading a high-resolution copy to 20 storage providers—a 15-week-long process. (Starling and the foundation are also experimenting with how new testimonies can be embedded with a content ID and stored on the blockchain as they are filmed.)

The ultimate goal, says Dotan, is . . .

Continue reading.

Written by Leisureguy

30 March 2022 at 11:54 am

Latin and Greek Are Finding A Voice At Oxford

leave a comment »

Bijan Omrani has an interesting article in Medium:

In Oxford, Latin is everywhere. Latin mottoes, memorials and inscriptions greet you at every turn throughout the city and its colleges. Every day, during term time, Latin graces are said in dining halls. Students are still admitted to the University and receive their degrees in Latin ceremonies. On occasion, there are Latin services and Latin sermons at the University Church. And at the high point of the University year, the Encaenia Ceremony, honorary doctorates are awarded — after a traditional spread of strawberries, peaches, and champagne — with grand Latin orations in praise of the honorands in the Sheldonian Theatre.

But, with this ubiquity of Latin in mind, how far would one get in Oxford if one actually attempted to speak Latin, or at least wanted to learn how to speak it?

Certainly, in my time as a Classics undergraduate there in the late 1990s, the answer would have been not very far. For all of the Latin traditions and ceremonial around us, in our daily studies Latin was seen as a language strictly to be read, and, on occasion, laboriously written in the course of exercises. The idea that it might be taught as a spoken language was unheard of, and had it been suggested, it would without doubt have been dismissed as being too eccentric even for Oxford, the proverbial home of lost causes.

My undergraduate self would therefore have been astonished to see me, twenty years on, sitting in an online classroom based out of Oxford, discussing the Aeneid in Latin. Although I have spent time as a Latin teacher, my halting attempts at speaking the language are far outstripped by graduate and undergraduate students, whose fluency and ease both in reading and speaking the language astonish me, brought up, as I was, in the traditional ways.

This online reading class is one of the activities run by the Oxford Ancient Languages Society (the OALS, formerly the Oxford Latinitas Project). It is the fruit of an increasing change in attitude both amongst students and academics in the United Kingdom towards the value of using spoken Latin as a means of improving proficiency in the language, as well as widening its accessibility.

“I had originally been teaching in the customary way,” said Dr. Melinda Letts, College Tutor in Latin and Ancient Greek at Jesus College Oxford, and a Senior Member of the OALS. “But I became quite troubled by the scope and nature of the need among undergraduates in the field of Latin and Greek language tuition. For a start, the A-levels [the end of school exam taken in the UK by 18-year-olds] seemed to be delivering different sets of skills, so that many students who had Latin and Greek A-levels still needed a great deal of language tuition to help them read texts fluently. At the same time, the numbers of undergraduates who needed to learn the languages from scratch had also steadily increased in recent years. This is because Oxford’s welcome efforts to encourage people from a much wider variety of backgrounds to apply began to bear fruit. Ancient languages are taught mainly in private schools these days, so widening access means we need to teach more and more students the languages. The interest in Classics is not confined to those who had the chance to go to private schools and learn the languages from an early age; the interest is wide, as we can see from the numbers of people applying from a great variety of schools, but many more students at University now have to learn from scratch.

“I had been teaching the languages to these students in the so-called ‘traditional way’ — which is really a misnomer, since it’s a comparatively recent invention. I hated seeing the students whom I had met during the admissions interviews who had been full of excitement about reading ancient literature end up after a term or two seriously struggling with, sometimes even weeping over, the volume of text they had to read on the course, and the level of difficulty. These are clever, highly motivated young people, yet it was clear to me that the task was extremely difficult for many of them. I couldn’t bear teaching something that made the students feel so frustrated and anxious. Yet the languages must continue to be taught. I was still passionate about making sure that students had the best possible language capabilities so that they can develop their own independent responses to the ancient texts. Otherwise, they will end up being dependent on using translations which, because of the socio-economic profile that has traditionally defined classicists, may only reflect an essentially narrow and privileged background. I want students to have a broad set of perspectives, and to have that we need more students who can read the ancient texts confidently for themselves.”

Letts tried to add active-language pedagogy to her training, but with limited success. “A lot of my questions in Latin ended up being written on the board instead of spoken,” she says. The turning point came in 2017, when she encountered Jenny Rallens, Brian Lapsa, and Lewis Scarpellino, and the student organization they had founded: the Oxford Latinitas Project. “They had experience of learning Latin through speaking the spoken language, and had started to run classes and events to get the group known. One event was a Septimana Latina — a week-long spoken Latin trip to Italy held during the Easter Vacation 2018. I went on this, and was so impressed that I recommended others to go on it as well.”

One of Melinda’s students whose whole outlook on the language was changed by the Oxford Septimana was . . .

Continue reading.

Written by Leisureguy

29 March 2022 at 5:21 pm

Posted in Daily life, Education, Memes

How to bully-proof your kids for life

leave a comment »

In the Guardian Joanna Moorhead reviews Bully-Proof Kids: Practical Tools to Help Your Child to Grow Up Confident, Resilient and Strong, by Stella O’Malley:

What is bullying?

It’s a sustained pattern of aggression by a person with more power, targeting someone with less power. The key, says Stella O’Malley, author of a ground-breaking new book, Bully-Proof Kids, is that it’s repeated behaviour. But beneath this simple definition lies a complex, multilayered situation that can be exceptionally tricky to unpack. What is the power, and where does it come from? With children, says O’Malley, it’s often that they have more social status, or have been led to believe they do.

One very big issue, which she returns to time and again in her book and in our conversation, is that bullying is always about more than what’s going on with two people: the bully and the target. What about the children O’Malley calls “wingmen”, the bully’s supporters, the kids who think the bully is the bee’s knees and want to stay in their favour? What’s happening with the kids watching silently – the bystanders? Who is seeing what’s happening, when it all starts to kick off, and getting out fast? Who’s calling out the injustice? To understand bullying, you have to see the whole picture.

Because, says O’Malley, bullying is about absolutely everyone in the group, room, office or playground; even the bystanders – those who do or say nothing when bullying is taking place – because, as the German theologian Dietrich Bonhoeffer said, not to speak is to speak; not to act is to act.

Could any kid be a bully – and a target?

Yes, says O’Malley. “I’ve never met anyone who I’ve thought could never be a bully, or anyone who could never be a target. The truth is that every single one of us has our shadow side – and it’s not until we can acknowledge it that we become better people.”

Bullying behaviour in children, says O’Malley, tends to be very animalistic, with a strong instinct in many kids to join the pack. “Maybe it takes 18 years to civilise ourselves, and that’s where parents come in. There’s so much we can do to make a difference.”

What can parents do?

The most important thing, says O’Malley, is to pay attention to your child, so you can work out what their vulnerabilities are. You know your child better than anyone: what are their emotional needs? Do they need love and belonging, or crave power, status and recognition from others? The first of these could be a passive, gentle child who might be more vulnerable to bullying, or to being recruited by a bully to be one of their supporters. Similarly, if your child needs power and recognition – and that’s a great cocktail for success in many sectors – it can easily trigger bullying behaviour, and as a parent you need to be aware of that, and active in how you manage it.

“A kid like this has wonderful strengths, but they need to learn empathy,” she says. “If you can nurture a sense of kindness in that child, help them understand how others are feeling, you’ll be combatting their bullying tendencies. Every child, every human being, has their flaws. Bullying has become demonised, but children can easily tip into it, and we need to help them out of it.” And the good thing, says O’Malley, is that it’s relatively easy to help a primary-school-age child out of being a bully. “They’re primed to be told how to behave, and they can learn to be different.”

What if the parents are bullies themselves?

Sometimes you come across a family where everyone is a bully: the parents, the older siblings, who bully the younger kids, and the younger kids, who bully others at school. These families are very difficult to help; but they’re also quite rare. What’s much more common is individualistic behaviour by parents that could set their child up to be a bully.

What sort of behaviour should I look out for?

There’s a certain sort of kid, warns O’Malley, who goes to school with a . . .

Continue reading.

Written by Leisureguy

22 March 2022 at 1:43 pm

War sent America off the rails 19 years ago. Could another one bring it back?n

leave a comment »

Mission accomplished? Not quite. In this May 2003 photo, George W. Bush declares the end of major combat in Iraq as he speaks aboard an aircraft carrier off the California coast. The war dragged on for many years after that. (AP Photo/J. Scott Applewhite, File)

The US invasion of Iraq was an act of hubris that killed hundreds of thousands and cost hundreds of billions of dollars and left a stain on the US that persists to this day. Jason Opal, Associate Professor of History and Chair, History and Classical Studies, McGill University, writes in The Conversation:

At the start of 2022, the right to vote, the rule of law and even the existence of facts seemed to be in grave peril in the United States.

Explanations for this crisis ranged from the decades-long decline of the American middle class to the more recent rise of social media and its unique capacity to spread lies.

In truth, many factors were at play, but the most direct cause of America’s harrowing descent — the one event that arguably set the others in motion — began 19 years ago.

War by choice

On March 19, 2003, George W. Bush and his neoconservative brain trust launched the Iraq war because of the alleged threat of Saddam Hussein’s mothballed weapons [and many pointed out that this threat was fictitious – LG]. Bush and his advisers believed in using military force to spread American political and economic might around the globe.

It was an ideology both foolish and fanatical, the pet project of a tiny circle of well-connected warmongers. Bush himself had lost the popular vote in 2000 and was slumping in the polls before Sept. 11, 2001.

But no one wanted to look weak after the terrorist attacks, and so, in one of the last bipartisan gestures of the past two decades, U.S. senators from Hillary Clinton to Mitch McConnell voted for war in the Middle East.

Having sold the invasion with bad faith and bluster, the neocons planned it with hubris and incompetence. Against the professional advice of the U.S. military, they sought to destroy Saddam Hussein’s regime with minimal ground forces, whereupon they would dismantle the Iraqi state and invite private contractors to somehow rebuild the place.

At first, their fantasies swept to victory. But by 2004, the country they had shattered began to lash out at both the invaders and itself, and by 2006 the singular disaster of our times began to spread.

Butterfly effects

Some two million Iraqis decamped to Syria and Jordan and even more fled to places within Iraq, where the ghoulish seeds of ISIS began to grow.

When ISIS spread following the U.S. withdrawal from Iraq in 2011, a second wave of refugees sought shelter in Europe. This stoked nationalism and helped propel Brexit to a stunning win in the United Kingdom. . .

Continue reading. There’s more. 

The US started the sequence, and the dominoes continued to topple in turn. Karl Rove famously said that Bush administration created its own reality, but he failed to recognize what a slipshod job it was doing.

Written by Leisureguy

20 March 2022 at 2:15 pm

Jerry Cans: The True Secret Weapon of WWII

leave a comment »

Military technology is not always high tech, but good design is always important. I found this video quite interesting. Note how badly one can bungle copying an excellent design.

Written by Leisureguy

18 March 2022 at 7:09 pm

%d bloggers like this: