Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘History’ Category

The American Abyss: Fascism, Atrocity, and What Comes Next

leave a comment »

Timothy Snyder, Levin professor of history at Yale University and the author of histories of political atrocity including “Bloodlands” and “Black Earth,” as well as the book “On Tyranny,” on America’s turn toward authoritarianism, writes in the NY Times Magazine on the mechanisms and failures that brought the US political system to its current state of wreckage:

When Donald Trump stood before his followers on Jan. 6 and urged them to march on the United States Capitol, he was doing what he had always done. He never took electoral democracy seriously nor accepted the legitimacy of its American version.

Even when he won, in 2016, he insisted that the election was fraudulent — that millions of false votes were cast for his opponent. In 2020, in the knowledge that he was trailing Joseph R. Biden in the polls, he spent months claiming that the presidential election would be rigged and signaling that he would not accept the results if they did not favor him. He wrongly claimed on Election Day that he had won and then steadily hardened his rhetoric: With time, his victory became a historic landslide and the various conspiracies that denied it ever more sophisticated and implausible.

People believed him, which is not at all surprising. It takes a tremendous amount of work to educate citizens to resist the powerful pull of believing what they already believe, or what others around them believe, or what would make sense of their own previous choices. Plato noted a particular risk for tyrants: that they would be surrounded in the end by yes-men and enablers. Aristotle worried that, in a democracy, a wealthy and talented demagogue could all too easily master the minds of the populace. Aware of these risks and others, the framers of the Constitution instituted a system of checks and balances. The point was not simply to ensure that no one branch of government dominated the others but also to anchor in institutions different points of view.

In this sense, the responsibility for Trump’s push to overturn an election must be shared by a very large number of Republican members of Congress. Rather than contradict Trump from the beginning, they allowed his electoral fiction to flourish. They had different reasons for doing so. One group of Republicans is concerned above all with gaming the system to maintain power, taking full advantage of constitutional obscurities, gerrymandering and dark money to win elections with a minority of motivated voters. They have no interest in the collapse of the peculiar form of representation that allows their minority party disproportionate control of government. The most important among them, Mitch McConnell, indulged Trump’s lie while making no comment on its consequences.

Yet other Republicans saw the situation differently: They might actually break the system and have power without democracy. The split between these two groups, the gamers and the breakers, became sharply visible on Dec. 30, when Senator Josh Hawley announced that he would support Trump’s challenge by questioning the validity of the electoral votes on Jan. 6. Ted Cruz then promised his own support, joined by about 10 other senators. More than a hundred Republican representatives took the same position. For many, this seemed like nothing more than a show: challenges to states’ electoral votes would force delays and floor votes but would not affect the outcome.

Yet for Congress to traduce its basic functions had a price. An elected institution that opposes elections is inviting its own overthrow. Members of Congress who sustained the president’s lie, despite the available and unambiguous evidence, betrayed their constitutional mission. Making his fictions the basis of congressional action gave them flesh. Now Trump could demand that senators and congressmen bow to his will. He could place personal responsibility upon Mike Pence, in charge of the formal proceedings, to pervert them. And on Jan. 6, he directed his followers to exert pressure on these elected representatives, which they proceeded to do: storming the Capitol building, searching for people to punish, ransacking the place.

Of course this did make a kind of sense: If the election really had been stolen, as senators and congressmen were themselves suggesting, then how could Congress be allowed to move forward? For some Republicans, the invasion of the Capitol must have been a shock, or even a lesson. For the breakers, however, it may have been a taste of the future. Afterward, eight senators and more than 100 representatives voted for the lie that had forced them to flee their chambers.

Post-truth is pre-fascism, and Trump has been our post-truth president. When we give up on truth, we concede power to those with the wealth and charisma to create spectacle in its place. Without agreement about some basic facts, citizens cannot form the civil society that would allow them to defend themselves. If we lose the institutions that produce facts that are pertinent to us, then we tend to wallow in attractive abstractions and fictions. Truth defends itself particularly poorly when there is not very much of it around, and the era of Trump — like the era of Vladimir Putin in Russia — is one of the decline of local news. Social media is no substitute: It supercharges the mental habits by which we seek emotional stimulation and comfort, which means losing the distinction between what feels true and what actually is true.

Post-truth wears away the rule of law and invites a regime of myth. These last four years,  . . .

Continue reading. There’s much more — it’s a long article — and at the link you can also listen to it (30 minutes at normal speed).

Written by LeisureGuy

16 January 2021 at 1:56 pm

The Woman Whose Invention Helped Win a War — and Still Baffles Weathermen

leave a comment »

Our common culture has long had a blind spot regarding women — their identity, experience, and achievements — and to some extent it is an active blind spot, in which some efforts are made to hide and erase knowledge of women’s accomplishments and women’s valid experience (cf. Harvey Weinstein and how culture covered for his offenses).

Irena Fischer-Hwang writes in Smithsonian Magazine:

On June 4, 2013, the city of Huntsville, Alabama was enjoying a gorgeous day. Blue skies, mild temperatures. Just what the forecasters had predicted.

But in the post-lunch hours, meteorologists started picking up what seemed to be a rogue thunderstorm on the weather radar. The “blob,” as they referred to it, mushroomed on the radar screen. By 4 PM, it covered the entire city of Huntsville. Strangely, however, the actual view out of peoples’ windows remained a calm azure.

The source of the blob turned out to be not a freak weather front, but rather a cloud of radar chaff, a military technology used by nations all across the globe today. Its source was the nearby Redstone Arsenal, which, it seems, had decided that a warm summer’s day would be perfect for a completely routine military test.

More surprising than the effect that radar chaff has on modern weather systems, though, is the fact that its inventor’s life’s work was obscured by the haze of a male-centric scientific community’s outdated traditions.

The inventor of radar chaff was a woman named Joan Curran.

Born Joan Strothers and raised in Swansea on the coast of Wales, she matriculated at the University of Cambridge’s Newnham College in 1934. Strothers studied physics on a full scholarship and enjoyed rowing in her spare time. Upon finishing her degree requirements in 1938, she went to the University’s preeminent Cavendish Laboratory to begin a doctorate in physics.

At the Cavendish, Strothers was assigned to work with a young man named Samuel Curran. For two years, Strothers got along swimmingly with her new lab partner. But with international conflict brewing in Europe, in 1940 the pair was transferred twice to work on military research, and ended up at Exeter.

There, the two developed proximity fuses to destroy enemy planes and rockets. There also, Strothers married Sam and took on his last name, becoming Joan Curran. Shortly after their wedding in November, the Currans transferred to the Telecommunications Research Establishment (TRE) in the autumn of 1940. Curran joined a team led by British physicist and scientific military intelligence expert R.V. Jones that was developing a method to conceal aircraft from enemy radar detection.

The idea, Jones later explained in his book Most Secret War, was simple. Radar detectors measure the reflection of radio waves of a certain wavelength off of incoming objects. As it turns out, thin metal strips can resonate with incoming waves, and also re-radiate the waves. Under the right conditions, the re-radiated waves create the sonic impression of a large object when in reality, there is none—hence, the blob in Alabama.

This property means that a few hundred thin reflectors could, together, reflect as much energy as a heavy British bomber plane would. A collection of strips might conceal the exact location of an aircraft during a raid behind a large cloud of signal, or even lead the enemy to believe they were observing a major attack when in reality, there was only one or two planes.

By the time Pearl Harbor was attacked in 1941, Curran was nearly a year into painstaking experiments on using metals to reflect radar signals. She had tried a seemingly countless number of sizes and shapes, from singular wires to metal leaflets the size of notebook paper. The leaflets had been a particularly interesting idea, since they could do double-duty as propaganda sheets with text printed on them.

In 1942, Curran finally settled on reflectors that were about 25 centimeters long and 1.5 centimeters wide. The reflectors were aluminized paper strips bundled into one-pound packets and intended to be thrown out of the leading aircraft. When defenestrated from a stream of bombers once every minute, they could produce “the radar equivalent of a smokescreen,” according to Jones.

In 1943, the reflector strips were put to a serious military test when the Allies launched Operation Gomorrah on Hamburg, Germany. Operation Gomorrah was a brutal campaign of air raids that lasted over a week, destroyed most of the city and resulted in almost 40,000 civilian deaths. But with rates of only 12 aircraft losses out of 791 on one evening’s bombing raid, the campaign was a major victory for the Allies, in large part due to Curran’s reflectors.

Perhaps most notably, radar chaff was used as part of a large-scale, elaborate diversion on June 5, 1944 to prevent German forces from knowing exactly where the Allied invasion into Nazi-held continental Europe would begin. Deployed on the eve of what would become known as D-Day, two radar chaff drops, Operations Taxable and Glimmer, were combined with hundreds of dummy parachutists to draw German attention towards the northernmost parts of France, and away from the beaches of Normandy.

Curran went on to work on . . .

Continue reading. There’s more.

Later in the article:

“We don’t know how many women were working in the labs of famous male scientists, or how many discoveries women contributed to, because for centuries men did a very good job hiding the achievements of women,” Wade remarked in an email.

Written by LeisureGuy

13 January 2021 at 9:45 am

20 Lessons from the 20th Century About How to Defend Democracy from Authoritarianism: A Timely List from Yale Historian Timothy Snyder

leave a comment »

This post on Open Culture is worth reading in full, but let me extract just this:

Timothy Snyder, Housum Professor of History at Yale University, is one of the foremost scholars in the U.S. and Europe on the rise and fall of totalitarianism during the 1930s and 40s. Among his long list of appointments and publications, he has won multiple awards for his recent international bestsellers Bloodlands: Europe between Hitler and Stalin and last year’s Black Earth: The Holocaust as History and WarningThat book in part makes the argument that Nazism wasn’t only a German nationalist movement but had global colonialist origins—in Russia, Africa, and in the U.S., the nation that pioneered so many methods of human extermination, racist dehumanization, and ideologically-justified land grabs.

. . .

Rather than making a historical case for viewing the U.S. as exactly like one of the totalitarian regimes of WWII Europe, Snyder presents 20 lessons we might learn from those times and use creatively in our own where they apply. In my view, following his suggestions would make us wiser, more self-aware, proactive, responsible citizens, whatever lies ahead. Read Snyder’s lessons from his Facebook post below and consider ordering his latest book On Tyranny: Twenty Lessons from the Twentieth Century:

1. Do not obey in advance. Much of the power of authoritarianism is freely given. In times like these, individuals think ahead about what a more repressive government will want, and then start to do it without being asked. You’ve already done this, haven’t you? Stop. Anticipatory obedience teaches authorities what is possible and accelerates unfreedom.

2. Defend an institution. Follow the courts or the media, or a court or a newspaper. Do not speak of “our institutions” unless you are making them yours by acting on their behalf. Institutions don’t protect themselves. They go down like dominoes unless each is defended from the beginning.

3. Recall professional ethics. When the leaders of state set a negative example, professional commitments to just practice become much more important. It is hard to break a rule-of-law state without lawyers, and it is hard to have show trials without judges.

4. When listening to politicians, distinguish certain words. Look out for the expansive use of “terrorism” and “extremism.” Be alive to the fatal notions of “exception” and “emergency.” Be angry about the treacherous use of patriotic vocabulary.

5. Be calm when the unthinkable arrives. When the terrorist attack comes, remember that all authoritarians at all times either await or plan such events in order to consolidate power. Think of the Reichstag fire. The sudden disaster that requires the end of the balance of power, the end of opposition parties, and so on, is the oldest trick in the Hitlerian book. Don’t fall for it.

6. Be kind to our language. Avoid pronouncing the phrases everyone else does. Think up your own way of speaking, even if only to convey that thing you think everyone is saying. (Don’t use the internet before bed. Charge your gadgets away from your bedroom, and read.) What to read? Perhaps “The Power of the Powerless” by Václav Havel, 1984 by George Orwell, The Captive Mind by Czesław Milosz, The Rebel by Albert Camus, The Origins of Totalitarianism by Hannah Arendt, or Nothing is True and Everything is Possible by Peter Pomerantsev.

7. Stand out. Someone has to. It is easy, in words and deeds, to follow along. It can feel strange to do or say something different. But without that unease, there is no freedom. And the moment you set an example, the spell of the status quo is broken, and others will follow.

8. Believe in truth. To abandon facts is to abandon freedom. If nothing is true, then no one can criticize power, because there is no basis upon which to do so. If nothing is true, then all is spectacle. The biggest wallet pays for the most blinding lights.

9. Investigate. Figure things out for yourself. Spend more time with long articles. Subsidize investigative journalism by subscribing to print media. Realize that some of what is on your screen is there to harm you. Learn about sites that investigate foreign propaganda pushes.

10. Practice corporeal politics. Power wants your body softening in your chair and your emotions dissipating on the screen. Get outside. Put your body in unfamiliar places with unfamiliar people. Make new friends and march with them.

11. Make eye contact and small talk. This is not just polite. It is a way to stay in touch with your surroundings, break down unnecessary social barriers, and come to understand whom you should and should not trust. If we enter a culture of denunciation, you will want to know the psychological landscape of your daily life.

12. Take responsibility for the face of the world. Notice the swastikas and the other signs of hate. Do not look away and do not get used to them. Remove them yourself and set an example for others to do so.

13. Hinder the one-party state. The parties that took over states were once something else. They exploited a historical moment to make political life impossible for their rivals. Vote in local and state elections while you can.

14. Give regularly to good causes, if you can. Pick a charity and set up autopay. Then you will know that you have made a free choice that is supporting civil society helping others doing something good.

15. Establish a private life. Nastier rulers will use what they know about you to push you around. Scrub your computer of malware. Remember that email is skywriting. Consider using alternative forms of the internet, or simply using it less. Have personal exchanges in person. For the same reason, resolve any legal trouble. Authoritarianism works as a blackmail state, looking for the hook on which to hang you. Try not to have too many hooks.

16. Learn from others in other countries. Keep up your friendships abroad, or make new friends abroad. The present difficulties here are an element of a general trend. And no country is going to find a solution by itself. Make sure you and your family have passports.

17. Watch out for the paramilitaries. When the men with guns who have always claimed to be against the system start wearing uniforms and marching around with torches and pictures of a Leader, the end is nigh. When the pro-Leader paramilitary and the official police and military intermingle, the game is over.

18. Be reflective if you must be armed. If you carry a weapon in public service, God bless you and keep you. But know that evils of the past involved policemen and soldiers finding themselves, one day, doing irregular things. Be ready to say no. (If you do not know what this means, contact the United States Holocaust Memorial Museum and ask about training in professional ethics.)

19. Be as courageous as you can. If none of us is prepared to die for freedom, then all of us will die in unfreedom.

20. Be a patriot. The incoming president is not. Set a good example of what America means for the generations to come. They will need it.

Written by LeisureGuy

11 January 2021 at 5:18 pm

Arnold Schwarzenegger points out similarities between Capitol Hill insurrection and Austria’s Kristallnacht

leave a comment »

Written by LeisureGuy

10 January 2021 at 11:27 am

A 25-Year-Old Bet Comes Due: Has Tech Destroyed Society?

leave a comment »

In Wired Steven Levy writes on a long bet come due. It’s also a litmus test for one’s own view of societal progress over the past quarter-century: are the average person of today better off now than the average person in 1995? The article begins:

ON MARCH 6, 1995, WIRED’s executive editor and resident techno-optimist Kevin Kelly went to the Greenwich Village apartment of the author Kirkpatrick Sale. Kelly had asked Sale for an interview. But he planned an ambush.

Kelly had just read an early copy of Sale’s upcoming book, called Rebels Against the Future. It told the story of the 19th-century Luddites, a movement of workers opposed to the machinery of the Industrial Revolution. Before their rebellion was squashed and their leaders hanged, they literally destroyed some of the mechanized looms that, they believed, reduced them to cogs in a dehumanizing engine of mass production.

Sale adored the Luddites. In early 1995, Amazon was less than a year old, Apple was in the doldrums, Microsoft had yet to launch Windows 95, and almost no one had a mobile phone. But Sale, who for years had been churning out books complaining about modernity and urging a return to a subsistence economy, felt that computer technology would make life worse for humans. Sale had even channeled the Luddites at a January event in New York City where he attacked an IBM PC with a 10-pound sledgehammer. It took him two blows to vanquish the object, after which he took a bow and sat down, deeply satisfied.

Kelly hated Sale’s book. His reaction went beyond mere disagreement; Sale’s thesis insulted his sense of the world. So he showed up at Sale’s door not just in search of a verbal brawl but with a plan to expose what he saw as the wrongheadedness of Sale’s ideas. Kelly set up his tape recorder on a table while Sale sat behind his desk.

The visit was all business, Sale recalls. “No eats, no coffee, no particular camaraderie,” he says. Sale had prepped for the interview by reading a few issues of WIRED—he’d never heard of it before Kelly contacted him—and he expected a tough interview. He later described it as downright “hostile, no pretense of objective journalism.” (Kelly later called it adversarial, “because he was an adversary, and he probably viewed me the same way.”) They argued about the Amish, whether printing presses denuded forests, and the impact of technology on work. Sale believed it stole decent labor from people. Kelly replied that technology helped us make new things we couldn’t make any other way. “I regard that as trivial,” Sale said.

Sale believed society was on the verge of collapse. That wasn’t entirely bad, he argued. He hoped the few surviving humans would band together in small, tribal-style clusters. They wouldn’t be just off the grid. There would be no grid. Which was dandy, as far as Sale was concerned.

“History is full of civilizations that have collapsed, followed by people who have had other ways of living,” Sale said. “My optimism is based on the certainty that civilization will collapse.”

That was the opening Kelly had been waiting for. In the final pages of his Luddite book, Sale had predicted society would collapse “within not more than a few decades.” Kelly, who saw technology as an enriching force, believed the opposite—that society would flourish. Baiting his trap, Kelly asked just when Sale thought this might happen.

Sale was a bit taken aback—he’d never put a date on it. Finally, he blurted out 2020. It seemed like a good round number.

Kelly then asked how, in a quarter century, one might determine whether Sale was right.

Sale extemporaneously cited three factors: an economic disaster that would render the dollar worthless, causing a depression worse than the one in 1930; a rebellion of the poor against the monied; and a significant number of environmental catastrophes.

“Would you be willing to bet on your view?” Kelly asked.

“Sure,” Sale said.

Then Kelly sprung his trap. He had come to Sale’s apartment with a $1,000 check drawn on his joint account with this wife. Now he handed it to his startled interview subject. “I bet you $1,000 that in the year 2020, we’re not even close to the kind of disaster you describe,” he said.

Sale barely had $1,000 in his bank account. But he figured that if he lost, a thousand bucks would be worth much less in 2020 anyway. He agreed. Kelly suggested they both send their checks for safekeeping to William Patrick, the editor who had handled both Sale’s Luddite book and Kelly’s recent tome on robots and artificial life; Sale agreed.

“Oh, boy,” Kelly said after Sale wrote out the check. “This is easy money.”

Twenty-five years later, the once distant deadline is here. We are locked down. Income equality hasn’t been this bad since just before the Great Depression. California and Australia were on fire this year. We’re about to find out how easy that money is. As the time to settle approached, both men agreed that Patrick, the holder of the checks, should determine the winner on December 31. Much more than a thousand bucks was at stake: The bet was a showdown between two fiercely opposed views on the nature of progress. In a time of climate crisis, a pandemic, and predatory capitalism, is optimism about humanity’s future still justified? Kelly and Sale each represent an extreme side of the divide. For the men involved, the bet’s outcome would be a personal validation—or repudiation—of their lifelong quests.

Continue reading. There’s much more (including the judge’s decision), and it’s interesting.

Written by LeisureGuy

10 January 2021 at 10:54 am

Podcast: Bill Moyers and Heather Cox Richardson

leave a comment »

The podcast can be downloaded from this post on BillMoyers.com. The transcript begins:

ANNOUNCER: Welcome to Moyers on Democracy. President Trump urged his followers to come to Washington for a “big protest” on January 6th. He wanted their help in reversing the results of the election he lost. “Be there,” he said.“ (It) will be wild.”  And they came. By the thousands, they came, and sure enough, it was not only “wild,”  as the President had promised, it was worse. Much worse. The protesters became a mob, stormed the US Capitol, drove the vice president and members of the House and Senate out of their chambers, and turned a day meant for celebrating democracy into a riot that sought to overturn a free and fair election. Across the country and around the world people watched, horrified, dumbfounded and disbelieving, as insurrection incited by the president of the United States and his Republican enablers struck at the very centerpiece of American governance. Here’s Bill Moyers, to talk about that day with the historian Heather Cox Richardson.

BILL MOYERS: Good morning Heather, glad you could join me.

HEATHER COX RICHARDSON: It’s always a pleasure.

BILL MOYERS: It’s the morning after what happened in Washington, the insurrection. Did you believe your eyes when you were watching those events unfold on the screen?

HEATHER COX RICHARDSON: I believed them and I wept. And I am not exaggerating. Seeing that Confederate flag, which had never flown in the Capitol during the Civil War, and it had never flown in the Capitol in the 1870s, and it had never flown in the Capitol during the second rise of KKK in the 1920s, going through our people’s government house in 2021– the blow that that means for those of us who understand exactly what was at stake in the Confederacy. That image for me, of the flag being carried through the halls was, I think, my lowest moment as an American.

BILL MOYERS: Interesting because I kept seeing the flags all afternoon: the Confederate flag, American flags flying upside down. Flags with the name “Jesus” on them, “Jesus saves,” “Jesus 2020.” A big, burly protester carrying a flag on a baseball bat that seemed as big as his arms. He paused long enough just to give the camera and us a middle finger. Joe Biden keeps saying, this isn’t America. It’s not who we are, but it is America. This kind of character and this kind of conflict and this kind of meanness are a big part of our history. Is there any hope for Biden’s aspiration to unite us again?

HEATHER COX RICHARDSON: These people have always been in our society. And they always will be in our society. What makes this moment different is that we have a president who is actively inciting them in order to destroy our democracy. We certainly have had presidents who incited these sorts of people before for one end or another. But at the end of the day, every president until now has believed in democracy. And this one does not. He wants to get rid of democracy and replace it with an oligarchy that puts him and his family at the top. The same sort of way that we have oligarchies in Russia now, for example. Biden cannot combat these people alone. This is a moment for Americans who care about our democracy and who care about returning to our fundamental principles. And finally, making them come to life to speak up, to push back, to insist on accountability and to recognize that we are, in fact, struggling for the survival of our country, not simply talking about, “Oh, I like this politician” or, “I like that politician.” And if we do that, will we win? Absolutely. But making people do that and getting people to understand how important that is is going to be a battle. And it’s one that, by the way, we’ve been in before, and lost. This is the same sort of battle we fought at the end of Reconstruction, when most Americans sort of went “Whatever.” And we ended up with a one-party state in the American South for generations. And that is exactly the sort of thing that they are trying to make happen across America itself.

BILL MOYERS: What do you think happens to those we saw on the screen yesterday, those who invaded the Capitol, the core of our congressional system? What do you think happens to them when they discover that Trump and the Republican Party have been lying to them? That the election wasn’t rigged, it wasn’t a hoax. What do they do?

HEATHER COX RICHARDSON: A lot of them will never realize that. You know your psychological studies. A lot of what we used to call brainwashing can’t be undone and won’t be undone. And they will go to their graves believing that this was a stolen election. But some, and you could see them on their faces yesterday, some people sort of went, “Well, wait a minute. This was supposed to be the storm. We were supposed to be having a revolution. And it didn’t happen. We got into the Capitol building. We did our part, and there was nobody there to greet us and to help us take over.” And what’s interesting in a moment like that is there are two things to do: you can go deeper into your delusion, or you can turn on the people who took you there in a really powerful and passionate way. And this is one of the reasons this moment is so fraught is a lot of people might be waking up and going, “Wait a minute. They lied to us. They changed their minds last night and they made Biden president.” And you can see if you’re watching QAnon. They’re sort of saying, “Well, wait a minute. I’m sure Trump has an even deeper plan.” Which, of course, puts him in a bind because he can’t now say, “Oh, never mind. I didn’t mean this,” because then he’s going to lose their loyalty. So, we’re in this fraught moment. But I think people will either go ahead and continue to believe and this will a rump group that we are going to have to be dealing with for many, many years. Or some of them will become some of our most vocal opponents of people like Trump.

BILL MOYERS: Seventy million people are not really a rump group, are they? They constitute a sizable portion of the American population. You think they’ll drift away, those who are just seeing Trump as a sort of spokesman for their grievances and someone who could put the establishment on notice? Or are they in this for the long run?

HEATHER COX RICHARDSON: I think it’s really important to distinguish between

Continue reading. Or go to the link and listen (or download the audio file).

Written by LeisureGuy

8 January 2021 at 1:07 pm

What New Science Techniques Tells Us About Ancient Women Warriors

leave a comment »

The past is a foreign country; they do things differently there.
— L.P. Hartley, writer (30 Dec 1895-1972)

A NY Times article suggests how people attempt to project their own cultural and social conventions on other societies even when it is totally inappropriate. To be fair, such projection is generally done from ignorance rather than ill will, though ill will quickly arises if the conventions are questioned. I think this is because people construct their identities from memes, generally taken from their cultural/social environment, so those conventions tend to be view as natural law with a heavy moral overlay. To deny them can feel to some as if their identity is in danger.

Of course, culture and social convention are subject to evolution and thus change over time. As an example of a change in social/cultural conventions, take the author of the article mentioned below, Mx. Annalee Newitz. Some decades back we went through a cultural shift away from the requirement that the marital status of women must be signified in the honorific: back then one had to use “Miss” for unmarried women, “Mrs.” for married women. (Men, of course, were called “Mr.” regardless of their marital status.)

The inequality was obvious, so in a relatively short period of time, the honorific “Ms.” (pronounced “mizz”) became common, readily adopted because it solved the problem of knowing which honorific to use when you did not know the woman’s marital status. (“Miss” and “Mrs.” were outliers among honorifics in requiring knowledge of marital status, since other honorifics — Mr., Capt., Dr., Rev., Prof., etc. — required no knowledge of marital status.) Indeed, in Southern speech, “mizz” was long since commonly used for both “Miss” and “Mrs.”

“Mx.” eases the burden of knowledge one step further: “Mx.” (pronounced “mix”) is an honorific that applies to a person without regard to gender — in effect, it is the honorific equivalent of “human” or “person.”

I think it would be quite useful, and will be quickly adopted by those whose name is ambiguous as regards gender and thus frequently get the wrong honorific (“Mr.” when “Ms.” is right, or “Ms.” when “”Mr.” is right — “Mx.” finesses the problem altogether). I’m thinking of names like Shirley (remember the spots columnist Shirley Povich?), Pat, Robin, Leslie, Sandy, Kim, Marion (John Wayne’s real name), Charlie, Evelyn, Sue, and so on.

So “Mx.” is the honorific equivalent of “human” or “person”: no comment regarding gender, but showing respect as a person.

Mx. Newitz writes:

Though it’s remarkable that the United States finally is about to have a female vice president, let’s stop calling it an unprecedented achievement. As some recent archaeological studies suggest, women have been leaders, warriors and hunters for thousands of years. This new scholarship is challenging long-held beliefs about so-called natural gender roles in ancient history, inviting us to reconsider how we think about women’s work today.

In November a group of anthropologists and other researchers published a paper in the academic journal Science Advances about the remains of a 9,000-year-old big-game hunter buried in the Andes. Like other hunters of the period, this person was buried with a specialized tool kit associated with stalking large game, including projectile points, scrapers for tanning hides and a tool that looked like a knife. There was nothing particularly unusual about the body — though the leg bones seemed a little slim for an adult male hunter. But when scientists analyzed the tooth enamel using a method borrowed from forensics that reveals whether a person carries the male or female version of a protein called amelogenin, the hunter turned out to be female.

With that information in hand, the researchers re-examined evidence from 107 other graves in the Americas from roughly the same period. They were startled to discover that out of 26 graves with hunter tools, 10 belonged to women. Bonnie Pitblado, an archaeologist at the University of Oklahoma, Norman, told Science magazine that the findings indicate that “women have always been able to hunt and have in fact hunted.” The new data calls into question an influential dogma in the field of archaeology. Nicknamed “man the hunter,” this is the notion that men and women in ancient societies had strictly defined roles: Men hunted, and women gathered. Now, this theory may be crumbling.

While the Andean finding was noteworthy, this was not the first female hunter or warrior to be found by re-examining old archaeological evidence using fresh scientific techniques. Nor was this sort of discovery confined to one group, or one part of the world.

Three years ago, scientists re-examined the remains of a 10th-century Viking warrior excavated in Sweden at the end of the 19th century by Hjalmar Stolpe, an archaeologist. The skeleton had been regally buried at the top of a hill, with a sword, two shields, arrows and two horses. For decades, beginning with the original excavation, archaeologists assumed the Viking was a man. When researchers in the 1970s conducted a new anatomical evaluation of the skeleton, they began to suspect that the Viking was in fact a woman. But it wasn’t until 2017, when a group of Swedish archaeologists and geneticists extracted DNA from the remains, that the sex of the warrior indeed proved to be female.

The finding led to controversy over whether the skeleton was really a warrior, with scholars and pundits protesting what they called revisionist history. Although the genetic sex determination thus was indisputable (the bones of the skeleton had two X chromosomes), these criticisms led the Swedish researchers to examine the evidence yet again, and present a second, more contextual analysis in 2019. Their conclusion again was that the person had been a warrior.

The naysayers raised fair points. In archaeology, as the researchers admitted, we can’t always know why a community buried someone with particular objects. And one female warrior does not mean that many women were leaders, just as the reign of Queen Elizabeth I was not part of a larger feminist movement.

Challenges to “man the hunter” have emerged in new examinations of the early cultures of the Americas as well. In the 1960s, an archaeological dig uncovered in the ancient city of Cahokia, in what is now southwestern Illinois, a 1,000-to-1,200-year-old burial site with two central bodies, one on top of the other, surrounded by other skeletons. The burial was full of shell beads, projectile points and other luxury items. At the time, the archaeologists concluded that this was a burial of two high-status males flanked by their servants.

But in 2016 archaeologists conducted a fresh examination of the grave. The two central figures, it turned out, were a male and a female; they were surrounded by other male-female pairs. Thomas Emerson, who conducted the study with colleagues from the Illinois State Archaeological Survey at the University of Illinois, alongside scientists from other institutions, said the Cahokia discovery demonstrated the existence of male and female nobility. “We don’t have a system in which males are these dominant figures and females are playing bit parts,” as he put it.

Armchair history buffs love to obsess about  . . .

Continue reading.

Written by LeisureGuy

2 January 2021 at 4:54 pm

American Death Cult: Why has the Republican response to the pandemic been so mind-bogglingly disastrous?

leave a comment »

Jonathan Chait wrote this back in July 2020 in New York. And just a reminder: the US as of today has seen 20 million cases and more than 346,000 deaths due to Covid-19.

Last October, the Nuclear Threat Initiative and the Johns Hopkins Center for Health Security compiled a ranking system to assess the preparedness of 195 countries for the next global pandemic. Twenty-one panel experts across the globe graded each country in 34 categories composed of 140 subindices. At the top of the rankings, peering down at 194 countries supposedly less equipped to withstand a pandemic, stood the United States of America.

It has since become horrifyingly clear that the experts missed something. The supposed world leader is in fact a viral petri dish of uncontained infection. By June, after most of the world had beaten back the coronavirus pandemic, the U.S., with 4 percent of the world’s population, accounted for 25 percent of its cases. Florida alone was seeing more new infections a week than China, Japan, Korea, Vietnam, Thailand, Malaysia, Indonesia, the Philippines, Australia, and the European Union combined.

During its long period of decline, the Ottoman Empire was called “the sick man of Europe.” The United States is now the sick man of the world, pitied by the same countries that once envied its pandemic preparedness — and, as recently as the 2014 Ebola outbreak, relied on its expertise to organize the global response.

Our former peer nations are now operating in a political context Americans would find unfathomable. Every other wealthy nation in the world has successfully beaten back the disease, at least significantly, and at least for now. New Zealand’s health minister was forced to resign after allowing two people who had tested positive for COVID-19 to attend a funeral. The Italian Parliament heckled Prime Minister Giuseppe Conte when he briefly attempted to remove his mask to deliver a speech. In May — around the time Trump cheered demonstrators into the streets to protest stay-at-home orders — Boris Johnson’s top adviser set off a massive national scandal, complete with multiple calls for his resignation, because he’d been caught driving to visit his parents during lockdown. If a Trump official had done the same, would any newspaper even have bothered to publish the story?

It is difficult for us Americans to imagine living in a country where violations so trivial (by our standards) provoke such an uproar. And if you’re tempted to see for yourself what it looks like, too bad — the E.U. has banned U.S. travelers for health reasons.

The distrust and open dismissal of expertise and authority may seem uniquely contemporary — a phenomenon of the Trump era, or the rise of online misinformation. But the president and his party are the products of a decades-long war against the functioning of good government, a collapse of trust in experts and empiricism, and the spread of a kind of magical thinking that flourishes in a hothouse atmosphere that can seal out reality. While it’s not exactly shocking to see a Republican administration be destroyed by incompetent management — it happened to the last one, after all — the willfulness of it is still mind-boggling and has led to the unnecessary sickness and death of hundreds of thousands of people and the torpedoing of the reelection prospects of the president himself. Like Stalin’s purge of 30,000 Red Army members right before World War II, the central government has perversely chosen to disable the very asset that was intended to carry it through the crisis. Only this failure of leadership and management took place in a supposedly advanced democracy whose leadership succumbed to a debilitating and ultimately deadly ideological pathology.

How did this happen? In 1973, Republicans trusted science more than religion, while Democrats trusted religion more than science. The reverse now holds true. In the meantime, working-class whites left the Democratic Party, which has increasingly taken on the outlook of the professional class with its trust in institutions and empiricism. The influx of working-class whites (especially religiously observant ones) has pushed Republicans toward increasingly paranoid varieties of populism.

This is the conventional history of right-wing populism — that it was a postwar backlash against the New Deal and the Republican Party’s inability or unwillingness to roll it back. The movement believed the government had been subverted, perhaps consciously, by conspirators seeking to impose some form of socialism, communism, or world government. Its “paranoid style,” so described by historian Richard Hofstadter, became warped with anti-intellectualism, reflecting a “conflict between businessmen of certain types and the New Deal bureaucracy, which has spilled over into a resentment of intellectuals and experts.” Its followers seemed prone to “a disorder in relation to authority, characterized by an inability to find other modes for human relationship than those of more or less complete domination or submission.” Perhaps this sounds like someone you’ve heard of.

But for all the virulence of conservative paranoia in American life, without the sanction of a major party exploiting and profiting from paranoia, and thereby encouraging its growth, the worldview remained relatively fringe. Some of the far right’s more colorful adherents, especially the 100,000 reactionaries who joined the John Birch Society, suspected the (then-novel, now-uncontroversial) practice of adding small amounts of fluoride to water supplies to improve dental health was, in fact, a communist plot intended to weaken the populace. Still, the far right lacked power. Republican leaders held Joe McCarthy at arm’s length; Goldwater captured the nomination but went down in a landslide defeat. In the era of Sputnik, science was hardly a countercultural institution. “In the early Cold War period, science was associated with the military,” says sociologist Timothy O’Brien who, along with Shiri Noy, has studied the transformation. “When people thought about scientists, they thought about the Manhattan Project.” The scientist was calculating, cold, heartless, an authority figure against whom the caring, feeling liberal might rebel. Radicals in the ’60s often directed their protests against the scientists or laboratories that worked with the Pentagon.

But this began to change in the 1960s, along with everything else in American political and cultural life. New issues arose that tended to pit scientists against conservatives. Goldwater’s insouciant attitude toward the prospect of nuclear war with the Soviets provoked scientists to explain the impossibility of surviving atomic fallout and the formation of Scientists and Engineers for Johnson-Humphrey. New research by Rachel Carson about pollution and by Ralph Nader on the dangers of cars and other consumer products made science the linchpin of a vast new regulatory state. Business owners quickly grasped that stopping the advance of big government meant blunting the cultural and political authority of scientists. Expertise came to look like tyranny — or at least it was sold that way.

One tobacco company conceded privately in 1969 that it could not directly challenge the evidence of tobacco’s dangers but could make people wonder how solid the evidence really was. “Doubt,” the memo explained, “is our product.” In 1977, the conservative intellectual Irving Kristol urged business leaders to steer their donations away from public-interest causes and toward the burgeoning network of pro-business foundations. “Corporate philanthropy,” he wrote, “should not be, cannot be, disinterested.” The conservative think-tank scene exploded with reports questioning whether pollution, smoking, driving, and other profitable aspects of American capitalism were really as dangerous as the scientists said.

The Republican Party’s turn against science was slow and jagged, as most party-identity changes tend to be. The Environmental Protection Agency had been created under Richard Nixon, and its former administrator, Russell Train, once recalled President Gerald Ford promising to support whatever auto-emissions guidelines his staff deemed necessary. “I want you to be totally comfortable in the fact that no effort whatsoever will be made to try to change your position in any way,” said Ford — a pledge that would be unimaginable for a contemporary Republican president to make. Not until Ronald Reagan did Republican presidents begin letting business interests overrule experts, as when his EPA used a “hit list” of scientists flagged by industry as hostile. And even Reagan toggled between giving business a free hand and listening to his advisers (as he did when he signed a landmark 1987 agreement to phase out substances that were depleting the ozone layer and a plan the next year to curtail acid rain).

The party’s rightward tilt accelerated in the 1990s. “With the collapse of the Soviet Union, Cold Warriors looked for another great threat,” wrote science historians Naomi Oreskes and Erik Conway. “They found it in environmentalism,” viewing climate change as a pretext to impose government control over the whole economy. Since the 1990s was also the decade in which scientific consensus solidified that greenhouse-gas emissions were permanently increasing temperatures, the political stakes of environmentalism soared.

The number of books criticizing environmentalism increased fivefold over the previous decade, and more than 90 percent cited evidence produced by right-wing foundations. Many of these tracts coursed with the same lurid paranoia as their McCarthy-era counterparts. This was when the conspiracy theory that is currently conventional wisdom on the right — that scientists across the globe conspired to exaggerate or falsify global warming data in order to increase their own power — first took root.

This is not just a story about elites. About a decade after business leaders launched their attack on science from above, a new front opened from below: Starting in the late 1970s,  . . .

Continue reading.

Written by LeisureGuy

1 January 2021 at 4:55 pm

How To Get Away with Murder: Live in ancient Rome

leave a comment »

Emma Souton writes in History Today:

In 176 BC a strange but revealing murder case came before the Roman praetor, M. Popillius Laenas. A woman, unnamed in the sources, was brought before the court on the charge of murdering her mother by bludgeoning her with a club. The woman happily confessed to the monstrous act of matricide. Her fate, then, seemed sealed when she entered Laenas’ court; but she introduced a defence that was as irrefutable as the wickedness of the killing of a parent. She claimed that the deed had been a crime of grief-fuelled vengeance resulting from the deaths of her own children. They, she said, had been deliberately poisoned by her mother simply to spite her and her own actions were therefore justified.

This defence caused the entire system to grind to a halt. The situation was an appalling paradox. In Roman culture, parricide was a crime that provoked a unique horror; there was nothing worse than murdering a parent. The typical punishment was a bizarre form of the death penalty, which involved the perpetrator being sewn into a sack with a monkey, a snake, a dog and a chicken and then thrown into the Tiber to drown. The purpose of the animals is unclear; the purpose of the sack was to deprive the murderer of the air and water, and prevent their bones from touching and defiling the earth. It was impossible to imagine a confessed parricide being left unpunished. Rome, however, had a predominantly self-help justice system, where private families and individuals investigated and punished slights against themselves. It was not the role of the state, particularly during the time of the Republic (510-27 BC), to interfere with such private matters as a vengeance killing within the family. The right independently to enact justice, especially when avenging the death of your own children, was central to the Roman conception of a just world. It was, therefore, equally impossible to imagine such a killing being punished.

For Laenas, the situation was a nightmare. For most of Republican history there was no formal law criminalising homicide: the Roman government was so deliberately decentralised that it did not see itself as a state which was harmed by private homicide. The murder of a private person did not affect the various magistrates’ power, and therefore the state need not interfere.

Therefore, if he punished a woman who had acted, in the depths of her grief for her children, to justly avenge their murder, then he would be passing judgment on all such killings and suggesting that vengeance killings were criminal. This could not be countenanced.

There was, however, one major exception to this rule:  . . .

Continue reading. There’s much more.

Written by LeisureGuy

1 January 2021 at 3:11 pm

The Endgame of the Reagan Revolution

leave a comment »

Heather Cox Richardson writes a good summary of modern American political history:

And so, we are at the end of a year that has brought a presidential impeachment trial, a deadly pandemic that has killed more than 338,000 of us, a huge social movement for racial justice, a presidential election, and a president who has refused to accept the results of that election and is now trying to split his own political party.

It’s been quite a year.

But I had a chance to talk with history podcaster Bob Crawford of the Avett Brothers yesterday, and he asked a more interesting question. He pointed out that we are now twenty years into this century, and asked what I thought were the key changes of those twenty years. I chewed on this question for awhile and also asked readers what they thought. Pulling everything together, here is where I’ve come out.

In America, the twenty years since 2000 have seen the end game of the Reagan Revolution, begun in 1980.

In that era, political leaders on the right turned against the principles that had guided the country since the 1930s, when Democratic President Franklin Delano Roosevelt guided the nation out of the Great Depression by using the government to stabilize the economy. During the Depression and World War Two, Americans of all parties had come to believe the government had a role to play in regulating the economy, providing a basic social safety net and promoting infrastructure.

But reactionary businessmen hated regulations and the taxes that leveled the playing field between employers and workers. They called for a return to the pro-business government of the 1920s, but got no traction until the 1954 Brown v. Board of Education decision, when the Supreme Court, under the former Republican governor of California, Earl Warren, unanimously declared racial segregation unconstitutional. That decision, and others that promoted civil rights, enabled opponents of the New Deal government to attract supporters by insisting that the country’s postwar government was simply redistributing tax dollars from hardworking white men to people of color.

That argument echoed the political language of the Reconstruction years, when white southerners insisted that federal efforts to enable formerly enslaved men to participate in the economy on terms equal to white men were simply a redistribution of wealth, because the agents and policies required to achieve equality would cost tax dollars and, after the Civil War, most people with property were white. This, they insisted, was “socialism.”

To oppose the socialism they insisted was taking over the East, opponents of black rights looked to the American West. They called themselves Movement Conservatives, and they celebrated the cowboy who, in their inaccurate vision, was a hardworking white man who wanted nothing of the government but to be left alone to work out his own future. In this myth, the cowboys lived in a male-dominated world, where women were either wives and mothers or sexual playthings, and people of color were savage or subordinate.

With his cowboy hat and western ranch, Reagan deliberately tapped into this mythology, as well as the racism and sexism in it, when he promised to slash taxes and regulations to free individuals from a grasping government. He promised that cutting taxes and regulations would expand the economy. As wealthy people—the “supply side” of the economy– regained control of their capital, they would invest in their businesses and provide more jobs. Everyone would make more money.

From the start, though, his economic system didn’t work. Money moved upward, dramatically, and voters began to think the cutting was going too far. To keep control of the government, Movement Conservatives at the end of the twentieth century ramped up their celebration of the individualist white American man, insisting that America was sliding into socialism even as they cut more and more domestic programs, insisting that the people of color and women who wanted the government to address inequities in the country simply wanted “free stuff.” They courted social conservatives and evangelicals, promising to stop the “secularization” they saw as a partner to communism.

After the end of the Fairness Doctrine in 1987, talk radio spread the message that Black and Brown Americans and “feminazis” were trying to usher in socialism. In 1996, that narrative got a television channel that personified the idea of the strong man with subordinate women. The Fox News Channel told a story that reinforced the Movement Conservative narrative daily until it took over the Republican Party entirely.

The idea that people of color and women were trying to undermine society was enough of a rationale to justify keeping them from the vote, especially after Democrats passed the Motor Voter law in 1993, making it easier for poor people to register to vote. In 1997, Florida began the process of purging voter rolls of Black voters.

And so, 2000 came.

In that year, the presidential election came down to the electoral votes in Florida. Democratic candidate Al Gore won the popular vote by more than 540,000 votes over Republican candidate George W. Bush, but Florida would decide the election. During the required recount, Republican political operatives led by Roger Stone descended on the election canvassers in Miami-Dade County to stop the process. It worked, and the Supreme Court upheld the end of the recount. Bush won Florida by 537 votes and, thanks to its electoral votes, became president. Voter suppression was a success, and Republicans would use it, and after 2010, gerrymandering, to keep control of the government even as they lost popular support.

Bush had promised to unite the country, but his installation in the White House gave new power to the ideology of the Movement Conservative leaders of the Reagan Revolution. He inherited a budget surplus from his predecessor Democrat Bill Clinton, but immediately set out to get rid of it by cutting taxes. A balanced budget meant money for regulation and social programs, so it had to go. From his term onward, Republicans would continue to cut taxes even as budgets operated in the red, the debt climbed, and money moved upward.

The themes of Republican dominance and tax cuts were the backdrop of the terrorist attack of September 11, 2001. That attack gave the country’s leaders a sense of mission after the end of the Cold War and, after launching a war in Afghanistan to stop al-Qaeda, they set out to export democracy to Iraq. This had been a goal for Republican leaders since the Clinton administration, in the belief that the United States needed to spread capitalism and democracy in its role as a world leader. The wars in Afghanistan and Iraq strengthened the president and the federal government, creating the powerful Department of Homeland Security, for example, and leading Bush to assert the power of the presidency to interpret laws through signing statements.

The association of the Republican Party with patriotism enabled Republicans in this era to call for increased spending for the military and continued tax cuts, while attacking Democratic calls for domestic programs as wasteful. Increasingly, Republican media personalities derided those who called for such programs as dangerous, or anti-American.

But while Republicans increasingly looked inward to their party as the only real Americans and asserted power internationally, changes in technology were making the world larger. The Internet put the world at our fingertips and enabled researchers to decode the human genome, revolutionizing medical science. Smartphones both made communication easy. Online gaming created communities and empathy. And as many Americans were increasingly embracing rap music and tattoos and LGBTQ rights, as well as recognizing increasing inequality, books were pointing to the dangers of the power concentrating at the top of societies. In 1997, J.K. Rowling began her exploration of the rise of authoritarianism in her wildly popular Harry Potter books, but her series was only the most famous of a number of books in which young people conquered a dystopia created by adults.

In Bush’s second term, his ideology created a perfect storm. His . . .

Continue reading. There’s much more.

Plato in Sicily: Philosophy in Practice

leave a comment »

Nick Romeo, a journalist and author who teaches philosophy for Erasmus Academy, and Ian Tewksbury, a Classics graduate student at Stanford University, write in Aeon:

In 388 BCE, Plato was nearly forty. He had lived through an oligarchic coup, a democratic restoration, and the execution of his beloved teacher Socrates by a jury of his fellow Athenians. In his youth, Plato seriously contemplated an entry into Athens’ turbulent politics, but he determined that his envisioned reforms of the city’s constitution and educational practices were vanishingly unlikely to be realised. He devoted himself instead to the pursuit of philosophy, but he retained a fundamental concern with politics, ultimately developing perhaps the most famous of all his formulations: that political justice and human happiness require kings to become philosophers or philosophers to become kings. As Plato approached the age of forty, he visited Megara, Egypt, Cyrene, southern Italy, and, most consequentially of all, the Greek-speaking city-state of Syracuse, on the island of Sicily.

In Syracuse, Plato met a powerful and philosophically-minded young man named Dion, the brother-in-law of Syracuse’s decadent and paranoid tyrant, Dionysius I. Dion would become a lifelong friend and correspondent. This connection brought Plato to the inner court of Syracuse’s politics, and it was here that he decided to test his theory that if kings could be made into philosophers – or philosophers into kings – then justice and happiness could flourish at last.

Syracuse had a reputation for venality and debauchery, and Plato’s conviction soon collided with the realities of political life in Sicily. The court at Syracuse was rife with suspicion, violence and hedonism. Obsessed with the idea of his own assassination, Dionysius I refused to allow his hair to be cut with a knife, instead having it singed with coal. He forced visitors – even his son Dionysius II and his brother Leptines – to prove that they were unarmed by having them stripped naked, inspected and made to change clothes. He slew a captain who’d had a dream of killing him, and he put to death a soldier who handed Leptines a javelin to sketch a map in the dust. This was an inauspicious candidate for the title of philosopher-king.

Plato’s efforts did not fare well. He angered Dionysius I with his philosophical critique of the lavish hedonism of Syracusan court life, arguing that, instead of orgies and wine, one needed justice and moderation to produce true happiness. However sumptuous the life of a tyrant might be, if it was dominated by insatiable grasping after sensual pleasures, he remained a slave to his passions. Plato further taught the tyrant the converse: a man enslaved to another could preserve happiness if he possessed a just and well-ordered soul. Plato’s first visit to Sicily ended in dark irony: Dionysius I sold the philosopher into slavery. He figured that if Plato’s belief were true, then his enslavement would be a matter of indifference since, in the words of the Greek biographer Plutarch, ‘he would, of course, take no harm of it, being the same just man as before; he would enjoy that happiness, though he lost his liberty.’

Fortunately, Plato was soon ransomed by friends. He returned to Athens to found the Academy, where he likely produced many of his greatest works, including The Republic and The Symposium. But his involvement in Sicilian politics continued. He returned to Syracuse twice, attempting on both later trips to influence the mind and character of Dionysius II at the urging of Dion.

These three episodes are generally omitted from our understanding of Plato’s philosophy or dismissed as the picaresque inventions of late biographers. However, this is a mistake that overlooks the philosophical importance of Plato’s Italian voyages. In fact, his three trips to Sicily reveal that true philosophical knowledge entails action; they show the immense power of friendship in Plato’s life and philosophy; and they suggest that Plato’s philosopher-king thesis is not false so much as incomplete.

These key events are cogently expressed in Plato’s often-overlooked Seventh Letter. The Seventh Letter has proved an enigma for scholars since at least the great German philologists of the 19th century. While the majority of scholars have accepted its authenticity, few have given its theory of political action a prominent place in the exegesis of Plato. In the past three decades, some scholars have even moved to write it out of the Platonic canon, with the most recent Oxford commentary terming it The PseudoPlatonic Seventh Letter (2015). Each age has its own Plato, and perhaps given the apolitical quietism of many academics, it makes sense that contemporary academics often neglect Plato’s discussion of political action. Nonetheless, most scholars – even those who wished it to be a forgery – have found the letter authentic, based on historical and stylistic evidence. If we return to the story of Plato’s Italian journeys, which Plato himself tells in The Seventh Letter, we’re able to resurrect the historical Plato who risked his life in order to unite philosophy and power.

While The Seventh Letter focuses on the story of Plato’s three voyages to Syracuse, it begins with a brief synopsis of his early life. Like most members of the Athenian elite, his first ambition was to enter politics and public life. In Plato’s 20s, however, Athens underwent a series of violent revolutions, culminating in the restoration of the democracy and the execution of his teacher Socrates in 399 BCE. ‘Whereas at first I had been full of zeal for public life,’ Plato wrote, ‘when I noted these changes and saw how unstable everything was, I became in the end quite dizzy.’ He decided that the time was too chaotic for meaningful action, but he didn’t abandon the desire to engage in political life. Instead, in his own words, he was ‘waiting for the right time’. He was also waiting for the right friends.

When Plato first arrived in Sicily, a trip that likely took more than a week by boat on the rough and dangerous Mediterranean, he immediately noticed the islanders’ extravagant way of life. He was struck by their ‘blissful life’, one ‘replete … with Italian feasts’, where ‘existence is spent in gorging food twice a day and never sleeping alone at night.’ No one can become wise, Plato believed, if he lives a life primarily focused on sensual pleasure. Status-oriented hedonism creates a society devoid of community, one in which the stability of temperance is sacrificed to the flux of competitive excess. Plato writes:

Nor could any State enjoy tranquility, no matter how good its laws when its men think they must spend their all on excesses, and be easygoing about everything except the drinking bouts and the pleasures of love that they pursue with professional zeal. These States are always changing into tyrannies, or oligarchies, or democracies, while the rulers in them will not even hear mention of a just and equitable constitution.

Though the Syracusan state was in disarray, Plato’s friend Dion offered him a unique opportunity to influence the Sicilian kings. Dion didn’t partake in the ‘blissful life’ of the court. Instead, according to Plato, he lived ‘his life in a different manner’, because he chose ‘virtue worthy of more devotion than pleasure and all other kinds of luxury’. While today we might not associate friendship with political philosophy, many ancient thinkers understood the intimate connection between the two. Plutarch, a subtle reader of Plato, expresses this link nicely:

[L]ove, zeal, and affection … which, though they seem more pliant than the stiff and hard bonds of severity, are nevertheless the strongest and most durable ties to sustain a lasting government.

Plato saw in Dion ‘a zeal and attentiveness I had never encountered in any young man’. The opportunity to extend these bonds to the summit of political power would present itself 20 years later, after Plato escaped slavery and Dionysius I had died.

Dionysius II, the elder tyrant’s son, also didn’t appear likely to become a philosopher king. Although Dion wanted his brother-in-law Dionysius I to give Dionysius II a liberal education, the older king’s fear of being deposed made him reluctant to comply. He worried that if his son received a sound moral education, conversing regularly with wise and reasonable teachers, he might overthrow him. So Dionysius I kept Dionysius II confined and uneducated. As he grew older, courtiers plied him with wine and women. Dionysius II once held a 90-day long drunken debauch, refusing to conduct any official business: ‘drinking, singing, dancing, and buffoonery reigned there without control,’ Plutarch wrote.

Nonetheless, Dion used all his influence to persuade the young king to invite Plato to Sicily and place himself under the guidance of the Athenian philosopher. Dionysius II began sending Plato letters urging him to visit, and Dion as well as various Pythagorean philosophers from southern Italy added their own pleas. But Plato was nearly 60 years old, and his last experience in Syracusan politics must have left him reluctant to test fate again. Not heeding these entreaties would have been an easy and understandable choice.

Dion wrote to Plato that this was  . . .

Continue reading. There’s more.

Later in the article:

He writes in The Seventh Letter:

I set out from home … dreading self-reproach most of all; lest I appear to myself only theory and no deed willingly undertaken … I cleared myself from reproach on the part of Philosophy, seeing that she would have been disgraced if I, through poorness of spirit and timidity, had incurred the shame of cowardice …

This reveals a conception of philosophy in which ‘theory’ is damaged by a lack of corresponding ‘deed’. The legitimacy of philosophy requires the conjunction of knowledge and action.

Written by LeisureGuy

30 December 2020 at 1:15 pm

536 CE: The worst year in history

leave a comment »

Ann Gibbons writes in Science:

Ask medieval historian Michael McCormick what year was the worst to be alive, and he’s got an answer: “536.” Not 1349, when the Black Death wiped out half of Europe. Not 1918, when the flu killed 50 million to 100 million people, mostly young adults. But 536. In Europe, “It was the beginning of one of the worst periods to be alive, if not the worst year,” says McCormick, a historian and archaeologist who chairs the Harvard University Initiative for the Science of the Human Past.

A mysterious fog plunged Europe, the Middle East, and parts of Asia into darkness, day and night—for 18 months. “For the sun gave forth its light without brightness, like the moon, during the whole year,” wrote Byzantine historian Procopius. Temperatures in the summer of 536 fell 1.5°C to 2.5°C, initiating the coldest decade in the past 2300 years. Snow fell that summer in China; crops failed; people starved. The Irish chronicles record “a failure of bread from the years 536–539.” Then, in 541, bubonic plague struck the Roman port of Pelusium, in Egypt. What came to be called the Plague of Justinian spread rapidly, wiping out one-third to one-half of the population of the eastern Roman Empire and hastening its collapse, McCormick says.

Historians have long known that the middle of the sixth century was a dark hour in what used to be called the Dark Ages, but the source of the mysterious clouds has long been a puzzle. Now, an ultraprecise analysis of ice from a Swiss glacier by a team led by McCormick and glaciologist Paul Mayewski at the Climate Change Institute of The University of Maine (UM) in Orono has fingered a culprit. At a workshop at Harvard this week, the team reported that a cataclysmic volcanic eruption in Iceland spewed ash across the Northern Hemisphere early in 536. Two other massive eruptions followed, in 540 and 547. The repeated blows, followed by plague, plunged Europe into economic stagnation that lasted until 640, when another signal in the ice—a spike in airborne lead—marks a resurgence of silver mining, as the team reports in Antiquity this week.

To Kyle Harper, provost and a medieval and Roman historian at The University of Oklahoma in Norman, the detailed log of natural disasters and human pollution frozen into the ice “give us a new kind of record for understanding the concatenation of human and natural causes that led to the fall of the Roman Empire—and the earliest stirrings of this new medieval economy.”

Ever since tree ring studies in the 1990s suggested the summers around the year 540 were unusually cold, researchers have hunted for the cause. Three years ago polar ice cores from Greenland and Antarctica yielded a clue. When a volcano erupts, it spews sulfur, bismuth, and other substances high into the atmosphere, where they form an aerosol veil that reflects the sun’s light back into space, cooling the planet. By matching the ice record of these chemical traces with tree ring records of climate, a team led by Michael Sigl, now of the University of Bern, found that nearly every unusually cold summer over the past 2500 years was preceded by a volcanic eruption. A massive eruption—perhaps in North America, the team suggested—stood out in late 535 or early 536; another followed in 540. Sigl’s team concluded that the double blow explained the prolonged dark and cold.

Mayewski and his interdisciplinary team decided to look for the same eruptions in an ice core drilled in 2013 in the Colle Gnifetti Glacier in the Swiss Alps. The 72-meter-long core entombs more than 2000 years of fallout from volcanoes, Saharan dust storms, and human activities smack in the center of Europe. The team deciphered this record using a new ultra–high-resolution method, in which a laser carves 120-micron slivers of ice, representing just a few days or weeks of snowfall, along the length of the core. Each of the samples—some 50,000 from each meter of the core—is analyzed for about a dozen elements. The approach enabled the team to pinpoint storms, volcanic eruptions, and lead pollution down to the month or even less, going back 2000 years, says UM volcanologist Andrei Kurbatov. . .

Continue reading. There’s more, including an interesting chart.

It’s worth noting that nature can dish out severe catastrophes with no warning. Humans should really try to get along, because certainly the future will at some point(s) see additional great catastrophes that will make the pandemic seem like a walk in the park.

Written by LeisureGuy

27 December 2020 at 6:54 pm

The long history of *

leave a comment »

The site Shady Characters has an interesting post that begins:

The as­ter­isk is old. Really old. Gran­ted, it is not 5,000 years old, as Robert Bring­hurst claims in the oth­er­wise im­pec­cable Ele­ments of Ty­po­graphic Style1 (Bring­hurst con­fuses it with a star-like cunei­form mark that rep­res­ents “deity” or “heaven”2), but it has more than two mil­len­nia un­der its belt non­ethe­less. I go into greater de­tail in the Shady Char­ac­ters book, but the abridged ver­sion of the as­ter­isk’s ori­gin story goes something like this.


.
In the third cen­tury bce, at Al­ex­an­dria in Egypt, a lib­rar­ian named Zen­odotus was was strug­gling to edit the works of Homer into something ap­proach­ing their ori­ginal form. I say a lib­rar­ian, but really Zen­odotus was the lib­rar­ian, the first in a long line to be em­ployed at Al­ex­an­dria by the Ptole­maic pharaohs.3 Many spuri­ous ad­di­tions, de­le­tions and al­ter­a­tions had been made to the Odys­sey and Iliad since the time of their com­pos­i­tion, but Zen­odotus lacked the tools to deal with them. As such, he star­ted draw­ing a short dash (—) in the mar­gin be­side each line he con­sidered to be su­per­flu­ous, and, in do­ing so, in­aug­ur­ated the field of lit­er­ary cri­ti­cism.4 Named the ob­elos, or “roast­ing spit”, in the sev­enth cen­tury Isidore of Seville cap­tured the es­sence of Zen­odotus’s mark when he wrote that “like an ar­row, it slays the su­per­flu­ous and pierces the false”.5

The as­ter­isk, in turn, was cre­ated by one of Zen­odotus’s suc­cessors. In the second cen­tury bce, Aristarchus of Sam­o­thrace in­tro­duced an ar­ray of new crit­ical sym­bols: the diple (>) called out note­worthy fea­tures in the text; the diple per­iestig­mene (⸖) marked lines where Aristarchus dis­agreed with Zen­odotus’s ed­its; and, fi­nally, the as­ter­iskos (※), or “little star”, de­noted du­plic­ate lines.6,7 Oc­ca­sion­ally, Aristarchus paired an as­ter­isk and ob­elus to in­dic­ate lines that be­longed else­where in the poem.8

Thus the as­ter­isk was born. And right from the be­gin­ning, it came with a warn­ing: a text with an as­ter­isk at­tached to it is not the whole story.


.
Hav­ing sur­vived the in­ter­ven­ing mil­len­nia with its visual form largely in­tact, by the me­di­eval period the as­ter­isk had moved into a new role as an “an­chor” for read­ers’ notes: where a reader wanted to link a note scribbled in the mar­gin to a par­tic­u­lar pas­sage in the text, a pair of as­ter­isks would do the trick. Later, in prin­ted books, au­thors used the as­ter­isk to call out their own asides.9

By the twen­ti­eth cen­tury, the as­ter­isk had be­come the de facto leader of the foot­note clan. In 1953, a lex­ico­grapher named Eric Part­ridge ex­plained that “the fol­low­ing are of­ten used”: ‘*’, ‘†’, ‘**’, ‘‡’ or ‘††’, ‘***’ or ‘⁂’ or ‘⁂’, and fi­nally ‘†††’.10 Things have calmed down a little since Part­ridge’s time, but ‘*’, ‘†’, and ‘‡’ are still re­l­at­ively com­mon and even ‘§’, ‘||’ and ‘¶’ ap­pear on oc­ca­sion. Should a writer’s pen­chant for foot­notes ex­tend past five or six per page, lettered or numbered notes may be a bet­ter op­tion and, in­deed, the fre­quency of ty­po­graphic foot­note mark­ers does seem to have waned over the past few dec­ades.


.
Yet even as the as­ter­isk is used less of­ten as a foot­note marker, its im­plied mean­ing — that there is more here than meets the eye — is as strong as ever. For Amer­ican news­pa­pers, merely to use the word “as­ter­isk” is to tar­nish its sub­ject by as­so­ci­ation; for Amer­ican sports writers, doubly so.

It all goes back to 1961, and a base­ball es­tab­lish­ment un­will­ing to . . .

Continue reading.

Written by LeisureGuy

27 December 2020 at 7:22 am

Posted in Daily life, History, Writing

How U.S. Cities Lost Control of Police Discipline

leave a comment »

It doesn’t have to be the way it is. In the NY Times Kim Barker, Michael H. Keller, and Steve Eder report:

It took Portland, Ore., almost $1 million in legal fees, efforts by two mayors and a police chief, and years of battle with the police union to defend the firing of Officer Ron Frashour — only to have to bring him back. Today, the veteran white officer, who shot an unarmed Black man in the back a decade ago, is still on the force.

Sam Adams, the former mayor of Portland, said the frustrated disciplinary effort showed “how little control we had” over the police. “This was as bad a part of government as I’d ever seen. The government gets to kill someone and get away with it.”

After the death of George Floyd at the hands of Minneapolis officers in May spurred huge protests and calls for a nationwide reset on law enforcement, police departments are facing new state laws, ballot proposals and procedures to rein in abusive officers. Portland and other cities have hired new chiefs and are strengthening civilian oversight. Some municipal leaders have responded faster than ever to high-profile allegations of misconduct: Since May, nearly 40 officers have been fired for use of force or racist behavior.

But any significant changes are likely to require dismantling deeply ingrained systems that shield officers from scrutiny, make it difficult to remove them and portend roadblocks for reform efforts, according to an examination by The New York Times. For this article, reporters reviewed hundreds of arbitration decisions, court cases and police contracts stretching back decades, and interviewed more than 150 former chiefs and officers, law enforcement experts and civilian oversight board members.

While the Black Lives Matter protests this year have aimed to address police violence against people of color, another wave of protests a half-century ago was exploited to gain the protections that now often allow officers accused of excessive force to avoid discipline.

That effort took off in Detroit, partly as a backlash to the civil rights movement of the 1960s, when police officers around the country — who at times acted as instruments of suppression for political officials or were accused of brutality in quelling unrest — felt vulnerable to citizen complaints.

Newly formed police unions leveraged fears of lawlessness and an era of high crime to win disciplinary constraints, often far beyond those of other public employees. Over 50 years, these protections, expanded in contracts and laws, have built a robust system for law enforcement officers. As a result, critics said, officers empowered to protect the public instead were protected from the public.

In many places, the union contract became the ultimate word. The contract overrode the city charter in Detroit. The contract can beat state law in Illinois. The contract, for years, has stalled a federal consent decree in Seattle.

Many police contracts and state laws allow officers to appeal disciplinary cases to an arbitrator or a review board, giving them final say. Arbitrators reinstate about half of the fired officers whose appeals they consider, according to separate reviews of samplings of cases by The Times and a law professor. Some arbitrators referred to termination as “economic capital punishment” or “economic murder.”

Disciplinary cases often fall apart because of contractual or legal standards that departments must show a record of comparable discipline: A past decision not to fire makes it harder to fire anyone else.

Because many departments don’t disclose disciplinary action for police misconduct and there is no public centralized record-keeping system, it is difficult to determine how many cases are pursued against officers, and the outcomes.

And police chiefs acknowledge that they don’t always seek the discipline they think is warranted. That can lead to problem officers remaining on the streets. Rather than gamble on arbitration, some chiefs allow officers to quit or opt for financial settlements, which can enable them to move on to other departments with seemingly unblemished records.

“You would pay them to leave,” said Roger Peterson, the former police chief in Rochester, Minn., who said he had negotiated such payments for about a dozen officers during his 19-year tenure. “It stunk.”

Union leaders defend the disciplinary protections, saying that police work is difficult [unlike all other jobs, which are easy? – LG], and that rules help ensure that chiefs don’t impose discipline because of political pressure or personal biases. Public outcry, they said, can unfairly influence a city’s decision to fire an officer accused of excessive force. Will Aitchison, the union lawyer who represented Officer Frashour in Portland, said the arbitration process protected officers like him who were fired because of “political expediency.”

Nobody wants a bad cop,” said Brian Marvel, a San Diego police officer and the president of California’s largest law enforcement labor organization. “Good cops want bad cops out as bad as anybody else. But we still have to protect the due-process rights of all our members.”

Even so, many leaders argue that the protections handcuff them. Eric Melancon, chief of staff to the Baltimore police commissioner, drew a direct line between the laws from decades ago and the difficulties today.

“If George Floyd were to happen in Baltimore city,” he told a state policing commission, “we would not be able to terminate those officers.”

In the summer of 1967, civil unrest simmered in more than 150 cities nationwide. Detroit caught fire.

Black residents saw the almost all-white police force as . . .

Continue reading.

Written by LeisureGuy

24 December 2020 at 3:30 pm

Why rulers and leaders don’t seem to see the right decisions so clearly as you and I

leave a comment »

Written by LeisureGuy

10 December 2020 at 1:00 pm

Russia’s long-term disinformation campaign against the US

leave a comment »

David Troy has a lengthy Facebook post on Russia’s disinformation war against the US, a war that has successfully enlisted support from a great number on the Right, who participate and amplify the disinformation without understanding on whose behalf they are working nor the impact on the US.

He writes:

Russia’s current disinformation attacks on the west accelerated in late 2012, consisting of three phases:

1) amplify leaked intelligence harmful to US, EU, and NATO,
2) identify and exacerbate real existing divisions in US society,
3) recruit and amplify Americans and other westerners to drive conflict.

Phase 1 involved Assange, Snowden, Manning and others; one can argue about whether and when they became witting vs. unwitting accomplices, and it doesn’t matter for this discussion. Phase 2 involved IRA planting content and driving conflicts. Phase 3 ran parallel to the entire operation but is now the principal driver. Russia need do nothing now but amplify western voices to advance its geopolitical agenda.

Even today people misunderstand what happened. The “interference” that occurred in 2016 was not a discrete event with a beginning and end, and it wasn’t limited to 2016. They began in 2012 with people like Christine Assange, Roseanne Barr, Caitlin Johnstone and Cassandra Fairbanks, planting seeds that would grow later. Several dozen others were seeded, to grow audiences and cultivate “points of view.”

By 2016 they had a mighty circus underway, with angry opposing voices, along with slogans and chants. To the public, including the media, this seemed home grown. “America is built on original sin, and this is just payback time,” we thought. “We are not the country we pretend to be,” we thought. In fact we are looking at ourselves through a funhouse mirror, grotesquely amplifying our least flattering features.

Those of us studying disinformation networks and cults in detail know what has actually occurred. And most likely, you won’t read about it in the papers soon. That’s not for lack of trying on our part. It’s because these manipulations are difficult to report, even as it’s possible to understand them from an intelligence and analysis point of view. Some of them are so complex and arcane as to defy both imagination and comprehension.

The case of Alger Hiss is instructive. He stood accused of espionage in 1948, for activities he had undertaken in the decade prior. The statute of limitations had run out for an espionage conviction, however, but he was convicted of perjury in 1950 for lying under oath in his testimony. He served three and a half years in jail. For decades, the debate over Hiss’ guilt or innocence raged. Hardcore leftists accused Richard Nixon of manufacturing a typewriter to frame Hiss.

In fact, an intelligence program called VENONA had captured information that could have proved Hiss’ guilt. But it was not revealed at that time because it would have exposed the fact that Russia’s (weak) encryption practices had been broken. So its existence, along with information that would have implicated Hiss’ entire spy cell, was kept secret until 1995 — fifteen years after the program was terminated.

Many a political debate in the 1970’s could have been quashed had the VENONA information been known at that time. As it turned out, the information eventually released showed Hiss to most likely be codename ALES, an active Soviet GRU agent. But even today, there are some who have alternative explanations for the identity of ALES. Most people believe he was Hiss.

So here we sit 80 years after the fact and still not totally certain about the facts of a GRU operation. I am here to tell you in very certain terms that we are living through this again now. There are active GRU operations happening in our country now that are leading us into pointless political debates, that make us think less of ourselves and of our country — and we may not learn about them in any detail for a long time.

Will it take 55 years, as it did with VENONA? Maybe. I hope not. Imagine in 2070 finally learning more of the actual truth about what happened in 2015 or 2020. That will be a reckoning, no? For my part I’ll be 99 years old, or most likely dead. Many of you will be long passed as well.
Imagine, spending the rest of your life battling manufactured demons and engineered information operations designed to outlast our ability to detect and report on them. I’m sad to report that is the reality we inhabit.

The last four years have felt like a war. This is because we are in one. We even have 300,000 dead to show for it. It is tempting to think that we have voted our way out of this war. I don’t think that’s the case.

Much has been made of the threat posed by disinformation. Less has been said about the war. What we will find next year is that we are no less exhausted and exasperated, because the war, which was undeclared, has not ended. It hasn’t even been articulated. It just is. We are in a forever war with ourselves, driven by powers who wish to keep it that way.

So what is to be done? We must close the gap between intelligence gathering and journalistic reporting. 55 years is too long to learn of the existence of intelligence operations. Indeed, Daniel Patrick Moynihan said the same when he successfully argued for the VENONA disclosures in 1995. Why *does* it take us so long to report on intelligence matters?

We have, for decades, over-relied on secrecy, which Moynihan argued has made us more prone to belief in conspiracy theories and distrust in government. He was right. We shouldn’t be keeping secrets forever, and we should have a schedule and a process that leans towards disclosure. It should not be possible for a government bureaucrat to hide things away inside the walls of secret databases to protect themselves or crony political interests.

Secondly, journalism is suffering from a perfect storm of exploitable failures. “News” is now expected to make a profit. That has led editors to prefer bite-sized, click bait stories that are easier to report and monetize on a regular basis. Harder, investigative work is all but impossible in this climate, and often only happens in partnership with research groups who have other funding or motivations.

Modern intelligence operations are also impossibly complex, involving shady characters, shell corporations, cutouts, LARPs, covert communications, sock puppet accounts — and cults. We have found that cults are being weaponized at every level. Human intelligence, more than data, is the bedrock of any analysis effort.

Decoding these networks requires time and skills journalists simply don’t have and aren’t usually interested in developing. And importantly, no one is paying them to develop such skills.

The end result is that complex information operations from 10-15 years ago are *still* effectively unreported. When, if ever now, in this era of underfunded journalism manned with inexperienced reporters, will we learn about the operations taking place today?

Or will we live our lives having pointless debates about engineered conflicts? I think this is likely right now. While I do think history will eventually reckon with the truth of this era — that it was a time of manufactured conflict, rooted in an abuse of secrecy, exploited by cynical actors on a hopelessly naïve public — we must also be careful not to lurch too far in the other direction.

There are voices on the “right” (I use quotes because there is so much god-damn play acting) are shouting “declassify!”, which actually means: “release information that damages my political enemies.”
And some on the “left” (same caveat) say we should have no secrets at all, yada yada. Neither of these positions is sane. We have operational capacity that must be protected to maintain peace and stability in the world. We can find balance.

But we need not protect execrable bad faith play actors and cults. In fact, we need urgently to expose these people and deplatform them. People acting in good faith have a right to be heard. Those acting in bad faith do not. Those harming people should be stopped urgently and brought to justice.

Beyond that, we must make it unproductive to wage harmful information attacks. There are ways we can strengthen our population which I will cover another time, but journalists can play a role by learning to work more like intelligence gatherers. Some are rising to this challenge, but it is slow work, wedged between other projects that pay the bills.

While much has been made of the relative “security” of the 2020 election, the fact is that Russia interfered in 2016, 2018, and 2020 by permanently altering the social structure of American society. This is the legacy of this round of information operations and we will need to pursue a course of healing — in addition to exposing and banishing ongoing information operations.

So how long will it be before we learn the truth about the fake debates we are having now, at the hands of information operations? A while longer, I’m afraid. But we can probably do better than 55 years.

Perhaps if we muster all our will we can cut it down to 5 years, but this is the immunization we all desperately need and it can’t come a moment too soon. In the meantime, please be aware of all we don’t know, and love your neighbors — all of them; it’s the only antidote.

He notes in passing:

Interestingly, the GRU has been the GRU since Soviet times. Unlike the KGB which was reconstituted into FSB and had some different components mixed in, GRU has by comparison had the same tasking and culture since its inception in 1918.

See this Wikipedia article on the GRU.

Written by LeisureGuy

10 December 2020 at 10:58 am

Five Books on Julius Caesar, genocidal maniac

leave a comment »

Peter Stohard recommends five books on Julius Caesar:

Julius Caesar was a populist politician and general of the late Roman Republic who immortalized himself not only by his beautiful writing about his military exploits, but also by the manner of his death. Here, British journalist and critic Peter Stothard, author of The Last Assassin, chooses five books to help you understand both the man and what motivated him and some of the people who have been inspired by him in the 2,000 years since he died.

Interview by Benedict King

Perhaps, before we discuss your selection of books about Julius Caesar, you might briefly outline who Caesar was. As a non-Classicist, I think he conquered Gaul and Britain, and brought the Roman Republic to an end by crossing the Rubicon. He was then assassinated and said: ‘Et tu, Brute?’

Yes, he did conquer Gaul—between 58 and 50 BC—killing maybe a million Gauls in the process, also getting too rich and too powerful for traditional Roman politics to cope with him. No, he didn’t conquer Britain—even though his skill as a self-propagandist has often led people to think that he did. He had two goes at invading Britain, 55 and 54 BC, and was knocked back both times—more by the weather than the Britons.

And yes, he did cross the Rubicon, which was a shallow stream between Gaul and Italy. By crossing it with his army, in January 49 BC, he broke the rules designed to keep victorious armies away from Rome, began a civil war and gave the world a new term for an act from which you couldn’t go back.

Four years later, he might have said something like, ‘Et tu Brute,’ when he saw that one of his assassins on the Ides of March was the much loved son of his mistress. But, if he did, it would have probably been in Greek. It was quite usual for educated Romans to speak Greek. More importantly, he was a great writer in plain and elegant Latin. With words he established his place in the minds of his fellow Romans and of millions of people later by saying what he’d done—just as his death defined him for other writers.

By being assassinated he set a standard for thinking about the motives and consequences of assassination. For Romans, how you died was a very important summation of how you had lived. His death cemented what he’d written about what he had done. And the consequences of his death meant that no one ever forgot him.

Your book, The Last Assassin, deals with the pursuit of Julius Caesar’s assassins by his supporters, most notably his adopted son, Octavian, who would go on to become Emperor Augustus. What does that campaign to get back at his assassins tell us about the early establishment of his myth and reputation?

Caesar had many friends, as people who get to the top always do. But it turned out that some of those friends, for various reasons, were also his greatest enemies, so much so that they were prepared to kill him.

They each had slightly different motives, some of which are related to aspects of Caesar’s own character. Some hated him because they hadn’t become as rich under his watch as they felt he’d promised them they would be, or they’d hoped to be. One of them didn’t like him because he’d slept with his wife. Some didn’t like him because he pardoned them and made them feel, by his famous clemency, that somehow he was holding that over them. They felt ashamed of having been pardoned.

Others killed him because they were jealous of other people who hadn’t been as close to Caesar in the hard days in Gaul, but who seemed to have done almost as well as they had. There were lots of different personal reasons. One of them was upset that Caesar had stolen some lions he had planned to put in a circus show.

But they all had this fear that Caesar, even if he wasn’t yet a tyrant in 44 BC, was going to become a tyrant and a single autocratic ruler of Rome. There had been brief periods in Roman history when there had been single autocratic rulers before, but the assassins had this idea that he was going to be different. They couldn’t know that, of course, but they thought he would become a kind of hereditary monarch and impose a different kind of tyranny that they wouldn’t be able to get rid of.

So, they argued amongst themselves, probably suppressing their personal motivations, as to whether it was the right thing to kill a man like Caesar, who had done a great deal for Rome, but who was now on the brink, or over the brink, of establishing a tyranny. Sophisticated arguments were brought to bear about whether they should kill him, or whether the civil war that would probably follow from his death would be even worse.

So, there were these discussions about the evil consequences of tyranny versus a civil war. That discussion was conducted at quite a high philosophical level, but was brought together with a whole lot of those personal motivations for killing him. The philosophical arguments and the individual personal motivations taken together address the issue of who Caesar was.

Let’s move on to the books you’re recommending about Julius Caesar. First up is Et Tu, Brute?: the Murder of Caesar and Political Assassination by Greg Woolf. Tell us about why you’ve chosen this one.

Having to choose five books about Julius Caesar has been a great challenge. Caesar is someone whom you have to look at through many different lenses and prisms. He is not an easy character to see straight up. Looking at him might be compared to looking at the sun. He wasn’t the sun, except to some of his most extreme admirers. But if you try to look at him from one sole direction, it is rather blinding. So, the books I’ve chosen—and Greg Woolf is a very good introduction to this—try to look around Julius Caesar, to look at the ways different people saw him at the time and have seen him since. Woolf’s is a good account of how Caesar got to the Ides of March and what happened on the day. It’s quick and short and a very good start. But there’s also a long section on how the assassination reverberated through history, across Europe and across the Atlantic.

If he didn’t say ‘Et Tu, Brute?’ what did he say?

Et tu, Brute?’ was one of Shakespeare’s many contributions. If he said something like it, it is more likely he said the Greek words, ‘kai su, teknon’, which means ‘and you, my child’ and has been variously interpreted to mean ‘even you, who I’ve loved so much’ and ‘even you, the son of my mistress’ or ‘you, too, are going to be assassinated in your turn.’ Maybe it meant ‘I’ll see you in hell’ or a version of ‘up yours, Brutus.’ The Greek phrase has been interpreted in many different ways and Shakespeare’s ‘Et Tu, Brute?’ was just a convenient way of Shakespeare saying what a Roman might have said.

And just before we get on to the next book: we all know how Caesar died, but where did he come from? Was he born into a senatorial Roman family or did he pull himself up by his bootstraps?

He was born into a good family. All the people we’re talking about in the story, all Caesar’s assassins, were part of the elite, if you like, although the man that I have recently become most interested in, Cassius Parmensis, the last surviving assassin, wasn’t one of the top ones, which in some ways made his eyes a good lens through which to watch the action.

Caesar was a member of one of the elite families which had been rivals, squabbled and cooperated with each other, and fought against each other for hundreds of years, and had made Rome the extraordinary conqueror of so much. Gradually, it turned out that the bigger Rome’s empire, and the bigger the army its generals had, the more impossible it was to control them from the centre. So, Caesar, out in Gaul, with a lot of legions, was a lot more powerful than the Senate, which was supposed to be his master. So the system risked toppling over under its own weight.

But there were still people who thought they could prop it up, that the problem was not the system but Caesar himself. These people were also within the elite—not among the people or the army, who largely loved Caesar, as the assassins found to their cost. These killers thought that, if they could just get rid of Caesar, they could go back to divvying up power in Rome between themselves, as they’d always done.

Let’s move on to American Caesar: Douglas MacArthur 1880-1964 by William Manchester. This is the life of the American general Douglas MacArthur, who was the ruler of occupied Japan after the Second World War. Why have you chosen this book?

This book is a great example of . . .

Continue reading.

Written by LeisureGuy

30 November 2020 at 3:49 pm

Posted in Books, History

A History of Philosophy from Ancient Greece to Modern Times (81 Video Lectures)

leave a comment »

Philosphy is endlessly fascinating, and I think this series will provide a good foundation. Watch one a day, starting today, and well before Valentine’s Day — well, 4 days before — you’ll have a better idea of what philosophy has been in the West.

This is via OpenCulture, which notes:

You can watch 81 video lectures tracing the history of philosophy, moving from Ancient Greece to modern times. Arthur Holmes presented this influential course at Wheaton College for decades, and now it’s online for you. The lectures are all streamable above, or available through this YouTube playlist.

Philosophers covered in the course include: Plato, Aquinas, Hobbes Descartes, Spinoza, Hume, Kant, Hegel, Nietzsche, Sartre and more.

A History of Philosophy has been added to our list of Free Online Philosophy courses, a subset of our meta collection, 1,500 Free Online Courses from Top Universities.

Written by LeisureGuy

22 November 2020 at 11:04 am

How to Build a State

leave a comment »

David Anthony points out in his (wonderful) book The Horse, the Wheel, and Language, the invention of the wheel, far from being the caveman-cartoon idea of a massive stone disk, had to await urbanzation and occupational specializing. The difficulty is not simply the wheel, but a load-bearing axle that allows low-friction rotation. (He pins down the time of the invention pretty closely, as an evolutionary step from using rollers to move heavy loads.)

As Anton Howes points out, putting a state into operation similarly requires a social infrastructure that took time to develop:

I’ve been a little quieter than usual lately, largely as I’ve been trying to write up some tricky parts of my next book. But I did recently publish a piece for a new online magazine called Works in Progress, entitled “How to Build a State”. With the piece, I wanted to convey the basic model of how we should think about what states could and could not do just a few centuries ago:

Suppose yourself transported to the throne of England in 1500, and crowned monarch. Once you bored of the novelty and luxuries of being head of state, you might become concerned about the lot of the common man and woman. Yet even if you wanted to create a healthcare system, or make education free and universal to all children, or even create a police force (London didn’t get one until 1829, and the rest of the country not til much later), there is absolutely no way you could succeed.

For a start, you would struggle to maintain your hold on power. Fund schools you say? Somebody will have to pay. The nobles? Well, try to tax them — in many European states they were exempt from taxation — and you might quickly lose both your throne and your head. And supposing you do manage to tax them, after miraculously stamping out an insurrection without their support, how would you even begin to go about collecting it? There was simply no central government agency capable of raising it.

It’s a basic point, perhaps, but it has lots of interesting implications, not least that monarchs were heavily reliant on making deals with soldiers, religious leaders, and assorted other groups. Appreciating the way states worked in the past — and the many constraints upon them — is fundamental to understanding the kinds of policies they pursued.

Indeed, the rest of the piece provides a framework for some of the other things I’ve been writing about recently, like the emergence of patents for invention, and the birth of the joint-stock business corporation. Both involved inventors exploiting monarchs’ desire for quick and easy cash, with monarchs exercising their prerogative rights to grant monopolies in exchange for a cut of the proceeds. No parliament needed.

And the framework helps to explain how patents came to be corrupted, far beyond the simple encouragement of new inventions or industries. Patents were soon being used to grant exemptions from certain laws and regulations, or even to oversee their enforcement. In 1594, one courtier obtained a 21-year privilege to regulate the quality of ale and beer used to make vinegar (alegar and beeregar, to be precise). In his petition, he alleged that the vinegars were being made from corrupt materials, so needed proper oversight. But in practice, when he obtained the patent, the courtier simply licensed all of the existing manufacturers to continue exactly what they had been doing before, but paying him fourpence per barrel.

Similar patents were granted for the “regulation” of  . . .

Continue reading.

Written by LeisureGuy

14 November 2020 at 12:33 pm

Where loneliness can lead: Hannah Arendt enjoyed her solitude, but . . .

leave a comment »

Samantha Rose Hill, assistant director of the Hannah Arendt Center for Politics and Humanities, visiting assistant professor of politics at Bard College in New York, and associate faculty at the Brooklyn Institute for Social Research in New York City, writes in Aeon:

What prepares men for totalitarian domination in the non-totalitarian world is the fact that loneliness, once a borderline experience usually suffered in certain marginal social conditions like old age, has become an everyday experience …
– From The Origins of Totalitarianism (1951) by Hannah Arendt

‘Please write regularly, or otherwise I am going to die out here.’ Hannah Arendt didn’t usually begin letters to her husband this way, but in the spring of 1955 she found herself alone in a ‘wilderness’. After the publication of The Origins of Totalitarianism, she was invited to be a visiting lecturer at the University of California, Berkeley. She didn’t like the intellectual atmosphere. Her colleagues lacked a sense of humour, and the cloud of McCarthyism hung over social life. She was told there would be 30 students in her undergraduate classes: there were 120, in each. She hated being on stage lecturing every day: ‘I simply can’t be exposed to the public five times a week – in other words, never get out of the public eye. I feel as if I have to go around looking for myself.’ The one oasis she found was in a dockworker-turned-philosopher from San Francisco, Eric Hoffer – but she wasn’t sure about him either: she told her friend Karl Jaspers that Hoffer was ‘the best thing this country has to offer’; she told her husband Heinrich Blücher that Hoffer was ‘very charming, but not bright’.

Arendt was no stranger to bouts of loneliness. From an early age, she had a keen sense that she was different, an outsider, a pariah, and often preferred to be on her own. Her father died of syphilis when she was seven; she faked all manner of illnesses to avoid going to school as a child so she could stay at home; her first husband left her in Berlin after the burning of the Reichstag; she was stateless for nearly 20 years. But, as Arendt knew, loneliness is a part of the human condition. Everybody feels lonely from time to time.

Writing on loneliness often falls into one of two camps: the overindulgent memoir, or the rational medicalisation that treats loneliness as something to be cured. Both approaches leave the reader a bit cold. One wallows in loneliness, while the other tries to do away with it altogether. And this is in part because loneliness is so difficult to communicate. As soon as we begin to talk about loneliness, we transform one of the most deeply felt human experiences into an object of contemplation, and a subject of reason. Language fails to capture loneliness because loneliness is a universal term that applies to a particular experience. Everybody experiences loneliness, but they experience it differently.

As a word, ‘loneliness’ is relatively new to the English language. One of the first uses was in William Shakespeare’s tragedy Hamlet, which was written around 1600. Polonius beseeches Ophelia: ‘Read on this book, that show of such an exercise may colour your loneliness.’ (He is counselling her to read from a prayer book, so no one will be suspicious of her being alone – here the connotation is of not being with others rather than any feeling of wishing that she was.)

Throughout the 16th century, loneliness was often evoked in sermons to frighten churchgoers from sin – people were asked to imagine themselves in lonely places such as hell or the grave. But well into the 17th century, the word was still rarely used. In 1674, the English naturalist John Ray included ‘loneliness’ in a list of infrequently used words, and defined it as a term to describe places and people ‘far from neighbours’. A century later, the word hadn’t changed much. In Samuel Johnson’s A Dictionary of the English Language (1755), he described the adjective ‘lonely’ solely in terms of the state of being alone (the ‘lonely fox’), or a deserted place (‘lonely rocks’) – much as Shakespeare used the term in the example from Hamlet above.

Until the 19th century, loneliness referred to an action – crossing a threshold, or journeying to a place outside a city – and had less to do with feeling. Descriptions of loneliness and abandonment were used to rouse the terror of nonexistence within men, to get them to imagine absolute isolation, cut off from the world and God’s love. And in a certain way, this makes sense. The first negative word spoken by God about his creation in the Bible comes in Genesis after he made Adam: ‘And the Lord God said, “It is not good that man is alone; I shall make him a helpmate opposite him.”’

In the 19th century, amid modernity, loneliness lost its connection with religion and began to be associated with secular feelings of alienation. The use of the term began to increase sharply after 1800 with the arrival of the Industrial Revolution, and continued to climb until the 1990s until it levelled off, rising again during the first decades of the 21st century. Loneliness took up character and cause in Herman Melville’s ‘Bartleby, the Scrivener: A Story of Wall Street’ (1853), the realist paintings of Edward Hopper, and T S Eliot’s poem The Waste Land (1922). It was engrained in the social and political landscape, romanticised, poeticised, lamented.

But in the middle of the 20th century, Arendt approached loneliness differently. For her, it was both something that could be done and something that was experienced. In the 1950s, as she was trying to write a book about Karl Marx at the height of McCarthyism, she came to think about loneliness in relationship to ideology and terror. Arendt thought the experience of loneliness itself had changed under conditions of totalitarianism:

What prepares men for totalitarian domination in the non-totalitarian world is the fact that loneliness, once a borderline experience usually suffered in certain marginal social conditions like old age, has become an everyday experience of the ever-growing masses of our century.

Totalitarianism in power found a way to crystallise the occasional experience of loneliness into a permanent state of being. Through the use of isolation and terror, totalitarian regimes created the conditions for loneliness, and then appealed to people’s loneliness with ideological propaganda.

Before Arendt left to teach at Berkeley, she’d published an essay on ‘Ideology and Terror’ (1953) dealing with isolation, loneliness and solitude in a Festschrift for Jaspers’s 70th birthday. This essay, alongside her book The Origins of Totalitarianism, became the foundation for her oversubscribed course at Berkeley, ‘Totalitarianism’. The class was divided into four parts: the decay of political institutions, the growth of the masses, imperialism, and the emergence of political parties as interest-group ideologies. In her opening lecture, she framed the course by reflecting on how the relationship between political theory and politics has become doubtful in the modern age. She argued that there was an increasing, general willingness to do away with theory in favour of mere opinions and ideologies. ‘Many,’ she said, ‘think they can dispense with theory altogether, which of course only means that they want their own theory, underlying their own statements, to be accepted as gospel truth.’

Arendt was referring to the way in which ‘ideology’ had been used as a desire to divorce thinking from action – ‘ideology’ comes from the French idéologie, and was first used during the French Revolution, but didn’t become popularised until the publication of Marx and Friedrich Engels’s The German Ideology (written in 1846) and later Karl Mannheim’s Ideology and Utopia (1929), which she reviewed for Die Gesellschaft in 1930.

In 1958, a revised version of ‘Ideology and Terror’ was added as a new conclusion to the second edition of The Origins of Totalitarianism.

Origins is a 600-page work divided into three sections on . . .

Continue reading. There’s much more, and it’s quite relevant today in the US as more people feel isolated and lonely because of the pandemic lockdowns and totalitarian pressures emerge from the Right.

Written by LeisureGuy

12 November 2020 at 12:37 pm

%d bloggers like this: