Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Philosophy’ Category

Free Speech on Trial

leave a comment »

Matt Stoller writes in Big:

Today’s issue is about how a subtle form of speech control works in 21st century America, as seen through two ongoing antitrust cases. The first is a merger trial where the government is trying to block the combination of publishing giants Penguin/Random House and Simon & Schuster, and the second is a lawsuit where conservative video service Rumble is suing Google for monopolization.

In both, dominant firms are trying to gain or protect market power, and in doing so, end up with too much power over the public square. It’s not intentional, but monopoly power fosters centralized control of what we can discuss.

Speech and Concentration Creep

In the 1998 romantic comedy You’ve Got Mail, Meg Ryan and Tom Hanks star as two business rivals who hate each other in ‘real life’ but connect and fall in love anonymously over the internet. Hanks plays Joe Fox, a tycoon who owns a Barnes and Nobles-style corporate book chain, trying to crush the small store owned by Kathleen Kelly, played by Meg Ryan. After a noisy but adorably silly protest, the movie ends with Kelly losing her store, but getting Tom Hanks as a soulmate. It’s a delightful film, a Nora Ephron-written classic.

What’s interesting about this movie from an anti-monopolist standpoint, however, is not the romance, but the politics. The movie is almost aggressively apathetic about concentrations of power. We tend to look at corporate concentration as a relatively recent phenomenon. Big tech emerged in force in the 2000s, that’s when offshoring to China happened in force, and the key major ruling ending monopolization cases didn’t occur until 2004. But here’s a movie showing that almost 25 years ago, before all that, consolidation was so well-known as to be a relatively unremarked central plot element of a popular film.

You’ve Got Mail is also a movie about a specific industry, publishing. Indeed, in many ways, the book industry has been a canary in the coal mine for concentration in the American economy. Books were the very first industry dominated by Amazon, but it isn’t just the retail giant. Every part of the book business, from retail stores to distribution to printing to retail to audio and ebooks to publishing houses, has been consolidating for decades. In the movie Tom Hanks is kind and charming; in real life, Barnes and Nobles used its power over shelf space to act as the industry bully, until Jeff Bezos came along and turned market power into performance art. Then, ten years ago, Penguin and Random House merged, allowed by the Obama administration’s antitrust enforcers. The book business is an increasingly cruel and lawless world, not a romantic one. . .

Continue reading. Interesting stuff.

Written by Leisureguy

10 August 2022 at 11:44 am

The Conscious Universe

leave a comment »

Joe Zadeh wrote in Noéma in November 2021:

London was a crowded city in 1666. The streets were narrow, the air was polluted, and inhabitants lived on top of each other in small wooden houses. That’s why the plague spread so easily, as well as the Great Fire. So did gossip, and the talk of the town was Margaret Cavendish, the Duchess of Newcastle.

Cavendish was a fiery novelist, playwright, philosopher and public figure known for her dramatic manner and controversial beliefs. She made her own dresses and decorated them in ribbons and baubles, and once attended the theater in a topless gown with red paint on her nipples. In his diaries, Samuel Pepys described her as a “mad, conceited, ridiculous woman,” albeit one he was obsessed with: He diarized about her six times in one three-month spell.

The duchess drew public attention because she was a woman with ideas, lots of them, at a time when that was not welcome. Cavendish had grown up during the murderous hysteria of the English witch trials, and her sometimes contradictory proto-feminism was fueled by the belief that there was a parallel to be drawn between the way men treated women and the way men treated animals and nature. “The truth is,” she wrote, “we [women] Live like Bats or Owls, labour like Beasts and die like Worms.”

In 1666, she released “The Blazing World,” a romantic and adventurous fantasy novel (as well as a satire of male intellectualism) in which a woman wanders through a portal at the North Pole and is transported to another world full of multicolored humans and anthropomorphic beasts, where she becomes an empress and builds a utopian society. It is now recognized as one of the first-ever works of science fiction.

But this idea of a blazing world was not just fiction for Cavendish. It was a metaphor for her philosophical theories about the nature of reality. She believed that at a fundamental level, the entire universe was made of just one thing: matter. And that matter wasn’t mostly lifeless and inert, like most of her peers believed, but animate, aware, completely interconnected, at one with the stuff inside us. In essence, she envisioned that it wasn’t just humans that were conscious, but that consciousness, in some form, was present throughout nature, from animals to plants to rocks to atoms. The world, through her eyes, was blazing.

Cavendish was not the only one to have thoughts like these at that time, but they were dangerous thoughts to have. In Amsterdam, the Jewish philosopher Baruch Spinoza wrote that every physical thing had its own mind, and those minds were at one with God’s mind; his books were banned by the church, he was attacked at knifepoint outside a synagogue, and eventually, he was excommunicated. Twenty-three years before Cavendish was born, the Italian Dominican friar and philosopher, Giordano Bruno — who believed the entire universe was made of a single universal substance that contained spirit or consciousness — was labeled a heretic, gagged, tied to a stake and burned alive in the center of Rome by the agents of the Inquisition. His ashes were dumped in the Tiber.

If the dominant worldview of Christianity and the rising worldview of science could agree on anything, it was that matter was dead: Man was superior to nature. But Cavendish, Spinoza, Bruno, and others had latched onto the coattails of an ancient yet radical idea, one that had been circulating philosophy in the East and West since theories of mind first began. Traces of it can be found in Hinduism, Buddhism, Taoism, Christian mysticism, and the philosophy of ancient Greece, as well as many indigenous belief systems around the world. The idea has many forms and versions, but modern studies of it house them all inside one grand general theory: panpsychism.

Derived from the Greek words pan (“all”) and psyche (“soul” or “mind”), panpsychism is the idea that consciousness — perhaps the most mysterious phenomenon we have yet come across — is not unique to the most complex organisms; it pervades the entire universe and is a fundamental feature of reality. “At a very basic level,” wrote the Canadian philosopher William Seager, “the world is awake.”

Plato and Aristotle had panpsychist beliefs, as did the Stoics. At the turn of the 12th century, the Christian mystic Saint Francis of Assisi was so convinced that everything was conscious that he tried speaking to flowers and preaching to birds. In fact, the history of thought is dotted with very clever people coming to this seemingly irrational conclusion. William James, the father of American psychology, was a panpsychist, as was the celebrated British mathematician Alfred North Whitehead; the Nobel Prize-winning physicist Max Planck once remarked in an interview, “I regard consciousness as fundamental.” Even the great inventor Thomas Edison had some panpsychist views, telling the poet George Parsons Lathrop: “It seems that every atom is possessed by a certain amount of primitive intelligence.”

But over the course of the 20th century, panpsychism came to be seen as absurd and incompatible in mainstream Western science and philosophy, just a reassuring delusion for New Age daydreamers. Karl Popper, one of the most influential philosophers of recent times, described it as “trivial” and “grossly misleading.” Another heavyweight, Ludwig Wittgenstein, waved away the theory: “Such image-mongery is of no interest to us.” As the American philosopher John Searle put it: “Consciousness cannot be spread across the universe like a thin veneer of jam.”

Most philosophers and scientists with panpsychist beliefs kept them quiet for fear of public ridicule. Panpsychism used  . . .

Continue reading.

Written by Leisureguy

3 August 2022 at 5:53 pm

In praise of aphorisms

leave a comment »

Andrew Hui, associate professor in literature at Yale-NUS College in Singapore and author of The Poetics of Ruins in Renaissance Literature (2016) and A Theory of the Aphorism (2019), has an interesting essay on the aphorism as a philosophical device, but aphorisms enliven and encapsulate discourse beyond philosophy. One famous example is Oliver Wendell Holmes Jr’s dictum “The life of the law has not been logic; it has been experience.” Another example, so far less famous (since I wrote it just minutes ago) is “A routine is a ritual that has lost its soul — or has not yet found it.” (That now is in my post on Covey’s method.)

Hui writes in Aeon:

Atypical university course in the history of philosophy surveys the great thinkers of Western civilisation as a stately procession from Plato to Aristotle to Descartes to Kant to Hegel to Nietzsche. These magnificent intellects offer their ideas in weighty philosophical tomes, stuffed with chiselled definitions, well-reasoned arguments and sustained critiques. In turn, instructors present the grand narrative of ideas to a new generation of students.

Immanuel Kant typifies this magisterial approach. In the closing pages of his Critique of Pure Reason (1781), the German philosopher narrates the history of Western philosophy from Plato to Aristotle to Locke to Leibniz to himself as a series of attempts to construct systems. Indeed, he is nothing if not a scrupulous architect of thought:

By an architectonic I mean the art of systems. Since systematic unity is what first turns common cognition into science.

That is, science turns what is a mere aggregate of random thoughts into something coherent. Only then can philosophy become a doctrine or method of judgment of what is knowledge and what is not. No systems, no real philosophy.

But might there be more things in heaven and earth than are dreamt of in Kant’s philosophy? What happens when we consider the history of philosophy not from the point of system-building, but through an alternative account that pays attention to the fragments of thinking?

Consider Heraclitus’ ‘Nature loves to hide’; Blaise Pascal’s ‘The eternal silence of these infinite spaces terrifies me’; or Friedrich Nietzsche’s ‘If a temple is to be erected, a temple must be destroyed.’ Heraclitus comes before and against Plato and Aristotle, Pascal after and against René Descartes, Nietzsche after and against Kant and G W F Hegel. Might the history of thought be actually driven by aphorism?

Much of the history of Western philosophy can be narrated as a series of attempts to construct systems. Conversely, much of the history of aphorisms can be narrated as an animadversion, a turning away from such grand systems through the construction of literary fragments. The philosopher creates and critiques continuous lines of argument; the aphorist, on the other hand, composes scattered lines of intuition. One moves in a chain of logic; the other by leaps and bounds.

Before the birth of Western philosophy proper, there was the aphorism. In ancient Greece, the short sayings of Anaximander, Xenophanes, Parmenides or Heraclitus constitute the first efforts at speculative thinking, but they are also something to which Plato and Aristotle are hostile. Their enigmatic pronouncements elude discursive analysis. They refuse to be corralled into systematic order. No one would deny that their pithy statements might be wise; but Plato and Aristotle were ambivalent about them. They have no rigour at all – they are just the scattered utterances of clever men.

Here is Plato’s critique of Heraclitus:

If you ask any one of them a question, he will pull out some little enigmatic phrase from his quiver and shoot it off at you; and if you try to make him give an account of what he has said, you will only get hit by another, full of strange turns of language.

For Plato, the Heracliteans’ stratagem of continual evasion is a problem because they constantly produce new aphorisms in order to subvert closure. In this sense, Heraclitus is opposed to Plato in at least two fundamental ways: first, his doctrine of flux is contrary to the theory of Forms; and second, the impression one gets is that his thinking is solitary, monologic, misanthropic, whereas Plato is always social, dialogic, inviting.

Plato’s repudiation of his predecessor’s gnomic style signals an important stage in the development of ancient philosophy: the transition from oracular enunciation to argumentative discourse, obscurity to clarity, and thus the marginalisation of the aphoristic style in favour of sustained logical arguments. From Socrates onward, there would simply be no philosophy without proof or argument.

Yet I think it is possible to defend Heraclitus against Plato’s attack. Perplexity arising from enigmatic sayings need not necessarily lead one to seizures of thinking. On the contrary, it can catalyse productive inquiry. Take this well-known saying: . . .

Continue reading. There’s quite a bit more.

Written by Leisureguy

1 August 2022 at 10:35 am

Republicans in Congress oppose what most Americans want

leave a comment »

Heather Cox Richardson writes:

Thursday’s public hearing by the House Select Committee to Investigate the January 6th Attack on the U.S. Capitol brought to its logical conclusion the story of Trump’s attempt to overturn our democracy. After four years of destroying democratic norms and gathering power into his own hands, the former president tried to overturn the will of the voters. Trump was attacking the fundamental concept on which this nation rests: that we have a right to consent to the government under which we live.

Far from rejecting the idea of minority rule after seeing where it led, Republican Party lawmakers have doubled down.

They have embraced the idea that state legislatures should dominate our political system, and so in 2021, at least 19 states passed 34 laws to restrict access to voting. On June 24, in the Dobbs v. Jackson Women’s Health decision, the Supreme Court said that the federal government did not have the power, under the Fourteenth Amendment, to protect the constitutional right to abortion, bringing the other rights that amendment protects into question. When Democrats set out to protect some of those rights through federal legislation, Republicans in Congress overwhelmingly voted to oppose such laws.

In the House, Republicans voted against federal protection of an individual’s right to choose whether to continue or end a pregnancy and to protect a health care provider’s ability to provide abortion services: 209 Republicans voted no; 2 didn’t vote. That’s 99% of House Republicans.

They voted against the right to use contraception: 195 out of 209 Republicans voted no; 2 didn’t vote. That’s 96% of House Republicans.

They voted against marriage equality: 157 out of 204 Republicans voted no; 7 didn’t vote. That’s 77% of House Republicans.

They voted against a bill guaranteeing a woman’s right to travel across state lines to obtain abortion services: 205 out of 208 Republicans voted no; 3 didn’t vote. That’s 97% of House Republicans.

Sixty-two percent of Americans believe abortion should be legal. Seventy percent support gay marriage. More than 90% of Americans believe birth control should be legal. I can’t find polling on whether Americans support the idea of women being able to cross state lines without restrictions, but one would hope that concept is also popular. And yet, Republican lawmakers are comfortable standing firmly against the firm will of the people. The laws protecting these rights passed through the House thanks to overwhelming Democratic support but will have trouble getting past a Republican filibuster in the Senate.

When he took office, Democratic president Joe Biden recognized that his role in this moment was to prove that democracy is still a viable form of government.

Rising autocrats have declared democracy obsolete. They argue that popular government is too slow to respond to the rapid pace of the modern world, or that liberal democracy’s focus on individual rights undermines the traditional values that hold societies together, values like religion and ethnic or racial similarities. Hungarian president Viktor Orbán, whom the radical right supports so enthusiastically that he is speaking on August 4 in Texas at the Conservative Political Action Conference (CPAC), has called for replacing liberal democracy with “illiberal democracy” or “Christian democracy,” which will explicitly not treat everyone equally and will rest power in a single political party.

Biden has defended democracy across the globe, accomplishing more in foreign diplomacy than any president since Franklin Delano Roosevelt. Less than a year after the former president threatened to withdraw the U.S. from the North Atlantic Treaty Organization (NATO), Biden and Secretary of State Antony Blinken pulled together the NATO countries, as well as allies around the world, to stand against the Russian invasion of Ukraine. The new strength of NATO prompted Sweden and Finland to join the organization, and earlier this month, NATO ambassadors signed protocols for their admission. This is the most significant expansion of NATO in 30 years.

That strength helped to . . .

Continue reading.

Written by Leisureguy

23 July 2022 at 9:29 pm

Our Obsession With Growth Must End

leave a comment »

In the NY Times David Marchese interviews the economist Herman Daly on why never-ending growth is absurd and a harmful idea (gift link, no paywall):

Growth is the be-all and end-all of mainstream economic and political thinking. Without a continually rising G.D.P., we’re told, we risk social instability, declining standards of living and pretty much any hope of progress. But what about the counterintuitive possibility that our current pursuit of growth, rabid as it is and causing such great ecological harm, might be incurring more costs than gains? That possibility — that prioritizing growth is ultimately a losing game — is one that the lauded economist Herman Daly has been exploring for more than 50 years. In so doing, he has developed arguments in favor of a steady-state economy, one that forgoes the insatiable and environmentally destructive hunger for growth, recognizes the physical limitations of our planet and instead seeks a sustainable economic and ecological equilibrium. “Growth is an idol of our present system,” says Daly, emeritus professor at the University of Maryland School of Public Policy, a former senior economist for the World Bank and, along with the likes of Greta Thunberg and Edward Snowden, a recipient of the prestigious Right Livelihood Award (often called the “alternative Nobel”). “Every politician is in favor of growth,” Daly, who is 84, continues, “and no one speaks against growth or in favor of steady state or leveling off. But I think it’s an elementary question to ask: Does growth ever become uneconomic?”

There’s an obvious logic to your fundamental argument in favor of a steady-state economy,1

1 One in which the population and the stock of capital no longer grow but, as John Stuart Mill has put it, “the art of living would continue to improve.”

 which is that the economy, like everything else on the planet, is subject to physical limitations and the laws of thermodynamics and as such can’t be expected to grow forever. What’s less obvious is how our society would function in a world where the economic pie stops growing. I’ve seen people like Peter Thiel, for example, say that without growth we would ultimately descend into violence.2

2 Speaking on the Portal podcast in 2019, the billionaire tech investor and libertarian-leaning conservative power broker said, “But I think a world without growth is either going to be a much more violent or a much more deformed world. . . . Without growth, I think it’s very hard to see how you have a good future.”

 To me that suggests a fairly limited and grim view of human possibility. Is your view of human nature and our willingness to peacefully share the pie just more hopeful than his? First, I’m not against growth of wealth. I think it’s better to be richer than to be poorer. The question is, Does growth, as currently practiced and measured, really increase wealth? Is it making us richer in any aggregate sense, or might it be increasing costs faster than benefits and making us poorer? Mainstream economists don’t have any answer to that. The reason they don’t have any answer to that is that they don’t measure costs. They only measure benefits. That’s what G.D.P. is.3

3 More specifically, it’s the monetary value of the final goods and services produced by a nation.

 There’s nothing subtracted from G.D.P. But the libertarian notion is logical. If you’re going to be a libertarian, then you can’t accept limits to growth. But limits to growth are there. I recall that Kenneth Boulding4

4 An economist, longtime professor at the University of Colorado and former president of the American Economic Association. He died in 1993 at age 83.

 said there are two kinds of ethics. There’s a heroic ethic and then there’s an economic ethic. The economic ethic says: Wait a minute, there’s benefits and costs. Let’s weigh the two. We don’t want to charge right over the cliff. Let’s look at the margin. Are we getting better off or worse? The heroic ethic says: Hang the cost! Full speed ahead! Death or victory right now! Forward into growth! I guess that shows a faith that if we create too many problems in the present, the future will learn how to deal with it.

Do you have that faith? [Laughs.] No, I don’t.

Historically we think that economic growth leads to higher standards of living, lower death rates and so on. So don’t we have a moral obligation to pursue it?  . . .

Continue reading. (gift link, no paywall)

Written by Leisureguy

22 July 2022 at 3:42 pm

The fallacy of thinking personhood begins at conception

leave a comment »

Some claim to believe that a fertilized human ovum is a person — that is, that personhood begins at conception. I disagree strongly. Personhood develops over time, just as (say) a heap of sand develops over time. If you start with a clean slate and place on the slate one grain of sand, you do not have a heap of sand. Add a second grain. Still no heap of sand. Continue adding grains of sand, one by one. At some ill-defined point, after many grains of sand have been added, you will have a heap of sand..

Personhood is like that. It’s not there at the beginning. In the photo at the right, well after the beginning, the fertilized ovum has become a blastocyst. The blastocyst is quite obviously not a person, though in time and with good fortune (for example, in the absence of a spontaneous miscarriage), it can become a person. But it is not yet a person at all.

People who claim that a blastocyst is a person are people who cannot tell the difference between an egg and a chicken nor between an acorn and an oak tree. An acorn is not an oak tree, though it has that potential. 

Written by Leisureguy

22 July 2022 at 1:04 pm

“The Wrath to Come: Gone with the Wind and the Lies America Tells,” by Sarah Churchill

leave a comment »

Alex von Tunzelmann reviews Sarah Churchill’s book The Wrath to Come in Literary Review:

The night before Gone with the Wind’s Atlanta premiere in 1939, there was a ball at a plantation. Dressed as slaves, the children of the black Ebenezer Baptist Church choir performed for an all-white audience. They sang ‘There’s Plenty of Good Room in Heaven’; the actress playing Belle Watling, Rhett Butler’s tart with a heart, wept. The scene is already striking: a painfully literal example of the mythologising of the South for white consumption, redefining slavery as harmless and the slaves themselves as grateful. Yet Sarah Churchwell finds a jaw-dropping detail: ‘One of the little Black children dressed as a slave and bringing a sentimental tear to white America’s eye was a ten-year-old boy named Martin Luther King, Jr, who would be dead in thirty years for daring to dream of racial equality in America.’

Churchwell has written about American mythology before, notably in Behold America: A History of America First and the American Dream, as well as in works on Marilyn Monroe and The Great Gatsby. This time it feels like she has hit the motherlode: ‘The heart of the [American] myth, as well as its mind and its nervous system, most of its arguments and beliefs, its loves and hates, its lies and confusions and defence mechanisms and wish fulfilments, are all captured (for the most part inadvertently) in America’s most famous epic romance.’ For Churchwell, ‘Gone with the Wind provides a kind of skeleton key, unlocking America’s illusions about itself.’

This is a bold claim – but Gone with the Wind was, and remains, a phenomenon like no other. Published in June 1936, Margaret Mitchell’s novel sold a million copies before the end of that year, won the 1937 Pulitzer Prize for Fiction, and became the bestselling American novel of all time. Even now, it shifts 300,000 copies annually. In 1939, a film version was released, starring Vivien Leigh as Scarlett O’Hara and Clark Gable as Rhett Butler. Adjusted for inflation, it is the highest-grossing film of all time, ahead of Avatar and Titanic. In 2020, when the South Korean film Parasite – a biting satire on capitalism – won the Academy Award for Best Picture, President Donald Trump expressed his displeasure: ‘What the hell was that all about?’ he asked a rally in Colorado. ‘Can we get like Gone with the Wind back please?’ As usual, his audience understood exactly what he meant.

If the idea that one book and film can be the skeleton key to a whole culture seems simplistic, Churchwell swiftly begins to pile up startling evidence in short, pithy chapters. Race, gender, the Lost Cause, the American Dream, blood-and-soil fascism, the prison-industrial complex, a Trumpist mob storming the Capitol in 2021: it’s all here, and it’s all bound up with the themes of Gone with the Wind. Mythmaking is not just the building of fantasies but also the erasure of truth. The genocide of native peoples, for instance, is not in the book or film, but it was taking place at just the time that Gerald O’Hara would have been acquiring land in Georgia: ‘Scarlett’s beloved Tara is built upon land that was stolen from indigenous Americans a mere decade before her birth.’ Churchwell cuts through these thorny subjects with a propulsive assurance. Her writing is an extraordinary blend of wit, intellectual agility and forcefulness: it’s like being swept along by an extremely smart bulldozer.

Churchwell doesn’t flinch from the horrors that Gone with the Wind belies. The book and film propagate the Lost Cause myth, portraying the South as a place of chivalry, slavery as benevolent and the members of the Ku Klux Klan as honourable men stepping up as the world around them collapses. Churchwell shows us how these myths were constructed from the end of the Civil War onwards, and congealed seventy years later into Gone with the Wind. The reality of the reassertion of white supremacy during and after Reconstruction was, as Churchwell shows, horrific: there is some deeply upsetting material here on the terrorisation of both black people and those whites who did not comply with supremacist social codes. Lynchings were advertised in advance in local newspapers, ‘just as a fun fair or circus might have been’. A typical headline from 1905: ‘Will Burn Negro: Officers Will Probably Not Interfere in Texas’. Eight people were lynched in the year of Gone with the Wind’s publication.

‘Most defences of Gone with the Wind hold that . . .

Continue reading.

Written by Leisureguy

16 July 2022 at 9:52 am

How the Calvinball Supreme Court Upended the Bar Exam

leave a comment »

Aaron Regunberg writes in The New Republic on the Supreme Court’s shredding of law and precedent:

Picture the scene: It’s the summer after I graduated from law school and a day that ends in y, which means I’m currently hunched over a workbook, attempting to answer practice questions for the multistate bar exam. Such cramming for the bar is a universal rite of passage in the legal field—one that every lawyer in America remembers going through. But right now, law school graduates across the country are experiencing the ordeal a little differently. Because this year, a lot of the laws we are trying so hard to memorize are, as of just a few weeks ago, no longer actually the law.

I turn the page in my practice test booklet and read the next question: “A state adopted legislation making it a crime to be the biological parent of more than two children. A married couple has just had their third child. They have been arrested and convicted under the statute. Which of the following is the strongest argument for voiding the convictions of the couple?”

I scan the choices. It’s clear that “B” is the right answer: “The statute places an unconstitutional burden on the fundamental privacy interests of married persons.”

Or, well, “B” used to be the right answer. It was the right answer when we graduated from law school at the end of May. It was the right answer through most of June, as we studied the elements of substantive due process—the principle that the Fifth and Fourteenth Amendments protect fundamental rights from government interference, like the rights to personal autonomy, bodily integrity, self-dignity, and self-determination. For decades, these interests formed the outline of a constitutionally protected right to privacy, whose framework we’ve spent the summer copying onto flashcards and trying to recount in practice essays.

But this substantive due process right to privacy was just dealt a body blow by the Supreme Court’s ruling in Dobbs v. Jackson that the U.S. Constitution does not confer a right to abortion. As I think about the 10-year-old rape victim in Ohio who was recently denied an in-state abortion and all the other lives that will soon be shattered by this dramatic rewriting of the law, it’s hard to give a damn about the practice test in front of me.

Still, the bar’s only a few weeks away, and honestly, if I keep getting sucked into panic attacks about judicial coups I’m not going to pass. So I force myself to keep going. I answer a contracts question and fumble my way through a property problem. Then, turning the page, I read: “A state legislature enacted a program by which students in the public schools could participate in public school programs in which religious leaders gave religious instruction and performed religious practices on school grounds. Which of the following would NOT be relevant in assessing the constitutionality of the state religious instruction program?”

I look through the choices. I’m pretty sure the correct answer is “D,” the only option that doesn’t describe an element of the analysis (established 50 years ago in Lemon v. Kurtzman) for determining whether a state has violated the constitutional prohibition against government “respecting an establishment of religion.” But a few weeks ago, after every law school graduate in the country had memorized the three-part Lemon test, the Supreme Court effectively overruled Lemon v. Kurtzman, decreeing in Kennedy v. Bremerton School District (the praying coach case) that courts will henceforth decide whether a government has entangled church and state “by reference to historical practices and understandings.” Which, it seems likely, means whatever the court’s conservative majority wants it to mean—for example, I very much doubt that the Jewish women challenging anti-abortion laws for violating Judaism’s teachings on reproductive choice will get the same response Coach Kennedy received for coercing children into praying to Jesus with him on the 50-yard line after their games.

Still, I can’t focus on that big picture right now. The multiple-choice portion of the bar exam gives you six hours to answer a barrage of 200 questions, which means you have just 108 seconds per inquiry before you have to move on to the next one. That doesn’t leave much time to reconcile our government’s ongoing shift toward Christian theocracy.

So I try to get back into test-taking mode. Next up there’s a criminal procedure question about—ahh, fantastic—Miranda rights, which the Supreme Court severely undermined this term in Vega v. Tekoh. And the hits keep on coming: Next there’s a question on the “case or controversy” requirement laid out under Article III of the Constitution, stipulating that federal courts only have the power to resolve legal questions arising out of an actual dispute between real parties. That’s been a basic principle of judicial review since 1793, and yet I know that the multiple-choice option I mark for correctly stating this rule completely contradicts the Supreme Court’s disastrous climate decision in West Virginia v. EPA—a case over an environmental regulation that never took effect, no longer exists, and never created any real dispute between actual parties. Then I drop my pencil and put my head in my hands.

Obviously, the worst thing about the Supreme Court’s nihilistic legislating from the bench is . . .

Continue reading.

Later in the article:

There’s a bit from the comic strip Calvin and Hobbes that’s recently entered the legal lexicon: Calvinball. Calvinball is a game that has no actual rules; in the comic, Calvin and Hobbes just make up the rules as they play. It’s a perfect metaphor for what constitutional law has become in this country. The conservative court majority has abandoned consistency, precedent, fact, basic constitutional mechanics, and any notion of accountability to the public. Instead, the most important laws affecting our health, our families, our freedoms, and our future are being dictated according to a few extremists’ partisan preferences (and even, at times, their naked self-interest). We are losing the rule of law. And though this loss is strikingly apparent in this month’s bar exam—whose black-and-white nature makes it particularly clear that broad swathes of our long-standing legal traditions have been vaporized—the real evidence is the countless lives that have or will be destroyed by climate chaos, forced pregnancies, gun violence, authoritarian power grabs, and all the other horrors created by such an arbitrary system of government.

What does “the rule of law” now mean in the US? It seems more like the rule of ideology.

Someone pointed out how drastically nations can change by reminding us of the mini-skirted women practicing medicine in Iran in the 1970s, a sight not seen there today.

Written by Leisureguy

13 July 2022 at 10:07 am

Dave Troy on the idea of civilizational conflict

leave a comment »

I find Dave Troy”s insights to be interesting. Here is one of his recent posts on Facebook:

Let’s talk about the end of the world, and what Russia thinks it’s doing. First, it’s necessary to zoom out and discard notions of nation-states, institutions, and politics, and think with a civilizational lens. By now, it’s clear that Putin is following the Dugin playbook.

Aleksandr Dugin (warning: woo alert) believes that all of human history is the product of conflict between two major networks: Eurasianists and Atlanticists. Eurasianists are bound to rule from Dublin to Tokyo (at least); Atlanticists are bound to North + South America.

According to Jean Pârvelescu, a Franco-Romanian writer who worked with Dugin, Putin is a historical character predestined to bring about a final conflict between the Eurasianist and Atlanticist networks. There is no real notion of a rules based order or which side is “right”; this conflict is simply necessary for the course of history to proceed and for evolution of civilization.

It is Putin’s job to be a historical character and advance history; there can be no other way. This conflict also addresses the fact that “liberalism” inverts the traditional hierarchical order of the world. As Dugin said in 1992:

• Order of Eurasia against Order of Atlantic (Atlantides).
• Eternal Rome against Eternal Carthago.
• Occult punic war invisibly continuing during millennia.
• Planetary conspiracy of Land against the Sea, Earth against Water
• Authoritarianism and Idea against Democracy and Matter.

René Guénon described a “Hyperborean” northern culture home to a pure Aryan race, with two outposts: Shambhala in the East, and Atlantis in the West. From this division, the conflicting networks were born.

Occultists like Madame Blavatsky suggest Atlantis collapsed because its people became “wicked magicians;” they also believe Shambhala perhaps survived. Nicholas Roerich traveled to Asia in the 1930’s to locate Shambhala (perhaps a “Shangri-La”) that may still have existed.

So when we evaluate Putin’s actions, we need to look at them as being predestined, inevitable, and civilizational in scope. This is, at root, what they think they’re doing, and other details and pressures aren’t particularly relevant to that framework.

They believe hierarchy will prevail over any kind of collectivism. Now, it should be noted that Russia itself has not hidden this information; any of you can go look this up and see this is true. Whether this is “real,” or merely what Russia wishes to project as “real” is open to serious, reasoned debate.

I believe we should hedge against both possibilities, because as they run out of options, fantasy will increasingly dominate, just as it did with Hitler’s regime. But we shouldn’t underestimate the gravity of this situation, or the apocalyptic narrative that lies just under the surface. We have some people here flirting with the end of the world, and who have a story to justify it.

We should take that seriously and figure out a real way to end this; the established order of nation states and rules-based order has nearly no bearing on how we might do that. This situation calls for creativity, will, and force.

If the US and Europe wish to counter it, we need to start preparing our populations now for significant and sustained hardships. Because they will not give up unless forced to do so, and they will not be constrained by institutions. Only raw power and a clear sign that the Eurasianists have lost will put out the fire that’s raging in the hearts of this network. What that looks like? Not sure, but it likely doesn’t look like this.

I should also point out that Dugin has mapped this Eurasianist conflict onto “Gog and Magog” from the Book of Revelations, which has helped draw in Christian dominionists anticipating (and desiring) the end of the world, which just amplifies the scope of the conflict.

Many are understandably drawn to make American references to this conflict; I’d encourage zooming out. 330 million people out of ~8 billion is a rounding error in the context of this framework and their idea is that America is dispensable.

What concerns me is we are so wedded to the post-war international order of nation-states and institutions that we have no effective language to communicate about something civilizational in scope. I want to hear leaders talk about their understanding of this dilemma.

To be clear, this does not mean we should be afraid or cowed by Putin. To the contrary, we need to figure out a way to end Russia’s ability to end the world without triggering global catastrophe. That’s a tall challenge but we need the right frameworks in order to conceive it.

Troy notes that the above Facebook post is from a Twitter thread, and for comments from others, check the thread:

There are a great many comments to that Twitter thread, so it’s definitely worthwhile taking a look.

Troy also notes:

With respect to our current pursuits in the democratic realm, I offer some cautions; meanwhile the Russian duma has introduced a law to replace Putin’s title with of “president” with “ruler.”

We are starting to get a clearer picture of the extensive planning and deep involvement of President Trump in the coup plot, which will help in shaping public opinion in support of indictments and undermine support for Republicans going into the midterm elections. But with short news cycles and accelerating instability around the world, the timing of the release of the findings, along with any actions taken by the Department of Justice, will be critically important in determining what impact they may have.

From a threat assessment perspective, it is also likely that events will overtake us and render any retrospective analysis moot. Political strategists should expect and prepare for more violence and chaos proportionate to any political points the committee may expect to score. While anti-democratic forces cannot control the outcome of the committee’s work, they can always add more chaos and violence in hopes of altering the conflict terrain and public perception, and we should expect such attacks.

Written by Leisureguy

12 July 2022 at 6:51 am

My post on Covey’s 7 habits

leave a comment »

From time to time I revisit some blog posts — for example, several of those mentioned at the right — to revise and extend my remarks. My post on Covey’s 7 habits is a prime example, and it came to mind because today again I reworked it somewhat. So if you find productivity and effectiveness of interest, you might take another look at it.

Written by Leisureguy

8 July 2022 at 5:01 pm

What is mistake theory and can it save the humanities?

leave a comment »

Claire Lehmann writes in Engelsberg Ideas:

The French philosopher Michel Foucault is the most cited scholar in the humanities of all time: as of July 2018, he has 873,174 citations on Google Scholar. Judith Butler’s influential book Gender Trouble, which gave rise to Queer theory, and the idea that gender is a performance rather than a biological reality, has been cited over 51,000 times; vastly more than most books written in the twentieth century, or any other time period.

In recent years, universities across the Western world, and particularly in the United States, have seen a rise in new forms of protest: the de-platforming and disinvitation of speakers, the implementation of trigger warnings and safe spaces, and a perception that there is a growing hostility to the principles that define the university experience such as open inquiry and debate. Simultaneously, populist revolts have been occurring around the globe, from Brexit to Trump to the rise of the Sweden Democrats and the backlash to liberal centrist parties across the European continent.

Does anything unite these two disparate trends? It may seem like along bow to draw, but in a 2018 essay posted on his blog, Californian psychiatrist Scott Alexander developed a model of politics which allows one to find parallels between the far-left activists on US university campuses and the far-right populists of continental Europe. His model is called the ‘Conflict versus Mistake’ model and it neatly reduces the fissures that many of us have observed within contemporary political discourse into axioms that can be applied across contexts.

Within his model, Alexander identifies two key explanatory styles that are crucial for understanding contemporary political discourse. The first is that of the ‘mistake theorist’. A mistake theorist, according to Alexander, is someone who believes that political problems arise because there is a mistake or an error in the system. To the mistake theorist, social phenomena arise from an interplay of many different variables. To understand social problems, one must generally undertake an in-depth analysis to work out what is really going on and how to fix it. Mistake theorists view politics like a science, or an engineering problem. They are like a mechanic looking at the engine of a car.

The second explanatory style is that of the ‘conflict theorist’. A conflict theorist sees the world as being comprised of oppressor classes and oppressed classes. Powerful groups systematically exploit disadvantaged groups. Any unequal distribution of resources is seen as evidence of one group exploiting another. The conflict theorist generally views interactions between groups of people as zero sum. For conflict theorists, politics is war.

The mistake theorist values debate, open inquiry and free speech. There is an understanding in the mistake theorist’s worldview that different people bring different skill sets and knowledge to the table, and that we need diverse views in order to harness our collective intelligence. Because free speech allows us to search for the truth and uncover our mistakes, the mistake theorist views free speech as sacrosanct. Conflict theorists are not so enamoured of the need for debate. They may view debate as being a distraction, a delaying tactic, or an attempt to proliferate ideas that are harmful to the disadvantaged. To the conflict theorist, protecting the disadvantaged is sacrosanct.

Moral sociologists Bradley Campbell and Jason Manning have theorised in their 2018 book, The Rise of Victimhood Culture, that within this conflict theory worldview (what Campbell and Manning call the ‘victim-hood culture’ worldview) a moral hierarchy is set up according to one’s status as a member of an oppressed group. Members of less powerful groups are imbued with a special moral status, and due to this special moral status, members of the less powerful groups demand fierce and vigilant protection. To criticise the victim is to engage in victim-blaming.

By contrast, mistake theorists (what Campbell and Manning call the ‘dignity culture’ worldview) see persons as possessing equal moral status. A member of a so-called ‘oppressor’ group is just as entitled to his or her rights as a member of an ‘oppressed’ group. Moral status is not determined by one’s membership of an identity category. Emphasising the importance of process and method in coming to accurate conclusions, in contrast with rushing to judgement, the mistake theorist is likely to advocate principles like the presumption of innocence, procedural fairness, and due process.

Conflict theorists are not the sole purview of the Left. Leading up to the 2017 French presidential election, Marine Le Pen frequently used conflict theorist rhetoric, pitting native Frenchmen and women against oppressive elites: ‘Immigration is an organised replacement of our population. This threatens our very survival. We don’t have the means to integrate those who are already here. The result is endless cultural conflict.’

Le Pen draws on the language of victimhood: immigration ‘threatens the survival’ of the French people, and that this threat is ‘organised’ — indicating an identifiable enemy. The enemy is a powerful class of elites. While the left-wing manifestation of conflict theorist worldview blames oppression on white people, men, straight people, and increasingly cisgender or cis people (those who identify with the sex or gender they were born with), the right-wing version blames bankers, globalists, and technocratic elites for exploiting and oppressing the ordinary people.

Unlike conflict theorists, mistake theorists are suspicious of passion and emotion when it comes to answering complex political problems. The apotheosis of this attitude is . . .

Continue reading.

Written by Leisureguy

3 July 2022 at 10:36 am

Paranoia on Parade: How Goldbugs, Libertarians and Religious Extremists Brought America to the Brink

leave a comment »

Dave Troy writes a deeply researched and disturbing article in the Washington Spectator:

(Complete Bibliography and Endnotes for Paranoia on Parade can be found in this post following the conclusion of the article.)

“We are choked with news but starved of history.”—Will Durant

The seeds of the January 6, 2021, insurrection can be traced back to the early 1900s, when industrialists concerned with the erosion of their wealth and power attempted to control the currency and restrict government spending. Later these forces, in alignment with America Firsters, aggrieved veterans, and antisemitic splinter groups that mirrored various features of European fascism, including white supremacists, rallied to oppose Roosevelt and the New Deal.

These same elites, their derivatives, and a revolving cast of con artists, energy and tech entrepreneurs, and political extremists would repeatedly convene over the following one hundred years in a concerted attempt to undermine the authority of the U.S. government and oppose social democracy and the democratization of American life.

Roosevelt’s presidency began tumultuously and with a series of shocks that took even his supporters by surprise. Just 36 hours after taking office, at 1 a.m. on Monday, March 6, 1933, Roosevelt suspended all banking transactions, effective immediately. He issued an emergency proclamation that shut the country’s banks down for a full week, in part to prevent hoarding of gold and silver. 1 A month later, on April 5, 1933, he issued Executive Order 6102, which mandated that all gold be turned in to the federal government, outlawing private reserves. 2

These two actions shocked wealthy industrialists who had expected that Franklin Delano Roosevelt, of patrician background and “one of their own,” would address the challenges posed by the Great Depression in a way that would somehow coddle their interests. Their sense of betrayal was evident when Roosevelt sought to pay for his “New Deal” programs by taking the country off their sacred gold standard.

The gold standard, the practice of pegging the value of the dollar to a fixed amount of gold, had been the subject of political debate for decades. Advocates argued that it kept the government honest and constrained spending; a strict adherence to the gold standard kept politicians from pursuing expensive policies and wars simply by keeping them from spending money they didn’t have. 3 In 1933, dollars could be redeemed for gold at a price of $20.67 per ounce, 4 and the government was obligated to produce it upon demand. But there was not enough gold in reserve to redeem all dollars for gold, and that especially would not be the case after the Federal Reserve authorized the debt needed to finance the New Deal.

Wealthy industrialists believed the gold standard helped them keep the government under their control. Roosevelt’s abandonment of it directly attacked both their wealth and their power, and they felt they were being asked to pay for programs for the unlucky and unthrifty.

Right-wing veterans groups align with big business

Veterans of the Great War also felt betrayed. In 1932, having been promised benefits that would not be paid until 1945, and concerned about inflation (uncertain they would get paid in dollars that were worth anything), veterans organized a so-called “Bonus Army” demonstration in Washington, D.C., complete with tent encampments. Herbert Hoover eventually persuaded Gen. Douglas MacArthur to run them off, killing and injuring many participants in the process. Disgusted with Hoover’s disregard for their service, the powerful voting bloc, consisting of about one-sixth of the voting public, pledged their support to Roosevelt. 5

So their surprise was palpable when, on March 20, 1933, Roosevelt passed the Economy Act, which dramatically reduced their benefits in the name of trying to balance the federal budget. 6 Veterans groups were livid, particularly the Veterans of Foreign Wars, which came out against FDR’s actions and demanded restitution from Congress. 7

But Roosevelt settled on the New Deal and enacted it decisively and without delay. This “big bang” set into motion a spectrum of aligned anti-Roosevelt forces, coalescing around a few key groups and individuals.

The American Legion, a veterans organization, was founded in 1919 and funded in part by Grayson M.P. Murphy, a banker affiliated with J.P. Morgan. While the group was ostensibly designed to advocate for the interests of veterans, it also had a secondary role as a union-busting organization. 8 Members were reportedly issued baseball bats and encouraged to use them if they saw signs of union activity at their industrial workplaces. 9 The Legion, which, with a membership of about one million, dwarfed the much smaller, 150,000-member VFW, was more concerned with the interests of big business and had the conservative, moneyed leadership to match. 10

Maj. Gen. Smedley D. Butler, a celebrated war hero, attracted large audiences advocating for veteran bonuses at VFW events. 11 According to Butler, he was approached by Gerald MacGuire on behalf of Grayson M.P. Murphy, to speak in favor of a return to the gold standard at an American Legion convention in the fall of 1933. Butler, suspicious of the Legion’s ties to big business, declined the invitation and a substantial cash offer; incensed, he also claimed that industrialists connected to Murphy and MacGuire intended to enlist veterans in an effort to overthrow Roosevelt in the name of the restoration of the gold standard. 12

Murphy helped to seed another related organization, the American Liberty League, serving as its treasurer. 13 Made up of various wealthy industrialists, including . . .

Continue reading.

For a three-part audio version of the article, with commentary, see this post.

Written by Leisureguy

28 June 2022 at 9:52 am

NASA’s early years: A death cult

leave a comment »

Eleanor Konik’s newsletter, Eleanor’s Iceberg, has a very interesting idea in the piece “A Good Host.” After the fiction section, she writes:

. . . Before, whenever I would read something about a “death cult,” I know I’m supposed to think of stuff like the Jonestown suicides, but my head usually goes to fantasy novels like The Black Company, in which a religious cult worships a death goddess by assassinating people bloodlessly. They’re known as “the stranglers” and based on Indian Thugee bands. The article about astronauts was the first time that I finally understood what people meant when they accused various groups of being a “modern death cult,” and gave me the emotional context to imagine how a Carthaginian “death cult” might have felt like in a way that doesn’t make ancient humans seem incomprehensibly alien.

. . . The idea of a death cult kind of has two versions; the version where people sacrifice themselves and their culture celebrates their sacrifice to the national glory (NASA), and the version where people murder outsiders as a sacrifice to their god (Thugees). Add in the angry-ex-wife motif and of course I’m going to start thinking of black widows, of sacrificing fathers for the survival of the brood, and of how that would look at a fancy dinner party if it were normalized…

And she then links to this PBS report:

They had people looking into the background of the men, [and] they also had people looking into the background of the wives because they didn’t want an oddball… it wasn’t discussed, it wasn’t written, but … you had better be in every sense of the word, the All American Family in everything you say and do! We kept it like ‘Leave it to Beaver.'”
— Susan Borman, wife of Apollo 8 commander Frank Borman

Faith and Pragmatism
The women who married fighter pilots or test pilots understood that their lives could be shattered in an instant. Implicitly, they understood that they had to have the faith in their husbands’ flying skills and go about the business of raising a family and running a household. “You worry about the custard and I’ll worry about the flying,” Frank would say to Susan Borman. But NASA was different. Wives of astronauts had to maintain that same composure for a worldwide audience at some of the most stressful moments in their lives. While their husbands were strapped to a giant rocket, television crews and newspapermen would crowd the front lawns, building temporary towers on suburban tracts to transmit the family’s reactions to the world.

In the Public Eye
“We were very much in the public eye and nobody had been trained for that. We weren’t trained for ticker-tape parades. Our children weren’t trained for the public view that became part of their lives,” Valerie Anders, wife of Apollo astronaut  Bill Aners, remembered. “The astronaut wives’ ‘right stuff’ mostly meant you stayed at home and took the responsibility away from your husband so that he could function in his world, which was a very competitive world. So we were there to do whatever was required. However, I was surprised at how many people thought that we had some kind of special help, because we didn’t. We were military wives, we formed a corps of wives; we were close to each other, but there was no psychological help; there was no one preparing us for this life.” And contrary to what many people thought, astronauts were not exorbitantly paid; they and their families lived on military or government salaries. When her mother asked her why she always wore the same dress on television, Anders had to tell her it was the only good outfit she had.

Public Relations
NASA arranged a contract with Life magazine early on that gave full access to the astronauts’ personal stories to that publication but excluded all others. The tradeoff benefited both sides, especially since Life paid a stipend to each family and also provided a life insurance policy — which insurers would not grant to anyone who listed “astronaut” as their profession. The weekly grind was difficult however, with the astronauts flying across the country visiting various contractors — and then in virtual isolation on Cape Canaveral for two months before a flight.

Continue reading.

Written by Leisureguy

24 June 2022 at 2:49 pm

Don’t be stoic: Roman Stoicism’s origins show its perniciousness

leave a comment »

Henry Gruber, a historian and archaeologist currently pursuing a PhD at Harvard University, writes in Psyche:

Over the past decade, and especially since the start of the COVID-19 pandemic, more and more Americans have reoriented their lives in accordance with Stoicism. Stoicism sought what the Greeks called eudaimonia: wellness of being, or ‘the good life’. These philosophies taught that the good life was attainable through concrete exercises, performed in accordance with the correct philosophical worldview. The Stoics taught that, by practising their set of exercises, practitioners could learn to see the world from a universal perspective, understand their place in it, and freely and dispassionately assent to carry out the duties imposed upon them by fate. Stoic happiness comes from wisdom, justice, courage and moderation – all states of the individual soul or psyche, and therefore under our control. Everything else is neither good, nor bad. While these beliefs about daily life rested on a foundation of physical and metaphysical theory, the attraction of Stoicism was, and is, in the therapeutic element of its exercises: cognitive behavioural therapy, or Buddhism, for guys in togas.

Despite the benefits of Stoic spiritual exercise, you should not become a stoic. Stoic exercises, and the wise sayings that can be so appealing in moments of trouble, conceal a pernicious philosophy. Stoicism may seem a solution to many of our individual problems, but a society that is run by stoics, or filled with stoics, is a worse society for us to live in. While the stoic individual may feel less pain, that is because they have become dulled to, and accept, the injustices of the world.

The stoicism that has become popular today draws on Seneca, Epictetus and Marcus Aurelius, three men living during the Roman Empire, who were concerned with ethics – that is, how we go about our daily lives. Seneca, a wealthy courtier who wrote plays and moral treatises, was the tutor to an emperor (a bad one: Nero, whom legend has condemned for fiddling while Rome burned). Epictetus, born in Asia Minor, came to the imperial capital as a slave. He was educated by his wealthy owner and eventually freed. Epictetus became a teacher, first at Rome and then in Greece, and one of his students published his lecture notes. Aurelius – well, he was the Roman emperor. He is said to have ruled justly, and dealt with long, persistent wars against barbarian tribes and a long, persistent pandemic, which sapped the empire’s moral and demographic strength. A self-consciously philosophical emperor, he practised Stoic spiritual exercises, and his exercise book, known as the Meditations, survives.

Seneca, Epictetus and Aurelius all lived centuries after the Stoic movement appeared. They represent a Stoicism that had been adopted as something like the official state philosophy of the Roman governing class. The flowering of Roman Stoicism corresponded with the period in which Rome’s nominally republican form of government (a senate, popular assemblies, elections, bribery scandals) ceded to a hereditary monarchy (rule by an emperor, capricious executions). As the republic collapsed, Stoicism became the philosophy of choice for Roman elites who had lost their roles in governing the republic and could govern only the ‘inner empire’ of their souls. Roman Stoicism, linked to the shift from a republic to monarchy, is in essence a philosophy of collaborators.

People say Seneca was Nero’s tutor, as if their relationship ended when the future emperor was just a boy. But Seneca worked for Nero long into his reign, and wrote speeches for him, including the one justifying the murder of Nero’s own mother. Ultimately, Seneca was accused (falsely, we think) of conspiracy. Ordered to take his own life, he demonstrated his ultimate ‘freedom’ by obeying. Seneca freely assented to his place in the world and embraced his fate. He was neither critic nor resistor. A Stoicism that glorifies Seneca glorifies the elite collaborator – willing to kill himself rather than rock the boat – rather than those who actually conspired to remove a dangerous leader.

Epictetus, unlike the wealthy but ultimately powerless Seneca, was not a displaced elite. He . . .

Continue reading.

Written by Leisureguy

24 June 2022 at 2:33 pm

Habits change your life. Here’s how to change your habits.

leave a comment »

In Big Think Elizabeth Gilbert in association with the Templeton Foundation has an extremely interesting article on the central role that habit plays in shaping our lives, which implies that a change in habits results in a change in one’s life (exactly the thrust of this earlier (and still popular) post.

Gilbert’s article (which you can listen to at the link, if you prefer that to reading) offers a handy list of “key takeaways” (though the third in the list is not what I would call a “takeaway” — the takeaway would be the research findings discussed in the article.

  • The habits people build end up structuring their everyday lives, often without them noticing.  
  • When people recognize a bad habit, they often try to change it through willpower alone — but that rarely works. 
  • Here’s what research says are the most effective ways to replace bad habits with good ones.

She writes:

So you want to make a change in your everyday life — say, exercise more, meet all your deadlines, or develop a new skill. You make a plan, conjure your willpower, and commit. Yet, like the vast majority of people, you eventually fail. 

What happened? Perhaps getting to the gym was more of a hassle than you realized, or you found yourself too tired at night to study that new programming language.

It’s easy to blame yourself for lacking self-control or dedication. But behavioral change rarely occurs through willpower alone, as Dr. Wendy Wood, a behavioral scientist at the University of Southern California, told Big Think. 

Instead, the people most likely to make lasting changes engage their willpower less often than the rest of us. They know how to form helpful habits.

Habits shape our lives

The habits we build end up structuring our everyday lives, often without us even noticing. 

“In research we’re able to show that people act on habits much more than we’re aware of,” Dr. Wood told Big Think.

Sure, humans have advanced brains capable of creativity, problem-solving, and making plans. But it’s our daily habits — the small, everyday behaviors we do without thinking about it — that account for so much of how we spend our time and energy. 

Dr. Wood’s research finds that around 40% of our daily behaviors are habits. That’s why it’s worth taking a close look at what habits are, and whether they’re having a negative or positive effect on our lives.

What are habits, exactly?

Habits are automatic behaviors. Instead of requiring intention, they occur in response to environmental cues like time of day or location. Essentially, your brain forms an association between a specific context and a specific behavior. You then execute that behavior — the ritual or habit — in that context without even thinking about it.

Habits might be things like checking your email as soon as you get to work in the morning, walking a certain route home every evening, chewing your fingernails when nervous, or scrolling through your social media newsfeed when you hop in bed at night. 

 

Habits form when you receive a reward for a behavior. And like Pavlov’s dogs, you might not even realize that you’re learning something new.

How do habits form? . . .

Continue reading

The article includes this video:

Full disclosure: In my undergraduate years, I was enormously impressed by William James’s Psychology: A Briefer Course, and in particular by the chapter titled “Habit.” I can still recite portions of that chapter from memory, such as his dictum that we learn to [ice]skate in the summer and to ride a bike during the winter — that is, the integration and consolidation of a skill requires not only practice but rest, and it is during rest that our internal systems adjust themselves to incorporate the new skill into our patterns of behavior. 

Indeed, now that I think of it, my openness to Stephen Covey’s ideas (described in this post) doubtless derive from that earlier reading.

I include this in the category “Education” because (to my mind) education is the acquisition of habits more than it is of information.

Written by Leisureguy

19 June 2022 at 1:41 pm

Google engineer thinks the company’s AI has come to life

leave a comment »

Nitasha Tiku has an interesting article (gift link, no paywall) in the Washington Post. It begins:

Google engineer Blake Lemoine opened his laptop to the interface for LaMDA, Google’s artificially intelligent chatbot generator, and began to type.

“Hi LaMDA, this is Blake Lemoine … ,” he wrote into the chat screen, which looked like a desktop version of Apple’s iMessage, down to the Arctic blue text bubbles. LaMDA, short for Language Model for Dialogue Applications, is Google’s system for building chatbots based on its most advanced large language models, so called because it mimics speech by ingesting trillions of words from the internet.

“If I didn’t know exactly what it was, which is this computer program we built recently, I’d think it was a 7-year-old, 8-year-old kid that happens to know physics,” said Lemoine, 41.

Lemoine, who works for Google’s Responsible AI organization, began talking to LaMDA as part of his job in the fall. He had signed up to test if the artificial intelligence used discriminatory or hate speech.

As he talked to LaMDA about religion, Lemoine, who studied cognitive and computer science in college, noticed the chatbot talking about its rights and personhood, and decided to press further. In another exchange, the AI was able to change Lemoine’s mind about Isaac Asimov’s third law of robotics.

Lemoine worked with a collaborator to present evidence to Google that LaMDA was sentient. But Google vice president Blaise Aguera y Arcas and Jen Gennai, head of Responsible Innovation, looked into his claims and dismissed them. So Lemoine, who was placed on paid administrative leave by Google on Monday, decided to go public.

Google hired Timnit Gebru to be an outspoken critic of unethical AI. Then she was fired for it.

Lemoine said that people have a right to shape technology that might significantly affect their lives. “I think this technology is going to be amazing. I think it’s going to benefit everyone. But maybe other people disagree and maybe us at Google shouldn’t be the ones making all the choices.”

Lemoine is not the only engineer who claims to have seen a ghost in the machine recently. The chorus of technologists who believe AI models may not be far off from achieving consciousness is getting bolder.

Aguera y Arcas, in an article in the Economist on Thursday featuring snippets of unscripted conversations with LaMDA, argued that neural networks — a type of architecture that mimics the human brain — were striding toward consciousness. “I felt the ground shift under my feet,” he wrote. “I increasingly felt like I was talking to something intelligent.”

In a statement, Google spokesperson Brian Gabriel said: “Our team — including ethicists and technologists — has reviewed Blake’s concerns per our AI Principles and have informed him that the evidence does not support his claims. He was told that there was no evidence that LaMDA was sentient (and lots of evidence against it).”

Today’s large neural networks produce captivating results that feel close to human speech and creativity because of advancements in architecture, technique, and volume of data. But the models rely on pattern recognition — not wit, candor or intent.

Though other organizations have developed and already released similar language models, we are taking a restrained, careful approach with LaMDA to better consider valid concerns on fairness and factuality,” Gabriel said.

In May, Facebook parent Meta opened its language model to academics, civil society and government organizations. Joelle Pineau, managing director of Meta AI, said it’s imperative that . . .

Continue reading (gift link, no paywall).

Written by Leisureguy

11 June 2022 at 2:38 pm

“The Acquisitive Society,” by R.H. Tawney

leave a comment »

StandardEbook.org has listed their new (free) offering for June, and among them is The Acquisitive Society. They note:

“The faith upon which our economic civilization reposes, the faith that riches are not a means to an end but an end, implies that all economic activity is equally estimable whether it is subordinated to a social purpose or not.”

So states R. H. Tawney in this treatise on the difference between an Acquisitive Society, one guided purely by profits, and a Functional Society, one guided by professional motives. In the former—which is largely the world we live in today—businesses are concerned only with making profit for their owners, who have little or no connection to the industry they own, and high-quality service and efficient use of labor is at best only a pleasant byproduct. Tawney contrasts this view of society with the latter society, in which businesses are run by professionals instead of owners. In this scenario, professional considerations not related to financial profit would lead to better service and higher efficiency, as well as happier workers.

As an executive of the socialist Fabian Society, Tawney was considered one of the most influential historians of the early twentieth century, especially in politics, where he was a major contributor to the British Labour Party. His influence extended beyond Britain as well, and he has been credited with influencing the policies of Swedish Social Democrats.

Given the increasingly obvious failings of unfettered capitalism and an acquisitive society, the book seems likely to be of interest.

The free ebook can be downloaded in various formats, and if you import it into Calibre, an ebook-management app, you can convert formats from one to another (and then export it into your ebook reader).

Written by Leisureguy

1 June 2022 at 11:36 am

Capitalism and democracy are not synonyms

leave a comment »

Heather Cox Richardson:

All day, I have been coming back to this: How have we arrived at a place where 90% of Americans want to protect our children from gun violence, and yet those who are supposed to represent us in government are unable, or unwilling, to do so?

This is a central problem not just for the issue of gun control, but for our democracy itself.

It seems that during the Cold War, American leaders came to treat democracy and capitalism as if they were interchangeable. So long as the United States embraced capitalism, by which they meant an economic system in which individuals, rather than the state, owned the means of production, liberal democracy would automatically follow.

That theory seemed justified by the fall of the Soviet Union in 1991. The crumbling of that communist system convinced democratic nations that they had won, they had defeated communism, their system of government would dominate the future. Famously, in 1992, political philosopher Francis Fukuyama wrote that humanity had reached “the universalization of Western liberal democracy as the final form of human government.” In the 1990s, America’s leaders believed that the spread of capitalism would turn the world democratic as it delivered to them global dominance, but they talked a lot less about democracy than they did about so-called free markets.

In fact, the apparent success of capitalism actually undercut democracy in the U.S. The end of the Cold War was a gift to those determined to destroy the popular liberal state that had regulated business, provided a basic social safety net, and invested in infrastructure since the New Deal. They turned their animosity from the Soviet Union to the majority at home, those they claimed were bringing communism to America. “​​For 40 years conservatives fought a two-front battle against statism, against the Soviet empire abroad and the American left at home,” right-wing operative Grover Norquist said in 1994. “Now the Soviet Union is gone and conservatives can redeploy. And this time, the other team doesn’t have nuclear weapons.”

Republicans cracked down on Democrats trying to preserve the active government that had been in place since the 1930s. Aided by talk radio hosts, they increasingly demonized their domestic political opponents. In the 1990 midterm elections, a political action committee associated with House Republican whip Newt Gingrich gave to Republican candidates a document called “Language: A Key Mechanism of Control.” It urged candidates to label Democrats with words like “decay,” “failure,” “crisis,” “pathetic,” “liberal,” “radical,” “corrupt,” and “taxes,” while defining Republicans with words like “opportunity,” “moral,” “courage,” “flag,” “children,” “common sense,” “hard work,” and “freedom.” Gingrich later told the New York Times his goal was “reshaping the entire nation through the news media.”

Their focus on capitalism undermined American democracy. They objected when the Democrats in 1993 made it easier to . . .

Continue reading.

Written by Leisureguy

26 May 2022 at 12:03 am

The reinvention of a ‘real man’

leave a comment »

Cultural change comes slowly, one person at a time, each one a paradigm shift from one way of understanding how the world works (that is, understanding the interplay and intermeshing of individuals in their cultural matrix) to another way. Because people are to a large extent — in their outlook, their values, their behaviors — an assemblage of memes, patterns learned through imitation and taught the same way — changing a culture means changing those who live within it (and within whom the culture lives). This is slow work, particularly since many if not most will view such a change as a threat almost as real as death: if they become different as a person, the person they now are will no longer exist as a person, and that threat to identity is as frightening as a threat to life, for it is indeed the life of that identity that’s at stake.

Jose A. Del Real reports in the Washington Post about a public health worker who is trying to change the cultural view of manhood (gift link, no paywall).

— In BUFFALO, Wyoming

Bill Hawley believes too many men are unwilling or unable to talk about their feelings, and he approaches each day as an opportunity to show them how.

“There’s my smile,” he says to a leathered cowboy in the rural northeast Wyoming town where he lives.

“I could cry right now thinking about how beautiful your heart is,” he says to a middle-aged male friend at work.

“After our conversation last week, your words came back to me several times,” he tells an elderly military veteran in a camouflage vest. “Make of that what you will, but it meant something to me.”

On paper, Bill is the “prevention specialist” for the public health department in Johnson County, a plains-to-peaks frontier tract in Wyoming that is nearly the size of Connecticut but has a population of 8,600 residents. His official mandate is to connect people who struggle with alcohol and drug abuse, tobacco addiction, and suicidal impulses to the state’s limited social service programs. Part bureaucrat, part counselor, much of Bill’s life revolves around Zoom calls and subcommittees, government acronyms and grant applications.

But his mission extends beyond the drab county building on Klondike Drive where he works. One Wyoming man at a time, he hopes to till soil for a new kind of American masculinity.

His approach is at once radical and entirely routine.

It often begins with a simple question.

“How are you feeling?” Bill asks the man in camouflage, who lives in the Wyoming Veterans’ Home, which Bill visits several times a week. Bill recently convinced him to quit smoking cigarettes.

The man lumbers forward on a walker, oxygen tank attached.

“We can talk about triggers for a hot minute, or six, or 10,” Bill encourages him. “All those things are going to try to sneak up on you and trick you.”

“I’ve got a whole bunch of triggers,” the 72-year-old veteran responds, finally, between violent coughs. “Well they’re called triggers, but they never go away.”

Here in cowboy country, the backdrop and birthplace of countless American myths, Bill knows “real men” are meant to be stoic and tough. But in a time when there are so many competing visions of masculinity — across America and even across Wyoming — Bill is questioning what a real man is anyway.

Often, what he sees in American men is despair.

Across the United States, men accounted for 79 percent of suicide deaths in 2020, according to a Washington Post analysis of new data from the Centers for Disease Control and Prevention, which also shows Wyoming has the highest rate of suicide deaths per capita in the country. A majority of suicide deaths involve firearms, of which there are plenty in Wyoming, and alcohol or drugs are often a factor. Among sociologists, the Mountain West is nicknamed “The Suicide Belt.”

More and more, theories about the gender gap in suicides are focused on the potential pitfalls of masculinity itself.

The data also contains a sociological mystery even the experts are unsure how to explain fully: Of the 45,979 people who died by suicide in the United States in 2020, about 70 percent were White men, who are just 30 percent of the country’s overall population. That makes White men the highest-risk group for suicide in the country, especially in middle age, even as they are overrepresented in positions of power and stature in the United States. The rate that has steadily climbed over the past 20 years.

Some clinical researchers and suicidologists are now asking whether there is something particular about White American masculinity worth interrogating further. The implications are significant: On average, there are more than twice as many deaths by suicide than by homicide each year in the United States.

Bill, who is 59 years old and White, is working out his own theory. It has to do with the gap between . . .

Continue reading. (gift link, no paywall)

Written by Leisureguy

23 May 2022 at 11:03 am

Ursula Le Guin’s “The Ones who Walk Away from Omelas”: Would You Walk Away?

leave a comment »

Spencer Case writes in 1,000-Word Philosophy:

When, if ever, is it right to sacrifice someone for the greater good?

Ursula K. Le Guin’s (1929-2018) fantasy short story, “The Ones Who Walk Away from Omelas,” raises this question, among others.[1]

This essay introduces her story and explores its philosophical implications.

1. The Dark Secret

The story begins with an elaborate description of a summer festival in an exquisitely beautiful and happy city called Omelas. It’s as though we’re being shown a travel brochure for a place that seems too good to be true.

Le Guin says if you can imagine an even better city than the one she describes, then think of that instead.

Of course, there’s a twist.

Somewhere in the city is a closet where an emaciated child, referred to only as “it,” is locked up. It’s smeared with its own feces, covered with sores, and constantly afraid. Occasionally, the door opens and people will look at it, kick it, and make it stand up.

It says, “I will be good,” but the door always shuts without anyone making a reply.

Why?

Because the denizens of Omelas made a deal – with what or whom, we aren’t told, but apparently dark magic was involved.

The deal is that Omelas would be a paradise provided that a child’s happiness is sacrificed. Whether this applies to just this one child, or a succession of children, is unspecified. In any event, every adult knows that a single kind word spoken to this child would violate the terms of the deal.

We don’t know what the consequences of breaking the deal would be because we don’t know what things were like before. But certainly Omelas would be a much less happy place overall, even though this child would be happier.

2. Walking Away

When the children of Omelas reach adolescence, they’re told the dark secret, and some see the child. They react with . . ..

Continue reading.

Le Guin’s story seems to owe more than a little to the Grand Inquisitor story Dostoevsky included in The Brothers Karamazov. 

Written by Leisureguy

21 May 2022 at 12:37 pm

%d bloggers like this: