Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘History’ Category

The Town That Went Feral

leave a comment »

In the New Republic Patrick Blanchfield reviews a brief history of an effort to put Libertarianism into practice in Grafton NH. (Like all previous attempts, it was an utter failure, and for the same reason: a reliance on mere logic, with no consideration given to experience — and as Oliver Wendell Holmes Jr. observed, “The life of the law has not been logic; it has been experience.”

The review begins:

In its public-education campaigns, the U.S. National Park Service stresses an important distinction: If you find yourself being attacked by a brown or grizzly bear, YES, DO PLAY DEAD. Spread your arms and legs and cling to the ground with all your might, facing downward; after a few attempts to flip you over (no one said this would be easy), the bear will, most likely, leave. By contrast, if you find yourself being attacked by a black bear, NO, DO NOT PLAY DEAD. You must either flee or, if that’s not an option, fight it off, curved claws and 700 psi-jaws and all.

But don’t worry—it almost never comes to this. As one park service PSA noted this summer, bears “usually just want to be left alone. Don’t we all?” In other words, if you encounter a black bear, try to look big, back slowly away, and trust in the creature’s inner libertarian. Unless, that is, the bear in question hails from certain wilds of western New Hampshire. Because, as Matthew Hongoltz-Hetling’s new book suggests, that unfortunate animal may have a far more aggressive disposition, and relate to libertarianism first and foremost as a flavor of human cuisine.

Hongoltz-Hetling is an accomplished journalist based in Vermont, a Pulitzer nominee and George Polk Award winner. A Libertarian Walks Into a Bear: The Utopian Plot to Liberate an American Town (and Some Bears) sees him traversing rural New England as he reconstructs a remarkable, and remarkably strange, episode in recent history. This is the so-called Free Town Project, a venture wherein a group of libertarian activists attempted to take over a tiny New Hampshire town, Grafton, and transform it into a haven for libertarian ideals—part social experiment, part beacon to the faithful, Galt’s Gulch meets the New Jerusalem. These people had found one another largely over the internet, posting manifestos and engaging in utopian daydreaming on online message boards. While their various platforms and bugbears were inevitably idiosyncratic, certain beliefs united them: that the radical freedom of markets and the marketplace of ideas was an unalloyed good; that “statism” in the form of government interference (above all, taxes) was irredeemably bad. Left alone, they believed, free individuals would thrive and self-regulate, thanks to the sheer force of “logic,” “reason,” and efficiency. For inspirations, they drew upon precedents from fiction (Ayn Rand loomed large) as well as from real life, most notably a series of micro-nation projects ventured in the Pacific and Caribbean during the 1970s and 1980s.

None of those micro-nations, it should be observed, panned out, and things in New Hampshire don’t bode well either—especially when the humans collide with a newly brazen population of bears, themselves just “working to create their own utopia,” property lines and market logic be damned. The resulting narrative is simultaneously hilarious, poignant, and deeply unsettling. Sigmund Freud once described the value of civilization, with all its “discontents,” as a compromise product, the best that can be expected from mitigating human vulnerability to “indifferent nature” on one hand and our vulnerability to one another on the other. Hongoltz-Hetling presents, in microcosm, a case study in how a politics that fetishizes the pursuit of “freedom,” both individual and economic, is in fact a recipe for impoverishment and supercharged vulnerability on both fronts at once. In a United States wracked by virus, mounting climate change, and ruthless corporate pillaging and governmental deregulation, the lessons from one tiny New Hampshire town are stark indeed.

“In a country known for fussy states with streaks of independence,” Hongoltz-Hetling observes, “New Hampshire is among the fussiest and the streakiest.” New Hampshire is, after all, the Live Free or Die state, imposing neither an income nor a sales tax, and boasting, among other things, the highest per capita rate of machine gun ownership. In the case of Grafton, the history of Living Free—so to speak—has deep roots. The town’s Colonial-era settlers started out by ignoring “centuries of traditional Abenaki law by purchasing land from founding father John Hancock and other speculators.” Next, they ran off Royalist law enforcement, come to collect lumber for the king, and soon discovered their most enduring pursuit: the avoidance of taxes. As early as 1777, Grafton’s citizens were asking their government to be spared taxes and, when they were not, just stopped paying them.

Nearly two and a half centuries later, Grafton has become something of a magnet for seekers and quirky types, from adherents of the Unification Church of the Reverend Sun Myung Moon to hippie burnouts and more. Particularly important for the story is one John Babiarz, a software designer with a Krusty the Klown laugh, who decamped from Big-Government-Friendly Connecticut in the 1990s to homestead in New Hampshire with his equally freedom-loving wife, Rosalie. Entering a sylvan world that was, Hongoltz-Hetling writes, “almost as if they had driven through a time warp and into New England’s revolutionary days, when freedom outweighed fealty and trees outnumbered taxes,” the two built a new life for themselves, with John eventually coming to head Grafton’s volunteer fire department (which he describes as a “mutual aid” venture) and running for governor on the libertarian ticket.

Although John’s bids for high office failed, his ambitions remained undimmed, and in 2004 he and Rosalie connected with . . .

Continue reading.

Written by Leisureguy

26 November 2022 at 6:15 pm

The origins of the US Thanksgiving holiday

leave a comment »

Heather Cox Richardson has a nice explainer on the origins of US Thanksgiving. She writes:

The past week has brought seven mass shootings in the United States. Twenty-two people have been killed and 44 wounded. I’ll have more to say later about our epidemic of gun violence, but tonight, on the night before Thanksgiving, when I traditionally post the story of the holiday’s history, I simply want to acknowledge the terrible sorrow behind tomorrow’s newly empty chairs.

Thanksgiving itself came from a time of violence: the Civil War.

The Pilgrims and the Wampanoags did indeed share a harvest celebration together at Plymouth in fall 1621, but that moment got forgotten almost immediately, overwritten by the long history of the settlers’ attacks on their Indigenous neighbors.

In 1841 a book that reprinted the early diaries and letters from the Plymouth colony recovered the story of that three-day celebration in which ninety Indigenous Americans and the English settlers shared fowl and deer. This story of peace and goodwill among men who by the 1840s were more often enemies than not inspired Sarah Josepha Hale, who edited the popular women’s magazine Godey’s Lady’s Book, to think that a national celebration could ease similar tensions building between the slaveholding South and the free North. She lobbied for legislation to establish a day of national thanksgiving.

And then, on April 12, 1861, southern soldiers fired on Fort Sumter, a federal fort in Charleston Harbor, and the meaning of a holiday for giving thanks changed.

Southern leaders wanted to destroy the United States of America and create their own country, based not in the traditional American idea that “all men are created equal,” but rather in its opposite: that some men were better than others and had the right to enslave their neighbors. In the 1850s, convinced that society worked best if a few wealthy men ran it, southern leaders had bent the laws of the United States to their benefit, using it to protect enslavement above all.

In 1860, northerners elected Abraham Lincoln to the presidency to stop rich southern enslavers from taking over the government and using it to cement their own wealth and power. As soon as he was elected, southern leaders pulled their states out of the Union to set up their own country. After the firing on Fort Sumter, Lincoln and the fledgling Republican Party set out to end the slaveholders’ rebellion.

The early years of the war did not go well for the U.S. By the end of 1862, the armies still held, but people on the home front were losing faith. Leaders recognized the need both to acknowledge the suffering and to keep Americans loyal to the cause. In November and December, seventeen state governors declared state thanksgiving holidays.

New York governor Edwin Morgan’s widely reprinted proclamation about the holiday reflected that the previous year “is numbered among the dark periods of history, and its sorrowful records are graven on many hearthstones.” But this was nonetheless a time for giving thanks, he wrote, because  . . .

Continue reading.

Written by Leisureguy

24 November 2022 at 9:29 am

When Lethal Weapons Grew on Trees

leave a comment »

Man on beach in primitive apparel, aiming a bowl with an arrow whose length is greater than the height of the man.
Tanimber islander with very large bow and arrow in leather armor, Dutch Indies. Source unknown.

Kris De Decker in Low-Tech Magazine (subtitle: Doubts on progress and technology) takes a look at what we know of the history of the bow and arrow. The (lengthy and well-illustrated) article begins:

Many bows and arrows ago

The bow is one of humanity’s most essential and fascinating technologies, perhaps only eclipsed by the controlled use of fire. Despite endless academic speculation on the subject for almost 200 years, we don’t know when archery originated. [1] Bows and arrows were made from organic materials, which do not preserve for long. The oldest archaeological finds come from peat bogs, glaciers, and water-logged lake sediments – oxygen-free environments that prevent organic materials from decaying. [2] In the 1930s, in Stellmoor, Germany, archaeologists found roughly 100 arrow shafts dated to between 8,000 and 10,000 BC. [3] The oldest bow came to light in the 1940s in Holmegaard, Denmark. Scientists dated it to between 6,500 and 7,000 BC.

The bow and arrow are much older than these records indicate. One reason is that prehistoric bows were of a very sophisticated design, a point we return to later. Second, archaeologists have unearthed much older projectile points. The arrowhead is the only part of the bow and arrow made of inorganic material and thus preserves much longer. However, it can be hard to distinguish arrowheads from projectile points used with other weapons, most notably the spearthrower or atlatl. [4-5] While keeping this in mind, some studies have pushed back the date for the first bow and arrow use to between 35,000 and 70,000 years ago. [6] But even arrowheads cannot tell us the whole story because fire-hardened wooden points may have preceded bone and stone points.

Human powered springs

In mechanical terms, the bow is a spring made up of two flexible, elastic limbs held under tension by a string. When the archer pulls the string back, energy accumulates in the bow. When the archer releases the string, the energy transmits to the arrow, which flies out of the bow. The bow is a highly efficient technology: the arrow’s kinetic energy (usable energy) is close to the total energy expended. [7][8] Arrows are also very efficient, much more so than bullets: they lose little speed in flight and require little energy to penetrate a target. [9]

The bow and arrow is a missile (or ranged) weapon for striking from a distance. Simple missile weapons are launched using unassisted bodily force, for example, thrown stones, throw sticks, or hand-cast spears (“javelins”). Complex missile weapons interpose a launcher between the human and the missile. Such weapon systems include the bow as well as the sling, the blowgun, the spearthrower, and the firearm. [4] In the hands of a skillful and muscular archer, the (pre)historical bow was a powerful and accurate weapon. The firearm replaced the bow because it was easier to use, not because it was technically superior. [9]

Diversity of bow designs

Our forebears have used the bow and arrow on every continent except Australia (where spearthrower and throw stick prevailed) and Antarctica. The large geographical distribution and long history led to a wide diversity of bow designs determined by the local circumstances – the available materials and tools, the landscape, the climate, the use of the weapon, the social context, and so on. All bows consisted of a stave and a string, but the materials, dimensions, forms, shooting styles, and other features varied considerably. [10-11] That is not the case with modern firearms, which are the same everywhere.

Essentially, there are two types of bows, opposites on a scale: the 

Continue reading. There’s much more.

Written by Leisureguy

23 November 2022 at 5:18 pm

How Coffee Fueled Revolutions—and Revolutionary Ideas

leave a comment »

Jessica Pearce Rotondi writes in History:

Sultan Murad IV decreed death to coffee drinkers in the Ottoman Empire. King Charles II dispatched spies to infiltrate London’s coffeehouses, which he saw as the original source of “false news.” During the Enlightenment, Voltaire, Rousseau and Isaac Newton could all be found talking philosophy over coffee. The cafés of Paris sheltered revolutionaries plotting the storming of the Bastille and later, served as the place authors like Simone de Beauvoir and Jean-Paul Sartre plotted their latest books.

History is steeped in ideas sparked over cups of coffee. Here’s a rundown of the revolutionary power of the commonplace café.

The First Coffee House Opens in the Ottoman Empire

Coffee houses began in the Ottoman Empire. Since liquor and bars were off-limits to most practicing Muslims, coffeehouses provided an alternative place to gather, socialize and share ideas. Coffee’s affordability and egalitarian structure—anyone could come in and order a cup—eroded centuries of social norms. Not everyone was pleased by this change.

In 1633, Sultan Murad IV decreed that the consumption of coffee was a capital offense. Murad IV’s brother and uncle had been killed by janissaries, infantry units who were known to frequent cafes. The sultan was so dedicated to catching coffee sippers in the act that he allegedly disguised himself as a commoner and prowled Istanbul, decapitating offenders with his hundred-pound broadsword.

Ottoman sultans issued and retracted coffeehouse bans well into the 18th century to prevent the gathering of dissidents. But by then, . . .

Continue reading.

Written by Leisureguy

22 November 2022 at 8:11 pm

The library of Alexandria and its reputation

leave a comment »

Peter Gainsford writes in the Kiwi Hellenist:

Many people are aware that the library of Alexandria is hugely overblown. Sure, there’ll always be people insisting that it was a magical place that held the secrets of Göbekli Tepe, Doggerland, and blond blue-eyed Europeans building pyramids in Mexico and Bolivia: there’s no point engaging with people like that. The thing is, pretty much everyone has heard of it.

Last week the History subreddit paid some attention to a piece I wrote in 2015 dispelling some myths about the Alexandrian library. Which is nice. Some people misread it and thought I was claiming it was true that ‘the burning of the library of Alexandria was “the most destructive fire in the history of human culture”’. That’s a pity, but understandable. (One reader was angry at my claiming to be a Kiwi and a hellenist: that was entertaining.)

On a more serious note, several readers pointed out that there were other library losses in history that were far more destructive. And that’s absolutely correct. Any time books are destroyed that don’t exist in other copies in other libraries, that’s a catastrophic and irreversible loss.

You can argue about whether specific incidents belong in this category. The destruction of the House of Wisdom in Baghdad in 1258 didn’t exactly put an end to the Abbasid knowledge economy and book culture, any more than the Alexandrian fire did in hellenised Egypt.

But some tragedies really are catastrophically destructive. The fire at the . . .

Continue reading.

Written by Leisureguy

22 November 2022 at 6:53 pm

Posted in Books, Daily life, History

Source of the problem

leave a comment »

Written by Leisureguy

21 November 2022 at 4:48 pm

A Revised “Ostrom’s Design Principles for Collective Governance of the Commons”

leave a comment »

Christopher Allen writes at Life with Alacrity:

The traditional economic definition of “the commons” are those resources that are held in common and not privately owned. This is closely related to economic concept of public goods, which are goods that are both non-excludable (in that individuals cannot be effectively excluded from use) and non-rivalrous (where use by one individual does not reduce availability to others).

My own personal definition for the commons is broader — any regenerative, self-organizing complex system that can be drawn upon for deep wealth. These can include traditional commons, such as lumber, fish, etc., but can also include other regenerative systems such as communities, markets, intellectual property, etc.

In 2009, Elinor Ostrom received the Nobel Prize in Economics for her “analysis of economic governance, especially the commons”. In that, she listed 8 principles for effectively managing against the tragedy of the commons.

However, I’ve found her original words — as well as many adaptions I’ve seen since — to be not very accessible. Also, since the original release of the list of 8 principles there also has been some research resulting in updates and clarifications to her original list.

I also wanted to generalize her principles for broader use given my broader definition of the commons, and apply them to everything from how to manage an online community to how a business should function with competitors. I went to the original version, more contemporary research, as well as a number of modern adaptions to define a new set of design principles.

My first draft divided up her 8 principles into 10. I then ran this list by a number of experts in this field, and here is my second draft based on their feedback. This draft in fact breaks up her principles into 12 different ones, but I have retained her old numbering system as there are large number of works that refer to the original 8. In addition, there appears to be some differences in thoughts on number 8, so I’ve included two variations.

Ostrom’s Design Principles for Collective Governance of the Commons


How to Avoid the Tragedy of the Commons within Self-Organizing Systems

1A. DEFINE AUTHORIZED USE: The community of those who have the right to use the common resource is clearly defined.

1B. DEFINE COMMONS BOUNDARIES: The boundaries of the commons are clearly defined so as to separate the usage rules from the larger environment.

2A. MAKE COSTS PROPORTIONAL: Costs for using and maintaining the commons are proportional to the benefits that those users receive from the commons.

2B. PAY ALL COSTS: People that use the commons keep costs inside the local system as much as possible. They do not externalize costs either to neighbors or future generations.

3A. DECIDE . . .

Continue reading.

See also a very interesting article by Eula Biss in the New Yorker, The Theft of the Commons.” (no paywall) It begins:

On the train to Laxton I was facing backward, heading south from Scotland, with the fields of England rushing away from me. I searched their dark creases and their uneven hedges for something I didn’t know how to see, something I wasn’t even certain was visible. I was trying to locate the origins of private property, a preposterous pursuit. There in those hedges, I was looking for a living record of enclosure, the centuries-long process by which land once collectively worked by the landless was claimed by the landed. That land already belonged to the landed, in the old sense of ownership, but it had always been used by the landless, who belonged to the land. The nature of ownership changed within the newly set hedges of an enclosed field, where the landowner now had the exclusive right to dictate how the land was used, and no one else belonged there.

From my backward-facing seat, I saw a long stone wall on the crest of a cliff. “The Wall,” John Berger writes, “is the front line of what, long ago, was called the Class War.” Walls, fences, hedges, and ditches were all used to mark the boundaries of enclosed land, so that sheep could be kept there, or some other profit could be pursued. Enclosure is how nearly all the agricultural land in Britain came to be owned by less than one per cent of the population. In “The Making of the English Working Class,” the historian E. P. Thompson writes that enclosure was “a plain enough case of class robbery, played according to fair rules of property and law laid down by a parliament of property-owners and lawyers.”

The pilgrims who sailed on the Mayflower were not property owners but economic migrants financed by property owners. They were also communists, in that they agreed to work communally and share the profits of their labor for the first seven years of their settlement, though that agreement did not last beyond the first year. They settled on land held by the Wampanoag people, who did not practice the absolute ownership of land. Among the Wampanoag, rights to use the same plot of land could overlap, so that one family might hold the right to fish in a stream and another might hold the right to farm the banks of that stream. Usage rights could be passed down from mothers to daughters, but the land itself could not be possessed.

I once saw some old suitcases lying open on a museum floor, each full of living sod, the work of the South African artist Kemang Wa Lehulere. His art, the museum catalogue explained, was an effort to reimagine deleted scenes from history. Enclosure would seem to be one such scene. Deleted, perhaps, because it unfolded so slowly, in the course of about five hundred years. It began in the Middle Ages and was completed by acts of Parliament in the eighteenth and nineteenth centuries. This land revolution set the stage for the Industrial Revolution. Enclosure, Marx argued, is what produced the landless wage workers who became the proletariat. Historians disagree on that, so it is safer to say that enclosure produced Romantic poetry, a literature marked by nostalgia for a lost world. “All sighed,” the landless poet John Clare wrote, “when lawless law’s enclosure came.”

Clare was a homesick poet, always trying to write himself back to the open fields of his childhood. “Unbounded freedom ruled the wandering scene,” he wrote, “Nor fence of ownership crept in between.” After writing four books of poetry, he was certified insane and admitted to an asylum. But he absconded from the asylum and walked eighty miles back to the place he was from. He slept in ditches and ate grass and believed he was going back to his first love, who was no longer alive. Ever since I saw those suitcases on the museum floor, Clare has walked eighty miles through my mind carrying a suitcase of his native sod.

Laxton is the one remaining village in England that was never enclosed, and where tenant farmers still work the land coöperatively, as they have for at least the past seven hundred years. They use the open-field system, cultivating crops on narrow strips of land that follow the curvature of the hills. There are no hedges or fences between these strips, and working them requires collaboration among the farmers.

In the time before enclosure, shared pastures where . . .

Continue reading. (no paywall)

Written by Leisureguy

20 November 2022 at 5:26 pm

The complicated business legacy of GE’s Jack Welch

leave a comment »

My view is that Jack Welch not only ate away at the foundation of GE, leading to its subsequent collapse, but he also was responsible for the same sort of damage to capitalism, being a major contributor to the era of hypercapitalism.

In Fast Company, Kaushik Viswanath reviews (no paywall) William D. Cohan’s book Power Failure, a history of General Electric (and thus a close look at Jack Welch). Viswanath writes:

Plenty of ink has been spilled on General Electric, the storied 130-year-old conglomerate that had its origins with Thomas Edison. With his new book Power Failure, William D. Cohan, a journalist and author of several books on American business, adds nearly 800 pages to this corpus, with the aim of delivering the definitive and final history of the company. And there is an air of finality to it: Although GE survives today, it has fallen far from the status it held in the ‘90s as the world’s most valuable company, and awaits being broken up into three separate companies next year.

Cohan tells the story chronologically, beginning with the founding of the company in the 19th century. He shines a light on a figure whose talent for business gets overshadowed in most other accounts by Edison’s genius for invention: Charles Albert Coffin. In 1883, Coffin, the CEO of a Massachusetts shoe company, bankrolled a struggling maker of dynamos in town. With Coffin’s guidance and capital, Thomson-Houston Electric Company achieved success, becoming a formidable competitor to the Edison General Electric Company. Coffin then conspired with Edison’s financial backers, including J.P. Morgan, to merge the two companies against Edison’s wishes, leaving a furious Edison with a mere 10% stake—and Coffin the president of the new General Electric Company, formed in 1892.

The early history of GE is fascinating not only for . . .

Continue reading. (no paywall)

Written by Leisureguy

17 November 2022 at 9:52 am

The complexity of J. Edgar Hoover

leave a comment »

People are more complicated than we want them to be,  particularly when we are feeling judgmental. Kai Bird has a fascinating review (no paywall) of a new biography of J. Edgar Hoover in the Washington Post, which begins:

On Oct. 7, 1964, President Lyndon Johnson’s longtime aide Walter Jenkins walked into the YMCA near the White House after a party at the Newsweek magazine office and had sex in the bathroom with a homeless Army veteran. The vice squad arrested Jenkins, booked him and released him. A week later, the story made headlines on the eve of the presidential election that pitted Johnson against Republican Barry Goldwater. By then, a near-suicidal Jenkins had checked into George Washington University Hospital and the Republicans were “punching hard,” writes Beverly Gage in “G-Man,” her masterful account of the life and controversial career of FBI Director J. Edgar Hoover. The Goldwater campaign demanded to know if Jenkins’s conduct had compromised national security. Forced to act, Johnson ordered Hoover, his old friend and onetime neighbor, to investigate the scandal. Hoover was annoyed. This was politics, and for decades he had tried to insulate the FBI from partisan politics. But he did what he was told to do by his president.

It turned out that Jenkins, the father of six children, had been arrested in the same bathroom five years earlier. Johnson was astonished that Jenkins could have hidden his proclivities. Hoover was not. He thought such temptations were commonplace. Four days into the investigation he told Johnson that Jenkins had been under enormous stress and required medical attention. The FBI chief had already sent a bouquet of flowers to Jenkins’s hospital room. Attached was a sympathy card wishing him a speedy recovery. “With less than two weeks to go before the election,” Gage writes, “Hoover issued a report absolving Jenkins of any national security violations,” and on Election Day, Johnson rolled to victory in one of the nation’s biggest presidential landslides.

In Gage’s biography, Hoover emerges as a strangely tortured man who wielded power within the Justice Department for an astonishing 48 years. His response to Jenkins revealed a softer side and, Gage explains, raised an “innuendo that Hoover might have more in common with Jenkins than he wished to acknowledge.” In a memo, Hoover wrote that he liked Jenkins and felt sorry for him. “It is a pitiful case,” he observed, “and I think it is time for people to follow the admonition of the Bible about persons throwing the first stone and that none are without sin.”

Hoover’s story illustrates the unique power of biography to enter the life of another human being. The genre can provoke a rare response: It can persuade one to change one’s mind. This magical leap can happen when a good biographer is able to seduce the reader into understanding another soul. “G-Man” is Gage’s first biography, and she turns out to be a marvelous biographer.

After reading Gage, I have changed my mind about Hoover. He is not the caricature villain I thought I knew when I came of age in the turbulent 1960s. Hoover was a man of profound contradictions. While he had enough empathy to send flowers to Jenkins, he also orchestrated . . .

Continue reading. (no paywall)

Written by Leisureguy

13 November 2022 at 5:59 pm

“How an 18th-Century Philosopher Helped Solve My Midlife Crisis”

leave a comment »

Alison Gopnik has a very interesting essay (no paywall) in the Atlantic:

In 2006, I was 50—and I was falling apart.

Until then, I had always known exactly who I was: an exceptionally fortunate and happy woman, full of irrational exuberance and everyday joy.

I knew who I was professionally. When I was 16, I’d discovered cognitive science and analytic philosophy, and knew at once that I wanted the tough-minded, rigorous, intellectual life they could offer me. I’d gotten my doctorate at 25 and had gone on to become a professor of psychology and philosophy at UC Berkeley.

I knew who I was personally, too. For one thing, I liked men. I was never pretty, but the heterosexual dance of attraction and flirtation had always been an important part of my life, a background thrum that brightened and sharpened all the rest. My closest friends and colleagues had all been men.

More than anything, though, I was a mother. I’d had a son at 23, and then two more in the years that followed. For me, raising children had been the most intellectually interesting and morally profound of experiences, and the happiest. I’d had a long marriage, with a good man who was as involved with our children as I was. Our youngest son was on his way to college.

I’d been able to combine these different roles, another piece of good fortune. My life’s work had been to demonstrate the scientific and philosophical importance of children, and I kept a playpen in my office long after my children had outgrown it. Children had been the center of my life and my work—the foundation of my identity.

And then, suddenly, I had no idea who I was at all.

My children had grown up, my marriage had unraveled, and I decided to leave. I moved out of the big, professorial home where I had raised my children, and rented a room in a crumbling old house. I was living alone for the first time, full of guilt and anxiety, hope and excitement.

I fell in love—with a woman, much to my surprise—and we talked about starting a new life together. And then my lover ended it.

Joy vanished. Grief took its place. I’d chosen my new room for its faded grandeur: black-oak beams and paneling, a sooty brick fireplace in lieu of central heating. But I hadn’t realized just how dark and cold the room would be during the rainy Northern California winter. I forced myself to eat the way I had once coaxed my children (“just three more bites”), but I still lost 20 pounds in two months. I measured each day by how many hours had gone by since the last crying jag (“There now, no meltdowns since 11 this morning”).

I couldn’t work. The dissolution of my own family made the very thought of children unbearable. I had won a multimillion-dollar grant to investigate computational models of children’s learning and had signed a contract to write a book on the philosophy of childhood, but I couldn’t pass a playground without tears, let alone design an experiment for 3-year-olds or write about the moral significance of parental love.

Everything that had defined me was gone. I was no longer a scientist or a philosopher or a wife or a mother or a lover.

My doctors prescribed Prozac, yoga, and meditation. I hated Prozac. I was terrible at yoga. But meditation seemed to help, and it was interesting, at least. In fact, researching meditation seemed to help as much as actually doing it. Where did it come from? Why did it work?

I had always been curious about Buddhism, although, as a committed atheist, I was suspicious of anything religious. And turning 50 and becoming bisexual and Buddhist did seem far too predictable—a sort of Berkeley bat mitzvah, a standard rite of passage for aging Jewish academic women in Northern California. But still, I began to read Buddhist philosophy.

In 1734, in Scotland, a 23-year-old was falling apart.

As a teenager, he’d thought . . .

Continue reading. (no paywall)

Written by Leisureguy

11 November 2022 at 9:23 pm

Using the Astronomicum Caesareum Book

leave a comment »

Written by Leisureguy

3 November 2022 at 3:43 pm

Where Will This Political Violence Lead? Look to the 1850s.

leave a comment »

In Politico Joshua Zeitz looks to US history and notes a recurring refrain of political violence from conservative minorities:

Early Friday morning, an intruder broke into the San Francisco home of House Speaker Nancy Pelosi and bludgeoned her husband, Paul Pelosi, 82, on the head with a hammer.

Details are still scant, but early indications suggest that the suspect, David Depape, is an avid purveyor of anti-Semitic, QAnon and MAGA conspiracy theories. Before the attack, the assailant reportedly shouted, “Where is Nancy? Where is Nancy?”

This is the United States of America in 2022. A country where political violence — including the threat of political violence — has become a feature, not a bug.

Armed men wearing tactical gear and face coverings outside ballot drop boxes in Arizona. Members of Congress threatening to bring guns onto the House floor — or actually trying to do it. Prominent Republican members of Congress, and their supporters on Fox News, stoking violence against their political opponents by accusing them of being pedophilesterrorists and groomers — of conspiring with “globalists” (read: Jews) to “replace” white people with immigrants.

And of course, January 6, and subsequent efforts by Republicans and conservative media personalities to whitewash or even celebrate it.

Pundits like to take refuge in the saccharine refrain, “this is not who we are,” but historically, this is exactly who we are. Political violence is an endemic feature of American political history. It was foundational to the overthrow of Reconstruction in the 1870s and the maintenance of Jim Crow for decades after.

But today’s events bear uncanny resemblance to an earlier decade — the 1850s, when Southern Democrats, the conservatives of their day, unleashed a torrent of violence against their opponents. It was a decade when an angry and entrenched minority used force to thwart the will of a growing majority, often with the knowing support and even participation of prominent elected officials.

That’s the familiar part of the story. The less appreciated angle is how that growing majority eventually came to accept the proposition that force was a necessary part of politics.

The 1850s were a singularly violent era in American politics. Though politicians both North and South, Whig and Democrat, tried to contain sectional differences over slavery, Southern Democrats and their Northern sympathizers increasingly pushed the envelope, employing coercion and violence to protect and spread the institution of slavery.

It began with the Fugitive Slave Act of 1850, which stripped accused runaways of their right to trial by jury and allowed individual cases to be bumped up from state courts to special federal courts. As an extra incentive to federal commissioners adjudicating such cases, it provided a $10 fee when a defendant was remanded to slavery but only $5 for a finding rendered against the slave owner. Most obnoxious to many Northerners, the law stipulated harsh fines and prison sentences for any citizen who refused to cooperate with or aid federal authorities in the capture of accused fugitives. Southern Democrats enforced the law with brute force, to the horror of Northerners, including many who did not identify as anti-slavery.

The next provocation was the Kansas Nebraska Act of 1854, which effectively abrogated the Missouri Compromise and opened the western territories to slavery. It wasn’t enough that Democrats rammed through legislation allowing the citizens of the Kansas and Nebraska territories to institutionalize slavery if they voted to do so in what had long been considered free territory. They then employed coercion and violence to rig the territorial elections that followed.

Though anti-slavery residents far outnumbered pro-slavery residents in Kansas, heavily armed “Border ruffians,” led by Missouri’s Democratic senator David Atchison, stormed the Kansas territory by force, stuffing ballot boxes, assaulting and even killing Free State settlers, in a naked attempt to tilt the scales in favor of slavery. “You know how to protect your own interests,” Atchison cried. “Your rifles will free you from such neighbors. … You will go there, if necessary, with the bayonet and with blood.” He promised, “If we win, we can carry slavery to the Pacific Ocean.”

The violence made it into Congress. When backlash against the Kansas Nebraska Act upended the political balance, driving anti-slavery Democrats and Whigs into the new, anti-slavery Republican party, pro-slavery Democrats responded with rage. In 1856,  . . .

Continue reading.

Written by Leisureguy

30 October 2022 at 11:29 am

Republicans seem ignorant of historical facts

leave a comment »

I do understand that the particular Republican discussed is more likely deceitful instead of (or as well as) ignorant. Heather Cox Richardson:

This week, news broke that as a guest on the right-wing Real America’s Voice media network in 2020, Republican candidate for Michigan governor Tudor Dixon said that the Democrats have planned for decades to topple the United States because they have not gotten over losing the Civil War. According to Dixon, Democrats don’t want anyone to know that white Republicans freed the slaves, and are deliberately strangling “true history.”

Dixon’s was a pure white power rant, but she was amplifying a theme we hear a lot these days: that Democrats were the party of enslavement, Republicans pushed emancipation, and thus the whole idea that Republican policies today are bad for Black Americans is disinformation.

In reality, the parties have switched sides since the 1850s. The shift happened in the 1960s, and it happened over the issue of race. Rather than focusing on party names, it makes more sense to follow two opposed strands of thought, equality and hierarchy, as the constants.

By the 1850s it was indeed primarily Democrats who backed slavery. Elite southern enslavers gradually took over first the Democratic Party, then the southern states, and finally the U.S. government. When it looked in 1854 as if they would take over the entire nation by spreading slavery to the West—thus overwhelming the free states with new slave states—northerners organized to stand against what they called the “Slave Power.”

In the mid-1850s, northerners gradually came together as a new political party. They called themselves “Republicans,” in part to recall Jefferson’s political party, which was also called the Republican party, even though Jefferson by then was claimed by the Democrats.

The meaning of political names changes.

The new Republican Party first stood only for opposing the Slave Power, but by 1859, Lincoln had given it a new ideology: it would stand behind ordinary Americans, rather than the wealthy enslavers, using the government to provide access to resources, rather than simply protecting the wealthy. And that would mean keeping slavery limited to the American South.

Prevented from imposing their will on the U.S. majority, southern Democrats split from their northern Democratic compatriots and tried to start a new nation based on racial slavery. They launched the Civil War.

At first, most Republicans didn’t care much about enslaved Americans, but by 1863 the war had made them come around to the idea that the freedom of Black Americans was crucial to the success of the United States. At Gettysburg in 1863, Lincoln reinforced the principles of the Declaration of Independence and dedicated the nation to a “new birth of freedom.” In 1865 the Republican Congress passed and sent off to the states for ratification the Thirteenth Amendment to the Constitution, ending enslavement except as punishment for crime (we really need to fix that, by the way).

After the war, as southern Democrats organized to reinstate white supremacy in their states, Republicans in 1868 added the Fourteenth Amendment, giving the federal government power to guarantee that states could not deny equal rights to American citizens, and then in 1870 the Fifteenth Amendment, guaranteeing Black men the right to vote. They also established the Department of Justice to defend those rights. But by 1871, white Republicans were backing away from federal protection of Black Americans.

Democrats continued to push white supremacy until 1879, when  . . .

Continue reading.

Written by Leisureguy

29 October 2022 at 8:11 pm

The Wide Angle, by David Troy

leave a comment »

David Troy now has a regular column, “The Wide Angle,” in the Washington Spectator. His initial column begins:

In June, The Washington Spectator published my long-form investigation into the complicated history behind the January 6th insurrection, Paranoia on Parade. Covering nearly a century, the piece was the result of several years worth of collaborative research, looking into root causes and obscure movements that busy reporters at our daily papers understandably have little time to consider.

At the invitation of Hamilton Fish, editor of the Spectator, I’m now also looking forward to publishing a monthly column here, where I can share insights and analysis in something closer to real time, while also pursuing long-form, sense-making investigative work. And I’m indebted to both Ham as well as my network of research partners who make any of this work possible.

I did not arrive in the world of investigative journalism intentionally. My educational background includes a focus on both history and computer science, and I have been a professional technology entrepreneur since I was a teenager — now several decades ago. I became involved with online culture and the internet in the 1980’s. I started and successfully exited several technology businesses and have been fortunate to be able to pursue a variety of projects I find challenging and worthwhile.

Since 2007, I have been heavily involved in analyzing data from social media platforms, such as Twitter and LinkedIn. For the last five years, I’ve focused on research and journalism with the intent of countering threats to our democracy. I’ve worked with many journalists and researchers to help document and counter ongoing threats in the information environment.

I’ll be focusing on keeping you up to date on emerging stories and trends that other outlets may not have the patience or capacity to cover, with a particular focus on irregular warfare, networked insurgency, and the alternate belief systems that animate these phenomena. Many of the stories my team and I are following derive from empirical network analysis. Our practice has been to let research, data, and evidence take us to the story — rather than the other way around.

As we head into the midterms and the eighth month of Vladimir Putin’s disastrous and cruel war in Ukraine, Americans are distracted by the team sports-style of coverage that most journalism delivers around election time. While that’s understandable,  my colleagues and I mostly have our eyes elsewhere.

The anti-democracy forces we saw on display on January 6th, which included individuals connected to Putin’s regime, Falun Gong, the Moonies, and a variety of domestic anti-government (and historically anti-communist) networks have not faded away, but rather are aligning globally.

As Russia’s military continues to falter in Ukraine, Putin’s tactics have become increasingly desperate with forced mobilization, referenda held effectively at gunpoint, and illegal annexations that the international community has mostly refused to recognize.

Elon Musk has become part of Russia’s propaganda thrust, serving as a proxy voice for the Kremlin, suggesting that Ukraine simply “compromise” — or risk the eruption of nuclear war. Tensions are heightening with North Korea, and propaganda channels are also suggesting that conflict between China and Taiwan is imminent.

Putin is also focused on establishing a new global economic bloc. Their intention is to pull together . . .

Continue reading.

I thought the comment on Elon Musk as being a useful idiot was spot on. Musk commenting on international politics shows the Dunning-Kruger effect in action.

Written by Leisureguy

19 October 2022 at 4:57 pm

How NATO Solves Its Abandonment Problem

leave a comment »

This video nicely explains some of the mechanisms that keep international agreements in place.

Written by Leisureguy

19 October 2022 at 10:02 am

List of common misconceptions

leave a comment »

Written by Leisureguy

16 October 2022 at 5:55 am

Mutual entrapment: Heather and humans

leave a comment »

Heather in bloom. Photo supplied by article author.

Mette Løvschal, professor of archaeology in the School of Culture and Society at Aarhus University in Denmark, has a fascinating article in Aeon, which begins:

Steady blows from flint axes echo through the forest. Each strike cuts deeper into the layers of a tree trunk, rhythmically interrupting the chattering of birds and the hum of swarming insects. Someone is calling, but their voice is drowned out by cracking in the underwood – another tree is falling. We are watching the making of a forest clearing, 5,000 years ago, somewhere near western Jutland in modern-day Denmark.

Beneath a dense weave of oak, hazel, elm and ash is a landscape of gentle hills. We can barely make out their contours through the trees. When we move, we follow passageways made by trunks and branches: here, impassable; there, pliable and yielding. As more trees fall, light pours onto wildlife and plants in the underwood. Down here, among sun-starved grasses and shrubs, is a little plant that will one day be called hæddre or heather. The smell of its burning leaves fills the air.

Deeper in the woods, nomadic Neolithic pastoralists are setting fires. Grey smoke hangs beneath the crowns of standing trees. The fire crackles. Oil in the heather’s shiny leaves – lustrous with fire-loving resin – bursts when ignited, and its dormant seeds begin to awaken in the flames. It is as if the plant is quietly whistling: ‘Burn me!’ Next year, once the last of the fallen trees have been hauled away and the fires in the clearing have been extinguished, new heather shoots will grow in these expanding patches of burned sun-lit pasture. What those shoots promise is a new horizon of pastoral possibility: when winter comes, and other resources dwindle, green and nutritious heather will be thriving here. This clearing will allow the survival of the herd.

Reconstructing moments like these – via pollen records and charred plant materials in archaeological excavations – allows us to tell a unique story about human-changed landscapes. Inside the forest, we witness a small, flammable shrub becoming a key resource that will one day cover millions of hectares across northern Europe, forming a colossal belt of heathlands stretching from Portugal to Ireland, and all the way up to the Lofoten archipelago in Norway. But this is not just another tale of how our species radically transforms its environment. Among the first forest clearings, we see early humans engaging in a new form of worldmaking, unaware that in some distant future this changed landscape would lock its domesticators into trajectories of care and maintenance from which it will become almost impossible to escape. Among the fallen oak, hazel, elm and ash, a trap had been sprung.

We often conceive of domestication as a process involving humans taming, penning or manipulating animals and plants. Domestication turned wild sheep species into livestock, wolves into pets, and weeds into cereal crops. It also transformed whole landscapes, as people learned to domesticate forests, grasslands, jungles and coastlines. But this is not a process that belongs to the distant past. Newer forms of domestication are still emerging as rural landscapes are turned into fields of solar panels, coastlines into concrete seawalls, and former deserts into forests. Each transformation is designed to serve human needs: to increase biomass, reduce food insecurity or sequester carbon. And, in each, domination appears to flow in one direction. Humans domesticate. But can domestication flow the other way?

To consider how a landscape might domesticate humans, we must journey into deep time to tell two stories. First, a story of how . . .

Continue reading.

Written by Leisureguy

14 October 2022 at 4:59 pm

1,600 Years of Medical Hubris

leave a comment »

Robert F. Graboyes, an economist, journalist, and musician who has worked at Chase Manhattan Bank, the Federal Reserve, NFIB, Mercatus Center, and five universities and has a PhD from Columbia University, writes at Bastiat’s Journal:

Over its long history, the field of medicine been at its best when it was rife with questions and at its worst when it was brimming with answers. The physicians whose ignorance killed George Washington and James A. Garfield would probably have nodded approvingly at the “We Believe Science Is Real” signs planted today in the front yards of America’s tonier precincts. Medical science has a long history of slavish devotion to orthodoxy and stasis and virulent opposition to heterodoxy and change. Physicist Max Planck famously said, “Science progresses one funeral at a time.” In medical science, I would argue that, “Science progresses millions of funerals at a time.” Unwarranted stasis can kill in large numbers.

Since 1962, people have debated the meaning and value of philosopher Thomas Kuhn’s The Structure of Scientific RevolutionsKuhn challenged the perception that the accumulation of scientific data leads us closer and closer to “truth.” Rather, in his paradigm, science is more of a metaphor for reality—an imperfect lens with which we examine a universe whose complexities are and will always be well beyond our grasp. In Kuhn’s formulation, any scientific theory, no matter how useful and imposing, will ultimately be overturned by successor theories when new evidence batters the old metaphor sufficiently. Newton’s formulation of gravity, for example, served mankind exceedingly well for centuries (and still does in certain settings). But by the early 20th century, the shortcomings of that model were becoming obvious, and Einstein devised a new metaphor—relativity. In some ways, medicine has always been especially resistant to the process that Kuhn enunciated.


In 1799, doctors with answers knew that it was wise to drain sick patients’ blood into a bowl. The practice, they knew, had a proven record of success that dated back over many centuries. By 1799, however, doctors with questions challenged this logic, but they were met with powerful disapproval by the doctors with answers.

Lewis Thomas was an essayist and poet—a 20th century bard. Aside from those roles, he also served as dean of the New York University School of Medicine (1966-69) and Yale School of Medicine (1972-73), and also as President of Memorial Sloan-Kettering Institute (1973-83). In his elegiac The Fragile Species, Thomas recounted the 1,600 years in which Western medicine was mired in the answers inherited from the Greek-Roman physician Galen (129 – 216 CE). Galen was an undoubted genius who helped define the fields of anatomy, neurology, pathology, physiology, and pharmacology. However, as Thomas noted:

Galen … had guessed wildly, and wrongly, in no less than five hundred treatises on medicine and philosophy, that everything about human disease could be explained by the misdistribution of “humors” in the body. Congestion of the various organs was the trouble to be treated, according to Galen, and by the eighteenth century, the notion had been elevated to a routine cure-all, or anyway treat-all: remove the excess fluid, one way or another. The ways were direct and forthright: open a vein and take away a pint or more of blood at a sitting, enough to produce faintness and a bluish pallor, place suction cups on the skin to draw out lymph, administer huge doses of mercury of various plant extracts to cause purging, and if all else failed, induce vomiting.

Galen’s unfounded belief in the four bodily humors—known as “humorism”— became the central credo of Western medicine till the early 1800s.

Thomas noted that George Washington, a “hale and hearty” 66-year old, likely died from bloodletting after contracting a fever and sore throat. Researchers still debate whether doctors drained enough of Washington’s blood to have killed him. But we do know that in the years leading up to 1799, the practice itself was already coming under intense criticism. In particular, Benjamin Rush, signer of the Declaration of Independence and the most celebrated American doctor of his time, was undergoing years of withering criticism of his management of a 1793 yellow fever epidemic in Philadelphia.

In a paper on that episode, “Benjamin Rush, MD: Assassin or Beloved Healer?” Robert L. North, MD, describes Rush as, “unshakable in his convictions, as well as self-righteous, caustic, satirical, humorless, and polemical.” He had decided in 1789 that:

there was only 1 fever in the world. He held that all fevers were a single entity, just as fire is a single entity: “Thus fire is a unit whether it be produced by friction, percussion, electricity, fermentation, or by a piece of wood or coal in a state of inflammation.”

And so, just as the streets of Paris were running red with blood unleashed by the guillotine, the streets of Philadelphia were running red with blood unleashed by Benjamin Rush’s medical instruments. From North’s paper:

Rush entered a frenzied state, personally seeing as many as 100 patients a day. His home became a clinic and a sort of pharmaceutical factory staffed by 5 of his students and apprentices, 3 of whom died of yellow fever. So much blood was spilled in the front yard that the site became malodorous and buzzed with flies. He prescribed repeated doses of pills and powders consisting of 10 grains of calomel and 10 grains (later 15) of jalap, at least 10 times the customary dose. These produced copious black stools and often provoked gastrointestinal bleeding before finally yielding only a few shreds of mucus. Rush estimated that the average person contained 25 pounds of blood and recommended that up to 80% be removed.

As such practices finally fell from favor, Lewis Thomas the era of “therapeutic nihilism” that settled in during the 1830s:

Groups of doctors in Boston, Paris, and Edinburgh raised new questions, regarded as heretical by most of their colleagues, concerning the real efficacy of the standard treatments of the day. … The great illumination from this, the first revolution in medical practice in centuries, was the news that there were many diseases that are essentially self-limited. … The new lesson was that treating [patients] made the outcome worse rather than better.

The next century—straight through Thomas’s own medical education at Harvard in the 1930s, was one not of answers, but of questions. The questioners, we might say, drained the medical field of its centuries-old hubris as surely as its cocksure physicians had drained their patients of bodily fluids.


This is not to say that medicine was devoid of destructive certainty after the 1830s. As Galenism faded in the 19th century, miasma theory, which . . .

Continue reading.

Written by Leisureguy

14 October 2022 at 3:28 pm

Southern states don’t like for black people to get government benefits like Medicaid

leave a comment »

The map above is from a post by Kevin Drum that’s worth reading.

Written by Leisureguy

14 October 2022 at 3:07 pm

Republicans becoming more explicit about their outlook and plans

leave a comment »

Heather Cox Richardson:

Last Thursday, October 6, the Republican members of the House Judiciary Committee tweeted: “Kanye. Elon. Trump.”

On Sunday, October 10, after his Instagram account was restricted for antisemitism, rapper Kanye West, now known as “Ye,” returned to Twitter from a hiatus that had lasted since the 2020 elections to tweet that he was “going death con 3 On JEWISH PEOPLE.” This was an apparent reference to the U.S. military’s “DEFCON 3,” an increase in force readiness.

Today, Ian Bremmer of the political consulting firm the Eurasia Group reported that billionaire Elon Musk spoke directly with Russian president Vladimir Putin before Musk last week proposed ending Russia’s attack on Ukraine by essentially starting from a point that gave Putin everything he wanted, including Crimea and Russian annexation of the four regions of Donetsk, Kherson, Luhansk, and Zaporizhzhia, as well as Ukraine’s permanent neutrality. This afternoon, Musk denied the story; Bremmer stood by it.

On Sunday, at a rally in Arizona, Trump claimed that President George H.W. Bush had taken “millions and millions” of documents from his presidency “to a former bowling alley pieced together with what was then an old and broken Chinese restaurant…. There was no security.” (In fact, the National Archives and Records Administration put documents in secure temporary storage at a facility that had been rebuilt, according to NARA, with “strict archival and security standards, and…managed and staffed exclusively by NARA employees.”)

Then Trump went on to accuse NARA of planting documents—his lawyers have refused to make that accusation in court—and, considering his habit of frontloading confessions, made an interesting accusation: “[The Archives] lose documents, they plant documents. ‘Let’s see, is there a book on nuclear destruction or the building of a nuclear weapon cheaply? Let’s put that book in with Trump.’ No, they plant documents.”

Antisemitism, Putin’s demands in Ukraine, and stolen documents seem like an odd collection of things for the Republican members of the House Judiciary Committee, which oversees the administration of justice in the United States, to endorse before November’s midterm elections.

But in these last few weeks before the midterms, the Republican Party is demonstrating that it has fallen under the sway of its extremist wing, exemplified by those like Representative Marjorie Taylor Greene (R-GA), who tweeted last week that “Biden is Hitler.”

Senator Tommy Tuberville (R-AL) this weekend told an audience that Democrats are in favor of “reparation” because they are “pro-crime.” “They want crime,” Tuberville said. “They want crime because they want to take over what you got. They want to control what you have,” Tuberville told the cheering crowd in an echo of the argument of white supremacists during Reconstruction. “They want reparation because they think the people that do the crime are owed that. Bullsh*t. They are not owed that.”

On October 6, New Hampshire Senate nominee Don Bolduc defended the overturning of Roe v. Wade and the subsequent loss of recognition of the constitutional right to abortion. The issue of abortion “belongs to the state,” he said. “It belongs to these gentlemen right here, who are state legislators representing you. That is the best way I think, as a man, that women get the best voice.” Republican super PACs are pouring money behind Bolduc.

Even those party members still trying to govern rather than play to racism, sexism, and antisemitism are pushing their hard-right agenda.

Senate Republicans have introduced a bill to get rid of  . . .

Continue reading.

Written by Leisureguy

11 October 2022 at 8:22 pm

%d bloggers like this: