Later On

A blog written for those whose interests more or less match mine.

A short history of a wrong direction the US embraced

leave a comment »

Heather Cox Richardson reviews some of the decisions and directions that brought the US to its current situation:

America today is caught in a plague of gun violence.

It wasn’t always this way. Americans used to own guns without engaging in daily massacres. Indeed, it always jumps out at me that the infamous St. Valentine’s Day Massacre of 1929, when members of one Chicago gang set up and killed seven members of a rival gang, was so shocking it led to legislation that prohibits automatic weapons in the U.S.

Eighty-nine years later, though, in 2018, another Valentine’s Day shooting at Marjory Stoneman Douglas High School in Parkland, Florida, killed 17 children and wounded 17 others. In response, then-President Donald Trump called for arming teachers, and the Republican-dominated Florida legislature rejected a bill that would have limited some high-capacity guns.

Our acceptance of violence today stands in striking contrast to Americans’ horror at the 1929 Valentine’s Day Massacre.

Today’s promotion of a certain kind of gun ownership has roots in the politics of the country since the Supreme Court handed down the 1954 Brown v. Board of Education of Topeka, Kansas, decision, which declared racial segregation in public schools unconstitutional. Since Democratic President Franklin Delano Roosevelt instituted a government that actively shaped the economy, businessmen who hated government regulation tried to rally opposition to get rid of that government. But Americans of the post-World War II years actually liked regulation of the runaway capitalism they blamed for the Great Depression.

The Brown v. Board decision changed the equation. It enabled those who opposed business regulation to reach back to a racist trope from the nation’s Reconstruction years after the Civil War. They argued that the active government after World War II was not simply regulating business. More important, they said, it was using tax dollars levied on hardworking white men to promote civil rights for undeserving Black people. The troops President Dwight Eisenhower sent to Little Rock Central High School in 1957, for example, didn’t come cheap. Civil Rights, then, promoted by the newly active federal government, were virtually socialism.

This argument had sharp teeth in the 1950s, as Americans recoiled from the growing influence of the U.S.S.R., but it came originally from the Reconstruction era. Then, white supremacist southerners who were determined to stop the federal government from enforcing Black rights argued that they were upset about Black participation in society not because of race—although of course they were—but rather because poor Black voters were electing lawmakers who were using white people’s tax dollars to lay roads, for example, or build schools.

In contrast to this apparent socialism, southern Democrats after the Civil War lionized the American cowboy, whom they mythologized as a white man (in fact, a third of the cowboys were men of color) who wanted nothing of the government but to be left alone (in reality, the cattle industry depended on the government). Out there on the western plains, the mythological cowboy worked hard for a day’s pay for moving cattle to a railhead, all the while fighting off Indigenous Americans, Mexicans, and rustlers who were trying to stop him.

That same mythological cowboy appeared in the 1950s to stand against what those opposed to business regulation and civil rights saw as the creeping socialism of their era. By 1959, there were 26 Westerns on TV, and in March 1959, eight of one week’s top shows were Westerns. They showed hardworking cowboys protecting their land from evildoers. The cowboys didn’t need help from their government; they made their own law with a gun.

In 1958, Republican Senator Barry Goldwater of Arizona rocketed to prominence after he accused the president from his own party, Dwight Eisenhower, of embracing “the siren song of socialism.” Goldwater had come from a wealthy background after his family cashed in on the boom of federal money flowing to Arizona dam construction, but he presented himself to the media as a cowboy, telling stories of how his family had come to Arizona when “[t]here was no federal welfare system, no federally mandated employment insurance, no federal agency to monitor the purity of the air, the food we ate, or the water we drank,” and that “[e]verything that was done, we did it ourselves.” Goldwater opposed the Brown v. Board decision and Eisenhower’s decision to use troops to desegregate Little Rock Central High School.

Increasingly, those determined to destroy the postwar government emphasized the hardworking individual under siege by a large, grasping government that redistributed wealth to the undeserving, usually people of color. A big fan of Goldwater, Ronald Reagan famously developed a cowboy image even as he repeatedly warned of the “welfare queen” who lived large on government benefits she stole.

As late as 1968, the National Rifle Association supported some forms of gun control, but that changed in the 1980s as the organization affiliated itself with Reagan’s Republican Party. In 1981, an assassin attempted to kill the president and succeeded in badly wounding him, as well as injuring the president’s press secretary, James Brady, and two others. Despite pressure to limit gun ownership, in 1986, under pressure from the NRA, the Republican Congress did the opposite: it passed the Firearms Owners’ Protection Act, which erased many of the earlier controls on gun ownership, making it easier to buy, sell, and transport guns across state lines.

In 1987, Congress began to consider the Brady Handgun Violence Prevention Act, otherwise known as the Brady Bill, to require background checks before gun purchases and to prevent certain transfer of guns across state lines. As soon as the measure was proposed, the NRA shifted into high gear to prevent its passage. The bill did not pass until 1993, under President Bill Clinton’s administration. The NRA set out to challenge the law in the courts.

While the challenges wound their way upward, the idea of individuals standing against a dangerous government became central to the Republican Party. . .

Continue reading. And do read the whole thing. It’s good to be reminded that choices have long-lasting impact.

Written by LeisureGuy

20 April 2021 at 10:23 am

The Reorientations of Edward Said

leave a comment »

In the New Yorker Pankaj Mishra has a very interesting profile of Edward Said in the context of a new biography. The entire piece is worth reading. It begins:

“Professor of Terror” was the headline on the cover of the August, 1989, issue of Commentary. Inside, an article described Edward Said, then a professor of English and comparative literature at Columbia University, as a mouthpiece for Palestinian terrorists and a confidant of Yasir Arafat. “Eduardo Said” was how he was referred to in the F.B.I.’s two-hundred-and-thirty-eight-page file on him—perhaps on the assumption that a terrorist was likely to have a Latin name. V. S. Naipaul willfully mispronounced “Said” to rhyme with “head,” and asserted that he was “an Egyptian who got lost in the world.” Said, an Arab Christian who was frequently taken to be Muslim, recognized the great risks of being misidentified and misunderstood. In “Orientalism” (1978), the book that made him famous, he set out to answer the question of, as he wrote in the introduction, “what one really is.” The question was pressing for a man who was, simultaneously, a literary theorist, a classical pianist, a music critic, arguably New York’s most famous public intellectual after Hannah Arendt and Susan Sontag, and America’s most prominent advocate for Palestinian rights.

Multiple and clashing selves were Said’s inheritance from the moment of his birth, in 1935, in West Jerusalem, where a midwife chanted over him in both Arabic and Hebrew. The family was Episcopalian and wealthy, and his father, who had spent years in America and prided himself on having light skin, named him after the Prince of Wales. Said always loathed his name, especially when shortened to Ed. Sent as a teen-ager to an American boarding school, Said found the experience “shattering and disorienting.” Trained at Princeton and Harvard as a literary scholar in a Euro-American humanist tradition, he became an enthusiast of French theory, a partisan of Michel Foucault. In “Orientalism,” published two decades into a conventional academic career, Said unexpectedly described himself as an “Oriental subject” and implicated almost the entire Western canon, from Dante to Marx, in the systematic degradation of the Orient.

“Orientalism” proved to be perhaps the most influential scholarly book of the late twentieth century; its arguments helped expand the fields of anti-colonial and post-colonial studies. Said, however, evidently came to feel that “theory” was “dangerous” to students, and derided the “jaw-shattering jargonistic postmodernisms” of scholars like Jacques Derrida, whom he considered “a dandy fooling around.” Toward the end of his life, the alleged professor of terror collaborated with the conductor Daniel Barenboim to set up an orchestra of Arab and Israeli musicians, angering many Palestinians, including members of Said’s family, who supported a campaign of boycott and sanctions against Israel. While his handsome face appeared on the T-shirts and posters of left-wing street protesters worldwide, Said maintained a taste for Rolex watches, Burberry suits, and Jermyn Street shoes right up to his death, from leukemia, in 2003.

“To be a Levantine is to live in two or more worlds at once without belonging to either,” Said once wrote, quoting the historian Albert Hourani. “It reveals itself in lostness, pretentiousness, cynicism and despair.” His melancholy memoir of loss and deracination, “Out of Place” (1999), invited future biographers to probe the connection between their subject’s cerebral and emotional lives. Timothy Brennan, a friend and graduate student of Said’s, now warily picks up the gauntlet, in an authorized biography, “Places of Mind” (Farrar, Straus & Giroux). Scanting Said’s private life, including his marriages and other romantic liaisons, Brennan concerns himself with tracing an intellectual and political trajectory. One of the half-concealed revelations in the book is how close Said came, with his Levantine wealth and Ivy League education, to being a somewhat refined playboy, chasing women around the Eastern Seaboard in his Alfa Romeo. In Jerusalem, Said went to St. George’s, a boys’ school for the region’s ruling castes. In Cairo—where his family moved in 1947, shortly before Jewish militias occupied West Jerusalem—he attended the British-run Victoria College. There he was chiefly known for his mediocre marks and insubordinate ways; his classmates included the future King Hussein of Jordan and the actor Omar Sharif.

Cairo was then the principal metropolis of a rapidly decolonizing and politically assertive Arab world. The creation of the state of Israel—following a U.N. resolution, on Palestinian land—and the refugee crisis and wars that ensued were on everyone’s mind. Yet Said inhabited a bubble of affluent cosmopolitans, speaking English and French better than Arabic, and attending the local opera. When he was six years old, he started playing the family piano, a Blüthner baby grand from Leipzig, and he later received private lessons from Ignace Tiegerman, a Polish Jew famous for his interpretations of Brahms and Chopin. Said’s father, who ran a successful office-supply business, was socially ambitious, and his time in America had given him a lasting admiration for the West. At one point, he considered moving his entire family to the United States. Instead, in 1951, he contented himself with dispatching his son to Northfield Mount Hermon School, in rural Massachusetts.

Brennan shows how much Said initially was, as he once confessed, a “creature of an American and even a kind of upper-class wasp education,” distanced from the “uniquely punishing destiny” of an Arab Palestinian in the West. Glenn Gould recitals in Boston appear to have registered more with him than the earthquakes of the post-colonial world, such as the Great Leap Forward or the anti-French insurgency in Algeria. The Egyptian Revolution erupted soon after Said left for the U.S., and a mob of protesters burned down his father’s stationery shop. Within a decade, the family had moved to Lebanon. Yet these events seem to have had less influence on Said than the political currents of his new country did. Brennan writes, “Entering the United States at the height of the Cold War would color Said’s feelings about the country for the rest of his life.” Alfred Kazin, writing in his journals in 1955, already worried that intellectuals had found in America a new “orthodoxy”—the idea of the country as “world-spirit and world hope.” This consensus was bolstered by a professionalization of intellectual life. Jobs in universities, media, publishing, and think tanks offered former bohemians and penurious toilers money and social status. Said began his career at precisely this moment, when many upwardly mobile American intellectuals became, in his later, unforgiving analysis, “champions of the strong.”

Nonetheless, his own early impulse, born of an immigrant’s insecurity, was, as he later put it, to make himself over “into something the system required.” His earliest intellectual mentors were such iconic figures of American literary culture as R. P. Blackmur and Lionel Trilling. He wrote a prize-winning dissertation on Conrad; he read Sartre and Lukács. In his early writings, he faithfully absorbed all the trends then dominant in English departments, from existentialism to structuralism. Devoted to Chopin and Schumann, he seems to have been as indifferent to blues and jazz as he was to Arabic music. He adored Hollywood movies, but there is no evidence that, in this period, he engaged with the work of James Baldwin or Ralph Ellison, or had much interest in the civil-rights movement. When students protesting the war in Vietnam disrupted a class of his, he called campus security.

Brennan detects a hint of what was to come in a remark of Said’s about the dual selves of Conrad: one “the waiting and willing polite transcriber who wished to please, the other an uncooperative demon.” Much impotent anger seems to have long simmered in Said as he witnessed “the web of racism, cultural stereotypes, political imperialism, dehumanizing ideology holding in the Arab or the Muslim.” In a conversation filmed for Britain’s Channel 4, Said claimed that many of his cultural heroes, such as Isaiah Berlin and Reinhold Niebuhr, were prejudiced against Arabs. “All I could do,” he said, “was note it.” He watched aghast, too, the critical acclaim for “The Arab Mind,” a 1973 book by the Hungarian Jewish academic Raphael Patai, which described Arabs as a fundamentally unstable people.

It’s not hard to see how Said, upholding the “great books” courses at Columbia, would have come to feel intensely the frustrations that writers and intellectuals from countries subjugated by Europe and America had long experienced: so many of the canonical figures of Western liberalism and democracy, from John Stuart Mill to Winston Churchill, were contemptuous of nonwhite peoples. Among aspiring intellectuals who came to the U.S. and Europe from Asia, Africa, and Latin America, a sense of bitterness ran especially deep. Having struggled to emulate the cultural élite of the West by acquiring a knowledge of its literature and philosophy, they realized that their role models remained largely ignorant of the worlds they had come from. Moreover, the steep price of that ignorance was paid, often in blood, by the people back home.

It was the Six-Day War, in 1967, and the exultant American media coverage of Israel’s crushing victory over Arab countries, that killed Said’s desire to please his white mentors. He began reaching

Continue reading.

Written by LeisureGuy

20 April 2021 at 10:03 am

Henson Shaving AL13 Medium with Grooming Dept Chypre Peach.

with 2 comments

This soap follows Grooming Dept’s Kairos formula:

Water, Stearic Acid, Beef Tallow, Sodium Lauroyl Lactylate, Kokum Butter, Castor Oil, Tucuma Butter, Avocado Oil, Glycerin, Coconut Milk, Goat Milk, Cupuaçu Butter, Shea Butter, Safflower Oil, Collagen Peptides, Whey Protein, Betaine, Fragrance, Lauryl Laurate, Jojoba Oil, Lanolin, Colloidal Oatmeal, Rice Bran Wax, Meadowfoam Oil, Linoleic Acid, Ethylhexyl Olivate, Hydrogenated Olive Oil, Isostearic Acid, Allantoin, Sodium Lactate, Caprylyl Glycol, Ethylhexylglycerin, Sodium Gluconate, Tetrasodium Glutamate Diacetate, Tocopherols, Silk Peptides.

Loading the brush did require a small amount of added water, and the resulting lather was excellent. West Coast Shaving describes the soap’s fragrance:

This Chypre Peach aroma is a complex combination of citrus, florals, spices, on an earthy base. This is a classic chypre with citrus top notes, middle of labdanum, and a base of oakmoss, but it departs from the traditional with a note of sweet peach.

My Henson Shaving AL13 Medium arrived. On the top of the baseplate, in the upper-left quadrant, the regular AL13 has a “+” stamped, and in the same location on the Medium baseplate is stamped “++”. I think there may be a small increase in blade feel from the Medium vs. the regular Henson, but certainly even the Medium is one of the least-threatening razors I have ever used. Withal, it is nevertheless extremely efficient, and I had no problem in getting a perfectly smooth result. The Medium would be totally suitable for a novice DE shaver.

A splash of La Toja Hombre, the job is done, and the day awaits.

Written by LeisureGuy

20 April 2021 at 8:44 am

Posted in Shaving

The conscious self constructed of memes one adopts: what happens when one’s basic meme set is not consistent?

leave a comment »

Panjo in the New Yorker reviews a biography of Edward Said. From that review:

. . . Multiple and clashing selves were Said’s inheritance from the moment of his birth, in 1935, in West Jerusalem, where a midwife chanted over him in both Arabic and Hebrew. The family was Episcopalian and wealthy, and his father, who had spent years in America and prided himself on having light skin, named him after the Prince of Wales. Said always loathed his name, especially when shortened to Ed. Sent as a teen-ager to an American boarding school, Said found the experience “shattering and disorienting.” Trained at Princeton and Harvard as a literary scholar in a Euro-American humanist tradition, he became an enthusiast of French theory, a partisan of Michel Foucault. In “Orientalism,” published two decades into a conventional academic career, Said unexpectedly described himself as an “Oriental subject” and implicated almost the entire Western canon, from Dante to Marx, in the systematic degradation of the Orient.

“Orientalism” proved to be perhaps the most influential scholarly book of the late twentieth century; its arguments helped expand the fields of anti-colonial and post-colonial studies. Said, however, evidently came to feel that “theory” was “dangerous” to students, and derided the “jaw-shattering jargonistic postmodernisms” of scholars like Jacques Derrida, whom he considered “a dandy fooling around.” Toward the end of his life, the alleged professor of terror collaborated with the conductor Daniel Barenboim to set up an orchestra of Arab and Israeli musicians, angering many Palestinians, including members of Said’s family, who supported a campaign of boycott and sanctions against Israel. While his handsome face appeared on the T-shirts and posters of left-wing street protesters worldwide, Said maintained a taste for Rolex watches, Burberry suits, and Jermyn Street shoes right up to his death, from leukemia, in 2003.

“To be a Levantine is to live in two or more worlds at once without belonging to either,” Said once wrote, quoting the historian Albert Hourani. “It reveals itself in lostness, pretentiousness, cynicism and despair.” His melancholy memoir of loss and deracination, “Out of Place” (1999), invited future biographers to probe the connection between their subject’s cerebral and emotional lives. Timothy Brennan, a friend and graduate student of Said’s, now warily picks up the gauntlet, in an authorized biography, “Places of Mind” (Farrar, Straus & Giroux). Scanting Said’s private life, including his marriages and other romantic liaisons, Brennan concerns himself with tracing an intellectual and political trajectory. One of the half-concealed revelations in the book is how close Said came, with his Levantine wealth and Ivy League education, to being a somewhat refined playboy, chasing women around the Eastern Seaboard in his Alfa Romeo. In Jerusalem, Said went to St. George’s, a boys’ school for the region’s ruling castes. In Cairo—where his family moved in 1947, shortly before Jewish militias occupied West Jerusalem—he attended the British-run Victoria College. There he was chiefly known for his mediocre marks and insubordinate ways; his classmates included the future King Hussein of Jordan and the actor Omar Sharif.

Cairo was then the principal metropolis of a rapidly decolonizing and politically assertive Arab world. The creation of the state of Israel—following a U.N. resolution, on Palestinian land—and the refugee crisis and wars that ensued were on everyone’s mind. Yet Said inhabited a bubble of affluent cosmopolitans, speaking English and French better than Arabic, and attending the local opera. When he was six years old, he started playing the family piano, a Blüthner baby grand from Leipzig, and he later received private lessons from Ignace Tiegerman, a Polish Jew famous for his interpretations of Brahms and Chopin. Said’s father, who ran a successful office-supply business, was socially ambitious, and his time in America had given him a lasting admiration for the West. At one point, he considered moving his entire family to the United States. Instead, in 1951, he contented himself with dispatching his son to Northfield Mount Hermon School, in rural Massachusetts.

Brennan shows how much Said initially was, as he once confessed, a “creature of an American and even a kind of upper-class wasp education,” distanced from the “uniquely punishing destiny” of an Arab Palestinian in the West. Glenn Gould recitals in Boston appear to have registered more with him than the earthquakes of the post-colonial world, such as the Great Leap Forward or the anti-French insurgency in Algeria. The Egyptian Revolution erupted soon after Said left for the U.S., and a mob of protesters burned down his father’s stationery shop. Within a decade, the family had moved to Lebanon. Yet these events seem to have had less influence on Said than the political currents of his new country did. Brennan writes, “Entering the United States at the height of the Cold War would color Said’s feelings about the country for the rest of his life.” Alfred Kazin, writing in his journals in 1955, already worried that intellectuals had found in America a new “orthodoxy”—the idea of the country as “world-spirit and world hope.” This consensus was bolstered by a professionalization of intellectual life. Jobs in universities, media, publishing, and think tanks offered former bohemians and penurious toilers money and social status. Said began his career at precisely this moment, when many upwardly mobile American intellectuals became, in his later, unforgiving analysis, “champions of the strong.”

Nonetheless, his own early impulse, born of an immigrant’s insecurity, was, as he later put it, to make himself over “into something the system required.” His earliest intellectual mentors were such iconic figures of American literary culture as R. P. Blackmur and Lionel Trilling. He wrote a prize-winning dissertation on Conrad; he read Sartre and Lukács. In his early writings, he faithfully absorbed all the trends then dominant in English departments, from existentialism to structuralism. Devoted to Chopin and Schumann, he seems to have been as indifferent to blues and jazz as he was to Arabic music. He adored Hollywood movies, but there is no evidence that, in this period, he engaged with the work of James Baldwin or Ralph Ellison, or had much interest in the civil-rights movement. When students protesting the war in Vietnam disrupted a class of his, he called campus security.

Brennan detects a hint of what was to come in a remark of Said’s about the dual selves of Conrad: one “the waiting and willing polite transcriber who wished to please, the other an uncooperative demon.” Much impotent anger seems to have long simmered in Said as he witnessed “the web of racism, cultural stereotypes, political imperialism, dehumanizing ideology holding in the Arab or the Muslim.” In a conversation filmed for Britain’s Channel 4, Said claimed that many of his cultural heroes, such as Isaiah Berlin and Reinhold Niebuhr, were prejudiced against Arabs. “All I could do,” he said, “was note it.” He watched aghast, too, the critical acclaim for “The Arab Mind,” a 1973 book by the Hungarian Jewish academic Raphael Patai, which described Arabs as a fundamentally unstable people.

It’s not hard to see how Said, upholding the “great books” courses at Columbia, would have come to feel intensely the frustrations that writers and intellectuals from countries subjugated by Europe and America had long experienced: so many of the canonical figures of Western liberalism and democracy, from John Stuart Mill to Winston Churchill, were contemptuous of nonwhite peoples. Among aspiring intellectuals who came to the U.S. and Europe from Asia, Africa, and Latin America, a sense of bitterness ran especially deep. Having struggled to emulate the cultural élite of the West by acquiring a knowledge of its literature and philosophy, they realized that their role models remained largely ignorant of the worlds they had come from. Moreover, the steep price of that ignorance was paid, often in blood, by the people back home.

It was the Six-Day War, in 1967, and the exultant American media coverage of Israel’s crushing victory over Arab countries, that killed Said’s desire to please his white mentors. He began reaching out to other Arabs and methodically studying Western writings about the Middle East. In 1970, he met Arafat, initiating a long and troubled relationship in which Said undertook two equally futile tasks: advising the stubbly, pistol-toting radical on how to make friends and influence people in the West, and dispelling Arafat’s impression that he, Said, was a representative of the United States. . .

Read the whole thing.

Written by LeisureGuy

19 April 2021 at 6:09 pm

Consciousness in the electric brain: Currents? or Field?

leave a comment »

I came across “Brain wifi,” with the subtitle:

Instead of a code encrypted in the wiring of our neurons, could consciousness reside in the brain’s electromagnetic field?

The article, by Johnjoe McFadden, professor of molecular genetics at the University of Surrey, begins:

Some 2,700 years ago in the ancient city of Sam’al, in what is now modern Turkey, an elderly servant of the king sits in a corner of his house and contemplates the nature of his soul. His name is Katumuwa. He stares at a basalt stele made for him, featuring his own graven portrait together with an inscription in ancient Aramaic. It instructs his family, when he dies, to celebrate ‘a feast at this chamber: a bull for Hadad harpatalli and a ram for Nik-arawas of the hunters and a ram for Shamash, and a ram for Hadad of the vineyards, and a ram for Kubaba, and a ram for my soul that is in this stele.’ Katumuwa believed that he had built a durable stone receptacle for his soul after death. This stele might be one of the earliest written records of dualism: the belief that our conscious mind is located in an immaterial soul or spirit, distinct from the matter of the body.

The Katamuwa Stele cast, digitally rendered by Travis Saul. Courtesy of the Oriental Institute of the University of Chicago.

More than 2 millennia later, I was also contemplating the nature of the soul, as my son lay propped up on a hospital gurney. He was undertaking an electroencephalogram (EEG), a test that detects electrical activity in the brain, for a condition that fortunately turned out to be benign. As I watched the irregular wavy lines march across the screen, with spikes provoked by his perceptions of events such as the banging of a door, I wondered at the nature of the consciousness that generated those signals.

Just how do the atoms and molecules that make up the neurons in our brain – not so different to the bits of matter in Katumwa’s inert stele or the steel barriers on my son’s hospital bed – manage to generate human awareness and the power of thought? In answering that longstanding question, most neurobiologists today would point to the information-processing performed by brain neurons. For both Katumuwa and my son, this would begin as soon as light and sound reached their eyes and ears, stimulating their neurons to fire in response to different aspects of their environment. For Katumuwa, perhaps, this might have been the pinecone or comb that his likeness was holding on the stele; for my son, the beeps from the machine or the movement of the clock on the wall.

Each ‘firing’ event involves the movement of electrically charged atoms called ions in and out of the neurons. That movement triggers a kind of chain reaction that travels from one nerve cell to another via logical rules, roughly analogous to the AND, OR and NOT Boolean operations performed by today’s computer gates, in order to generate outputs such as speech. So, within milliseconds of him glancing at his stele, the firing rate of millions of neurons in Katumuwa’s brain correlated with thousands of visual features of the stele and its context in the room. In this sense of correlating with, those brain neurons would supposedly know at least some aspects of Katumuwa’s stele.

Yet information-processing clearly isn’t sufficient for conscious knowing. Computers process lots of information yet have not exhibited the slightest spark of consciousness. Several decades ago, in an essay exploring the phenomenology of consciousness, the philosopher Thomas Nagel asked us to imagine what it’s like to be a bat. This feature of being-like-something, of having a perspective on the world, captures something about what it means to be a truly conscious ‘knower’. In that hospital room watching my son’s EEG, I wondered what it was like to be one of his neurons, processing the information registering the slamming of a door. As far as we can tell, an individual neuron knows just one thing – its firing rate. It fires or doesn’t fire based on its inputs, so the information it carries is pretty much equivalent to the zero or one of binary computer language. It thereby encodes just a single bit of information. The value of that bit, whether a zero or a one, might correlate with the slamming of a door, but it says nothing about the door’s shape, its colour, its use as a portal between rooms or the noise of its slamming – all features that I’m sure were part of my son’s conscious experience. I concluded that being a single neuron in my son’s brain would not feel like anything.

Of course, you could argue, as neurobiologists usually do, that . . .

Continue reading. There’s much more.

Then in the New Yorker I was reading “Do Brain Implants Change Your Identity?” by Christine Kenneally. It’s an interesting article, but what caught my eye was a description of the conscious experience of an epilectic seizure, which is (as the article explains) an electric storm in the brain which of course would disrupt the electromagnetic field. If that indeed is where consciousness resides, that would explain this woman’s description:

. . . The human brain is a small electrical device of super-galactic complexity. It contains an estimated hundred billion neurons, with many more links between them than there are stars in the Milky Way. Each neuron works by passing an electrical charge along its length, causing neurotransmitters to leap to the next neuron, which ignites in turn, usually in concert with many thousands of others. Somehow, human intelligence emerges from this constant, thrilling choreography. How it happens remains an almost total mystery, but it has become clear that neural technologies will be able to synch with the brain only if they learn the steps of this dance. . .

. . . I asked Leggett to describe what it was like to have a seizure. She didn’t know. When one took hold, she was ripped out of her consciousness; she wasn’t there. Afterward, there was a terrible sense of having been absent. She would feel mortified in front of anyone who had witnessed the seizure and alarmed as she took stock of the injuries that she often suffered. Even worse, she said, was that epilepsy stole her memories. Every time she had a seizure and then returned, she seemed to have left some of her memories behind her. . .

Written by LeisureGuy

19 April 2021 at 5:42 pm

Is Facebook Buying Off The New York Times?

leave a comment »

Dan Froomkin writes in the Washington Monthly:

Over the past two decades, as Big Tech has boomed, news organizations have been going bust. Between 2004 and 2019, one in every four U.S. newspapers shut down, and almost all the rest cut staff, for a total of 36,000 jobs lost between 2008 and 2019 alone. Local newspapers have been particularly devastated, making it ever more difficult for people to know what is happening in their communities.

Many factors contributed to this economic collapse, but none more so than the cornering of the digital advertising market by the duopoly of Facebook and Google. Facebook’s threat to a free press—and, by extension, to democracy—is especially pernicious. The social media company is financially asphyxiating the news industry even as it gives oxygen to conspiracy theories and lies. As a result of its many roles in degrading our democracy, it faces mounting scrutiny by politicians and regulators.

Facebook has responded to the negative attention by creating a highly sophisticated public relations effort, which includes becoming the number one corporate spender on federal lobbying and engaging in a massive advertising blitz aimed at the D.C. policy audience. Less well known, and potentially far more dangerous, is a secretive, multimillion-dollar-a-year payout scheme aimed at the most influential news outlets in America. Under the cover of launching a feature called Facebook News, Facebook has been funneling money to The New York TimesThe Washington PostThe Wall Street Journal, ABC News, Bloomberg, and other select paid partners since late 2019.

Participating in Facebook News doesn’t appear to deliver many new readers to outlets; the feature is very difficult to find, and it is not integrated into individuals’ newsfeeds. What Facebook News does deliver—though to only a handful of high-profile news organizations of its choosing—is serious amounts of cash. The exact terms of these deals remain secret, because Facebook insisted on nondisclosure and the news organizations agreed. The Wall Street Journal reported that the agreements were worth as much as $3 million a year, and a Facebook spokesperson told me that number is “not too far off at all.” But in at least one instance, the numbers are evidently much larger. In an interview last month, former New York Times CEO Mark Thompson said the Times is getting “far, far more” than $3 million a year—“very much so.”

For The New York Times, whose net income was $100 million in 2020, getting “far, far more” than $3 million a year with essentially no associated cost is significant. And once news outlets take any amount of money from Facebook, it becomes difficult for them to let it go, notes Mathew Ingram, chief digital writer for the Columbia Journalism Review. “It creates a hole in your balance sheet. You’re kind of beholden to them.” It’s not exactly payola, Ingram told me, searching for the right metaphor. Nor is it a protection racket. “It’s like you’re a kept person,” he said. “You’re Facebook’s mistress.”

There’s no evidence that the deal directly affects coverage in either the news or editorial departments. Before the Facebook News deal, the Times famously published an op-ed titled “It’s Time to Break Up Facebook,” by Chris Hughes, a cofounder of Facebook turned critic. And since the deal, columns from Tim Wu and Kara Swisher, among others, have been similarly critical. In December, the editorial board welcomed a lawsuit calling for Facebook to be broken up.

And Facebook and Google money is, admittedly, all over journalism already. Virtually every major media nonprofit receives direct or indirect funding from Silicon Valley, including this one. When the Monthly gets grants from do-good organizations like NewsMatch, some of the funds originate with Facebook.

But these three points are beyond dispute.

First, the deals are a serious breach of traditional ethics. In the pre-internet days, independent newspapers wouldn’t have considered accepting gifts or sweetheart deals from entities they covered, under any circumstance. The Washington Post under the editor Leonard Downie Jr., for instance, wouldn’t even accept grants from nonprofits to underwrite reporting projects, for fear of losing the appearance of independence. Facebook, which took in $86 billion in revenue last year, is a hugely controversial behemoth having profound, highly newsworthy, and negative effects on society. Accepting money from them creates a conflict of interest.

Even for trusted news organizations whose audiences believe they can’t be bought outright, “it might come across as hypocrisy to heavily criticize an industry while also collaborating with them,” says Rasmus Kleis Nielsen, the director of the Reuters Institute for the Study of Journalism. Agreeing to keep the terms of the deal confidential is also a mistake, Nielsen told me. “This sort of opacity I don’t think builds trust.”

Second, these deals help Facebook maintain the public appearance of legitimacy. Journalists, critics, and congressional investigators have amply documented how Facebook has become a vector of disinformation and hate speech that routinely invades our privacy and undermines our democracy. For The New York Times and other pillars of American journalism to effectively partner with Facebook creates the impression that Facebook is a normal, legitimate business rather than a monopolistic rogue corporation.

Finally, these agreements undermine  . . .

Continue reading. There’s much more, and it’s worth readingg.

Written by LeisureGuy

19 April 2021 at 11:00 am

Lemon Bay, a Mallard-formula soap from Grooming Dept, with the iKon Shavecraft X3

with 4 comments

I naturally did use Grooming Dept pre-shave. I specifically mention it because the result this morning is a perfect shave, and thus I want all contributing factors specified.

This silvertip brush combines a relatively long loft with a relatively low knot density, the result being an extraordinarily gentle brush. It loaded easily and quickly — “gentle” does not mean “ineffective” — and I quickly worked up a very nice lather, nice in both fragrance and feel. The snakewood handle is light in weight and so the brush overall has a light feeling.

The X3 is an extremely good slant, here mounted on the RazoRock barberpole handle. The X3 is extremely comfortable, so there was no nick danger (“Nick Danger, Third Eye”), and the result was smooth perfection.

A splash of lemon Myrsol, and the week begins, with clear skies and a temperature of 57ºF.

Written by LeisureGuy

19 April 2021 at 10:23 am

Posted in Shaving

Interview: Julia Galef

leave a comment »

Noah Smith interviews Julia Galef:

If the Rationalist movement can be said to have a leader, I would argue that it is Julia Galef. She hosts the podcast Rationally Speaking, and is the founder of the Center for Applied Rationality, which tries to train people to eliminate cognitive bias. And now she has written a book! It’s called The Scout Mindset: Why Some People See Things Clearly and Others Don’t, and you can buy it on Amazon here.

In the interview that follows, I talk to Julia about different concepts of rationality, about the purpose of the “scout mindset”, about whether rationality will win in the marketplace of ideas, and more!

N.S.: So I hear you have a new book! It’s called “The Scout Mindset: Why Some People See Things Clearly and Others Don’t”. I’m going to read it, but why don’t you give me a preview of what it’s about!

J.G.: I do! It’s about, unsurprisingly, the scout mindset — which is my term for the motivation to see things as they are, not as you wish they were. In other words, trying to be intellectually honest, objective, and curious about what’s actually true.

The central metaphor in the book is that we are often in soldier mindset, my term for the motivation to defend your own beliefs against arguments and evidence that might threaten them. Scout mindset is an alternative way of thinking. A scout’s goal is not to attack or defend, but to go out and form an accurate map of what’s really there.

So in the book, I discuss why soldier mindset is so often our default and make the case for why we’d be better off shifting towards the scout instead. And I share some tips for how to do that, which I illustrate with lots of real examples of people demonstrating scout mindset, in science, politics, sports, entrepreneurship, activism, and lots of everyday contexts as well.

N.S.: So are we always better off being scouts instead of soldiers? Just to indulge in a bit of devil’s advocacy, don’t many entrepreneurs succeed by being overconfident about their idea’s chance of success? And doesn’t irrational optimism often sustain us through trying times? Isn’t excessive realism considered a hallmark of clinical depression?

I guess this is a specific way of asking about the more general question of analytic rationality versus instrumental rationality. Are there times when, if we were a planner trying to maximize our own utility, we would choose to endow ourselves with logical fallacies and incorrect beliefs? 

J.G.: Yeah, my claim isn’t that soldier mindset has no benefits. My claim is that:

1. We overestimate those benefits, and

2. There are usually ways to get those benefits without resorting to soldier mindset

I’ll briefly sum up my case for those claims. To the first point, one reason we overestimate soldier mindset’s benefits is that they’re so immediate. When you convince yourself “I didn’t screw up” or “My company is definitely going to succeed,” you feel good right away. The harms don’t come until later, in the form of making you less likely to notice yourself making a similar mistake in the future, or a flaw in your business plan. And just in general, humans tend to over-weight immediate consequences and under-weight delayed consequences.

(As an aside, it’s worth noting that the research claiming things like “People who self-deceive are happier” is really not very good. I’m willing to believe that self-deception can make you happy, at least temporarily, but I wouldn’t believe it as a result of the academic research.

Then to the second point… even though people often claim that you “need” soldier mindset to be happy, or confident, or motivated, there are lots of counterexamples disproving that.

For example, you brought up the claim that entrepreneurs need to be overconfident in their odds of success, in order to motivate themselves. That is a common claim, but in fact, many successful entrepreneurs originally gave themselves rather low odds of success. Jeff Bezos figured he had a 30% shot at success with Amazon, and Elon Musk gave his companies (Tesla and SpaceX) each a 10% chance of success.

Yet obviously both Bezos and Musk are highly motivated despite recognizing the tough odds facing them. That’s because they were motivated not by the promise of a guaranteed win, but by the high expected value of the risk they were taking: The upside of success was huge, and the downside of failure was tolerable. (“If something is important enough, you should try,” Musk has said. “Even if the probable outcome is failure.”)

That kind of thinking about risk is a better source of motivation, I would argue — because it doesn’t require you to believe false things.

N.S.: Got it! The idea of scout mindset reminds me of my favorite Richard Feynman term: “a satisfactory philosophy of ignorance“. Was Feynman’s thinking influential to you at all?

Anyway, I have another question, about the relationship between the soldiers and the scouts. In real armies, scouts and soldiers are on the same side, doing different jobs but fighting a common enemy. What is the common enemy in the case of people who take the two mindsets? Or if not an enemy, what is the common purpose that unites them, or ought to unite them? 

J.G.: . . .

Continue reading.

Written by LeisureGuy

18 April 2021 at 10:53 am

Example of a bad (micro)cultural meme: Scott-Rudinesque behavior

leave a comment »

Peter Marks has an interesting report in the Washington Post, and mentions in passing how a cultural meme is created and then reinforced generation by generation, because “this is how we do it.” The underlying problem is that it’s difficult to do A-B tests of memes, though some do arise.

At any rate, from his report (and read it with an eye out for memes):

. . . The story, in which several people described allegations that have circulated in the entertainment industry for years about Rudin’s bullying and rages, rocked the theater world. In one anecdote, he allegedly smashed a computer monitor on an assistant’s hand over an unsuccessful flight booking, sending the employee to the emergency room. He’s also accused of throwing objects at workers, including a stapler and a baked potato.

Rudin declined to elaborate on the statement, or on what exactly retreating from “active participation” entails. He has spoken to confidants about beginning a program of anger management or some manner of coaching. Whether his actions will in some way quell the calls for punitive action to be taken against him is unclear. Producers who spoke on the condition of anonymity due to the sensitive nature of the allegations have spoken of some sanction by the Broadway League, whose members are Broadway producers and theater owners. But the league exists primarily as a trade organization and overseer of the Tony Awards with the American Theatre Wing. Every commercial Broadway production is, in essence, its own private enterprise.

“All change is theoretical,” said Olivo, in response to Rudin’s statement, “Action and time are needed before we can name it transformation. . . . Rudin is but one dragon to slay. There are more.”

Some members of the Broadway community say Rudin is just one of many abusive people — directors, choreographers, actors, business executives — whose behavior has been tolerated. His stepping back from “active participation” will probably not change the environment, they say.

“It’s a first step. Is it enough? No,” said one Broadway producer, who spoke on the condition of anonymity out of fear of negative consequences. “There are people at every point in the business that have been taught that this is how you get the results you need. So the behavior gets replicated.”

“We have been taught that we have to sacrifice for our art,” this producer said about why bad behavior remains prevalent. “But you can do great work without creating a toxic environment.”

Actors’ Equity, the national labor union, called for Rudin to release his staff from any nondisclosure agreements that they may have had to sign, saying it would be “an important first step in creating truly safe and harassment-free theatrical workplaces on Broadway and beyond.”

“Since news reports emerged about Scott Rudin, we have had many private conversations with our sibling unions and the Broadway League. We have heard from hundreds of members that these allegations are inexcusable, and everyone deserves a safe workplace whether they are a union member or not,” president Kate Shindle and executive director Mary McColl said in a joint statement.

An exit by Rudin has potentially immense consequences for an industry that is short on visionary leaders. The Internet Broadway Database lists 77 plays and musicals produced by Rudin since the early 1990s. They run the gamut from . . .

Written by LeisureGuy

17 April 2021 at 5:29 pm

A compendium of bridges, explained by an engineer

leave a comment »

Written by LeisureGuy

17 April 2021 at 12:26 pm

Effective Altruism Is Not Effective

leave a comment »

Thomas R. Wells writes in The Philosopher’s Beard:

Effective altruism is based on a very simple idea: we should do the most good we can. Obeying the usual rules about not stealing, cheating, hurting, and killing is not enough, or at least not enough for those of us who have the good fortune to live in material comfort, who can feed, house, and clothe ourselves and our families and still have money or time to spare. Living a minimally acceptable ethical life involves using a substantial part of our spare resources to make the world a better place. Living a fully ethical life involves doing the most good we can. (Peter Singer)

It is almost universally agreed that the persistence of extreme poverty in many parts of the world is a bad thing. It is less well-agreed, even among philosophers, what should be done about it and by who. An influential movement founded by the philosopher Peter Singer argues that we should each try to do the best we can by donating our surplus income to charities that help those in greatest need. This ‘effective altruism’ movement has two components: i) encouraging individuals in the rich world to donate more; and ii) encouraging us to donate more rationally, to the organisations most efficient at translating those donations into gains in human well-being.

Unfortunately both components of effective altruism focus on what makes giving good rather than on achieving valuable goals. Effective altruism therefore does not actually aim at the elimination of global poverty as is often supposed. Indeed, its distinctive commitment to the logic of individualist consumerism makes it constitutionally incapable of achieving such a large scale project. Effective altruism is designed to fail.

I. The No-Sacrifice Principle of Giving

In his best-selling defense of effective altruism The Life You Can Save: Acting Now to End World Poverty (2009, p.15) Singer provides this outline of his argument.

First premise: Suffering and death from lack of food, shelter, and medical care are bad.

Second premise: If it is in your power to prevent something bad from happening, without sacrificing anything nearly as important, it is wrong not to do so.

Third premise: By donating to aid agencies, you can prevent suffering and death from lack of food, shelter, and medical care, without sacrificing anything nearly as important.

Conclusion: Therefore, if you do not donate to aid agencies, you are doing something wrong.

Singer famously supports his second premise by reference to his ‘shallow pond’ thought experiment, in which nearly everyone agrees that we would have an obligation to rescue a drowning child even at some personal inconvenience. He argues that since we already seem to accept that principle, the moral challenge is to integrate it better into how we live by donating some of our ‘surplus’ income to charities. Effective altruism is thereby identified as a way of living better in accordance with reason and right, the correct answer to Socrates’ challenge ‘How ought we to live?’

What I want to bring out here is that Singer’s main concern is the question of how good to be in terms of how much we should be giving, i.e. the internal moral economy of the subject. The ‘bads’ of suffering and death identified in premise 1 are peripheral to this analysis. They may motivate our interest in altruism but their remediation is not the measure of our altruistic success.

On the face of it, premise 2 is a very demanding principle because it links our subjective moral economy to the prevention of significant objective harms. However, the way that Singer uses the principle severs that relation. Singer is concerned to help us calculate our personal budget for good works: how much we each can spare from our other interests and commitments. As Singer makes clear, altruism on this conception should not feel like a sacrifice because it is merely the harmonious integration of our moral with our other preferences. This generates a rather generic analysis of how much it is reasonable to expect people of different levels of affluence to contribute to good causes without having to make any real sacrifices, i.e. calculations of how much money we could easily do without. (Singer suggests a progressive rate of voluntary self-taxation starting at 5% of income for those earning more than $100,000.)

Such calculations are generic because they are fundamentally concerned with how to be an altruist, not with how to fix the world’s problems, and so they are unrelated to the significance of the specific problems our donations are supposed to address, nor with what would be needed to successfully solve them. For consider, even if global poverty were eliminated entirely, there will still always be causes you could contribute to that would be more valuable than pursuing your own interests (such as generating benefits to future generations). This is the paradoxical overdemandingness of utilitarianism identified by various philosophers (Bernard Williams; Susan Wolf; etc): that a world of utilitarians would be a world incapable of happiness. What I think Singer’s ‘no sacrifice’ principle actually offers is a (not especially convincing) way to reconcile our moral duty to doing good with our right to live a life of our own.  We are effectively asked to calculate our own voluntary moral tax-rate that delineates when we have done enough for others and can turn away, morally free to pursue our private projects and commitments. How much good this amount of giving will achieve in the world is irrelevant to what that tax rate should be.

II. Efficiency is Not the Same Thing as Effectiveness

Effective altruists ….. know that saving a life is better than making a wish come true and that saving three lives is better than saving one. So they don’t give to whatever cause tugs strongest at their heartstrings. They give to the cause that will do the most good, given the abilities, time, and money they have. (Peter Singer)

The problem with the first component of effective altruism was that it focuses on the internal moral economy of the giver rather than on the real world problems our giving is supposed to address. The second component of effective altruism might not seem to have that problem because it is explicitly concerned with maximising the amount of good that each unit of resources achieves. (This is also the component that has received more emphasis in the last 10 years as the movement gained traction among a younger generation of philosophers such as Toby Ord and William MacAskill.) However, this concern is better understood as efficiency than as effectiveness (the general idea of getting things done). This might seem an innocuous distinction since efficiency is about how we ought to get things done, i.e. a way of being effective. However, there are significant consequences for practical reasoning in the kind of cases effective altruism is concerned with.

If one takes the efficiency view promoted by the effective altruism movement then one assumes a fixed set of resources and the choice of which goal to aim for follows from a calculation of how to maximise the expected value those resources can generate; i.e. the means justifies the end. For example, in the context of global poverty, you would use evidence and careful reasoning to decide in which cause or organisation to invest your chosen amount on the basis of which generates  the most QALYS per dollar. This should ensure that your donation will achieve the most good, which is to say that you have done the best possible job of giving. However, despite doing so well at the task effective altruism has set you, if you step back you will notice that very little has actually been achieved. The total amount of good we can achieve with our donations is limited to the partial alleviation of some of the symptoms of extreme poverty, symptoms that will recur so long as poverty persists. But effective altruism supplies no plan for the elimination of poverty itself, and there is no way for a feasible plan for that goal to be developed and implemented by this method of reasoning at the margin.

The underlying problem is that . . .

Continue reading. There’s much more, and in my view he annihilates Singer’s argument and position.

Written by LeisureGuy

17 April 2021 at 9:43 am

Grooming Dept Moisturizing Pre-Shave counteracts drying soaps

with 5 comments

A reader commented yesterday that he had found l’Occitane Cade shaving soap to be quite drying. I did not experience that in yesterday’s shave, and I wondered whether my use of Grooming Dept Moisturizing Pre-Shave might account for it.

So today I chose a soap that is definitely drying for me — Martin de Candre’s shaving soap — and used it with the moisturiizing pre-shave. I easily got a very nice lather — Martin de Candre is good at that — and I enjoyed the feel (and performance) of the Rooney Victorian brush, though the relatively short loft definitely changes the feel (and my own preference in general is for a longer loft, but I’ve learned it works best for me to accept my brushes as they are, on their own terms).

Three passes of the 1940’s Gillette Aristocrat did a decent job, a splash of Cavendish aftershave completed the shave and started the weekend.

Sitting here after the shave, I do not detect the dryness that generally has followed a Martin de Candre shave. I think Grooming Dept Moisturizing Pre-Shave did indeed work to combat that drying effect.

Written by LeisureGuy

17 April 2021 at 9:07 am

Posted in Shaving

Vine robots

leave a comment »

Written by LeisureGuy

16 April 2021 at 1:18 pm

Brain control of devices

leave a comment »

This video bears an interesting relation to the previous post.

Written by LeisureGuy

16 April 2021 at 1:17 pm

Ingenious and stimulating science-fiction story

leave a comment »

The story, “Lena,” is by qntm, translated from the Russian by Boris Ostanin. It begins:

This article is about the standard test brain image. For the original human, see Miguel Acevedo.

MMAcevedo (Mnemonic Map/Acevedo), also known as Miguel, is the earliest executable image of a human brain. It is a snapshot of the living brain of neurology graduate Miguel Álvarez Acevedo (2010–2073), taken by researchers at the Uplift Laboratory at the University of New Mexico on August 1, 2031. Though it was not the first successful snapshot taken of the living state of a human brain, it was the first to be captured with sufficient fidelity that it could be run in simulation on computer hardware without succumbing to cascading errors and rapidly crashing. The original MMAcevedo file was 974.3PiB in size and was encoded in the then-cutting-edge, high-resolution MYBB format. More modern brain compression techniques, many of them developed with direct reference to the MMAcevedo image, have compressed the image to 6.75TiB losslessly. In modern brain emulation circles, streamlined, lossily-compressed versions of MMAcevedo run to less than a tebibyte. These versions typically omit large amounts of state data which are more easily supplied by the virtualisation environment, and most if not all of Acevedo’s memories.

The successful creation of MMAcevedo was hailed as a breakthrough achievement in neuroscience, with the Uplift researchers receiving numerous accolades and Acevedo himself briefly becoming an acclaimed celebrity. Acevedo and MMAcevedo were jointly recognised as Time’s “Persons of the Year” at the end of 2031. The breakthrough was also met with severe opposition from humans rights groups.

Between 2031 and 2049, MMAcevedo was duplicated more than 80 times, so that it could be distributed to other research organisations. Each duplicate was made with the express permission of Acevedo himself or, from 2043 onwards, the permission of a legal organisation he founded to manage the rights to his image. Usage of MMAcevedo diminished in the mid-2040s as more standard brain images were produced, these from other subjects who were more lenient with their distribution rights and/or who had been scanned involuntarily. In 2049 it became known that MMAcevedo was being widely shared and experimented upon without Acevedo’s permission. Acevedo’s attempts to curtail this proliferation had the opposite of the intended effect. A series of landmark U.S. court decisions found that Acevedo did not have the right to control how his brain image was used, with the result that MMAcevedo is now by far the most widely distributed, frequently copied, and closely analysed human brain image.

Acevedo died from coronary heart failure in 2073 at the age of 62. . .

Read the whole thing at the link.

Written by LeisureGuy

16 April 2021 at 12:54 pm

Can a prime number be illegal? Yes.

leave a comment »

See this Wikipedia article, which begins:

An illegal prime is a prime number that represents information whose possession or distribution is forbidden in some legal jurisdictions. One of the first illegal primes was found in 2001. When interpreted in a particular way, it describes a computer program that bypasses the digital rights management scheme used on DVDs. Distribution of such a program in the United States is illegal under the Digital Millennium Copyright Act.[1] An illegal prime is a kind of illegal number.

Written by LeisureGuy

16 April 2021 at 9:29 am

Dark Chocolate and the iKon OC

with 7 comments

The Rooney 2 has a comparatively long loft, which provides great lather capacity and also makes the lather-filled knot gentle on the face. Early on this was a favorite brush, though I didn’t know why. Now I have a better understanding of the factors that I favor.

The brush quickly made a generous and delicious smelling lather. I had to add water a couple of times during loading, which I think has to do more with the volume of the knot than anything about the soap. I start loading with a brush that’s barely damp, and to get enough soap to flow into the (large) knot as I load, more water is needed that for a knot with a short loft. That’s my theory, at any rate.

The iKon open-comb is a marvelous razor — it’s a pleasure to use, and it left my face perfectly smooth and undamaged in any way. A splash of the Dark Chocolate aftershave, and the end of the week is upon us.

Written by LeisureGuy

16 April 2021 at 8:49 am

Posted in Shaving

Vikings Blade Chieftain and l’Occitane Cade

with 9 comments

A while back a reader commented that they like l’Occitane Cade shaving soap. When I had previously tried it using a badger brush I found that I didn’t get all that great a lather, and would use a combination of the soap and l’Occitane Cade shaving cream to make a superlather, which worked well. He suggested that a synthetic knot might solve the problem, and I recalled how Barrister & Mann specifically recommend a synthetic knot for their Reserve line of shaving soaps.

So, nothing loath, I acquired a new puck of l’Occitane Cade shaving soap and tried it this morning, using the most synthetic knot I have, this Yaqi Cashmere knot — not actually cashmere, of course, but just their name for the fiber, probably because of it dense, smooth feel.

And lo! the lather was as good a lather as one could want. (There are potential confounding factors, of course: they might in the meantime have improved the formula, and/or I might in the meantime have become better at loading and lathering.) Still, it was a seriously fine lather, always a pleasure. I’ll try later using a badger brush to see whether things go as well.

This is a new razor, purchased on Mantic59’s recommendation: the Vikings Blade Chieftain. Mantic59 knows what he’s talking about: this US$25 razor is absolutely first-rate. It is well constructed (and the head covers the blade’s end tabs, something I appreciate), and its feel and performance are uncommonly good: extremely comfortable while still being quite efficient. I will add that the presentation is also excellent: it comes in a very nice box that is suitable for gift-giving.

I easily achieved a BBS result with total comfort. The comfort of the razor encouraged somewhat more rapid strokes, which may help cutting action.

I’m adding this to my list of recommended razors, and thanks to Mantic59 for pointing it out in Sharpologist.

A final rinse, a splash of Cade EDT as an aftershave, and another bright shiny new day begins.

Written by LeisureGuy

15 April 2021 at 8:59 am

Posted in Shaving

“I Fought in Afghanistan. I Still Wonder, Was It Worth It?”

leave a comment »

Timothy Kahn, formerly a USMC captain, served in Iraq and Afghanistan and writes in the NY Times:

When President Biden announced on Wednesday that the United States would withdraw all its troops from Afghanistan by Sept. 11, 2021, he appeared to be finally bringing this “forever war” to an end. Although I have waited for this moment for a decade, it is impossible to feel relief. The Sept. 11 attacks took place during my senior year of college, and the wars in Iraq and Afghanistan that followed consumed the entirety of my adult life. Although history books may mark this as the end of the Afghanistan war, it will never be over for many of my generation who fought.

Sometimes there are moments, no more than the span of a breath, when the smell of it returns and once again I’m stepping off the helicopter ramp into the valley. Covered in the ashen dust of the rotor wash, I take in for the first time the blend of wood fires burning from inside lattice-shaped mud compounds, flooded fields of poppies and corn, the sweat of the unwashed and the wet naps that failed to mask it, chicken and sheep and the occasional cow, the burn pit where trash and plastic smoldered through the day, curries slick with oil eaten by hand on carpeted dirt floors, and fresh bodies buried shallow, like I.E.D.s, in the bitter earth.

It’s sweet and earthy, familiar to the farm boys in the platoon who knew that blend of animal and human musk but alien to those of us used only to the city or the lush Southern woods we patrolled during training. Later, at the big bases far from the action, surrounded by gyms and chow halls and the expeditionary office park where the flag and field grade officers did their work, it was replaced by a cologne of machinery and order. Of common parts installed by low-bid contractors and the ocher windblown sand of the vast deserts where those behemoth bases were always located. Relatively safe after the long months at the frontier but dull and lifeless.

Then it’s replaced by the sweet, artificial scents of home after the long plane ride back. Suddenly I’m on a cold American street littered with leaves. A couple passes by holding hands, a bottle of wine in a tote bag, dressed for a party, unaware of the veneer that preserves their carelessness.

I remain distant from them, trapped between past and present, in the same space you sometimes see in the eyes of the old-timers marching in Veterans Day parades with their folded caps covered in retired unit patches, wearing surplus uniforms they can’t seem to take off. It’s the space between their staring eyes and the cheering crowd where those of us who return from war abide.

My war ended in 2011, when I came home from Afghanistan eager to resume my life. I was in peak physical shape, had a college degree, had a half-year of saved paychecks and would receive an honorable discharge from the Marine Corps in a few months. I was free to do whatever I wanted, but I couldn’t bring myself to do anything.

Initially I attributed it to jet lag, then to a need for well-deserved rest, but eventually there was no excuse. I returned to my friends and family, hoping I would feel differently. I did not.

“Relax. You earned it,” they said. “There’s plenty of time to figure out what’s next.” But figuring out the future felt like abandoning the past. It had been just a month since my last combat patrol, but I know now that years don’t make a difference.

At first, everyone wanted to ask about the war. They knew they were supposed to but approached the topic tentatively, the way you hold out a hand to an injured animal. And as I went into detail, their expressions changed, first to curiosity, then sympathy and finally to horror.

I knew their repulsion was only self-preservation. After all, the war cost nothing to the civilians who stayed home. They just wanted to live the free and peaceful lives they’d grown accustomed to — and wasn’t their peace of mind what we fought for in the first place?

After my discharge, I moved to . . .

Continue reading. There’s more.

Justin Bieber today is stunning

leave a comment »

Just read Zach Baron’s interview of him in GQ:

Justin Bieber and I have just met when I ask him something and he talks and talks—for 10 illuminating and uninterrupted minutes he talks. He talks about God and faith and castles in Ireland, about shame and drugs and marriage. He talks about what it is to feel empty inside, and what it is to feel full. At one point he says, “I’m going to wrap it up here,” but he doesn’t, he just keeps going, and that is what it is like to talk to Justin Bieber now. Like you’re in the confessional booth with him. Like whatever rules about “privacy” or the thick opaque wall of massive celebrity that people like Bieber are supposed to follow don’t apply.

He has lived a well-documented life—maybe among the more well-documented lives in the history of this decaying planet. But to my knowledge, there is not one example of him speaking this way—in a moving but unprompted, unselfconscious torrent of words—in public prior to this moment. I will admit to being disoriented. If I’m being honest, I had been expecting someone else entirely—someone more monosyllabic; someone more distracted, more unhappy; someone more like the guy I’m pretty sure Justin Bieber was not all that long ago—and now I am so thrown that the best I can do is stammer out some tortured version of… How did you become this person? By which I mean: seemingly guileless. Bursting with the desire to connect, to tell his own story, in case it might be of use to anyone else.

It’s a question that’s not even a question, really. But what Bieber gently says in response is: “That’s okay.”

He knows approximately what I’m asking—how he got from wherever he was to here, to becoming the man in front of me, clear-eyed on a computer screen from an undisclosed location in Los Angeles. His hair, under a Vetements hat, is long in the back; he is in no particular hurry. He is married to a woman—Hailey Baldwin Bieber—who cares for him like no one has ever cared for him, he says. He is happy. He is currently renovating the house in which he will live happily with his wife. He’s spent the past several months piecing together a new record, Justice, which is dense with love songs and ’80s-style anthems—interspersed with some well-intentioned, if not totally well-advised, interludes featuring the voice of Martin Luther King Jr.—that are bluntly honest about his bad past and equally optimistic about his future. (“Everybody saw me sick, and it felt like no one gave a shit,” he sings on the cathartic last song on the record, “Lonely.”) He’s still so overflowing with music that he puts out Freedom, a meditative, postscript of an EP about faith, just a few weeks after Justice. He is, if anything, the empathetic professional in this interaction too as he goes about trying to help me understand how he’s arrived at where he’s arrived. . .

Continue reading. And do read the whole thing. He does explain well how he arrived and where he arrived.

Written by LeisureGuy

14 April 2021 at 3:24 pm

<span>%d</span> bloggers like this: