Later On

A blog written for those whose interests more or less match mine.

Archive for November 12th, 2020

Trump’s big election lie pushes America toward autocracy

leave a comment »

Timothy Snyder, a professor of history at Yale University and the author of On Tyranny: Twenty Lessons From the Twentieth Century, writes in the Boston Globe:

When you lose, it is good and healthy to know why. In the First World War, the conflict that defined our modern world, the Germans lost because of the overwhelming force assembled by their enemies on the Western Front. After the Americans entered the war, German defeat was a matter of time. Yet German commanders found it convenient instead to speak of a “stab in the back” by leftists and Jews. This big lie was a problem for the new German democracy that was created after the war, since it suggested that the major political party, the Social Democrats, and a national minority, the Jews, were outside the national community. The lie was taken up by the Nazis, and it became a central element of their version of history after they took power. The blame was elsewhere.

It is always tempting to blame defeat on others. Yet for a national leader to do so and to inject a big lie into the system puts democracy at great risk. Excluding others from the national community makes democracy impossible in principle, and refusing to accept defeat makes it impossible in practice. What we face now in the United States is a new, American incarnation of the old falsehood: that Donald Trump’s defeat was not what it seems, that votes were stolen from him by internal enemies — by a left-wing party. “Where it mattered, they stole what they had to steal,” he tweets. He claims that his votes were all “Legal Votes,” as if by definition those for his opponent were not.

Underestimating Donald Trump is a mistake that people should not go on making. Laughing at him will not make him go away. If it did, he would have vanished decades ago. Nor will longstanding norms about how presidents behave make him go away. He is an actor and will stick to his lines: It was all a fraud, and he won “by a lot.” He was never defeated, goes the story; he was a victim of a conspiracy. This stab-in-the-back myth could become a permanent feature of American politics, so long as Trump has a bullhorn, be it on Fox or on RT (formerly Russia Today) — or, though Democrats might find this unthinkable, as an unelected president remaining in power.

After all, a claim that an election was illegitimate is a claim to remaining in power. A coup is under way, and the number of participants is not shrinking but growing. Few leading Republicans have acknowledged that the race is over. Important ones, such as Mitch McConnell and Mike Pompeo, appear to be on the side of the coup. We might like to think that this is all some strategy to find the president an exit ramp. But perhaps that is wishful thinking. The transition office refuses to begin its work. The secretary of defense, who did not want the army attacking civilians, was fired. The Department of Justice, exceeding its traditional mandate, has authorized investigations of the vote count. The talk shows on Fox this week contradict the news released by Fox last week. Republican lawmakers find ever new verbal formulations that directly or indirectly support Trump’s claims. The longer this goes on, the greater the danger to the Republic.

What Trump is saying is false, and Republican politicians know it. If the votes against the president were fraudulent, then Republican wins in the House and Senate were also fraudulent: The votes were on the same ballots. Yet conspiracy theories, such as the stab in the back, have a force that goes beyond logic. They push away from a world of evidence and toward a world of fears. Psychological research suggests that citizens are especially vulnerable to conspiracy theories at the time of elections. Trump understands this, which is why his delivery of conspiracy theory is full of capital letters and bereft of facts. He knows better than to try to prove anything. His ally Newt Gingrich reaches for the worst when he blames a wealthy Jew for something that did not happen in the first place.

History shows where this can go. If people believe an election has been stolen, that makes the new president a usurper. In Poland in 1922, a close election brought a centrist candidate to the presidency. Decried by the right in the press as an agent of the Jews, he was assassinated after two weeks in office. Even if the effect is not so immediate, the lingering effect of a myth of victimhood, of the idea of a stab in the back, can be profound. The German myth of a stab in the back did not doom German democracy immediately. But the conspiracy theory did help Nazis make their case that some Germans were not truly members of the nation and that a truly national government could not be democratic.

Democracy can be buried in a big lie. Of course, the end of democracy in America would take an American form. In 2020 Trump acknowledged openly what has been increasingly clear for decades: The Republican Party aims not so much to win elections as to game them. This strategy has its temptations: The more you care about suppressing votes, the less you care about what voters want. And the less you care about voters want, the closer you move to authoritarianism. Trump has taken the next logical step: Try to disenfranchise voters not only before but after elections.

The results of the 2020 elections could be read to mean that Republicans can fight and win on the issues. Reading the results as fraudulent instead will take Republicans, and the country, on a very different journey, through a cloud of magical thinking toward violence.

If you have been stabbed in the back, then everything is permitted. Claiming that a fair election was foul is preparation for an election that is foul. If you convince your voters that the other side has cheated, you are promising them that you yourself will cheat next time. Having bent the rules, you then have to break them. History shows the danger in . . .

Continue reading.

Written by Leisureguy

12 November 2020 at 1:23 pm

Ron Klain, back in January: Coronavirus Is Coming—And Trump Isn’t Ready

leave a comment »

President-Elect Joe Biden has picked Ron Klain as his chief of staff. Klain wrote this in the Atlantic in January:

We all knew the moment would come. It could have been over Iran or North Korea, a hurricane or an earthquake. But it may be the new coronavirus out of China that tests whether President Donald Trump can govern in a crisis—and there is ample reason to be uneasily skeptical.

The U.S. government has the tools, talent, and team to help fight the coronavirus abroad and minimize its impact at home. But the combination of Trump’s paranoia toward experienced government officials (who lack “loyalty” to him), inattention to detail, opinionated rejection of science and evidence, and isolationist instincts may prove toxic when it comes to managing a global-health security challenge. To succeed, Trump will have to trust the kind of government experts he has disdained to date, set aside his own terrible instincts, lead from the White House, and work closely with foreign leaders and global institutions—all things he has failed to do in his first 1,200 days in office.

We do not know yet how grave a threat the new coronavirus will turn out to be. On the one hand, scientists have quickly sequenced the virus and are working on a vaccine. China has imposed draconian quarantines to slow the virus’s spread, and is rapidly building massive new hospitals to treat its victims. To date, the U.S. has seen only a handful of cases, all of them the product of travel to China, not transmission here. These are causes for concern, but not overwrought fear.

But on the other hand, there are some worrisome developments. Models suggest that the cases in China may number in the hundreds of thousands—many times what the government has reported. Perhaps a million or more people left Wuhan before the quarantines, and could be spreading the virus widely. Other countries are reporting cases of the virus among people who were not in China; there are even reports that individuals may be infectious before the onset of symptoms (a substantial complication to traditional public-health screening). And the economic impact of a massive epidemic in China on the global economy is difficult to predict.

What will Trump do about it? His track record offers us two data points, one horrible and one merely disappointing.

Trump briefly withdrew from politics after his “birther” campaign against President Barack Obama was discredited, but his next big public splash was a virulent, xenophobic, fearmongering outburst over the West African Ebola epidemic of 2014. Trump’s numerous tweets—calling Obama a “dope” and “incompetent” for his handling of the epidemic—were both wrongheaded and consequential: One study found that Trump’s tweets were the single largest factor in panicking the American people in the fall of 2014. How paranoid and cruel was Trump? He blasted Obama for evacuating an American missionary back to the United States when that doctor contracted Ebola while fighting the disease in Africa. Fortunately, Obama ignored Trump’s protests, and Kent Brantley was successfully treated in the U.S.; he continues doing good works today.

Obama’s strategy for combatting Ebola in West Africa—interventionist, aggressive, science-based—made a huge contribution to a global response that saved hundreds of thousands of lives, and protected the U.S. from an outbreak. Tom Friedman wrote that it was perhaps Obama’s “most significant foreign policy achievement, for which he got little credit precisely because it worked … [showing] that without America as quarterback, important things that save lives … often don’t happen.”

As president, Trump’s own handling of the second-worst Ebola outbreak in history—ongoing in Congo—has been more mixed. Regrettably, after four U.S. soldiers were killed in Niger in 2017, Trump imposed an isolationist edict that no U.S. personnel are allowed to be in harm’s way in fighting the disease. Top American experts who were in and near the disease “hot zone” in Congo have been withdrawn. Moreover, while the U.S. has sent aid, it has provided only a fraction of the assistance offered in past global-health emergencies, a step back from the leadership demonstrated by prior Democratic and Republican administrations. Within these constraints, however, Trump has allowed the experts at the U.S. Agency for International Development and the Centers for Disease Control to provide assistance; most surprisingly, Trump even allowed the medical evacuation of a possible Ebola case to the U.S. for treatment.

The record on the Congo response is uneven: As long as Ebola in Congo is not in the news, the White House allows the bureaucracy to do its job, albeit within a limited range of action and with less than robust U.S. participation. But escalation of the coronavirus epidemic, and the elevated level of public attention, may lead Trump to depart from his usual indifference to the functioning of government and choose to assume personal leadership of his administration’s response.

Some of the world’s leading infectious-disease experts continue to serve in the administration, led by the incomparable Tony Fauci at the National Institutes of Health, and the level-headed Anne Schuchat at the CDC. These two, along with other leaders at key science agencies (and scores of men and women working for them), have decades of experience serving under presidents of both parties, and are among the world’s best at what they do.

But Trump’s war on government has decimated crucial functions in other key agencies. Smart and effective border screening will be a key tool in the response; there is scarcely a single competent or experienced leader left at the Department of Homeland Security. While USAID is in solid hands under Administrator Mark Green, it is stuck inside Mike Pompeo’s State Department, which has been purged of the many skilled administrators who play a role in facilitating foreign-disaster response. Trump’s poor choices for many ambassadorial posts, and harsh treatment of the Foreign Service, may create holes in our on-the-scene leadership as the disease spreads: During the West African Ebola epidemic, career Foreign Service ambassadors were important players in the response.

The biggest gap, of course, is at the White House itself.

At the end of the West African Ebola epidemic in 2015, President Obama accepted my recommendation to set up a permanent directorate at the National Security Council to coordinate government-wide pandemic preparedness and response. For the first year of his presidency, Trump kept that structure, and put the widely respected Admiral Tim Ziemer, a veteran of the George W. Bush administration, in charge of the unit. But in July 2018, John Bolton took over the NSC, disbanded the unit, and relegated Ziemer to a staff job at the State Department. The administration described that as a move to “streamline” the NSC, while critics charged that Bolton was too focused on hard-power threats.

To date, the Trump administration has resisted reversing this decision—either permanently, or on an ad hoc basis for the coronavirus response. Standing up a unit at NSC would require bringing in career staff to work there, and Trump’s paranoia about having such government veterans in the White House weighs against the move. But perhaps just as important, greater White House involvement in managing the response to pandemics would likely mean greater personal involvement by Trump. And in that regard, senior officials in government agencies may have a view of presidential engagement not unlike Fiddler on the Roof’s prayer for the czar: “May the Lord bless and keep him … far away from us.”

In his press conference on Tuesday,  . . .

Continue reading.

Written by Leisureguy

12 November 2020 at 1:12 pm

Where loneliness can lead: Hannah Arendt enjoyed her solitude, but . . .

leave a comment »

Samantha Rose Hill, assistant director of the Hannah Arendt Center for Politics and Humanities, visiting assistant professor of politics at Bard College in New York, and associate faculty at the Brooklyn Institute for Social Research in New York City, writes in Aeon:

What prepares men for totalitarian domination in the non-totalitarian world is the fact that loneliness, once a borderline experience usually suffered in certain marginal social conditions like old age, has become an everyday experience …
– From The Origins of Totalitarianism (1951) by Hannah Arendt

‘Please write regularly, or otherwise I am going to die out here.’ Hannah Arendt didn’t usually begin letters to her husband this way, but in the spring of 1955 she found herself alone in a ‘wilderness’. After the publication of The Origins of Totalitarianism, she was invited to be a visiting lecturer at the University of California, Berkeley. She didn’t like the intellectual atmosphere. Her colleagues lacked a sense of humour, and the cloud of McCarthyism hung over social life. She was told there would be 30 students in her undergraduate classes: there were 120, in each. She hated being on stage lecturing every day: ‘I simply can’t be exposed to the public five times a week – in other words, never get out of the public eye. I feel as if I have to go around looking for myself.’ The one oasis she found was in a dockworker-turned-philosopher from San Francisco, Eric Hoffer – but she wasn’t sure about him either: she told her friend Karl Jaspers that Hoffer was ‘the best thing this country has to offer’; she told her husband Heinrich Blücher that Hoffer was ‘very charming, but not bright’.

Arendt was no stranger to bouts of loneliness. From an early age, she had a keen sense that she was different, an outsider, a pariah, and often preferred to be on her own. Her father died of syphilis when she was seven; she faked all manner of illnesses to avoid going to school as a child so she could stay at home; her first husband left her in Berlin after the burning of the Reichstag; she was stateless for nearly 20 years. But, as Arendt knew, loneliness is a part of the human condition. Everybody feels lonely from time to time.

Writing on loneliness often falls into one of two camps: the overindulgent memoir, or the rational medicalisation that treats loneliness as something to be cured. Both approaches leave the reader a bit cold. One wallows in loneliness, while the other tries to do away with it altogether. And this is in part because loneliness is so difficult to communicate. As soon as we begin to talk about loneliness, we transform one of the most deeply felt human experiences into an object of contemplation, and a subject of reason. Language fails to capture loneliness because loneliness is a universal term that applies to a particular experience. Everybody experiences loneliness, but they experience it differently.

As a word, ‘loneliness’ is relatively new to the English language. One of the first uses was in William Shakespeare’s tragedy Hamlet, which was written around 1600. Polonius beseeches Ophelia: ‘Read on this book, that show of such an exercise may colour your loneliness.’ (He is counselling her to read from a prayer book, so no one will be suspicious of her being alone – here the connotation is of not being with others rather than any feeling of wishing that she was.)

Throughout the 16th century, loneliness was often evoked in sermons to frighten churchgoers from sin – people were asked to imagine themselves in lonely places such as hell or the grave. But well into the 17th century, the word was still rarely used. In 1674, the English naturalist John Ray included ‘loneliness’ in a list of infrequently used words, and defined it as a term to describe places and people ‘far from neighbours’. A century later, the word hadn’t changed much. In Samuel Johnson’s A Dictionary of the English Language (1755), he described the adjective ‘lonely’ solely in terms of the state of being alone (the ‘lonely fox’), or a deserted place (‘lonely rocks’) – much as Shakespeare used the term in the example from Hamlet above.

Until the 19th century, loneliness referred to an action – crossing a threshold, or journeying to a place outside a city – and had less to do with feeling. Descriptions of loneliness and abandonment were used to rouse the terror of nonexistence within men, to get them to imagine absolute isolation, cut off from the world and God’s love. And in a certain way, this makes sense. The first negative word spoken by God about his creation in the Bible comes in Genesis after he made Adam: ‘And the Lord God said, “It is not good that man is alone; I shall make him a helpmate opposite him.”’

In the 19th century, amid modernity, loneliness lost its connection with religion and began to be associated with secular feelings of alienation. The use of the term began to increase sharply after 1800 with the arrival of the Industrial Revolution, and continued to climb until the 1990s until it levelled off, rising again during the first decades of the 21st century. Loneliness took up character and cause in Herman Melville’s ‘Bartleby, the Scrivener: A Story of Wall Street’ (1853), the realist paintings of Edward Hopper, and T S Eliot’s poem The Waste Land (1922). It was engrained in the social and political landscape, romanticised, poeticised, lamented.

But in the middle of the 20th century, Arendt approached loneliness differently. For her, it was both something that could be done and something that was experienced. In the 1950s, as she was trying to write a book about Karl Marx at the height of McCarthyism, she came to think about loneliness in relationship to ideology and terror. Arendt thought the experience of loneliness itself had changed under conditions of totalitarianism:

What prepares men for totalitarian domination in the non-totalitarian world is the fact that loneliness, once a borderline experience usually suffered in certain marginal social conditions like old age, has become an everyday experience of the ever-growing masses of our century.

Totalitarianism in power found a way to crystallise the occasional experience of loneliness into a permanent state of being. Through the use of isolation and terror, totalitarian regimes created the conditions for loneliness, and then appealed to people’s loneliness with ideological propaganda.

Before Arendt left to teach at Berkeley, she’d published an essay on ‘Ideology and Terror’ (1953) dealing with isolation, loneliness and solitude in a Festschrift for Jaspers’s 70th birthday. This essay, alongside her book The Origins of Totalitarianism, became the foundation for her oversubscribed course at Berkeley, ‘Totalitarianism’. The class was divided into four parts: the decay of political institutions, the growth of the masses, imperialism, and the emergence of political parties as interest-group ideologies. In her opening lecture, she framed the course by reflecting on how the relationship between political theory and politics has become doubtful in the modern age. She argued that there was an increasing, general willingness to do away with theory in favour of mere opinions and ideologies. ‘Many,’ she said, ‘think they can dispense with theory altogether, which of course only means that they want their own theory, underlying their own statements, to be accepted as gospel truth.’

Arendt was referring to the way in which ‘ideology’ had been used as a desire to divorce thinking from action – ‘ideology’ comes from the French idéologie, and was first used during the French Revolution, but didn’t become popularised until the publication of Marx and Friedrich Engels’s The German Ideology (written in 1846) and later Karl Mannheim’s Ideology and Utopia (1929), which she reviewed for Die Gesellschaft in 1930.

In 1958, a revised version of ‘Ideology and Terror’ was added as a new conclusion to the second edition of The Origins of Totalitarianism.

Origins is a 600-page work divided into three sections on . . .

Continue reading. There’s much more, and it’s quite relevant today in the US as more people feel isolated and lonely because of the pandemic lockdowns and totalitarian pressures emerge from the Right.

Written by Leisureguy

12 November 2020 at 12:37 pm

Why Did America Give Up on Mass Transit? (Don’t Blame Cars.)

leave a comment »

Jonathan English had an interesting article in Bloomberg CityLab back in August 2018, and it is still relevant. He wrote:

One hundred years ago, the United States had a public transportation system that was the envy of the world. Today, outside a few major urban centers, it is barely on life support. Even in New York City, subway ridership is well below its 1946 peak. Annual per capita transit trips in the U.S. plummeted from 115.8 in 1950 to 36.1 in 1970, where they have roughly remained since, even as population has grown.

This has not happened in much of the rest of the world. While a decline in transit use in the face of fierce competition from the private automobile throughout the 20th century was inevitable, near-total collapse was not. At the turn of the 20th century, when transit companies’ only competition were the legs of a person or a horse, they worked reasonably well, even if they faced challenges. Once cars arrived, nearly every U.S. transit agency slashed service to cut costs, instead of improving service to stay competitive. This drove even more riders away, producing a vicious cycle that led to the point where today, few Americans with a viable alternative ride buses or trains.

Now, when the federal government steps in to provide funding, it is limited to big capital projects. (Under the Trump administration, even those funds are in question.) Operations—the actual running of buses and trains frequently enough to appeal to people with an alternative—are perpetually starved for cash. Even transit advocates have internalized the idea that transit cannot be successful outside the highest-density urban centers.

And it very rarely is. Below is a set of maps that show the present-day network rail and bus lines operating at least every 30 minutes, all day to midnight, seven days a week, for five urban areas in the U.S. and one in Canada for comparison. That could be considered the bare-minimum service level required for people to be able to live adequately car free. In fact, research says that frequencies of 15 minutes or better—good enough for people to turn up and go without consulting a schedule—are where the biggest jumps in ridership happen. But that is so far off from service levels in most American cities that a 30-minute standard is more appropriate.

The maps illustrate the vast swaths of urban areas untouched by full service bus routes. For those who do live near one, it’s quite likely that the bus wouldn’t get them where they need to go, unless their destination is downtown. A bus that comes once an hour, stops at 7 pm, and doesn’t run on Sundays—a typical service level in many American cities—restricts people’s lives so much that anyone who can drive, will drive. That keeps ridership per capita low.

What happened? Over the past hundred years the clearest cause is this: Transit providers in the U.S. have continually cut basic local service in a vain effort to improve their finances. But they only succeeded in driving riders and revenue away. When the transit service that cities provide is not attractive, the demand from passengers that might “justify” its improvement will never materialize.

Here’s how this has played out, era by era. In a companion article, I look at how differently things unfolded in other parts of the world.

THE AGE OF RAIL
1850s to 1930s

The decades at the turn of the century were a time of massive transit infrastructure growth in the United States, carried out primarily by private companies with some municipal subsidy. Much of New York City and Philadelphia’s subways, Chicago’s ‘L,’ and Boston’s ‘T’ were built in this era. Huge networks of “interurbans”—a kind of streetcar that ran deep into rural areas—spread out from cities across the country. “Streetcar suburbs” grew outward along main streets, allowing middle-class people to buy homes while still easily getting to jobs downtown. . .

Continue reading. There’s much more and it’s quite interesting.

Written by Leisureguy

12 November 2020 at 11:29 am

11 Misconceptions About Ancient Rome, Debunked

leave a comment »

Mark Mancini writes in Mental Floss:

Released in 1959, Charlton Heston’s Ben-Hur is considered one of the greatest motion pictures of all time. Unfortunately, the film helped perpetuate a few mistaken beliefs concerning Rome and her citizenry. With the Ben-Hur remake set to hit theaters on August 19, now seems like a good time to bust some myths.

1. ROMANS DIDN’T WEAR TOGAS 24-7.

In his epic poem The Aeneid, Jupiter talks about the future of the Romans as the “masters of the world, the race that wears the toga.” No article of clothing has ever been more synonymous with this ancient culture. Only a Roman citizen could legally wear one, and as years went by, different styles came to be used as a way of displaying the wearer’s socioeconomic status. But for most of Rome’s history, togas were not considered everyday attire.

At first, the toga emphasized function over form. During the Republic’s early days, men, women, and children alike wore these accessories as a kind of durable outerwear. Underneath, they’d don a tunic, which was a sleeved, t-shaped garment that extended from the collar to the knees. Inevitably, though, the region’s fashion standards evolved. By the 2nd century BCE, it became taboo for adult women to put on a toga (prostitutes and adulteresses notwithstanding). Within the next hundred years, the toga turned into a bulky, impractical article of clothing that was mostly reserved for formal occasions like religious services and funerals. In casual environments, the average male Roman citizen would instead wear one of his tunics, sans toga.

Because togas were made with large quantities of costly wool, they were also quite expensive. The Roman poet Juvenal once observed that “there are many parts of Italy, to tell the truth, in which no man puts on the toga until he is dead.” Toward the dawn of the 4th century CE, the toga was more or less replaced by a kind of cloak called the paenula.

2. CONTRARY TO POPULAR BELIEF, IT LOOKS LIKE THE “NAZI SALUTE” WASN’T INVENTED IN ROME.

You’ll often hear it said that the Romans created this now-infamous gesture. Supposedly, it was then copied by Adolf Hitler’s devotees many centuries later. The whole myth is so widespread that the motion is sometimes referred to as the “Roman salute.” And yet there’s no historical evidence to suggest that such a greeting was ever used in ancient Rome.

Instead, the salute can probably be traced back to a 1784 painting called The Oath of the Horatii [see above – LG]. Created by French Neoclassicist Jacques-Louis David, it shows three Roman brothers pledging to defend their homeland. While the men do so, we see that they’ve raised their right arms and extended the fingers. Over the next century, other artists started to portray Romans in this pose and playwrights began writing it into their historical drama scripts.

Mussolini’s Italian Fascist Party later claimed the salutation as its own and celebrated the gesture’s allegedly Roman origins. Inspired by il Duce, Hitler created a German variant for his own fascist organization. “I introduced the salute into the Party at our first meeting in Weimar,” he recalled in 1942. “The S.S. at once gave it a soldierly style.”

3. WE DON’T KNOW WHAT JULIUS CAESAR’S LAST WORDS WERE.

But they probably weren’t “Et tu, Brute?” On March 15 in the year 44 BCE, Julius Caesar was murdered by a group of over 60 co-conspirators, one of whom was Marcus Junius Brutus, the son of the dictator’s longtime mistress. The Roman historian Suetonius later wrote that, according to bystanders, Caesar’s dying utterance was “Kai su, teknon?” which means “You too, child?” in Greek. For the record, however, both Suetonius and another scholar named Plutarch believed that when he was slain, the dictator didn’t say anything at all. The world-famous “Et tu, Brute?” line was made up by William Shakespeare.

4. NOT ALL GLADIATORS WERE SLAVES OR PRISONERS … OR MEN. . . .

Continue reading.

Written by Leisureguy

12 November 2020 at 11:12 am

Posted in Daily life, History

Pro 48 and Cavendish CK-6, with the Rockwell Model T

with 3 comments

The handle of the Omega Pro 48 (10048) is  a distant cousin of the Simpson Emperor handle: a bulbous base (though slight in the Pro 48) and a ridge that demarcates base and the ball containing the knot. While the Emperor’s handle is a solid resin, the Pro 48’s handle is a lesser plastic and feels hollow (because relatively light). Of course, given that the Emperor 3 is $190 and the Pro 48 is $13.75, some differences are to be expected.

The Pro 48 handle does, however, work quite well as a handle — nothing to write home about, but it gets the job done — and I continue to love the feel and action of the long-lofted knot, now well broken in. It was easily loaded from the CK-6 formulation of Cavendish, an old favorite from when the company first began, and between brush and soap lathering was both a tactile and olfactory pleasure.

The Rockwell Model T does feel large, but it was no problem at all in getting a fine result. Some have said that the head size make shaving under the nose difficult, but they must use a different technique than I, for I find it no problem at all. Three passes left my skin smooth and — thanks in part to CK-6 — soft.

A splash of Cavendish aftershave, and the day is well begun.

Written by Leisureguy

12 November 2020 at 8:49 am

Posted in Shaving

%d bloggers like this: