Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘History’ Category

The GOP’s war against poor Americans

leave a comment »

Heather Cox Richardson has a good column on the origin of the GOP’s hostile attitude toward the public good. The column begins:

This morning, as expected, the House Republicans elected Elise Stefanik (R-NY), Trump’s choice for conference chair, to replace Representative Liz Cheney (R-WY). This means that the four top House Republican leaders—Minority Leader Kevin McCarthy (R-CA), Minority Whip Steve Scalise (R-LA), Stefanik, and Policy Committee Chair Gary Palmer (R-AL)—all voted to overturn Biden’s 2020 victory after the January 6 attack on the Capitol.

Stefanik thanked “President Trump for his support,” saying “he is a critical part of our Republican team.” She went on to say that “House Republicans are united in our fight to save our country from the radical Socialist Democrat agenda of President Biden and Nancy Pelosi.”

Today’s vote confirmed that the leaders of the current Republican Party are willing to abandon democracy in order to save the country from what they call “socialism.”

But what Republicans mean when they say “socialism” is not the political system most countries recognize when they use that word: one in which the people, through their government, own the means of production. What Republicans mean comes from America’s peculiar history after the Civil War, when new national taxation coincided with the expansion of voting to include Black men.

In the years just after the firing stopped, white southerners who hated the idea that Black men could use the vote to protect themselves terrorized their Black neighbors. Pretending to be the ghosts of dead Confederate soldiers, they dressed in white robes with hoods to cover their faces and warned formerly enslaved people not to show up at the polls. But in 1870, Congress created the Department of Justice, and President U.S. Grant’s attorney general set out to destroy the Ku Klux Klan.

In 1871, southern leaders changed their tactics. The same men who had vowed that Black people would never be equal to whites began to say that their objection to Black voting was not based on race. No, they said, their objection was that Black people were poor and uneducated and would elect lawmakers who promised to give them things—hospitals, and roads, and schools—that could be paid for only through tax levies on people with property: white men. In this formulation, voting was not a means to ensuring equality; it was a redistribution of wealth from hardworking white men to African Americans who wanted a handout. Black voting meant “socialism,” and it would destroy America.

With this argument, northerners who had fought alongside Black colleagues and insisted they must be equal before the law on racial grounds were willing to see Black men kept from the polls. Black voting, which northerners had recognized as key to African Americans being able to protect their interests—and, for that matter, to defend the national government from the former Confederates who still wanted to destroy it—slowed. And then it stopped.

The South became a one-party state ruled by a small elite class, defined by white supremacy, and mired in poverty. For its part, the North also turned on workers, undermining the labor movement and focusing on protecting the new industrial factories whose owners claimed they were the ones driving the economy.

In the 1930s, the Great Depression changed this equation. When the bottom fell out of the economy, Democrats under Franklin Delano Roosevelt transformed the government to regulate business, provide a basic social safety net, and promote infrastructure. As early as 1937, Republican businessmen and southern Democrats began to talk of coming together to stop what they considered socialism. But most Americans liked this New Deal, and its opponents had little hope of attracting enough voters to stop its expansion.

That equation changed after World War II, when Presidents Harry Truman and Dwight Eisenhower began to use the government to advance racial equality. Truman’s 1948 desegregation of the military prompted southern Democrats to form their own short-lived segregationist party. The Supreme Court’s 1954 Brown v. Board of Education of Topeka, Kansas, decision declaring segregation in public schools unconstitutional enabled opponents of the new government system to tie racism to their cause. They warned that the expanded government meant the expensive protection of Black rights, which cost tax dollars. They argued it was simply a redistribution of wealth, just as their counterparts had done in the Reconstruction South.

With the passage of the 1965 Voting Rights Act, that. . .

Continue reading. The history is interesting and relevant. She concludes the column:

. . . With the election of Democrat Joe Biden and Vice President Kamala Harris, along with a Democratic Congress, the leadership of the Republican Party has taken the next step. They are rejecting the legitimacy of the election, doubling down on Trump’s Big Lie that he won. Claiming to want to combat “voter fraud,” they are backing bills across the country to suppress Democratic voting, making sure that no one but a Republican can win an election.

Just as white southerners argued after the Civil War, Republican leaders claim to be acting in the best interests of the nation. They are standing firm against “the radical Socialist Democrat agenda,” making sure that no wealthy person’s tax dollars go to schools or roads or social programs.

They are “saving” America, just as white supremacists “saved” the Jim Crow South.

Written by Leisureguy

15 May 2021 at 10:08 am

A GOP Civil War? Don’t Bet On It.

leave a comment »

Jeff Greenfield, five-time Emmy-winning network television analyst and author, has a piece in Politico that is somewhat depressing because it seems valid. It begins:

If you’ve been reading the coverage lately, or listened to gloating Democrats, it’s easy to believe the Republican Party is eating itself alive.

The former Republican president literally campaigns against incumbents of his own party. NBC calls it a “GOP power struggle”; The Hill describes “deep rifts”; and the Democratic National Committee exults over “a GOP civil war.” After losing the White House, the House and the Senate, its congressional leadership is now in open conflict; Wednesday, the minority leader is expected to oust his No. 3.

Among Democrats painfully aware of their tiny or non-existent margins in the House and Senate, the prospect of a divided Republican Party offers hope that this “civil war” will redound to Democrats’ advantage in 2022.

They shouldn’t be so sure.

First, beyond a few spats that make headlines, it’s getting harder to detect any serious division among rank-and-file Republicans. In Congress, and at the grassroots, the dominance of Donald Trump over the party is more or less total. The small handful who denounced the former president for his massive lies about the election and his seeding of an insurrectionist riot are now either silent, or have embraced a mealy-mouthed argument for “election integrity.” The same state officials who pushed back against Trump’s attempt to overturn November’s results have embraced a series of restrictive voting measures ostensibly designed to combat non-existent “fraud,” all aimed at hobbling voters inclined to vote for Democrats. Mitch McConnell, who denounced Trump’s behavior in high-minded tones in the aftermath of the riot, also—on the exact same day—voted to exonerate him of wrongdoing.

Second, and more significant, history is littered with times that critics on the left, and in the pundit class, were positive the Republican Party was setting itself up for defeat by embracing its extremes … only to watch the party comfortably surge into power. This time there are structural advantages as well: Given the Republican advantages in the House (through gerrymandering, and the statistically “wasted” votes in landslide Democratic districts), in the Senate, in state legislatures and in the Electoral College, a Trump-dominated Republican Party is a strong contender to take the White House next time around. And, contrarian as it may seem, the lockstep devotion to the former president may actually enhance, rather than lessen, its chances. What we’re seeing isn’t a civil war. It’s a purge, and there’s every reason to believe it will work.

This is not the conclusion you’ll reach if you follow much of the mainstream press. A New York Times story on Saturday about Trump’s hold on the GOP quoted former Rep. Barbara Comstock, former Sen. Jeff Flake, GOP consultant Sarah Longwell and Republican strategist Scott Reed, all warning of the political danger of a Trumpcentric party. These are estimable public figures, none of whom remotely speaks for the Republican base. For the past few weeks, much media attention was focused on Michael Wood, the 34-year-old veteran running for a Texas seat with a message that the Republican Party had to move away from Trump. He wound up finishing ninth, with 3 percent of the vote.

For a broader measure of just how one-sided the “civil war” is, you don’t need to stop at the behavior of House Republicans, who are poised to defenestrate Liz Cheney from her leadership post, and who overwhelmingly voted in January to block the certification of electors. A far better picture emerges when . . .

Continue reading. There’s more.

Written by Leisureguy

14 May 2021 at 12:12 pm

The Wootz Hunter

leave a comment »

Almost three years ago, I blogged an extract from an article in Craftsmanship about a farrier who rediscovered (with the help of a professor metallurgy) how to make wootz steel, a secret lost around the 13th century. I rewatched the video that accompanied the article and again found it fascinating (as is the article itself), so I thought I’d point it out to new readers who are interested in fine steel and sharp objects. A great read and an interesting video.

UPDATE: I found a longer and more detailed video on Wootz steel:

Written by Leisureguy

14 May 2021 at 10:21 am

The Republican party has demonstrably lost its collective mind (and conscience)

leave a comment »

Heather Cox Richardson points out some salient facts:

As expected, this morning the House Republicans removed Wyoming Representative Liz Cheney from her position as conference chair after she refused to stop speaking out against the former president for instigating the January 6 attack on our Capitol and the counting of electoral votes for President Joe Biden. The Republicans ousted her by voice vote, which meant that no one had to go on the record for or against Cheney, and the Republicans kept the split in the party from being measurable. It also ensured that she would lose; she has survived a secret ballot vote before.

Before the vote, Cheney allegedly told her Republican colleagues: “If you want leaders who will enable and spread his destructive lies, I’m not your person; you have plenty of others to choose from.” After the vote, she went in front of the cameras to say that she would lead the fight to reclaim the party from Trump, and said: “I will do everything I can to ensure that the former president never again goes anywhere near the Oval Office.”

After her ouster, Trump Republican Representative Madison Cawthorn (NC) tweeted ““Na na na na, na na na na, hey hey, goodbye Liz Cheney.” The former president echoed Cawthorn: “Liz Cheney is a bitter, horrible human being. I watched her yesterday and realized how bad she is for the Republican Party. She has no personality or anything good having to do with politics or our Country.”

After convincing his caucus to dump Cheney and embrace Trump, House Minority Leader Kevin McCarthy (R-CA) told reporters: “I don’t think anybody is questioning the legitimacy of the presidential election. I think that is all over with.”

This was a breathtaking statement. McCarthy himself challenged the certification of Biden’s win, and just last week, Trump made a big announcement in which he called the election of 2020 “fraudulent.” The Big Lie animating the Republicans today is that Trump, not Biden, really won the 2020 election.

But McCarthy is not alone in his gaslighting. Yesterday, in the Senate Rules Committee markup of S1, the For the People Act protecting the vote, ending gerrymandering, and pushing big money out of our elections, Senator Minority Leader Mitch McConnell (R-KY) said: “I don’t think anyone on our side has been arguing that [voter fraud] has been pervasive all over the country.”

The false claim of widespread voter fraud is, of course, exactly what Trump Republicans have stood on since the 2020 election. It is the justification for their voter suppression measures in Republican states, including Texas, Iowa, Georgia, Florida, and, as of yesterday afternoon, Arizona.

In today’s House Oversight Committee hearing on the January 6 insurrection, Republican lawmakers in general tried to gaslight Americans, as they tried to paint that unprecedented attack on our democracy as nothing terribly important. Although 140 law enforcement officers were injured, five people were killed, more than 400 people have been charged with crimes, and rioters did more than $30 million worth of damage, Republican representatives downplayed the events of the day, insisting that they were not really out of the ordinary. Representative Andrew Clyde (R-GA) said that calling the attack on the Capitol an insurrection is a “bald-faced lie” and that “if you didn’t know the TV footage was a video from January the 6th, you would actually think it was a normal tourist visit….”

CNN later called Clyde’s remarks “absolute nonsense.” Even the definition of insurrection Clyde quoted—“an organized attempt by a group of people to defeat their government and take control of their country usually by violence”—showed the attack of January 6 to be an insurrection. And, as lawyer and CNN analyst Asha Rangappa noted tonight on Twitter, at his second impeachment trial even Trump’s own lawyers did not dispute that the events of January 6 were a violent insurrection. The record is clear.

Republican lawmakers like Clyde did, though, echo the former president’s interview on the Fox News Channel in March when he said that when his supporters went into the Capitol they posed “zero threat” and were “hugging and kissing the police and the guards…. A lot of the people were waved in, and then they walked in and they walked out.”

The former president appears to be continuing to exercise control over his underlings. Former Acting Attorney General Jeffrey Rosen and former Acting Defense Secretary Christopher Miller provided testimony at the House Oversight Committee hearing, and what they would not say was revealing. Rosen refused to answer questions about whether Trump asked him to try to overturn the 2020 election. Miller’s prepared remarks had included a sentence that said “I stand by my prior observation that I personally believe his comments encouraged the protesters that day.” In his testimony, he omitted that line, and later tried to walk it back, trying to draw a line between people who marched on the Capitol and those who broke into it.

But with Cheney and her supporters now in open revolt, and with news about the Capitol attack dropping, and even with more information coming about the ties between the former president and Russia, will Republican Party leaders manage to sweep everything under the rug?

Today, at a hearing on domestic extremism before the Senate Appropriations Committee, Attorney General Merrick Garland and Homeland Security Secretary Alejandro Mayorkas both testified that the most serious domestic national security threat in the U.S. right now is that of white supremacist gangs. “I think it’s fair to say that in my career as a judge, and in law enforcement, I have not seen a more dangerous threat to democracy than the invasion of the Capitol,” Garland said. “There was an attempt to interfere with the fundamental passing of an element of our democracy, the peaceful transfer of power. And if there has to be a hierarchy of things that we prioritize, this would be the one we’d prioritize. It is the most dangerous threat to our democracy. That does not mean that we don’t focus on other threats.”

For his part, President Biden is refusing to get sucked into the Republican drama, instead focusing on the country. Today an advisory panel for the Centers for Disease Control and Prevention endorsed the Pfizer vaccine for children as young as 12, and the CDC signed off on the recommendation, making it easier to reopen schools in the fall.

Today Biden met at the White House with . . .

Continue reading. It’s worth noting that the domestic terrorist threat comes almost totally from the radical right.

Written by Leisureguy

12 May 2021 at 9:52 pm

Why Trump Still Has Millions of Americans in His Grip

leave a comment »

Thomas Edsall’s column in the NY Times today makes good points, some of which relate to the previous post on the coming automation of trucking:

Beginning in the mid-1960s, the priorities of the Democratic Party began to shift away from white working- and middle-class voters — many of them socially conservative, Christian and religiously observant — to a set of emerging constituencies seeking rights and privileges previously reserved for white men: African-Americans; women’s rights activists; proponents of ethnic diversity, sexual freedom and self-expressive individualism.

By the 1970s, many white Americans — who had taken their own centrality for granted — felt that they were being shouldered aside, left to face alone the brunt of the long process of deindustrialization: a cluster of adverse economic trends including the decline in manufacturing employment, the erosion of wages by foreign competition and the implosion of trade unionism.

These voters became the shock troops of the Reagan Revolution; they now dominate Trump’s Republican Party.

Liberal onlookers exploring the rise of right-wing populism accuse their adversaries of racism and sexism. There is plenty of truth to this view, but it’s not the whole story.

In “The Bitter Heartland,” an essay in American PurposeWilliam Galston, a veteran of the Clinton White House and a senior fellow at Brookings, captures the forces at work in the lives of many of Trump’s most loyal backers:

Resentment is one of the most powerful forces in human life. Unleashing it is like splitting the atom; it creates enormous energy, which can lead to more honest discussions and long-delayed redress of grievances. It can also undermine personal relationships — and political regimes. Because its destructive potential is so great, it must be faced.

Recent decades, Galston continues, “have witnessed the growth of a potent new locus of right-wing resentment at the intersection of race, culture, class, and geography” — difficult for “those outside its orbit to understand.”

They — “social conservatives and white Christians” — have what Galston calls a “bill of particulars” against political and cultural liberalism. I am going to quote from it at length because Galston’s rendering of this bill of particulars is on target.

  • “They have a sense of displacement in a country they once dominated. Immigrants, minorities, non-Christians, even atheists have taken center stage, forcing them to the margins of American life.”

  • “They believe we have a powerful desire for moral coercion. We tell them how to behave — and, worse, how to think. When they complain, we accuse them of racism and xenophobia. How, they ask, did standing up for the traditional family become racism? When did transgender bathrooms become a civil right?”

  • “They believe we hold them in contempt.”

  • “Finally, they think we are hypocrites. We claim to support free speech — until someone says something we don’t like. We claim to oppose violence — unless it serves a cause we approve of. We claim to defend the Constitution — except for the Second Amendment. We support tolerance, inclusion, and social justice — except for people like them.”

Galston has grasped a genuine phenomenon. But white men are not the only victims of deindustrialization. We are now entering upon an era in which vast swaths of the population are potentially vulnerable to the threat — or promise — of a Fourth Industrial Revolution.

This revolution is driven by unprecedented levels of technological innovation as artificial intelligence joins forces with automation and takes aim not only at employment in what remains of the nation’s manufacturing heartland, but also increasingly at the white-collar managerial and professional occupational structure.

Daron Acemoglu, an economist at M.I.T., described in an email the most likely trends as companies increasingly adopt A.I. technologies.

A.I. is in its infancy. It can be used for many things, some of them very complementary to humans. But right now it is going more and more in the direction of displacing humans, like a classic automation technology. Put differently, the current business model of leading tech companies is pushing A.I. in a predominantly automation direction.

As a result, Acemoglu continued, “we are at a tipping point, and we are likely to see much more of the same types of disruptions we have seen over the last decades.”

In an essay published in Boston Review last month, Acemoglu looked at the issue over a longer period. Initially, in the first four decades after World War II, advances in automation complemented labor, expanding the job market and improving productivity.

But, he continued, “a very different technological tableau began in the 1980s — a lot more automation and a lot less of everything else.” In the process, “automation acted as the handmaiden of inequality.”

Automation has pushed the job market in two opposing directions. Trends can be adverse for those (of all races and ethnicities) without higher education, but trends can also be positive for those with more education:

New technologies primarily automated the more routine tasks in clerical occupations and on factory floors. This meant the demand and wages of workers specializing in blue-collar jobs and some clerical functions declined. Meanwhile professionals in managerial, engineering, finance, consulting, and design occupations flourished — both because they were essential to the success of new technologies and because they benefited from the automation of tasks that complemented their own work. As automation gathered pace, wage gaps between the top and the bottom of the income distribution magnified.

Technological advancement has been one of the key factors in the growth of inequality based on levels of educational attainment, as the accompanying graphic shows:

Acemoglu warns:

If artificial intelligence technology continues to develop along its current path, it is likely to create social upheaval for at least two reasons. For one, A.I. will affect the future of jobs. Our current trajectory automates work to an excessive degree while refusing to invest in human productivity; further advances will displace workers and fail to create new opportunities. For another, A.I. may undermine democracy and individual freedoms.

Mark Muro, a senior fellow at Brookings, contends that it is essential to look at the specific types of technological innovation when determining impact on the job market.

“Two things are happing at once, when you look at traditional ‘automation’ on the one hand and ‘artificial intelligence’ on the other,” Muro wrote in an email. “The more widespread, established technologies usually branded ‘automation’ very much do tend to disrupt repetitive, lower-skill jobs, including in factories, especially in regions that have been wrestling with deindustrialization and shifts into low-pay service employment.”

In contrast, Muro continued, “Artificial intelligence really is a very different set of technologies than those we label as ‘automation, and it will for a while mostly affect college educated workers.” But, and it’s a big but,

there is a greater chance that such white collar workers, with their B.A.s, will be better equipped to coexist with A.I. or even benefit from it than will non-B.A. workers impacted by other forms of automation. And yet, there’s no doubt A.I. will now be introducing new levels of anxiety into the professional class

In a November 2019 paper, “What jobs are affected by A.I.? Better-paid, better-educated workers face the most exposure,” Muro and two colleagues found that exposure to A.I. is significantly higher for jobs held by men, by people with college degrees or higher, by people in the middle and upper pay ranks and by whites and Asian-Americans generally.

In contrast, in a March 2019 paper, “Automation perpetuates the red-blue divide,” Muro and his colleagues found that automation, as opposed to A.I., hurts those who hold jobs that do not require college degrees the most, and that exposure to automation correlates with support for Trump:

The strong association of 2016 Electoral College outcomes and state automation exposure very much suggests that the spread of workplace automation and associated worker anxiety about the future may have played some role in the Trump backlash and Republican appeals.

More specifically, Muro and his colleagues found:

Heartland states like Indiana and Kentucky, with heavy manufacturing histories and low educational attainment, contain not only the nation’s highest employment-weighted automation risks, but also registered some of the widest Trump victory margins. By contrast, all but one of the states with the least exposure to automation, and possessing the highest levels of educational attainment, voted for Hillary Clinton.

How do the risks of automation, foreign-trade-induced job loss and other adverse consequences of technological change influence politics? . . .

Continue reading.

Written by Leisureguy

9 May 2021 at 12:32 pm

The U.S. Owes Hawaiians Millions of Dollars Worth of Land. Congress Helped Make Sure the Debt Wasn’t Paid.

leave a comment »

Rob Perez of the Honolulu Star-Advertiser reports in ProPublica:

In the 1990s, Hawaii’s two elder statesmen — U.S. Sens. Daniel Inouye and Daniel Akaka — were at the forefront of efforts to ensure that the U.S. compensated Native Hawaiians for ancestral lands taken from them over the years.

“Dan Inouye believed that a promise made should be a promise kept,” Akaka, a Native Hawaiian, said in 2012 upon the death of his longtime Senate colleague.

But an investigation by the Honolulu Star-Advertiser and ProPublica has found that those same senators voted several times each to support must-pass legislation that included provisions undermining efforts to repay millions of dollars in land debt to Hawaiians. At least six other current and former members of Hawaii’s congressional delegation have supported such legislation one or more times.

Between them, Hawaii’s members of Congress voted for at least six laws authorizing the federal government to sell dozens of excess properties to private parties rather than offering them to a Hawaiian trust established to repatriate the land. In one must-pass military spending bill spanning more than 500 pages, lawmakers slipped in a single sentence that helped a handful of nonprofits to acquire the land. In another, they added language that effectively put the need for military housing ahead of the need for housing Hawaiians.

The circumvention of the landmark 1995 Hawaiian Home Lands Recovery Act, which has not been previously reported, sent the excess lands to a variety of buyers instead: the Catholic Church; the nonprofit operator of a private school; a developer that intends to sell a site to another company with plans to construct hundreds of private-sector homes there.

The transactions mostly involved lands on Oahu, the state’s most populous island, and were executed during a period in which the Department of Hawaiian Home Lands, which manages the trust, faced a severe shortage of developable residential land there. About 11,000 Hawaiians are now seeking residential homesteads on Oahu, nearly double what the figure was when the recovery act passed. As the Star-Advertiser and ProPublica reported in December, the trust has only enough land to accommodate less than a third of those homestead-seekers in single-family homes, although it is moving to develop more multi-family housing. Many waitlisters are homeless, and thousands have died without getting a homestead lease.

Even as the federal government was selling excess properties to private buyers, it offered only two parcels to the trust over the past decade, according to the news organizations’ investigation. And one was for a remote mountainside location that DHHL rejected because it determined that the property — a former solar observatory — wasn’t suitable for residential use or to lease for other purposes.

The findings confirmed the suspicions of Mike Kahikina, who said he had a hunch something was amiss during the eight years he served on the Hawaiian Homes Commission, which decides policy for DHHL.

Kahikina joined the commission in 2011, 16 years after the recovery act was signed. Along with eight other commissioners, his job was to help the department get beneficiaries onto residential, ranching and farming homesteads in a timely way — a task DHHL has struggled with historically. By the time he left in 2019, the federal government’s debt was the same size as when he joined.

Kahikina said he periodically raised questions with DHHL about the land debt, but they were never satisfactorily answered.

The news organizations shared their findings with Kahikina — an Air Force veteran, former state legislator, ordained minister and outreach worker for troubled youth — as he sat outside the West Oahu homestead residence that has been in his family for three generations. With his long salt-and-pepper hair tied back in a bun, Kahikina, who now heads the Association of Hawaiians for Homestead Lands, a statewide nonprofit organization of waitlisters, was stunned as he learned details of the private deals. “You connected the dots for me,” he said, repeating himself to emphasize the point. “It’s like we’re an invisible people.”

The investigation relied on federal, state and county records and revealed nearly 40 deals over the past decade involving about 520 acres, all authorized by special language inserted into at least six bills passed by Congress. Beyond the Catholic Church, the developer and the private school operator, the special legislation also allowed land deals with a veterans association, individual homebuyers, another nonprofit private school operator and several religious organizations.

Had it not been for that legislation, advocates say the recovery act could have allowed some of these same entities to access the land while benefiting DHHL at the same time. That’s because under the recovery act, DHHL is permitted to sell certain properties for fair market value and use the proceeds for homestead development.

The Navy, which had owned the majority of lands involved in the private deals, defended its actions. The special legislation expressed the intent of Congress at the time, and if a new law conflicted with a prior one, the new one applied, according to a spokesperson. “Navy followed the law,” she wrote.

The General Services Administration, which plays a key role in federal land disposal, would not address criticisms about bypassing the recovery act. But in response to a letter from one of Hawaii’s two current U.S. senators, a GSA official acknowledged that congressional actions — a reference to the special legislation — allowed some agencies to bypass the recovery act. . .

Continue reading. There’s much more. The US too often proves to be untrustworthy.

Written by Leisureguy

7 May 2021 at 1:50 pm

When will the revolution in architecture begin?

leave a comment »

Nathan J. Robinson has a fascinating (and well-illustrated) article in Current Affairs that begins:

Something is terribly wrong with architecture. Nearly everything being built is boring, joyless, and/or ugly, even though there is no reason it has to be. The architectural profession rewards work that is pretentious and bland. The cities we build are not wondrous.

I’ve documented it at length before, but the problem can be seen at a glance.

Here is a classic work of Islamic architecture: . ..

Continue reading. And look at those photos!

In a way, one could say that the architecture revolution is here, since the modern buildings pictured in the article are indeed revolting.

Written by Leisureguy

5 May 2021 at 6:07 pm

New documentary shows how “whiteness” is used to define power

leave a comment »

Jon Schwarz reports in The Intercept:

IN THE FINAL episode of Raoul Peck’s HBO documentary, “Exterminate All the Brutes,” Peck says in a voice-over, “The very existence of this film is a miracle.”

That is 100 percent true. Before this moment in history, it would have been impossible to imagine that one of the world’s largest corporations — AT&T, owner of HBO, with a current market cap of $220 billion — would have funded and broadcast a film like this. The fact that it somehow squeezed through the cracks and onto our TVs and laptop screens demonstrates that something profound about the world is changing. Decades, centuries of people fighting and dying were required both to widen the cracks and mold someone like Peck, the right human at the right time, to step through.

“Exterminate All the Brutes” is a sprawling disquisition — four episodes, each an hour long —into the invention and consequences of 500 years of “white” supremacy, presented via a high-gloss pastiche of old footage, newly filmed dramatizations, and clips from Hollywood movies. “White” needs scare quotes because the film makes clear that whiteness is not something that exists in reality — like, say, the moon — that is right there whether we believe in it or not. Instead, it’s something imaginary that we’ve somehow all agreed on, like pieces of paper having value. [That is, “whiteness” is a meme, a culturally defined idea. – LG]

These two made-up concepts meet in the $100 bill via the man on its face, Benjamin Franklin. In 1751, Franklin wrote an essay that makes clear that anyone can be classified as “white” or read out of the white race, depending on the needs of the moment.

Franklin was desperate to keep the British colonies “white,” but by white, he didn’t mean European. For Franklin, only the English and Saxons counted. Germans, Swedes, Russians, and the French were hilariously “swarthy,” and thus “will never adopt our Language or Customs, any more than they can acquire our Complexion.”

At the same time, as the miniseries illustrates, the English were colonizing Ireland and demoting its nearly translucent inhabitants to nonwhite. A famous British clergyman named Charles Kingsley, extremely liberal by the standards of the day, wrote home from a trip to Sligo that the people somehow had skin “as white as ours” but nevertheless were subhuman “chimpanzees.” In the U.S., the Irish were the standard by which nonwhiteness was measured, to the extent that African Americans were sometimes referred to as “smoked Irish.”

Of course, America eventually promoted the Irish to white, on the condition that they would be team players. Across the world in South Africa, the apartheid regime decided that Japanese immigrants were loyal enough to be “honorary whites.” The sorting process can even be seen in real time in a 1949 Atlantic article by a friend of Franklin D. Roosevelt about his trip to the newly born Israel. The country, he explained, could be useful as “the best guarantee” for Western interests in the area. Jewish people, who had previously been “moth-eaten” and “grease-spotted,” now possessed “physical beauty, healthy vitality, politeness, good nature” and were comparable to Thomas Jefferson. Arab people were in the way but “about as dangerous as so many North American Indians,” and therefore nonwhite and “foul, diseased, smelling, rotting, and pullulating with vermin.”

Peck may be only the filmmaker who would want to take on this gigantic subject and then manage to present it as is, simultaneously terrifying and preposterous. Born in Haiti — i.e., the western half of Hispaniola, the island where Columbus landed in the “New” World — Peck has lived all over the planet and has a humanistic sympathy for all people, both at their best and their absolute worst. He’s made several dozen films, many documentaries, and was nominated for an Oscar in 2017 for “I Am Not Your Negro,” about James Baldwin.

At the outset, Peck says, “This is  . . .

Continue reading. There’s more, and it’s worth reading.

Written by Leisureguy

5 May 2021 at 2:10 pm

The Lulz of Medusa: On Laughter as Protest

leave a comment »

Shira Chess is Associate Professor of Entertainment and Media Studies at the University of Georgia and author of Ready Player Two and Play Like a Feminist. The following, an extract from the latter, appeared in the MIT Press Reader:

The physical act of laughter is the ultimate tool of playful protest. It does not require props, screens, or any affiliated costs. Laughter has the ability to disrupt the status quo, extricating stifling hypocrisies. It is always available, regardless of your position of power. It works as an antiseptic and is clarifying. It is personal. Laughter has been used and mobilized by those in the past, and needs to reclaim its role in the protestations of the future. Laughter is a striking tool of resistance. If deployed properly, we can giggle, guffaw, chuckle, and snicker toward resistance and advancement.

But how do we laugh in the face of the terrible things that happen — things that strike us so deeply that we are immobilized with fear? In these moments, our first response to protest is often one of anger and deliberate, obstinate resistance. When this anger turns into overwhelming sadness, how do we locate the presence of mind to laugh without diminishing or undercutting our topic?

When I wrote about protest and laughter in my book, “Play Like a Feminist,” I wrote it in the shadow of a shooting in 2018, when 11 people were shot at a synagogue in Pittsburgh. Like so many other shootings, this story feels unremarkable, with fresh gun violence occurring as frequently as our hearts beat. Last month, a gunman killed eight people in Atlanta, most of them women of Asian descent. Not a week later another killed 10 shoppers in Boulder. In the face of this, how do I find space for laughter? This year alone, there have already been 104 mass shootings recorded in the United States of America. How do any of us find space for laughter?

I laugh because laughter is power. Rebecca Krefting refers to “charged humor”: a form of disruptive laughter meant to reimagine communities and use comedy to “foment social change.” Krefting observes that in this way, laughter is an inroad toward social justice. Similarly, historian Joseph Boskin writes about the ability of comedy to disrupt the momentary zeitgeist and offset power. But he also suggests that political humor is often deployed ineffectively, and that rather than focusing on institutions, it tends to be directed at individuals. In other words, the target of derisive laughter should not be the politician who makes a public misstep but instead the political system that put that politician in power. This lack of institutional focus makes our laughter less effective as a weapon.

Nevertheless, laughter has been and can be weaponized. Charged humor, in addition to being a kind of communal glue, is personal and intimate. We can laugh in a crowded theater, to great satisfaction, but we can also laugh alone and unheard by others. Laughter can be deployed by the disenfranchised to reclaim their sense of the absurd world we live in while remaining a binding substance that can fortify relationships. We need more laughter.

Famously, in her essay “The Laugh of Medusa,” Hélène Cixous writes about the monstrous feminine as inhabited by the infamous mythological icon. Rather than casting her as a beast, Cixous reinterprets the character, noting, “You only have to look at the Medusa straight on to see her. And she’s not deadly. She’s beautiful and laughing.”

Medusa’s laughter is rebellious; it fights back against the gods and mortals who have left her in her predicament, and it is our own fear that keeps us from hearing that laughter. It is Medusa’s laugh that we need to . . .

Continue reading.

Written by Leisureguy

2 May 2021 at 4:53 pm

The Invention of the Police: Why American policing got so big so fast

leave a comment »

Jill Lepore’s excellent article on the history of policing in the US was published in the New Yorker in July 2020. It begins:

To police is to maintain law and order, but the word derives from polis—the Greek for “city,” or “polity”—by way of politia, the Latin for “citizenship,” and it entered English from the Middle French police, which meant not constables but government. “The police,” as a civil force charged with deterring crime, came to the United States from England and is generally associated with monarchy—“keeping the king’s peace”—which makes it surprising that, in the antimonarchical United States, it got so big, so fast. The reason is, mainly, slavery.

“Abolish the police,” as a rallying cry, dates to 1988 (the year that N.W.A. recorded “Fuck tha Police”), but, long before anyone called for its abolition, someone had to invent the police: the ancient Greek polis had to become the modern police. “To be political, to live in a polis, meant that everything was decided through words and persuasion and not through force and violence,” Hannah Arendt wrote in “The Human Condition.” In the polis, men argued and debated, as equals, under a rule of law. Outside the polis, in households, men dominated women, children, servants, and slaves, under a rule of force. This division of government sailed down the river of time like a raft, getting battered, but also bigger, collecting sticks and mud. Kings asserted a rule of force over their subjects on the idea that their kingdom was their household. In 1769, William Blackstone, in his “Commentaries on the Laws of England,” argued that the king, as “pater-familias of the nation,” directs “the public police,” exercising the means by which “the individuals of the state, like members of a well-governed family, are bound to conform their general behavior to the rules of propriety, good neighbourhood, and good manners; and to be decent, industrious, and inoffensive in their respective stations.” The police are the king’s men.

History begins with etymology, but it doesn’t end there. The polis is not the police. The American Revolution toppled the power of the king over his people—in America, “the law is king,” Thomas Paine wrote—but not the power of a man over his family. The power of the police has its origins in that kind of power. Under the rule of law, people are equals; under the rule of police, as the legal theorist Markus Dubber has written, we are not. We are more like the women, children, servants, and slaves in a household in ancient Greece, the people who were not allowed to be a part of the polis. But for centuries, through struggles for independence, emancipation, enfranchisement, and equal rights, we’ve been fighting to enter the polis. One way to think about “Abolish the police,” then, is as an argument that, now that all of us have finally clawed our way into the polis, the police are obsolete.

But are they? The crisis in policing is the culmination of a thousand other failures—failures of education, social services, public health, gun regulation, criminal justice, and economic development. Police have a lot in common with firefighters, E.M.T.s, and paramedics: they’re there to help, often at great sacrifice, and by placing themselves in harm’s way. To say that this doesn’t always work out, however, does not begin to cover the size of the problem. The killing of George Floyd, in Minneapolis, cannot be wished away as an outlier. In each of the past five years, police in the United States have killed roughly a thousand people. (During each of those same years, about a hundred police officers were killed in the line of duty.) One study suggests that, among American men between the ages of fifteen and thirty-four, the number who were treated in emergency rooms as a result of injuries inflicted by police and security guards was almost as great as the number who, as pedestrians, were injured by motor vehicles. Urban police forces are nearly always whiter than the communities they patrol. The victims of police brutality are disproportionately Black teen-age boys: children. To say that many good and admirable people are police officers, dedicated and brave public servants, which is, of course, true, is to fail to address both the nature and the scale of the crisis and the legacy of centuries of racial injustice. The best people, with the best of intentions, doing their utmost, cannot fix this system from within.

There are nearly seven hundred thousand police officers in the United States, about two for every thousand people, a rate that is lower than the European average. The difference is guns. Police in Finland fired six bullets in all of 2013; in an encounter on a single day in the year 2015, in Pasco, Washington, three policemen fired seventeen bullets when they shot and killed an unarmed thirty-five-year-old orchard worker from Mexico. Five years ago, when the Guardian counted police killings, it reported that, “in the first 24 days of 2015, police in the US fatally shot more people than police did in England and Wales, combined, over the past 24 years.” American police are armed to the teeth, with more than seven billion dollars’ worth of surplus military equipment off-loaded by the Pentagon to eight thousand law-enforcement agencies since 1997. At the same time, they face the most heavily armed civilian population in the world: one in three Americans owns a gun, typically more than one. Gun violence undermines civilian life and debases everyone. A study found that, given the ravages of stress, white male police officers in Buffalo have a life expectancy twenty-two years shorter than that of the average American male. The debate about policing also has to do with all the money that’s spent paying heavily armed agents of the state to do things that they aren’t trained to do and that other institutions would do better. History haunts this debate like a bullet-riddled ghost.

That history begins in England, in the thirteenth century, when maintaining the king’s peace became the duty of an officer of the court called a constable, aided by his watchmen: every male adult could be called on to take a turn walking a ward at night and, if trouble came, to raise a hue and cry. This practice lasted for centuries. (A version endures: George Zimmerman, when he shot and killed Trayvon Martin, in 2012, was serving on his neighborhood watch.) The watch didn’t work especially well in England—“The average constable is an ignoramus who knows little or nothing of the law,” Blackstone wrote—and it didn’t work especially well in England’s colonies. Rich men paid poor men to take their turns on the watch, which meant that most watchmen were either very elderly or very poor, and very exhausted from working all day. Boston established a watch in 1631. New York tried paying watchmen in 1658. In Philadelphia, in 1705, the governor expressed the view that the militia could make the city safer than the watch, but militias weren’t supposed to police the king’s subjects; they were supposed to serve the common defense—waging wars against the French, fighting Native peoples who were trying to hold on to their lands, or suppressing slave rebellions.

The government of slavery was not a rule of law. It was a rule of police. In 1661, the English colony of Barbados passed its first slave law; revised in 1688, it decreed that “Negroes and other Slaves” were “wholly unqualified to be governed by the Laws . . . of our Nations,” and devised, instead, a special set of rules “for the good Regulating and Ordering of them.” Virginia adopted similar measures, known as slave codes, in 1680:

It shall not be lawfull for any negroe or other slave to carry or arme himselfe with any club, staffe, gunn, sword or any other weapon of defence or offence, nor to goe or depart from of his masters ground without a certificate from his master, mistris or overseer, and such permission not to be granted but upon perticuler and necessary occasions; and every negroe or slave soe offending not haveing a certificate as aforesaid shalbe sent to the next constable, who is hereby enjoyned and required to give the said negroe twenty lashes on his bare back well layd on, and soe sent home to his said master, mistris or overseer . . . that if any negroe or other slave shall absent himself from his masters service and lye hid and lurking in obscure places, comitting injuries to the inhabitants, and shall resist any person or persons that shalby any lawfull authority be imployed to apprehend and take the said negroe, that then in case of such resistance, it shalbe lawfull for such person or persons to kill the said negroe or slave soe lying out and resisting.

In eighteenth-century New York, a person held as a slave could not gather in a group of more than three; could not ride a horse; could not hold a funeral at night; could not be out an hour after sunset without a lantern; and could not sell “Indian corn, peaches, or any other fruit” in any street or market in the city. Stop and frisk, stop and whip, shoot to kill.

Then there were the slave patrols. Armed Spanish bands called hermandades had hunted runaways in Cuba beginning in the fifteen-thirties, a practice that was adopted by the English in Barbados a century later. It had a lot in common with England’s posse comitatus, a band of stout men that a county sheriff could summon to chase down an escaped criminal. South Carolina, founded by slaveowners from Barbados, authorized its first slave patrol in 1702; Virginia followed in 1726, North Carolina in 1753. Slave patrols married the watch to the militia: serving on patrol was required of all able-bodied men (often, the patrol was mustered from the militia), and patrollers used the hue and cry to call for anyone within hearing distance to join the chase. Neither the watch nor the militia nor the patrols were “police,” who were French, and considered despotic. In North America, the French city of New Orleans was distinctive in having la police: armed City Guards, who wore military-style uniforms and received wages, an urban slave patrol.

In 1779, Thomas Jefferson created a chair in “law and police” at the College of William & Mary. The meaning of the word began to change. In 1789, Jeremy Bentham, noting that “police” had recently entered the English language, in something like its modern sense, made this distinction: police keep the peace; justice punishes disorder. (“No justice, no peace!” Black Lives Matter protesters cry in the streets.) Then, in 1797, a London magistrate named Patrick Colquhoun published “A Treatise on the Police of the Metropolis.” He, too, distinguished peace kept in the streets from justice administered by the courts: police were responsible for the regulation and correction of behavior and “the prevention and detection of crimes.” . . .

Continue reading.

Written by Leisureguy

2 May 2021 at 1:45 pm

The Free Market is Dead: What Will Replace It?

leave a comment »

Chris Hughes, co-chair of the Economic Security Project and a senior advisor at the Roosevelt Institute, is the author of Fair Shot: Rethinking Inequality and How We Earn and was a co-founder of Facebook. He writes in TIME magazine:

Big meetings in the Oval Office in the time of Covid-19 are rare, but two weeks into his presidency, President Joe Biden decided to make an exception. It was only a few days after the nation’s coronavirus case count peaked in late January, and Biden sat on a stately beige chair, double masked and flanked by Vice President Kamala Harris and newly confirmed Treasury Secretary, Janet Yellen.

The leaders of some of the nation’s largest businesses like Wal-Mart and J.P. Morgan Chase had come to the White House that day to talk economic stimulus. But the real surprise attendee was the head of America’s largest business advocacy group, the Chamber of Commerce, Tom Donohue. Under Donohue’s leadership over the past two decades, the Chamber had effectively become an organ of the Republican party, handsomely rewarding conservatives who worked to dismantle public programs and the regulatory state with campaign donations and support.

Donohue said little, but he didn’t have to. His presence was enough to rock the political landscape. “Washington’s most powerful trade group is having a political identity crisis,” wrote Politico. Two weeks later, a group of 150 CEOs, unaffiliated with the Chamber, followed suit, throwing their weight behind Biden’s COVID relief bill, which sailed through Congress. They have been similarly supportive of the additional $2 trillion the administration has now proposed for infrastructure spending – but they unsurprisingly don’t want corporate tax rates to be the means for paying for it.

But corporate America’s newfound support for more public investment is not a temporary phenomenon. We are witnessing the most profound realignment in American political economy in nearly forty years. President Ronald Reagan summed up the conventional wisdom that reigned from the mid-1970s onward in the United States: “Government is not the solution to our problem, government is the problem.” Economists, policymakers, and everyday Americans alike generally accepted that markets, unfettered and free, are the best way to create economic growth.

That ideology began to crack after the Great Recession, and in the wake of the coronavirus pandemic, it has collapsed. The rise of ethno-nationalism on the right and democratic socialism on the left testify to the growing disillusionment with the conventional wisdom of how government and economics are supposed to work.

It’s not just the fringes questioning free market orthodoxy in a time of disease. Cross-partisan supermajorities of Americans want some of the biggest companies of America to be broken up, significantly higher minimum wages, a wealth tax on billionaires, and believe significantly more public investment is required to create economic growth.

We have had regulations, public investment, and macroeconomic management to varying degrees throughout American history. What makes this moment different is that Americans across parties, class, and educational background are using a new framework to think about how we create prosperity.

The new managed market paradigm is bigger than Bidenomics or any particular economic agenda—it is a story about how the economy works.

We went from living in a country where markets couldn’t be touched to one where Americans believe the state has an important role in managing them to create prosperity. What killed off free market mythology, and what will come next?


A crisis in confidence in government triggered the last paradigm shift, making way for the rise of free market thinking. In the 1970s,

Continue reading. There’s much more, and it’s worth reading.

Later in the article:

In the years after the [2008 financial] crisis, scholars and policymakers came to realize that free markets had failed empirically to live up to their promise.

Reduced taxes on capital and fewer regulations were supposed to create more growth by making it easier for investors to invest and entrepreneurs to hire, the orthodoxy said. Yet the economy grew by 3.9 percent on average between 1950 and 1980, the era before free market orthodoxy took hold, and only at 2.6 percent on average in the 40 years since.

Similarly, aggregate growth, fueled by deregulation and free trade, should have boosted incomes for American workers if free market orthodoxy was to be believed. The rich would do well with lower taxes, they promised, but so too would the middle class and poor because of all the additional economic activity. In reality, wages have not meaningfully increased over the past 40 years after accounting for inflation, while income inequality has soared.

This list goes on. Relaxed antitrust enforcement was supposed to enable monolith companies to benefit from economies of scale, reducing costs for Americans. But the cost of living in America has skyrocketed, with housing, healthcare, and education eating up a greater proportion of Americans’ budgets than ever before. Expected investments in productivity-enhancing technologies by such large companies have not materialized.

We were told that policies developed to combat inequality like progressive taxation or public investment were supposed to constrict growth. Studies now show the opposite is true. The work of economists like Raj Chetty and Janet Currie has shown that poorer children lack access to good nutrition, stable neighborhoods, and quality schools and are not able to climb a meritocratic ladder. That hurts them individually and starves the economy of skilled workers that boost growth. The lack of public investment in public programs like affordable childcare means parents are more likely to drop out of the labor force, depriving the economy of workers and growth, as Heather Boushey has shown. And because the wealthy save for more than the poor, growing wealth inequality has muted the largest driver of economic growth, consumer spending, as documented by the economist Karen Dynan. . .

Written by Leisureguy

29 April 2021 at 12:11 pm

John Milton’s defense of free speech

leave a comment »

Nicholas McDowell has an interesting essay in Aeon. Who is he?

Nicholas McDowell is professor of early modern literature and thought at the University of Exeter. He is the author and editor of several books on the relationship between literature, history and ideas in the 17th century, including The Oxford Handbook of Milton (2009). His most recent book is Poet of Revolution: The Making of John Milton (2020), the first volume of a two-part intellectual biography of Milton.

So he knows Milton, and here he discusses Areopagitica (full text here). He writes:

Published at the height of the first English Civil War, Areopagitica: A Speech of Mr John Milton for the Liberty of Unlicensed Printing, to the Parliament of England (1644), remains a powerful defence of free expression. Printing might now have almost given way to digital media as the form in which beliefs and ideas are proposed, argued with and attacked, but the questions raised by Areopagitica about liberty of thought and speech, and more specifically writing, are more urgent than ever. John Milton, the poet who, in Paradise Lost (1667), composed an English epic that could compete with and even surpass the Greek and Latin classics, was also a prose writer of distinction. This fact tends to be eclipsed by his reputation as a poet. But in Areopagitica, he gave Western liberalism some of the language through which it still conceives of itself. It’s both illuminating and salutary, at a moment of crisis in the liberal tradition, to return to the principles that shaped that tradition. At a time when the possibility of civil war in the United States is openly entertained by some, literally as well as metaphorically, we can learn much about the tensions inherent in liberalism by returning to the origins of Milton’s arguments amid the actual civil war that raged in Britain and Ireland in the mid-17th century.

There’s little evidence that what has become Milton’s best-known prose work had any wide impact on the thinking of his contemporaries: one German reader in 1647 suggested that it should be translated into other languages to ‘give it good circulation in other lands where such tyranny reigns’, but he also thought it ‘rather too satirical’ and that its arguments needed to be ‘more moderately set forth’. The real impact of Areopagitica ­– the title alludes to Isocrates’ seventh oration addressed to the Areopagus, the ancient council of Athens – came in later revolutions and in different lands. Thomas Jefferson quoted it, and the comte de Mirabeau’s translation into French went through four editions between 1788 and 1792.

Its eventual influence on British thought is apparent in the echoes of its argument and imagery in John Stuart Mill’s essay On Liberty (1859), in which Mill insists that freedom of expression is a precondition of a flourishing society. The occasion for George Orwell’s powerful essay ‘The Prevention of Literature’ (1946), in which he considers the twin threats posed to ‘intellectual liberty’ by ‘totalitarianism’ and ‘monopoly and bureaucracy’, was Orwell’s dismay after attending a meeting of the PEN Club – a society founded in 1921 to further intellectual cooperation among writers – to commemorate the tercentenary of the publication of Areopagitica. That ‘no speaker quoted from the pamphlet which was ostensibly being commemorated’ was, for Orwell, an indication of the failure of his contemporaries to live up to the ideals that they claimed to promote.

The resonance of Areopagitica for US ideals of the free exchange of ideas is, however, apparent to anyone who has been to the New York Public Library. A plaque on Library Way bears this quotation from the pamphlet beside an image of a printing press: ‘Where there is much desire to learn, there of necessity will be much arguing, much writing, many opinions; for opinion in good persons is but knowledge in the making.’ More prominent is the quotation displayed above the entrance to the main reading room, which preserves the original spelling: ‘A good Booke is the pretious life-blood of a master spirit, imbalm’d and treasur’d up on purpose to a life beyond life.’

These lofty and poetic declarations emerged out of a more prosaic personal context for Milton. The outbreak of civil war between parliament and the Royalist forces of Charles I in 1642 had caused the ecclesiastical and political mechanisms of prepublication licensing in England, according to which every publication had first to be approved by a committee of bishops, to fall into disuse. The early 1640s consequently witnessed an unprecedented spike in the number of publications in England, many of them polemical attacks on the other political side in the civil war. The Westminster Assembly, composed mainly of clerics of various Puritan beliefs, had begun, by 1643, to discuss what forms of worship should replace the structures of the collapsed Church of England. It was in this atmosphere of innovation and revolution that Milton felt that he could publish, in the same year, proposals for a reform of the divorce laws that would enable a husband to separate from a wife on the grounds, not merely of nonconsummation or adultery, but of unhappiness and incompatibility. For Milton, who was 34 at this point, the argument for reform seemingly had a deeply personal impetus: in the early summer of 1642, he had married Mary Powell, but she had left him after little more than a month to return to her family, and hadn’t come back.

The reception of his tract The Doctrine and Discipline of Divorce (1643), which he published anonymously and unlicensed, deeply shocked him: it was condemned by the Puritan clergy as being heretical and intending to foster sexual libertinism, and it was cited in petitions to parliament as evidence of the need to reinstall a system of prepublication licensing. Parliament’s Licensing Order of June 1643 required once again that appointed officers, including clerics, examine books for heterodoxy, sedition and libel before licensing them for printing. It was in response to these attempts to restore prepublication censorship on the grounds of the appearance of his own book on divorce reform, among other books charged with promoting heresy, that Milton issued Areopagitica in November 1644 (again without a licence). Yet, as I show in my intellectual biography of the young Milton, Poet of Revolution: The Making of John Milton (2020), his interest in these matters wasn’t sudden, nor entirely the result of a sense of personal insult.

Milton had been thinking hard about how censorship and state persecution suppressed intellectual and literary achievement for some years prior to Areopagitica. After leaving Christ’s College, Cambridge in 1632, he had embarked on an intensive period of reading about the history of Europe and became particularly interested in episodes of literary censorship in Italy, the country in which he lived and travelled from 1638 to 1639. In 2014, Milton’s copy of the 1544 edition of Giovanni Boccaccio’s book Vita di Dante (c1360), translated as Life of Dante, was discovered in the Bodleian Library in Oxford, complete with annotations, dating from around 1637 to 1638, that mark Boccaccio’s account of how Dante’s political work On Monarchy (1313) was burnt as a heretical text by the papal authorities. Milton even shows his awareness that Boccaccio’s account of this censorship of Dante was itself later censored and that parts of it are missing from some editions of the Life of Dante, marking in his annotations the passages that were excised by the authorities between editions. In other words, Milton identified a process of double censorship that had been imposed in different centuries on two of the Italian writers whom he most admired.

Milton’s fascination with the topic of censorship and the Catholic Church’s Index of Prohibited Books, before he wrote Areopagitica, is evident also in his surviving commonplace book, a manuscript notebook compiled mostly during the late 1630s and early 1640s, in which he gathered a list of  . . .

Continue reading. There’s much more.

Written by Leisureguy

26 April 2021 at 1:06 pm

Maslow’s Hierarchy of Needs: He botched it.

leave a comment »

Teju Ravilochan wrote a piece for GatherFor: on Medium. The thrust reflects the priorities and purposes of the group, which is based in New York City and works to develop and network small community groups to build community belonging and resilience. The piece begins:

Some months ago, I was catching up with my dear friend and board member, Roberto Rivera. As an entrepreneur and community organizer with a doctorate and Lin-Manuel-Miranda-level freestyle abilities, he is a teacher to me in many ways. I was sharing with him that for a long time, I’ve struggled with Maslow’s Hierarchy of Needs.

The traditional interpretation of Maslow’s Hierarchy of Needs is that humans need to fulfill their needs at one level before we can advance to higher levels.

Maslow’s idea emerged and was informed by his work with the Blackfeet Nation through conversations with elders and inspiration from the shape and meaning of the Blackfoot tipi. Maslow’s idea has been criticized for misrepresenting the Blackfoot worldview, which instead places self-actualization as a basis for community-actualization and community-actualization as a basis for cultural perpetuity, the latter of which exists at the top of the tipi in Blackfoot philosophy.

The Blackfoot Tipi

This is a slide from a presentation by Cindy Blackstock, a member of the Gitksan First Nation and University of Alberta Professor, shared in Karen Lincoln Michel’s blog. She describes Maslow’s theory as “a rip off of the Blackfoot nation.”

Maslow’s Failure to Elevate the Blackfoot Model

Continue reading. There’s much more.

It does strike me that in the US today the prevailing view of individuality above all — one’s own individual desires and needs being paramount, with community needs much less in the picture — has resulted in some bad outcomes for all.

Written by Leisureguy

26 April 2021 at 11:01 am

This book looks at ancient Rome in a new light

leave a comment »

I am indebted to The Younger Daughter in two ways regarding the audiobook The Fate of Rome.

  1. She recommended the book in the first place, and it’s fascinating; and
  2. She told me how I can listen to it free through getting it as audiobook with 1 of the 2 free credits I would get for signing up with

So I signed up, and the 2 credits were clearly displayed. I searched for the book title, purchased the audiobook for 1 credit, and I’ve been listening to it. It’s amazing how the change in perspective adds to one’s understanding. One example: while the Romans were building all those excellent roads that lead to Rome, they also were in effect constructing efficient transportation channels that would allow infectious diseases to spread swiftly and widely.

So go ahead and sign up — even if you don’t get any other books, this one is definitely worth the (free) sign-up.

Written by Leisureguy

25 April 2021 at 12:18 pm

The Ivy League vs Democracy

leave a comment »

Matt Stoller writes in BIG:

Today I’m pleased to welcome a guest writer, Sam Haselby (@samhaselby), who thinks deeply about the history of the Ivy League, its role today, and its religious roots as a set of institutions designed around exclusion. [Haselby writes as follows. – LG]

The Ivy League vs Democracy

One of the great puzzles of American society is the position of the Ivy Leagues. They are a bastion of privilege and power, and yet the campuses are rife with left-leaning professors who one might imagine seek to redistribute wealth. According to the Harvard Crimson, 77.6% of Harvard professors define themselves as left-leaning, and just 2.9% as conservative. What explains this dynamic? Former Harvard College Dean Harry Lewis said that it gets to the basic point of the school, which is to advance radical ideas. “It’s almost by definition anti-preservationist because we place such a high value on the creation of new knowledge,” he said.

A wildly different explanation is apparent from watching Netflix’s Varsity Blues: The College Admissions Scandal, the highly publicized fiasco in which wealthy parents used bribery to get their kids into top colleges. What I found most interesting about this episode wasn’t the actual corruption, but a different and more poignant feature of American meritocracy. Even in the midst of acts of bribery, many of the parents were beset with fear that their children might find out about the crooked machinations to win their admission to elite schools. They took desperate steps to shield the kids from facing real questions of “merit” or deservedness. And in fact, while most involved in meritocracy don’t use bribery, a tremendous amount of energy now goes into preserving similar basic fictions about the nature of elite private education and its role in the United States.

We most often hear about inequality in terms of super-rich corporations and individuals or families. But it is important that the same gulf, separating haves and have nots, has opened between U.S. colleges and universities. Since the pandemic began, 650,000 jobs have disappeared in American academic institutions. More than 75% of college faculty in the U.S. are contingent workers or non tenure-track. Meanwhile, as of 2020, the aggregate value of the endowments of the richest 20 U.S. schools rose to over $311 billion, all of which are subsidized by taxpayers through the tax-free treatment we offer nonprofit educational institutions. The common joke, that Harvard is a hedge fund with an educational arm, is not so far off.

According to the IMF, the value of these endowment funds is greater than the GDP of New Zealand, Finland, or Chile. In the last 5 years the U.S. has fallen in the UN’s Human Development Index, but its elite universities have risen in the world rankings and gotten richer. America’s richest colleges and universities, in effect, exist in a country of their own (though paid for in part with the public’s money).

This inequity reflects a restructuring of political power, towards an aristocracy. In historical perspective, we are seeing the collapse of the great post World War II democratization of post-secondary arts and sciences education alongside the appearance of a meritocracy alienated from the public and at odds with democracy. If anyone points out the role of elite education in the reproduction of inequality today, Americans tend to see it as flawed or compromised meritocracy rather than “true” meritocracy. But such responses are signs of a kind of Stockholm Syndrome. The “merit” of meritocracy has little to nothing to do with the abilities, or worth, or value of people as human beings and citizens.

Meritocracy and democracy are not the same thing. The goal of meritocracy is to produce, or reproduce, an elite. There is nothing necessarily democratic about that. The Puritans who founded the Ivy league schools were very good at building stable and exclusive institutions, for many reasons, including that the elite, for them, was the elect: those specially chosen to receive God’s grace, to be one of the sanctified and saved few among the masses of the damned. In the early United States, however, New Englanders quickly discovered, to their dismay, that being the elect did not mean much to many Americans and they would be hard pressed to win national elections. The Puritan schools are designed to serve the elect, not for democratic education.  Thomas Jefferson feared and reviled the Puritan schools, and founded the University of Virginia to counter what he saw as their anti-democratic influence.

The Civil War and Reconstruction, first, and the Civil Rights Movement, second constitute the greatest achievements in modern American democracy. Both also were high marks of public education. In the former, radical Republicans who had seized control of the government created America’s great land grant universities, while the Civil Rights Movement unfolded after a generation of Cold War investment in high quality public university education. The United States has spent a generation moving away from this kind of democratic education toward a gilded meritocracy. America’s elite private schools are now one of the last strongholds of the drunken post-Cold War triumphalism that hoarded wealth and privilege to private institutions at the expense of public and democratic ones.

There is no way that I know of to have truly democratic elections without . . .

Continue reading. There’s more.

Written by Leisureguy

25 April 2021 at 11:35 am

“Why I support reparations — and all conservatives should”

leave a comment »

Fred Hiatt, publisher of the Washington Post, made this comment in a newsletter:

When Donald Trump emerged on the political scene in 2015, The Post featured what I thought were (and still are) the best conservative columnists in the country.

None of them, however, supported Trump. We realized that, if we were to be true to our commitment to offer a full range of political views, we would have to add a new kind of conservative voice.

We were fortunate to find Gary Abernathy, who at the time was editing one of the few newspapers that endorsed Trump for president, the (Hillsboro, Ohio) Times-Gazette. 

Ever since, he has written a column that I usually disagree with — and almost always learn from. He has helped our readers understand the perspective of voters in southwestern Ohio. He offers a model of civil, good-natured debate, and he is rarely predictable. His column last week on why he supports reparations — and why he thinks all conservatives should — may give you a sense of what I mean.

So here is what a Trump conservative has to say about reparations. Gary Abernathy writes:

Rep. Pramila Jayapal (D-Wash.) is among the progressive lawmakers whose blunt, liberal outspokenness regularly annoys me. Recently, she particularly upset me while discussing the latest congressional study of reparations for descendants of enslaved people, when she said, “If you through your history benefited from that wrong that was done, then you must be willing to commit yourself to righting that wrong.”

Only this time I was bothered because her comments hit home.

Like most conservatives, I’ve scoffed at the idea of reparations or a formal apology for slavery. I did not own slaves, so why would I support my government using my tax dollars for reparations or issuing an apology? Further, no one in the United States has been legally enslaved since 1865, so why are Black people today owed anything more than the same freedoms and opportunities that I enjoy?

I remain unconvinced that an apology would have much real value, but the more substantive notion of reparations is worth discussing. In fact, it could be argued that the idea fits within the conservative philosophy. We’ll come back to that. But it is undeniable that White people have disproportionately benefitted from both the labor and the legacy of slavery, and — crucially — will continue to do so for generations to come.

When slavery was abolished after a bloody civil war, African Americans were dispersed into a world that was overtly hostile to them. Reconstruction efforts were bitterly resisted by most Southern Whites, and attempts to educate and employ former slaves happened only in fits and starts. The government even reneged on its “40 acres and a mule” pledge. After slavery, prejudice and indifference continued to fuel social and economic disparity.

The result is unsurprising. As noted by scholars A. Kirsten Mullen and William A. Darity Jr., co-authors of “From Here to Equality: Reparations for Black Americans in the Twenty-First Century,” data from the 2016 Survey of Consumer Finances showed that median Black household net worth averaged $17,600 — a little more than one-tenth of median White net worth. As Mullen and Darity write, “white parents, on average, can provide their children with wealth-related intergenerational advantages to a far greater degree than black parents. When parents offer gifts to help children buy a home, avoid student debt, or start a business, those children are more able to retain and build on their wealth over their own lifetimes.”

Black author and activist Randall Robinson has argued that even laws such as those on affirmative action “will never close the economic gap. This gap is structural. … blacks, even middle-class blacks, have no paper assets to speak of. They may be salaried, but they’re only a few months away from poverty if they should lose those jobs, because … they’ve had nothing to hand down from generation to generation because of the ravages of discrimination and segregation, which were based in law until recently.”

In addition to the discrepancy in inherited wealth, even conservatives should be able to acknowledge that Whites enjoy generational associations in the business world, where who you know often counts more than what you know — a reality based not so much on overt racism as on employment and promotion patterns within old-school networks that Blacks lack the traditional contacts to consistently intersect. . .

Continue reading. There’s more.

The column concludes:

. . . It is a tenet of conservatism that a level playing field is all we should guarantee. But that’s meaningless if one team starts with an insurmountable lead before play even begins.

It’s not necessary to experience “White guilt” or buy into the notion of “White privilege,” a pejorative that to me suggests Whites possess something they should lose, when in fact such benefits should extend to all. Supporting reparations simply requires a universal agreement to work toward, as Jayapal said, “righting that wrong.”

Written by Leisureguy

25 April 2021 at 7:36 am

Why has nuclear power been a flop?

leave a comment »

Jason Crawford’s blog Roots of Progress has a very interesting post that begins:

To fully understand progress, we must contrast it with non-progress. Of particular interest are the technologies that have failed to live up to the promise they seemed to have decades ago. And few technologies have failed more to live up to a greater promise than nuclear power.

In the 1950s, nuclear was the energy of the future. Two generations later, it provides only about 10% of world electricity, and reactor design hasn‘t fundamentally changed in decades. (Even “advanced reactor designs” are based on concepts first tested in the 1960s.)

So as soon as I came across it, I knew I had to read a book just published last year by Jack Devanney: Why Nuclear Power Has Been a Flop.

What follows is my summary of the book—Devanney‘s arguments and conclusions, whether or not I fully agree with them. I‘ll give my own thoughts at the end.

The Gordian knot

There is a great conflict between two of the most pressing problems of our time: poverty and climate change. To avoid global warming, the world needs to massively reduce CO2 emissions. But to end poverty, the world needs massive amounts of energy. In developing economies, every kWh of energy consumed is worth roughly $5 of GDP.

How much energy do we need? Just to give everyone in the world the per-capita energy consumption of Europe (which is only half that of the US), we would need to more than triple world energy production, increasing our current 2.3 TW by over 5 additional TW: . . .

Continue reading. There’s much more, and he invites discussion.

Written by Leisureguy

23 April 2021 at 4:15 pm

The Case of the Sovereign Individual: Unlocking the Mystery of Rey Rivera

leave a comment »

A Small Subset of Agora-World, 1980-Present

Dave Troy has a fascinating article in Medium that seems to expose the tip of an iceberg:

On the evening of May 16th, 2006, Rey Rivera received a phone call, left his north Baltimore house in a hurry, and was never heard from again. Several days later, his body was found after his co-workers observed a hole in a lower roof of the Belvedere Hotel, located near his employer’s offices. With the help of the hotel’s staff, they located Rey’s badly injured and decomposed body in an unused conference room. Rey had been dead several days, and it is widely believed he died the same day he disappeared, probably as a result of a fall through the building’s lower roof. Police ruled the cause of his death as “undetermined.”

I generally research and write about disinformation—not unsolved murder mysteries. So in that sense, this case is completely outside of my normal area of expertise. But fate had other plans. You see, the Belvedere Hotel is across the street from my office in Baltimore. And I know Baltimore intimately—its quirks, characters, and irregularities—and I also am familiar with Rey’s former employer, Agora Financial.

And it was through my “Big History” essay series about the events of January 6th that I was directed back to both Agora and to Rey Rivera. After compiling some recent research on Agora’s history, I thought to re-watch Season 1, Episode 1 (Mystery on a Rooftop) of the 2020 Netflix reboot of “Unsolved Mysteries.” My memories about the Rivera case flooded back—and I realized there was a major overlap with my “Big History” analysis.

The show relied heavily on interviews with two investigative reporters, Stephen Janis and Jayne Miller. As fate would have it, I have known both of them for years, and count them as friends. I spoke with both of them at some length, and was able to flesh out my conclusions.

Agora Publishing was founded in 1978 by Bill Bonner and James Dale Davidson around a single financial publication called “International Living,” with appeal to people who enjoyed travel, lucrative investments, and importantly, minimizing their tax bills. The US tax code at that time (even more than now) was a labyrinth of loopholes and shelters such that anyone with sufficient motivation could find any number of ways to disguise profits as losses, and minimize taxes to near zero.

Over the next few decades, Agora sprouted a plethora of different newsletter products directed at different sub-audiences. One of the newsletters they started was called “The Strategic Investor.” Its editors were Bill Bonner, James Dale Davidson, and William Rees-Mogg— the former editor of the London Times and a member of the House of Lords—whom they cheekily sold as being connected to the MI5 intelligence agency.

In 1996, James Dale Davidson and William Rees-Mogg published a book called The Sovereign Individual, predicting the rise of phenomena such as cryptocurrency (they called it Digital Cash) and the eventual collapse of the nation state. The book advised people to start shopping for multiple residences and looking for jurisdictions that would give them the “best deal” in terms of taxes, perks, and other living arrangements. The hardcore libertarian investor Peter Thiel (and now CEO of Palantir) wrote the foreword for more recent revisions of the book.

This is the story of how Rey Rivera fell into the secret world behind Agora— and didn’t make it out. . .

Continue reading. There’s much more. And follow the links in the article.

Written by Leisureguy

22 April 2021 at 12:17 pm

A short history of a wrong direction the US embraced

leave a comment »

Heather Cox Richardson reviews some of the decisions and directions that brought the US to its current situation:

America today is caught in a plague of gun violence.

It wasn’t always this way. Americans used to own guns without engaging in daily massacres. Indeed, it always jumps out at me that the infamous St. Valentine’s Day Massacre of 1929, when members of one Chicago gang set up and killed seven members of a rival gang, was so shocking it led to legislation that prohibits automatic weapons in the U.S.

Eighty-nine years later, though, in 2018, another Valentine’s Day shooting at Marjory Stoneman Douglas High School in Parkland, Florida, killed 17 children and wounded 17 others. In response, then-President Donald Trump called for arming teachers, and the Republican-dominated Florida legislature rejected a bill that would have limited some high-capacity guns.

Our acceptance of violence today stands in striking contrast to Americans’ horror at the 1929 Valentine’s Day Massacre.

Today’s promotion of a certain kind of gun ownership has roots in the politics of the country since the Supreme Court handed down the 1954 Brown v. Board of Education of Topeka, Kansas, decision, which declared racial segregation in public schools unconstitutional. Since Democratic President Franklin Delano Roosevelt instituted a government that actively shaped the economy, businessmen who hated government regulation tried to rally opposition to get rid of that government. But Americans of the post-World War II years actually liked regulation of the runaway capitalism they blamed for the Great Depression.

The Brown v. Board decision changed the equation. It enabled those who opposed business regulation to reach back to a racist trope from the nation’s Reconstruction years after the Civil War. They argued that the active government after World War II was not simply regulating business. More important, they said, it was using tax dollars levied on hardworking white men to promote civil rights for undeserving Black people. The troops President Dwight Eisenhower sent to Little Rock Central High School in 1957, for example, didn’t come cheap. Civil Rights, then, promoted by the newly active federal government, were virtually socialism.

This argument had sharp teeth in the 1950s, as Americans recoiled from the growing influence of the U.S.S.R., but it came originally from the Reconstruction era. Then, white supremacist southerners who were determined to stop the federal government from enforcing Black rights argued that they were upset about Black participation in society not because of race—although of course they were—but rather because poor Black voters were electing lawmakers who were using white people’s tax dollars to lay roads, for example, or build schools.

In contrast to this apparent socialism, southern Democrats after the Civil War lionized the American cowboy, whom they mythologized as a white man (in fact, a third of the cowboys were men of color) who wanted nothing of the government but to be left alone (in reality, the cattle industry depended on the government). Out there on the western plains, the mythological cowboy worked hard for a day’s pay for moving cattle to a railhead, all the while fighting off Indigenous Americans, Mexicans, and rustlers who were trying to stop him.

That same mythological cowboy appeared in the 1950s to stand against what those opposed to business regulation and civil rights saw as the creeping socialism of their era. By 1959, there were 26 Westerns on TV, and in March 1959, eight of one week’s top shows were Westerns. They showed hardworking cowboys protecting their land from evildoers. The cowboys didn’t need help from their government; they made their own law with a gun.

In 1958, Republican Senator Barry Goldwater of Arizona rocketed to prominence after he accused the president from his own party, Dwight Eisenhower, of embracing “the siren song of socialism.” Goldwater had come from a wealthy background after his family cashed in on the boom of federal money flowing to Arizona dam construction, but he presented himself to the media as a cowboy, telling stories of how his family had come to Arizona when “[t]here was no federal welfare system, no federally mandated employment insurance, no federal agency to monitor the purity of the air, the food we ate, or the water we drank,” and that “[e]verything that was done, we did it ourselves.” Goldwater opposed the Brown v. Board decision and Eisenhower’s decision to use troops to desegregate Little Rock Central High School.

Increasingly, those determined to destroy the postwar government emphasized the hardworking individual under siege by a large, grasping government that redistributed wealth to the undeserving, usually people of color. A big fan of Goldwater, Ronald Reagan famously developed a cowboy image even as he repeatedly warned of the “welfare queen” who lived large on government benefits she stole.

As late as 1968, the National Rifle Association supported some forms of gun control, but that changed in the 1980s as the organization affiliated itself with Reagan’s Republican Party. In 1981, an assassin attempted to kill the president and succeeded in badly wounding him, as well as injuring the president’s press secretary, James Brady, and two others. Despite pressure to limit gun ownership, in 1986, under pressure from the NRA, the Republican Congress did the opposite: it passed the Firearms Owners’ Protection Act, which erased many of the earlier controls on gun ownership, making it easier to buy, sell, and transport guns across state lines.

In 1987, Congress began to consider the Brady Handgun Violence Prevention Act, otherwise known as the Brady Bill, to require background checks before gun purchases and to prevent certain transfer of guns across state lines. As soon as the measure was proposed, the NRA shifted into high gear to prevent its passage. The bill did not pass until 1993, under President Bill Clinton’s administration. The NRA set out to challenge the law in the courts.

While the challenges wound their way upward, the idea of individuals standing against a dangerous government became central to the Republican Party. . .

Continue reading. And do read the whole thing. It’s good to be reminded that choices have long-lasting impact.

Written by Leisureguy

20 April 2021 at 10:23 am

The conscious self constructed of memes one adopts: what happens when one’s basic meme set is not consistent?

leave a comment »

Panjo in the New Yorker reviews a biography of Edward Said. From that review:

. . . Multiple and clashing selves were Said’s inheritance from the moment of his birth, in 1935, in West Jerusalem, where a midwife chanted over him in both Arabic and Hebrew. The family was Episcopalian and wealthy, and his father, who had spent years in America and prided himself on having light skin, named him after the Prince of Wales. Said always loathed his name, especially when shortened to Ed. Sent as a teen-ager to an American boarding school, Said found the experience “shattering and disorienting.” Trained at Princeton and Harvard as a literary scholar in a Euro-American humanist tradition, he became an enthusiast of French theory, a partisan of Michel Foucault. In “Orientalism,” published two decades into a conventional academic career, Said unexpectedly described himself as an “Oriental subject” and implicated almost the entire Western canon, from Dante to Marx, in the systematic degradation of the Orient.

“Orientalism” proved to be perhaps the most influential scholarly book of the late twentieth century; its arguments helped expand the fields of anti-colonial and post-colonial studies. Said, however, evidently came to feel that “theory” was “dangerous” to students, and derided the “jaw-shattering jargonistic postmodernisms” of scholars like Jacques Derrida, whom he considered “a dandy fooling around.” Toward the end of his life, the alleged professor of terror collaborated with the conductor Daniel Barenboim to set up an orchestra of Arab and Israeli musicians, angering many Palestinians, including members of Said’s family, who supported a campaign of boycott and sanctions against Israel. While his handsome face appeared on the T-shirts and posters of left-wing street protesters worldwide, Said maintained a taste for Rolex watches, Burberry suits, and Jermyn Street shoes right up to his death, from leukemia, in 2003.

“To be a Levantine is to live in two or more worlds at once without belonging to either,” Said once wrote, quoting the historian Albert Hourani. “It reveals itself in lostness, pretentiousness, cynicism and despair.” His melancholy memoir of loss and deracination, “Out of Place” (1999), invited future biographers to probe the connection between their subject’s cerebral and emotional lives. Timothy Brennan, a friend and graduate student of Said’s, now warily picks up the gauntlet, in an authorized biography, “Places of Mind” (Farrar, Straus & Giroux). Scanting Said’s private life, including his marriages and other romantic liaisons, Brennan concerns himself with tracing an intellectual and political trajectory. One of the half-concealed revelations in the book is how close Said came, with his Levantine wealth and Ivy League education, to being a somewhat refined playboy, chasing women around the Eastern Seaboard in his Alfa Romeo. In Jerusalem, Said went to St. George’s, a boys’ school for the region’s ruling castes. In Cairo—where his family moved in 1947, shortly before Jewish militias occupied West Jerusalem—he attended the British-run Victoria College. There he was chiefly known for his mediocre marks and insubordinate ways; his classmates included the future King Hussein of Jordan and the actor Omar Sharif.

Cairo was then the principal metropolis of a rapidly decolonizing and politically assertive Arab world. The creation of the state of Israel—following a U.N. resolution, on Palestinian land—and the refugee crisis and wars that ensued were on everyone’s mind. Yet Said inhabited a bubble of affluent cosmopolitans, speaking English and French better than Arabic, and attending the local opera. When he was six years old, he started playing the family piano, a Blüthner baby grand from Leipzig, and he later received private lessons from Ignace Tiegerman, a Polish Jew famous for his interpretations of Brahms and Chopin. Said’s father, who ran a successful office-supply business, was socially ambitious, and his time in America had given him a lasting admiration for the West. At one point, he considered moving his entire family to the United States. Instead, in 1951, he contented himself with dispatching his son to Northfield Mount Hermon School, in rural Massachusetts.

Brennan shows how much Said initially was, as he once confessed, a “creature of an American and even a kind of upper-class wasp education,” distanced from the “uniquely punishing destiny” of an Arab Palestinian in the West. Glenn Gould recitals in Boston appear to have registered more with him than the earthquakes of the post-colonial world, such as the Great Leap Forward or the anti-French insurgency in Algeria. The Egyptian Revolution erupted soon after Said left for the U.S., and a mob of protesters burned down his father’s stationery shop. Within a decade, the family had moved to Lebanon. Yet these events seem to have had less influence on Said than the political currents of his new country did. Brennan writes, “Entering the United States at the height of the Cold War would color Said’s feelings about the country for the rest of his life.” Alfred Kazin, writing in his journals in 1955, already worried that intellectuals had found in America a new “orthodoxy”—the idea of the country as “world-spirit and world hope.” This consensus was bolstered by a professionalization of intellectual life. Jobs in universities, media, publishing, and think tanks offered former bohemians and penurious toilers money and social status. Said began his career at precisely this moment, when many upwardly mobile American intellectuals became, in his later, unforgiving analysis, “champions of the strong.”

Nonetheless, his own early impulse, born of an immigrant’s insecurity, was, as he later put it, to make himself over “into something the system required.” His earliest intellectual mentors were such iconic figures of American literary culture as R. P. Blackmur and Lionel Trilling. He wrote a prize-winning dissertation on Conrad; he read Sartre and Lukács. In his early writings, he faithfully absorbed all the trends then dominant in English departments, from existentialism to structuralism. Devoted to Chopin and Schumann, he seems to have been as indifferent to blues and jazz as he was to Arabic music. He adored Hollywood movies, but there is no evidence that, in this period, he engaged with the work of James Baldwin or Ralph Ellison, or had much interest in the civil-rights movement. When students protesting the war in Vietnam disrupted a class of his, he called campus security.

Brennan detects a hint of what was to come in a remark of Said’s about the dual selves of Conrad: one “the waiting and willing polite transcriber who wished to please, the other an uncooperative demon.” Much impotent anger seems to have long simmered in Said as he witnessed “the web of racism, cultural stereotypes, political imperialism, dehumanizing ideology holding in the Arab or the Muslim.” In a conversation filmed for Britain’s Channel 4, Said claimed that many of his cultural heroes, such as Isaiah Berlin and Reinhold Niebuhr, were prejudiced against Arabs. “All I could do,” he said, “was note it.” He watched aghast, too, the critical acclaim for “The Arab Mind,” a 1973 book by the Hungarian Jewish academic Raphael Patai, which described Arabs as a fundamentally unstable people.

It’s not hard to see how Said, upholding the “great books” courses at Columbia, would have come to feel intensely the frustrations that writers and intellectuals from countries subjugated by Europe and America had long experienced: so many of the canonical figures of Western liberalism and democracy, from John Stuart Mill to Winston Churchill, were contemptuous of nonwhite peoples. Among aspiring intellectuals who came to the U.S. and Europe from Asia, Africa, and Latin America, a sense of bitterness ran especially deep. Having struggled to emulate the cultural élite of the West by acquiring a knowledge of its literature and philosophy, they realized that their role models remained largely ignorant of the worlds they had come from. Moreover, the steep price of that ignorance was paid, often in blood, by the people back home.

It was the Six-Day War, in 1967, and the exultant American media coverage of Israel’s crushing victory over Arab countries, that killed Said’s desire to please his white mentors. He began reaching out to other Arabs and methodically studying Western writings about the Middle East. In 1970, he met Arafat, initiating a long and troubled relationship in which Said undertook two equally futile tasks: advising the stubbly, pistol-toting radical on how to make friends and influence people in the West, and dispelling Arafat’s impression that he, Said, was a representative of the United States. . .

Read the whole thing.

Written by Leisureguy

19 April 2021 at 6:09 pm

%d bloggers like this: