Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Education’ Category

Edward de Bono has passed away

leave a comment »

Edward de Bono looms large in my legend. He is among the authors in my list of books I find myself repeatedly recommending. The specific book I mention is Po: Beyond Yes and No, but he wrote many books, and I read a substantial number of them and did my best to apply what I learned from them.

He also established a foundation, CoRT (Cognitive Research Trust), which publishes an excellent set of materials to teach critical thinking skills to young children, a program that I wish would be universally adopted. (I doubt it will be. Children who learn critical thinking skills will start using their skills, to the dismay of parents who do not welcome questioning, thought, or dialogue.)

Stuart Jeffries writes de Bono’s obituary in the Guardian:

The thinker and writer Edward de Bono, who has died aged 88, once suggested that the Arab-Israeli conflict might be solved with Marmite. During a 1999 lecture to Foreign Office officials, the originator of the term lateral thinking argued that the yeast extract, though proverbially socially divisive, could do what politicians and diplomats had failed for years to achieve. The problem, as he saw it, was that people in the Middle East eat unleavened bread and so lack zinc, which makes them irritable and belligerent. Feeding them Marmite, therefore, would help create peace.

Through his 60-plus books, including The Mechanism of Mind (1969), Six Thinking Hats (1985), How to Have A Beautiful Mind (2004) and Think! Before It’s Too Late (2009) [I’ve added links to inexpensive secondhand copies of the books. – LG], as well as seminars, training courses and a BBC television series, De Bono sought to free us from the tyranny of logic through creative thinking. “What happened was, 2,400 years ago, the Greek Gang of Three, by whom I mean Aristotle, Plato, and Socrates, started to think based on analysis, judgment, and knowledge,” he said. “At the same time, church people, who ran the schools and universities, wanted logic to prove the heretics wrong. As a result, design and perceptual thinking was never developed.”

De Bono’s revolution began in 1967 with his book The Use of Lateral Thinking. Imagine, he said, that a money lender claims a merchant’s daughter in lieu of her father’s debt. The merchant and daughter concoct a compromise. The money lender will put a black stone in one bag and in the other, a white. If the daughter chooses the black stone, she will be doomed to marry the money lender and the debt cancelled; if the white she will stay with her father and the debt be cancelled. But as the trio stand on a pebble-strewn path, she notices the money lender putting a black stone in each bag. What should she do to avoid a nightmarish fate?

This is where lateral thinking – ie, employing unorthodox means to solve a problem – comes in. De Bono suggested the daughter should pick either bag, but fumble and drop her stone on to the path. “Since the remaining pebble is of course black, it must be assumed she picked the white pebble, since the money lender dare not admit his dishonesty.”

What De Bono called vertical thinking, typified by logic, would be useless in reaching this elegant solution. It is lateral thinking that creates new ideas – Einstein and Darwin, according to De Bono, were lateral thinkers. “Studies have shown that 90% of error in thinking is due to error in perception. If you can change your perception, you can change your emotion [a point stressed by Stephen Covey in 7 Habits of Highly Effective People — see this brief outline. – LG] and this can lead to new ideas. Logic will never change emotion or perception.”

De Bono believed humour was one of the most significant characteristics of the human mind, precisely for its basis in shifting perceptions. “Let me tell you a joke,” he said. “An old man dies and goes to hell. When he gets there, he sees his friend, a 90-year-old man, with a beautiful woman sitting on his knee. He says to his friend, ‘This can’t be hell, you’re not being punished, you’re having fun!’, to which his friend replies, ‘This is punishment – for her!’”

His most trenchant thinking concerned children’s education. “Schools waste two-thirds of the talent in society. The universities sterilise the rest,” he said. The Maltese thinker was particularly scathing of Britain, where, he claimed, rigid thinking and an obsession with testing led to many children leaving school “believing they are stupid. They are not stupid at all, many are good thinkers who have never had the chance to show it. But that lack of confidence will pervade the rest of their lives.”

Rather than teaching children to absorb information and repeat it, he argued, schools should equip them to think creatively. He once did a study in which he asked children to design a sleep machine, an elephant-weighing machine, a system for constructing a house and a system for building a rocket. His 1972 book Children Solve Problems described the results.

In Six Thinking Hats, De Bono suggested that business meetings might be more efficient if attendees wore imaginary colour-coded hats. The black hat signified negative or realistic thoughts; white, information; red, emotion; blue, management; green, creativity; and yellow, optimism. Everyone in the meeting would figuratively place a coloured hat on their heads. This way, he claimed, “ego would be taken out of the situation”.

The method found its devotees. Motorola, IBM, and Boeing reported cutting meeting times by half by applying it. De Bono reported that one of his clients, Ron Barbaro of Prudential Insurance, said that after suggesting an idea that executives might counter was too risky, “he would say: ‘Yes, that’s fine black hat thinking. Now let’s try the yellow hat.’”

De Bono was convinced about its importance. “The Six Thinking Hats method may well be the most important change in human thinking for the past 2,300 years,” he wrote in the preface to the book.

Certainly, he was rarely burdened with humility, informing the world that his childhood nickname was “Genius”. By contrast, he did not suffer detractors gladly. Years after a stinking review of Six Thinking Hats appeared in the Independent, written by Adam Mars-Jones, De Bono told the Guardian: “That book, we know, has saved $40m dollars and tens of thousands of man-hours. Now, some silly little idiot, trying to be clever, compared to the actual results, that just makes him look like a fool.”

Mars-Jones retorted that when his review appeared, De Bono “wrote to the editor [saying] … that he was entitled to compensation for the loss of earnings which my comments had inflicted on his lecture tours (which he assessed at £200,000). He seemed less taken with my proposal that he pay a dividend to every journalist who, by taking him seriously, had inflated his earning power.”

Born in Saint Julian’s Bay, Malta, Edward was the son of Joseph de Bono, a physician, and Josephine (nee O’Byrne), an Irish journalist. He went to St Edward’s college in Malta and jumped classes twice. “I was always three or four years younger than anyone else in my class.”

He qualified as a doctor at the Royal University of Malta before going to Christ Church, Oxford, as a Rhodes scholar to study for a master’s in psychology and physiology (1957), and a DPhil in medicine (1961). There, he represented the university in both polo and rowing, and set two canoeing records, one for paddling 112 miles from Oxford to London nonstop.

Following graduation he worked at Oxford as a . . .

Continue reading. An amazing manwith many good ideas.

Written by Leisureguy

17 June 2021 at 3:26 pm

To ban teaching about systemic racism is a perfect example of systemic racism

leave a comment »

I am indebted to The Eldest for pointing out the nice recursion of the title. Someone then commented about a video of a teacher who totally understands teenagers:

Teacher: I’m not allowed to teach you about critical race theory.

Class: What’s that?

Teacher: I’m not allowed to tell you.

Class: What?? Not fair! (Then they all looked it up in Wikipedia.)

Chris Argyris in his (excellent) books on management theory and what distinguishes a learning organization from one that resists learning. One difference, of course, is success vs. failure over the long term, but also organizations that resist learning typically have double-layer taboos on some topics within the organization: not only can you not talk about X, you also cannot talk about not talking about X. It will be interesting to see whether the Right is so far gone they will prohibit teachers from explaining why they cannot teach critical race theory. (My guess is that the Right is indeed so far gone — and even farther.)

Written by Leisureguy

17 June 2021 at 2:25 pm

Fixing equipment in the lab teaches life lessons

leave a comment »

I imagine we’ve all found that we learn a lot about how to do something by actually doing it. Experience is a fantastic teacher since it provides only feedback, not criticism, allows you to see for yourself the effects of mistakes you make, and offers the chance for active participation in finding solutions. James Crawford writes in Nature of what he learned from hands-on involvement:

The focus of my PhD thesis is examining ways of upgrading biomass to transportation fuels, and I regularly use a variety of analytical equipment and reactor systems in my laboratory at the Colorado School of Mines in Golden.

An expert is rarely on-site to assist with equipment repairs, which can range from simple tasks such, as replacing a used gasket on a vacuum chamber, to cumbersome rebuilds for pumps, furnaces, mass spectrometers and adsorption analysers.

Paying for specialist help is often financially out of the question — and, although reading through a manual for a broken stir plate might not be a bucket-list item, I have found, over the past four years of graduate school, that understanding and repairing equipment has given me more valuable experiences than I’d expected.

What I learnt from fixing a temperature controller

One key role of a chemical engineer working in industry or in the lab is process control — monitoring and controlling an operation to achieve the desired temperature, pressure, concentration or any other important parameter.

During my undergraduate studies at Montana State University in Bozeman, we were taught the fundamental theory, essential rules of thumb and computational methods for process control, but application of the knowledge was limited.

In the second year of my PhD, I had a chance to apply this knowledge when a power surge destroyed a previously functioning heater and temperature controller. This equipment worked in the same way as a household boiler and thermostat: tell the machine what temperature you want, and the system attempts to hit that target. Behind the scenes, control parameters determine how aggressively the system pursues that target.

After the power surge, the temperature controller would not power on. I enlisted the help of a fourth-year PhD student to diagnose and repair the damage. But, in the process of rebuilding, stored parameters in the memory of the temperature controller were lost.

I thought back to my undergraduate courses — and, after watching a few YouTube videos with the fourth-year student and checking Wikipedia, we successfully tuned and tested the rebuilt heating system. The process of diagnosing the problem, gathering relevant information and developing a solution was really empowering, and motivated me to continue fixing problems in the lab.

I was fortunate to take on this repair under the guidance of a more senior student, whose experience and patience was profoundly influential for me. In a graduate programme, it can be hard to find time to help others, so his efforts mentoring me were deeply appreciated and transformative for my future endeavours working with other students.

What I learnt from fixing a chemisorption analyser

A year later, I was trying to work out the structure of some catalyst materials that I had synthesized. Chemisorption, or the adsorption of vapour molecules in a sample, is a valuable analytical technique that provides information about the surface chemistry of a catalyst.

After being trained, I attempted to run my samples on our chemisorption system, but I found that the data were not reproducible. I spoke to some colleagues, and it became clear that the instrument was somehow malfunctioning, and so was being used only for basic qualitative analysis. For my purposes, it was important to fix the instrument so that the data collected from it were reproducible and quantitative. I got permission from the principal investigator in charge of the instrument, and teamed up with a chemistry PhD student to resolve the problem.

We ran a standard sample on the chemisorption instrument. This process would normally be automated, but we needed to catch the error as it occurred. We monitored the progress of the experiment for 12 hours. Taking turns watching the instrument and keeping notes, we discovered that a portion of the tubing was blocked: when gases were sent to . . .

Continue reading. There’s more.

My own experience in learning was finally nailing down how to make good tempeh, which involved also learning how to build a good incubator (it took two versions to learn how to build the next version best, though version 2 is perfectly workable). But the most recent batch is the best yet, and next with will chickpea-and-peanut tempeh.

Written by Leisureguy

17 June 2021 at 12:25 pm

Why People Fall For Conspiracy Theories

leave a comment »

In FiveThirtyEight Kaleigh Rogers and Jasmine Mithani have a clear explanation of why people succumb to conspiracy theories. They write:

Think of a conspiracy theorist. How do they see the world? What stands out to them? What fades into the background? Now think of yourself. How does the way you see things differ? What is it about the way you think that has stopped you from falling down a rabbit hole?

Conspiracy theories have long been part of American life, but they feel more urgent than ever. Innocuous notions like whether the moon landing was a hoax feel like child’s play compared to more impactful beliefs like whether vaccines are safe (they are) or the 2020 election was stolen (it wasn’t). It can be easy to write off our conspiracy theorist friends and relatives as crackpots, but science shows things are far more nuanced than that. There are traits that likely prime people to be more prone to holding these beliefs, and you may find that when you take stock of these traits, you aren’t far removed from your cousin who is convinced the world is run by lizard people.

Flying to conclusions

Let’s begin our tour of cognitive fallacies by going bird-watching. Picture, if you will, avid bird-watcher Fivey Fox at their two favorite spying spots. In one habitat, there are more cardinals than bluebirds — a 60:40 ratio — so Fivey calls that habitat “Big Red.” In the other habitat, “Big Blue,” there is an opposite ratio of bluebirds to cardinals (60 bluebirds, 40 cardinals).

In the interactive below, you initially can’t see which spot Fivey is bird-watching in, but you can see what bird they spot. After each sighting, Fivey notes the bird in their notebook, and then you decide whether you have seen enough to guess the correct habitat or you’d like to see more birds. Have a go! . . .

Continue reading. There’s much more.

Written by Leisureguy

16 June 2021 at 1:38 pm

The value of imitation in the arts

leave a comment »

Interesting quotation from David Perell’s newsletter:

I once met a painting coach who tells students to copy their favorite artists.

At first, students resist.

In response, the coach tells them to listen for friction. “Do you hear that resistance? It’s the whisper of your unique style.”

Through imitation, we discover our voice.

Written by Leisureguy

15 June 2021 at 7:28 pm

The Historian and the Murderer

leave a comment »

Dominique K. Reill, an associate professor in modern European history at the University of Miami, and the author most recently of The Fiume Crisis: Life in the Wake of the Habsburg Empire, writes in Zócalo:

On May 14, 2018, I was led into a nondescript courtroom in Kew Gardens, Queens to testify at a murder trial. I am a historian who loves details, and the resources involved in getting me into that humdrum room to be questioned with a jury to my left, a judge to my right, and a murderer sitting in front of me astounded. An entire system of asking, telling, tracking, and filing for the grand finale of live community listening and judging: no wonder so many historians love to study court cases.

From years of obsessively watching Law & Order, I had assumed my questioning would focus on the titillations mass media devours—which was how my name was associated with the crime in the first place. My involvement with the case did not begin January 31, 2015 when the 42-year-old Croatian historian William Klinger was shot twice in an Astoria park in broad daylight and left to die. After he was declared dead in a New York City emergency room, no one had informed me because I was irrelevant to his life. Three weeks later, however, I got emails and calls because the murderer claimed I was part of why Klinger had died.

Police determined that Klinger had been in the park alone with a friend, 49-year-old Alexander Bonich. They also discovered that Klinger had wired $85,000 to Bonich in order to purchase an apartment in Astoria. Anyone who knows anything about New York real estate can smell a rat in this story. An apartment in Astoria costs about $700,000, if you’re very lucky. In New York City, real estate fraud is a believable motive for killing. Bonich was arrested posthaste.

To counter the murder charge, Bonich insisted he shot Klinger out of self-defense. As Bonich told police and then a New York Times journalist, on the day of his death Klinger behaved strangely. He seemed unhinged, filled with emotional rage triggered by the fact that he had deserted his family in Europe “to meet a woman named Dominique.” With Klinger coming at him, Bonich insisted he had shot Klinger to forestall Klinger doing the same to him.

There are very few people connected with Croatian academia who share my first name. Within minutes of the New York Times article giving Bonich’s side of the story going live, a friend wrote me an email to alert me. Within an hour, my inbox was filled with queries from journalists and police. The idea that Klinger’s grieving wife and children would have to suffer the killer’s lies cut me to the quick and I responded by contacting anyone I could to set the record straight.

The New York Times immediately erased my name from the article they had published online. I gave journalists, police, and lawyers full access to all my communications with Klinger. At some point, the murderer had also asserted that Klinger and I had had a rendezvous in New York in the days prior to the shooting. To disprove this, it took just a few minutes to supply travel itineraries and credit card statements showing how I was nowhere near New York City at the time.

At Bonich’s trial three years later, I assumed I was being called to the stand to disprove assertions about Klinger’s relationship to me. Imagine my surprise, then, when two minutes into my deposition the prosecutor asked me, “What are the archives?”

In my professional life as a history professor at an elite research university, “what are the archives?” is a question that gets posed regularly, often by professors encouraging students to think about how history “gets made.” When the prosecutor asked me this question, it was in response to my explanation of how I had first met Klinger. I had said “I met him in the archives in Rijeka [Croatia],” assuming this was straightforward. When asked to elaborate, I still assumed that the question was not about the things I usually talk about when discussing archives, but about the nature of my relationship to the deceased.

Was it possible that the prosecutor feared the jury imagined we had met at some nightclub called “The Archives”? Maybe those Queens residents were picturing us drinking cocktails at a bar pretentiously decorated with old-school card catalogs, green banker’s lamps, and anachronistic maps? So, instead of answering what archives were in a professional sense, I focused on how unsexy—how all work, 8 a.m.-to-2 p.m. no fun—they are.

Here is where it became clear that all my assumptions about why I was in that courtroom were wrong. As I was explaining how archivists regularly introduce scholars to each other in the reading room, the defense attorney called out: “Judge, I’m going to object to the witness being nonresponsive.” Though the judge overruled the objection, the effect of the defense attorney’s intervention was significant.

From then on, my job in the almost 80 questions that followed was not to disabuse the court of ideas of adulterous encounters but instead to explain what this strange profession of “historian” was, and what role it played in bringing Klinger into that Astoria park on the day he died.

I told the jury how Klinger had attended some of the most prestigious institutions in Europe, how he had published widely in several languages, and how he was generally considered the expert in his field, even though he could not find permanent, full-time employment anywhere. A long-time motto repeated ad nauseam in academia is “Publish or perish.” In essence, I was there to explain how this historian perished in our profession even though he had published, and how his professional disappointment set him up for associating with someone who would kill him for real.

When reading over the court transcripts, it is hard to remember that we were all sitting together in that room because a man had died. The questions were not about Klinger or his murderer. Instead, they focused on the intricacies of how difficult it is for a historian to make a living.

I explained how historians can’t get academic jobs through individual merits in the U.S. or Europe. You need networks. I talked about “markets,” the expectations of what CVs (the academic term for resumes) should look like, and how getting noticed by universities is dependent not just on productivity but also on references from people of great esteem. With every explanation I gave, another question came up. What is a postdoc? What is an editor? What is a letter of recommendation? How does anyone get paid?

The questions kept coming because the answers I was giving made no sense to how people imagined someone survived as a professional historian. Weren’t historians like artists or writers? Wasn’t their worth and position dependent on the quality of what they produced? Or maybe they were like journalists, paid per column or through working on producing publications? Or maybe historians were like teachers, their employment opportunities dependent on the degrees they had obtained?

I’m sure it was confusing when I told the lawyers, judge, and jury about how the writing and publishing process works. I said: “People don’t make money working for journals; you do it as a volunteer for the state of the field. There are no paying jobs.” Both the defense attorney and the prosecutor had been under the impression that Klinger’s arrival in the United States would solve his miserable professional status in Europe. My testimony underscored that it was far from the truth—but that Klinger didn’t know it, and that’s what made him vulnerable.

Though he had published much and the solidity of his research was undeniable, Klinger had not proven himself as a man who worked within structures. He had never taught in an American classroom. He had no portfolio of teaching evaluations. He had not participated in a research facility where interdisciplinary collaboration was emphasized. He had almost no links within the wider profession, meaning there were few who could vouch for him to those outside his relatively obscure specialty. This also meant he could not help future students procure positions.

Klinger did history like a starving artist might: he worked alone, he published in the easiest and quickest (rather than the most prestigious) journals, and he struggled to broaden his profile. His lack of networks was partly a result of the fact that no one in Italy or Croatia would give him a permanent position. But it was also partly because he was so passionate about the researching and writing that he didn’t prioritize the other stuff.

I had explained to Klinger “at the archives” and in emails what I had said in court: procuring permanent employment in the United States is a slow, networked, highly professionalized process that proves unsuccessful for most. I had told him explicitly that there is no way to just publish, come, and get a job. But Klinger ignored me and decided instead to believe a man who told him what he wanted to hear.

Apparently, Bonich promised Klinger all: not just an apartment but also a job at Hunter College in New York City based on his qualifications, with no application, interview, or letters of recommendation required. That is as inconceivable as the $85,000 price tag for an Astoria apartment. Nonetheless, Klinger wanted to believe. The murderer also told the court Klinger had deserted his family in part because I had arranged a position for him as a journal editor in Maryland, one which would pay enough for him to build a new life for himself.

This, too, was not just a lie; it was impossible.

It didn’t matter that Klinger and I barely knew each other. It didn’t matter that the journal the killer named did not exist. It also didn’t matter that history journals do not pay book review editors. The killer told those lies because he thought they were believable, because that’s how he thought the historical profession worked. Just like Klinger, Bonich did not realize that there are almost no historians in the world who can survive on their writing, their editorships, or their qualifications. Historians in the United States are paid for how they work within institutions. And getting into the institutions is a herculean feat only the most obstinate should try to undertake.

We’ll never know how  . . .

Continue reading. There’s much more.

Written by Leisureguy

14 June 2021 at 1:46 pm

Exit the Fatherland: Germany’s work to rebuild its common culture

leave a comment »

In the decades following WWII, Germany deliberately and slowly reformed its cultural outlook and common values, working at every level to create a society that doesn’t encourage blindly following a leader. The US seems in need of a similar effort to build a culture of common values and understanding, and looking at how Germany did it might help (though the US seems loathe to learn from other countries’ experience). – See update below.

Helmut Walser Smith, the Martha Rivers Ingram Chair of History and professor of history at Vanderbilt University in Nashville, and author of The Butcher’s Tale: Murder and Antisemitism in a German Town (2002), The Continuities of German History: Nation, Religion, and Race across the Long 19th Century (2008), and Germany: A Nation in Its Time (2020), writes in Aeon:

After 12 years of fascism, six years of war, and the concentrated genocidal killing of the Holocaust, nationalism should have been thoroughly discredited. Yet it was not. For decades, nationalist frames of mind continued to hold. They prevailed on both sides of the so-called Iron Curtain and predominated in the Global North as well as in the developing world of the Global South. Even in the Federal Republic of Germany, the turn away from ‘the cage called Fatherland’ – as Keetenheuve, the main character in Wolfgang Koeppen’s novel The Hothouse (1953), called his depressingly nationalistic West Germany – didn’t commence immediately.

When the turn did begin, however, Keetenheuve’s country would set out on a remarkable journey – not one racing down the highway to cosmopolitanism, but rather a slow one that required a series of small steps leading to the gradual creation of a more pacific, diverse and historically honest nation – a better Germany.

After the collapse of the Third Reich, Germans widely blamed other countries for the Second World War. ‘Every German knows that we are not guilty of starting the war,’ asserted the Nazi journalist Hildegarde Roselius in 1946. With ‘every German’, this acquaintance of the American photographer Margaret Bourke-White certainly exaggerated. But in 1952, 68 per cent of Germans polled gave an answer other than ‘Germany’ to the question of who started the Second World War, and it was not until the 1960s that this opinion fell into the minority.

In the mid-1950s, nearly half of all Germans polled said ‘yes’ to the proposition that ‘were it not for the war, Hitler would have been one of the greatest statesmen of the 20th century.’ Until the late 1950s, nearly 90 per cent gave an answer other than ‘yes’ when asked if their country should recognise the Oder-Neisse line, the new border with Poland. Perhaps most revealing of all was their stance on Jews. On 12 June 1946, Hannah Arendt hazarded the opinion to Dolf Sternberger, one of occupied Germany’s most prominent publicists, that ‘Germany has never been more antisemitic than it is now.’ As late as 1959, 90 per cent of Germans polled thought of Jews as belonging to a different race – while only 10 per cent thought of the English in these terms.

The sum of these attitudes suggests that Keetenheuve’s cage called Fatherland remained shut for more than two decades after the fall of the Third Reich.

Like most of Europe and indeed the world, Germany lacked a powerful alternative discourse to nationalism. Until the 1970s, the United Nations Declaration of Human Rights possessed little traction in postwar Europe. Regional affiliations, such as those to Europe (or Pan-Africanism or Pan-Arabism), were more viable but as yet confined to a small number of elites. Strident defences of capitalism also did little to deplete the store of nationalist tropes. And on the western side of the Iron Curtain, anti-Communism supported rather than undermined Nazi-inspired nationalism.

The postwar world was, moreover, awash in new nation-states, especially as it shaded into the postcolonial era. In 1945, there were only 51 independent countries represented at the UN: 30 years later, there were 144. Whether in Jawaharlal Nehru’s India or Kwame Nkrumah’s Ghana, nationalism and promises of self-determination fired anti-colonial independence movements in Asia and Africa. In Europe, nationalism also continued to shape claims to group rights and territorial boundaries. In Germany, divided and not fully sovereign until 1990, it informed discussion over eventual unification, the right of the ethnic German expellees to return to their east European homelands, and the validity of Germany’s eastern borders. Indeed, it wasn’t until 1970, a quarter-century after the war, that the Federal Republic of Germany finally recognised as legitimate the German border (established at the Potsdam Conference in 1945) with Poland. And still nearly half the citizens of West Germany opposed the recognition.

The pervasiveness of exclusionary nationalism in the postwar period also reflected a new underlying reality. The Second World War had created a Europe made up of nearly homogeneous nation-states. A series of western European countries, now thought of as diverse, were at that time just the opposite. The population of West Germany who were born in a foreign country stood at a mere 1.1 per cent, and the minuscule percentage proved paradigmatic for the tessellated continent as a whole. The Netherlands had a still smaller foreign-born population, and foreigners made up less than 5 per cent of the population in Belgium, France and Great Britain. In the interwar years, eastern European countries such as Poland and Hungary had significant ethnic minorities and large Jewish populations. In the postwar period, both were all but gone, and Poles and Hungarians were largely on their own.

Nor, in the trough of deglobalisation, did Europeans often get beyond their own borders, and Germans were no exception. In 1950, most Germans had never been abroad, except as soldiers. Some 70 per cent of the adult women had never left Germany at all. Travel, a luxury enjoyed by the few, didn’t begin to pick up until the mid-1950s, while international travel became a truly mass phenomenon only in the 1970s, when most people had cars of their own. In the first decades of tourism, Germans mainly visited German-speaking destinations, such as the castles on the Rhine or the northern slopes of the Alps. In these decades, few Germans, save for the highly educated, knew foreign languages, and most other Europeans, unless migrant workers, were no different.

The cage called Fatherland was thus reinforced. The persistence in a world of nationalism of the habits of thought of a once-Nazified nation-in-arms constituted one set of reinforcements. The relative homogeneity of postwar nations and the lack of peacetime experiences abroad constituted another. There was also a third reinforcement keeping the cage shut. This was that Germans had something to hide.

In the postwar period, Germany was full of war criminals. The European courts condemned roughly 100,000 German (and Austrian) perpetrators. The sum total of convictions by the Second World War allies, including the United States, the Soviet Union and Poland, pushes that number higher still, as does the more than 6,000 offenders that West German courts would send to prison, and the nearly 13,000 that the much harsher judicial regimen of East Germany convicted.

Nevertheless, there was still a great deal left to cover up. Lower down the Nazi chain of command, a dismaying number of perpetrators of various shades of complicity got off without penalty or consequence. Two jarring examples might suffice. Only 10 per cent of Germans who had ever worked in Auschwitz were even tried, and only 41 of some 50,000 members of the murderous German Police Battalions, responsible for killing a half a million people, ever saw the inside of a prison.

Trials and sentences reveal only part of the story of complicity. Many Germans not directly involved in crimes had come into inexpensive property and wares. Detailed reports from the north German city of Hamburg suggest that, in that one city alone, some 100,000 people bought confiscated goods at auctions of Jewish wares. Throughout the Federal Republic, houses, synagogues and businesses once belonging to Jewish neighbours were now in German hands. Mutatis mutandis, what was true for the number of people involved in the murder and theft activities of the Third Reich also held true about what people knew. ‘Davon haben wir nichts gewusst’ (‘We knew nothing about that [the murder of the Jews]’), West German citizens never tired of repeating in the first decades after the war. Historians now debate whether a third or even half of the adult population in fact knew of the mass killings, even if most scholars concede that few Germans had detailed knowledge about Auschwitz.

The Germans shared a European fate here as well, even if they had the most to hide. In his trailblazing article ‘The Past Is Another Country: Myth and Memory in Postwar Europe’ (1992), the late Tony Judt pointed out the stakes that almost all of occupied Europe had in covering up collaboration with Nazi overlords. This wasn’t merely a matter of forgetting, as is sometimes assumed. Rather, it involved continuing and conscious concealment. After all, many people (especially in eastern Europe, where the preponderance of Jews had lived) had enriched themselves – waking up in ‘Jewish furs’, as the saying went, and occupying Jewish houses in what was surely one of the greatest forced real-estate transfers of modern history.

For all these reasons, the cage called Fatherland wasn’t easy to leave and, rather than imagine a secret key opening its door, it makes more sense to follow the hard work involved in loosening up its three essential dimensions: a warring nation, a homogeneous nation, and a cover-up nation. It wasn’t until West Germans could take leave of these mental templates that they could even begin to exit the cage. Fortunately, in the postwar era, Germany was blessed with prolonged prosperity, increased immigration, and the passing of time. When brought together with small, often courageous steps of individuals and institutions, these factors allowed West Germans eventually to embraced peace, diversity and the cause of historical truth: in short, to exit the cage.

he vision of ‘a living, not a deathly concept of Fatherland’, as Dolf Sternberger put it in 1947, had already been laid in the early years of occupation. Sternberger, who cut off the ‘A’ from his first name, argued for a different kind of nation, one that commanded openness and engagement but didn’t end in the glorification of killing and dying in war or in the marginalisation and persecution of others. The nation as a source of life, as a caretaker of its citizens, and not as a vehicle for power, expansion, war and death: this was Sternberger’s initial vision.

It was a conception of Germany that West Germans slowly embraced, symbolically replacing the warfare state with the welfare state, swapping barracks and panzers for department stores and high-performance cars. Enabled by . . .

Continue reading. There’s much more. I suspect that this essay is essentially an extract from his latest book.

Update

There’s still work to be done: see the report “German commando unit disbanded over suspected far-right links within ranks,” by Loveday Morris and Luisa Beck, published today (10 June 2021) in the Washington Post. It begins:

German authorities disbanded a Frankfurt police commando unit Thursday over suspected far-right links to a group of active officers, the latest in a string of extremist-related scandals to blight the country’s police and military.

Peter Beuth, interior minister for the Hesse state where Frankfurt is located, said “unacceptable misconduct” prompted the decision to close the unit. He also said superiors had turned a “blind eye.”

Hesse’s prosecutor on Wednesday said the office was investigating 20 officers from the force with the majority suspected of sending messages in far-right chat groups, including Nazi symbols and “inciting content.” Three supervising officers were accused of failing to stop or report the exchanges. All but one of the 20 was on active duty.

The chat groups were uncovered after examining the phone of an officer suspected of possessing and distributing child pornography.

One officer has been officially suspended, and the others have been “banned from conducting official business,” the public prosecutor said.

The move comes in the wake of revelations over far-right links that have embroiled Germany’s security forces, from other far-right chat groups sharing neo-Nazi content to a group of extremist doomsday preppers who hoarded ammunition ahead of “Day X.”

A court in Hesse is trying Franco Albrecht, a former soldier accused of posing as a Syrian refugee in an attempt to carry out a “false flag” attack. Hesse’s police chief was forced to resign last year after police computers were used to search for personal details of prominent figures before they were sent threatening letters and emails.

A year ago, Germany also partially disbanded its military’s elite commando force because of the extremist links of its officers.

Germany’s Federal Interior Minister Horst Seehofer has pushed back against assertions of structural racism or far-right sympathies in the country’s police forces. But he has agreed to commission a study into the issue last year as pressure grew amid a slew of such cases.

A similar study by Germany’s domestic intelligence agency said there were 370 suspected cases of right-wing extremism in the country’s police and security forces.

I will point out that far-right extremists have been discovered in US law enforcement (police departments and state police) and in the US military. Indeed, some of those active in the insurrection of January 6 were active in military service and some were active police officers. The problem is not unique to Germany.

Written by Leisureguy

9 June 2021 at 11:09 am

Where Gender-Neutral Pronouns Come From

leave a comment »

Michael Waters writes in the Atlantic:

On a frigid January day, Ella Flagg Young—the first woman to serve as superintendent of the Chicago public-school system—took the stage in front of a room of school principals and announced that she had come up with a new solution to an old problem. “I have simply solved a need that has been long impending,” she said. “The English language is in need of a personal pronoun of the third person, singular number, that will indicate both sexes and will thus eliminate our present awkwardness of speech.” Instead of he or she, or his or her, Young proposed that schools adopt a version that blended the two: he’erhis’er, and him’er.

It was 1912, and Young’s idea drew gasps from the principals, according to newspaper reports from the time. When Young used his’er in a sentence, one shouted, “Wh-what was that? We don’t quite understand what that was you said.”

Young was actually borrowing the pronouns from an insurance broker named Fred S. Pond, who had invented them the year prior. But in the subsequent weeks, her proposal became a national news story, earning baffled write-ups in the Chicago Tribune and the Associated Press. Some embraced the new pronouns—but many dismissed them as an unnecessary linguistic complication, and others despaired that the introduction of gender-neutral pronouns would precipitate an end to language as they knew it. An editor for Harper’s Weekly, for instance, insisted that “when ‘man’ ceases to include women we shall cease to need a language.”

Today’s gender-neutral English-language pronouns make space not just for two genders, but for many more, serving as a way for people who fall outside the binary of “man” and “woman” to describe themselves. In recent years especially, they’ve become a staple of dating apps, college campuses, and email signatures. In 2020, a Trevor Project survey found that one in four LGBTQ youth uses pronouns other than he/him and she/her, and the American Dialect Society named the singular they its word of the decade.

Meanwhile, commentators have forecast the demise of language once again. A 2018 Wall Street Journal op-ed went so far as to claim that using they/them pronouns amounted to “sacrilege,” and an Australian politician said that an effort to celebrate they/them pronouns was “political correctness gone mad.” Last month, after the singer Demi Lovato came out as nonbinary, a conservative commentator called they/them pronouns “poor grammar” and an example of “low academic achievement.” Bundled into these arguments is the idea that gender-neutral pronouns are a new phenomenon, an outgrowth of the internet that is only now spreading into other spheres—suggesting that the gender fluidity they describe is also a fad.

Until relatively recently, gender-neutral pronouns were something people used to describe others—mixed groups, or individuals whose gender was unknown—not something people used to describe themselves. But even though people did not, in Young’s time, personally identify as nonbinary in the way we understand it today (though some identified as “neuter”), neutral pronouns existed—as did an understanding that the language we had to describe gender was insufficient. For more than three centuries, at least, English speakers have yearned for more sophisticated ways to talk about gender.

Likely the oldest gender-neutral pronoun in the English language is the singular they, which was, for centuries, a common way to identify a person whose gender was indefinite. For a time in the 1600s, medical texts even referred to individuals who did not accord with binary gender standards as they/them. The pronoun’s fortunes were reversed only in the 18th century, when the notion that the singular they was grammatically incorrect came into vogue among linguists.

In place of they, though, came a raft of new pronouns. According to Dennis Baron, a professor emeritus of English at the University of Illinois at Urbana-Champaign who wrote the definitive history of gender-neutral pronouns in his book What’s Your Pronoun?, English speakers have proposed 200 to 250 pronouns since the 1780s. Although most petered out almost immediately after their introduction, a few took on lives of their own.

Thon—short for that one—has resurfaced frequently since an attorney named Charles Converse first introduced it as a more elegant way of writing he or she. Converse claimed to have coined the word as far back as 1858, but it didn’t actually appear publicly in a magazine until 1884. The word made a splash in

Continue reading.

Written by Leisureguy

5 June 2021 at 2:29 pm

(Trying To) Study Textbooks Effectively: A Year of Experimentation

leave a comment »

An interesting post at LessWrong:

When I started studying the art of studying, I wanted to understand the role of book learning. How do we best learn from a textbook, scientific article, or nonfiction book? What can a student of average intelligence do to stay on top of their homework? Is it possible to improve your annual knowledge growth rate by one or two percent by learning how to learn? Should a motivated student take a maximizing or satisficing approach to their coursework? How many of the skills of a top scholar are strategic, collaborative, psychological, or involve merely a set of habits and technological proficiencies?

Fortunately, I started with the most esoteric of approaches, exploring visualization. I tried using a memory palace to memorize a textbook. It was vivid, fun, and creative. Exploring visualization helped me understand chemical diagrams, led me to invent a math problem, and made learning a lot more fun. But I simply couldn’t jam that much detailed technical knowledge into my head. The method didn’t help me pass my final exam, and I dropped it.

Posts from this era include Visual Babble and PruneUsing a memory palace to memorize a textbookThe point of a memory palaceVisualizing the textbook for fun and profit,

After that, I explored speed reading. I read the theory, experimented both with physical technique and speed reading apps, and kind of broke my reading habits developing this difficult-to-correct tendency to skim. This tendency to read too quickly persisted long after I’d dropped deliberate attempts at speed reading. I finally made some intellectual progress, which preceded correcting the reading habit itself, in The Comprehension Curve.

Then I explored the world of Anki and tried to use flashcards to memorize a textbook instead (or at least a few chapters). After simulating the sheer amount of flashcard review I’d have to do to keep a strategy like that up long-term, I dropped that too. I felt that forming memories of narrow facts (like the structure of RNA polymerase or the name of the 7th enzyme in glycolysis) was the costliest way to learn. And I found the achievement of world-class memory champions irrelevant to real-world learning, which just seems like an entirely different task.

Posts from this area (not all on flashcards specifically) include The Multi-Tower Study StrategyDefine Your Learning Goal: Competence Or Broad KnowledgeProgressive Highlighting: Picking What To Make Into FlashcardsGoldfish ReadingCurious Inquiry and Rigorous Training, and Using Flashcards for Deliberate Practice.

During this time, I also played around with “just reading,” without a conscious technique. Posts from this era include Check OK, babble-read, optimize (how I read textbooks)Wild Reading,

Notes are cheap. It takes a lot less time to write down a fact than to memorize it. But I went further. I developed an elaborate and carefully-specified system of shorthand notation to represent causal, temporal, and physical structures. It used Newick notation for tree structures, variants on arrow signs to articulate causation, sequence, combination, and more, templates to rewrite the stereotyped information presented by textbooks in a uniform format, and hyperlinks in Obsidian to represent the relationships between concepts.

Not only did I take notes on the textbook, I also took notes on each individual homework problem. I also developed notes for other problems. I wrote Question Notes for The Precipice. This means that for each paragraph in the book, I wrote down one question to which that paragraph was a valid answer.

I never published any posts on note-taking. Partly, note-taking itself scratched that itch. But more importantly, it was a very fast iterative cycle. My methods developed day by day, over the course of months. I was experimenting with different software apps, tweaking the templates I used, figuring out how to expand my particular method of shorthand to represent complex structures. After all the shifts I’d made on my previous experiments, I thought I would spare LessWrong the tedious minutiae of my developing thoughts on note-taking. I’m confident that crafting the perfect notes in an elaborate and precise shorthand system is no a panacaea, so I don’t know if it’s worth bothering.

Exploring note-taking was as useful as visualizing was fun. The rigid structure of my note-taking approach gave me clear guidance on what it means to “read” or “study” a textbook chapter. They became a useful reference for looking things up. The idea of bringing together any data, formula, charts, or techniques I needed to solve a problem, and then making a plan of attack before setting to work, was a big upgrade for my accuracy and sense of ease.

Yet when my note-taking . . .

Continue reading.

Written by Leisureguy

4 June 2021 at 12:26 pm

Interesting finding: Voting is now driven by education, not class

leave a comment »

Read this post by Kevin Drum. Right-click image to open in a new tab, then click it to enlarge.

Written by Leisureguy

30 May 2021 at 3:54 pm

Posted in Education, Election

How to Ask Useful Questions

leave a comment »

Josh Kaufman has a useful post:

Asking useful questions is a skill, and it requires practice.

Inexperienced or naive questions sound like this:

“Hello! [Insert life story.] What should I do?”

Or this:

“I’m thinking about [action]. What do you think?”

Questions like these make a few critical mistakes:

  • They don’t include the context necessary for the recipient to answer the question.
  • They don’t respect the recipient’s time, energy, attention, or competing demands.
  • They implicitly transfer responsibility for the End Result from the questioner to the recipient.

As a result, questions like these go unanswered due to Friction – answering them would take too much effort, so the recipient doesn’t bother.

If you want useful answers, learn to ask better questions. In most cases, you’ll need to tailor the form of the question to the type of information you’re seeking.

Asking for Information

“I’m interested in more information about A, and I found you via B. Are you the best person to ask about this?”

Keys to information-seeking questions:

  • Be specific about the information you’re looking to obtain.
  • Give context by referencing why you’re contacting them and how you found their contact information.
  • Make it easy for the recipient to refer you to the best resource as quickly as possible, which will save you both time.

Asking for Clarification

“Based on our conversation about A, it sounds like B is the case. Is that correct?”

Keys to clarification questions:

  • Include a short summary of the topic for context.
  • “It sounds like…” leaves room for clarification without being confrontational.
  • “Is that correct?” (or a close variant) is clear, concise, direct, and polite.

Asking for Help

“I’m trying to A, and I’m having trouble. So far, I’ve tried B with result C, and D with result E. Now I’m stuck. Any guidance?”

Keys for asking for assistance:

  • Be clear and precise about what you’re trying to do.
  • Give context by including what you’ve tried so far, which makes it clear that you’re doing your own work and not asking the recipient to solve your problems for you.
  • “Any guidance?” or “What should I try next?” sets up the recipient as the expert and doesn’t transfer responsibility for the problem.

Asking for Agreement . . .

Continue reading.

Written by Leisureguy

30 May 2021 at 11:07 am

‘Centrism’: an insidious bias favoring an unjust status quo

leave a comment »

Rebecca Solnit writes in The Guardian:

The idea that all bias is some deviation from an unbiased center is itself a bias that prevents pundits, journalists, politicians and plenty of others from recognizing some of the most ugly and impactful prejudices and assumptions of our times. I think of this bias, which insists the center is not biased, not afflicted with agendas, prejudices and destructive misperceptions, as status-quo bias. Underlying it is the belief that things are pretty OK now, that the people in charge should be trusted because power confers legitimacy, that those who want sweeping change are too loud or demanding or unreasonable, and that we should just all get along without looking at the skeletons in the closet and the stuff swept under the rug. It’s mostly a prejudice of people for whom the system is working, against those for whom it’s not.

I saw a tweet the other day that said the Secret Service and US Capitol police must have been incompetent or complicit to be blindsided by the 6 January insurrection. The writer didn’t seem to grasp the third option: that the Secret Service was unable to see past the assumptions that middle-aged conservative white men don’t pose a threat to democracy and the rule of law, that elected officials in powerful places weren’t whipping up a riot or worse, that danger meant outsiders and others. A decade ago, when I went to northern Japan for the first anniversary of the Great Tohuko Earthquake and tsunami, I was told that the 100ft-high wave of black water was so inconceivable a sight that some people could not recognize it and the danger it posed. Others assumed this tsunami would be no bigger than those in recent memory and did not flee high enough. A lot of people died of not being able to see the unanticipated.

People fail to recognize things that do not fit into their worldview, which is why those in power have not adequately responded to decades of terrorism by white men – anti-reproductive-rights-driven killings, racial violence in churches, mosques, synagogues and elsewhere, homophobia and transphobia, the pandemic-scale misogynist violence behind a lot of mass shootings, attacks on environmentalists, and white supremacy in the ranks of the police and the military. Finally, this year the US attorney general, Merrick Garland, called this terrorism by its true name and identified it as “the most dangerous threat to our democracy”. The constant assumption has been that crime and trouble comes from outsiders, from “them”, not “us”, which is why last summer’s Black Lives Matter protests were constantly portrayed by conservatives and sometimes the mainstream as far more violent and destructive than they were and the right has had such an easy time demonizing immigrants.

What violence and destruction did take place in or adjacent to Black Lives Matter protests was often the work of the right wing. That includes the murder of a guard at a federal court in Oakland, allegedly by an air force sergeant and Boogaloo Boy, while a BLM protest was going on nearby. It also reportedly includes some of the arson in Minneapolis shortly after George Floyd’s murder, as well as attacks on protesters. USA Today reported 104 such attacks by cars driven into crowds, many of them apparently politically motivated.

No one has ever loved the status quo more than the editorial board at the New York Times, which recently composed an editorial declaring it a misstep for “the city’s Pride organizers … to reduce the presence of law enforcement at the celebration, including a ban on uniformed police and corrections officers marching as groups until at least 2025”. They found a lesbian of color who is also a cop and focused on this individual feeling “devastated”, rather than the logic behind the decision. Pride celebrates the uprising against longtime police violence and criminalization of queerness at the Stonewall Bar in 1969.

Police officers are in no way banned from participating out of uniform, if they so desire, but that’s not enough for these “can’t we all get along” editorialists, who also wrote: “But barring LGBTQ officers from marching is a politicized response and is hardly worthy of the important pursuit of justice for those persecuted by the police.” You want to shout that the whole parade is political, because persecution and inequality have made being LGBTQ political, and the decision to include the police would be no less political than to exclude them. And who decides what’s worthy? The idea that there is some magically apolitical state all should aspire to is key to this bias and to why it refuses to recognize itself as a bias. It believes it speaks from neutral ground, which is why it forever describes a landscape of mountains and chasms as a level playing field.

The status-quo bias is something I’ve encountered over and over again as gender violence, particularly as the refusal or inability to recognize that a high-status man or boy, be he film mogul or high-school football player, can also be a vicious criminal. Those who cannot believe the charges, no matter how credible, often dismiss and blame the victim instead (or worse: reporting a rape too often leads to death threats and other forms of harassment and intimidation intended to make an uncomfortable truth go away). Society has a marked failure of imagination when it comes to grasping that such predators treat their low-status victims in secret differently than their high-status peers in public, and that failure of imagination denies the existence of such inequality even as it perpetrates it.

It’s a failure born out of undue respect for the powerful. (Here I think of all the idiots who kept discovering “the moment Trump became presidential” over and over again, unable to comprehend that his incompetence was as indelible as his corruption and malice, perhaps because their respect for the institution inexorably extended to the grifter who barged into it.) Centrist bias is institutional bias, and all our institutions historically perpetrated inequality. To recognize this is to delegitimize them; to deny it is to have it both ways – think yourself on the side of goodness while insisting no sweeping change is overdue. A far-right person might celebrate and perpetrate racism or police brutality or rape culture; a moderate might just play down its impact, past or present.

To recognize the pervasiveness of sexual abuse is to have to listen to children as well as adults, women as well as men, subordinates as well as bosses: it’s to upend the old hierarchies of who should be heard and trusted, to break the silences that protect the legitimacy of the status quo. More than 95,000 people filed claims in the sexual-abuse lawsuit against the Boy Scouts of America, and what it took to keep all those children quiet while all those hundreds of thousands of assaults took place is a lot of unwillingness to listen and to shatter faith in an institution that was itself so much part of the status quo (and in many ways an indoctrination system for it).

Centrists in the antebellum era were apathetic or outright resistant to ending slavery in the US and then in the decades before 1920 to giving women the vote. . .

Continue reading. There’s much more.

Written by Leisureguy

29 May 2021 at 9:25 am

The Oldest Grandson is graduating from college today.

with 3 comments

A side-effect of the pandemic event is that much experience has been gained in the live-streaming of ceremonial events, so I am in (remote) attendance as he receives his degree from Johns Hopkins University.

Postscript: The ceremony was terrific, and not just the technology. It moved along at a good pace, interesting photos and videos were intermixed with speeches, and Michael Bloomberg gave an excellent commencement address. He emphasized the importance of personal relationships sustained and nourished by in-person (not merely online) encounters. I agree with him. He pointed out that being together in person promotes creativity and productivity, as well as one’s own emotional well-being and happiness.

Written by Leisureguy

27 May 2021 at 5:08 pm

Posted in Daily life, Education

How to be a genius

leave a comment »

Craig Wright, professor emeritus of music at Yale University and a member of the American Academy of Arts and Sciences and author of The Hidden Habits of Genius: Beyond Talent, IQ, and Grit  Unlocking the Secrets of Greatness (2020), has an interesting article in Aeon that begins:

Don’t get me wrong – yes, I’m a professor at Yale University, but I’m no genius. When I first mentioned to our four grown children that I was going to teach a new course on genius, they thought that was the funniest thing they’d ever heard. ‘You, you’re no genius! You’re a plodder.’ And they were right. So how did it come to pass that now, some dozen years later, I continue to teach a successful course on genius at Yale, and have written an Amazon Book of the Year selection, The Hidden Habits of Genius (2020). The answer: I must have, as Nikola Tesla urged, ‘the boldness of ignorance’.

I started my professional life trying to be a concert pianist, back in the days of the Cold War. The United States was then trying to beat the Soviet Union at its own games. In 1958, Van Cliburn, a 23-year-old pianist from Texas, won the inaugural International Tchaikovsky Competition, something akin to the Olympics of classical music. And then in 1972, Brooklyn’s Bobby Fischer defeated Boris Spassky in chess. Because I had shown an interest in music, and was also tall with enormous hands, I, too, would become the next Cliburn, at least so my mother declared.

Although our family wasn’t wealthy, my parents managed to provide me with a Baldwin grand piano and find the best teachers in our hometown of Washington, DC. Soon, I was packed off to the prestigious Eastman School of Music, where, once again, every opportunity was placed before me. And I had a strong work ethic: by the age of 21, I had engaged, by my estimation, in 15,000 hours of focused practice. (Mozart had needed only 6,000 to get to the level of master-composer and performer.) Yet, within two years, I could see that I would never earn a dime as a concert pianist. I had everything going for me except one: I lacked musical talent. No special memory for music, no exceptional hand-eye coordination, no absolute pitch – all things very necessary to a professional performer.

‘If you can’t compose, you perform; and if you can’t perform, you teach’ – that’s the mantra of conservatoires such as the Eastman School of Music. But who wants to spend each day in the same studio teaching other likely soon-to-fail pianists? My intuition was to find a larger arena in a university. So off I went to Harvard to learn to become a college professor and a researcher of music history – a musicologist, as it’s called. Eventually, I found employment at Yale as a classroom instructor teaching the ‘three Bs:’ Bach, Beethoven and Brahms. Yet the most captivating composer I ran into there was an M: Mozart. My interest in him accelerated with the appearance of the Academy Award-winning film Amadeus (1984). For a time, the entire world seemed obsessed with this funny, passionate and naughty character.

Thus it was a movie, of all things, that caused me to shift the focus of my academic research to Mozart. Yet the cardinal principle of scholarship I’d been taught at Harvard remained the same: if you seek the truth, consult the original primary sources; the rest is simply hearsay. Thus, over the course of 20 years, I went in search of Mozart in libraries in Berlin, Salzburg, Vienna, Krakow, Paris, New York and Washington, studying his autograph (or handwritten) music manuscripts. I found that Mozart could effortlessly conceive of great swaths of music entirely in his head, with almost no corrections. What Salieri said of Mozart in Amadeus no longer seems so fanciful: here ‘was the very voice of God’.

To hold in your hands the divine pages of a Mozart autograph – even if wearing the oft-required white gloves – is at the same time an honour and an exhilaration. The fluctuating angles of his pen, changing size of his note heads and varying tints of ink provide an insight as to how his mind is working. As if invited into Mozart’s study, you watch as this genius, empowered by his huge natural gifts, enters a creative zone, and the music just pours forth.

What other genius, I wondered, worked like Mozart? Here again, it was the autograph manuscripts that drew me in. Who among us has not been attracted to the fascinating designs of Leonardo da Vinci – his sketches of ingenious machines and instruments of war, as well as pacifist paintings? Unlike the original manuscripts of Mozart, the drawings and notes of Leonardo (some 6,000 pages survive) have mostly been published in facsimile editions, and many are now available online.

If Mozart could hear in his head how the music ought to go, Leonardo, judging from his sketches, could simply see in his mind’s eye how the machine should work or the painting should look. Here, too, Leonardo’s natural technical facility is manifest, as seen in the hand-eye coordination that results in correct proportions and the cross-hatching lines that suggest three-dimensional perception. Likewise evident is Leonardo’s relentless curiosity. We watch his mind range across an endless horizon of interconnected interests; on one page, for example, a heart becomes the branches of a tree, which then become the tentacles of a mechanical pulley. How do all these seemingly disparate things of the world hang together? Leonardo wanted to know. With good reason, the cultural historian Kenneth Clark called him ‘the most relentlessly curious man in history’.

Mozart in music, Leonardo in art; what about the everyday world of politics? Here the perfect subject of a study of genius was close at hand: Elizabeth I, queen of England. The Beinecke Rare Book and Manuscript Library at Yale owns copies of every history of her reign written by her contemporaries. The secret to her success? Elizabeth not only read books voraciously (three hours a day was her wont) but also people. She read, she studied, she observed, and she kept her mouth shut (Video et taceo was her motto). By knowing all and saying little, Elizabeth ruled for nearly 45 years, laid the foundations of the British empire and fledgling capitalist corporations, and gave her name to an entire epoch, the Elizabethan era.

Fascinating! I was learning so much. Why not have students learn along with me – after all, that’s why we have these young people cluttering up the place! And that’s how my genius course – or ‘Exploring the Nature of Genius’ – came to be.

Perhaps it takes a non-genius to analyse how exceptional human accomplishment happens. During my years at Harvard and at Yale, I met a lot of smart people, including a half-dozen Nobel Prize winners. If you’re a prodigy with a great gift for something, you can simply do it – yet might not be aware of why and how. And you don’t ask questions. Indeed, the geniuses I met seemed too preoccupied with committing acts of genius to consider the cause of their creative output. Maybe an outsider looking in has a clearer overview of how the magic gets done.

Year after year, increasing numbers of Yale students enrolled in my course to find the answer but, from the very first, something unexpected happened, and I should have seen it coming: the appreciation of genius turns out to be gender-biased.

Although the ratio of Yale undergraduates is now 50/50 male-female, and although the genius course is a general humanities class open to all, annually the enrolment in that class skews about 60/40 male-female. Students at Yale and other liberal arts colleges vote with their feet and, despite favourable course evaluations, women at Yale don’t seem to be as interested in exploring the nature of genius as their male counterparts are.

Why, I wondered. Are women less excited by competitive comparisons that rank some people as ‘more exceptional’ than others? Are they less likely to value the traditional markers of genius in a winner-take-all world – things such as the world’s greatest painting or most revolutionary invention? Does the absence of female mentors and role models play a part? Why take a course in which the readings, once again, will be mostly about the triumphant accomplishments of ‘great [mostly white] men’? Was the very way I’d framed this course perpetuating, once again, an unconscious bias against women and the assumption of a white cultural supremacy?

Happily, I ultimately ‘capped’ the course at 120 students and, thus, could do bit of social engineering. I was at liberty to admit whom I wished and thereby assure a representative proportion of women and minority students. The aim was not to fill quotas, but to increase diversity of opinion and inspire robust argumentation, things especially useful in a course in which there’s no answer.

‘There is no answer! There is no answer! There is no answer!’ chanted 120 eager undergraduates in the first session of the ‘genius course’, as I urged them on. Students typically want an answer to put into their pocket as they leave class, one they can later deploy on a test – but I felt that it was important to make this point immediately. To the simple question ‘What is genius?’ there’s no answer, only opinions. As to what drives it – nature or nurture – again, no one knows.

The question ‘Nature or nurture?’ always provoked debate. The quant types (mathematics and science majors) thought genius was due to natural gifts; parents and teachers had told them that they’d been born with a special talent for quantitative reasoning. The jocks (varsity athletes) thought exceptional accomplishment was all hard work: no pain, no gain. Coaches had taught them that their achievement was the result of endless hours of practice. Among novice political scientists, conservatives thought genius a God-given gift; liberals thought it was caused by a supportive environment. No answer? Call in the experts: readings from Plato, William Shakespeare and Charles Darwin to Simone de Beauvoir followed, but each had his or her own take.

The students hoped for something more concrete. Some wanted to know if they were already geniuses and what their futures might hold. Most wanted to know how they, too, might become a genius. They had heard that I’d studied geniuses from Louisa May Alcott to Émile Zola, and thought that I might have found the key to genius. So I asked: ‘How many of you think you already are or have the capacity to be a genius?’ Some timidly raised their hands; the class clowns did so emphatically. Next: ‘If you’re not one already, how many of you want to be a genius’? In some years, as many as three-quarters of the students raised their hands. Then I asked: ‘OK, but what exactly is a genius?’ Excitement turned to puzzlement, which was followed by a two-week quest to formulate a definition of genius, one that usually ended with the following sort of hypothesis:

A genius is a person of extraordinary mental powers whose original works or insights change society in some significant way for good or for ill across cultures and across time.

Only gradually, and not until I’d written my book The Hidden Habits of Genius, did I come to see that this complex verbiage might be simplified into something akin to a ‘genius equation’.

Here was a formula that students and the populace at large could immediately grasp: . . .

Continue reading. There’s more.

Written by Leisureguy

19 May 2021 at 4:25 pm

The attack on American foundational principles

leave a comment »

Heather Cox Richardson writes:

I wanted to note that on this day in 1954, the Supreme Court handed down the Brown v. Board of Education of Topeka, Kansas, decision, declaring racial segregation in public schools unconstitutional. A unanimous court decided that segregation denied Black children the equal protection of the laws guaranteed by the Fourteenth Amendment, which was ratified in 1868 in the wake of the Civil War. Brown v. Board was a turning point in establishing the principle of racial equality in modern America.

Since the 1860s, we have recognized that equality depends upon ensuring that all Americans have a right to protect their own interests by having a say in their government.

Today, that principle is under attack.

In 1965, President Lyndon B. Johnson urged Congress to pass the Voting Rights Act to “help rid the Nation of racial discrimination in every aspect of the electoral process and thereby insure the right of all to vote.” And yet, in 2013, the Supreme Court gutted that law, and in the wake of the 2020 election in which voters gave Democrats control of the government, Republican-dominated states across the country are passing voter suppression laws.

Today, Senators Joe Manchin (D-WV) and Lisa Murkowski (R-AK) begged their colleagues to reinstate the Voting Rights Act. In 2006 a routine reauthorization of the law got through the Senate with a vote of 98-0; now it is not clear it can get even the ten Republican votes it will need to get through the Senate, so long as the filibuster remains intact.

But here’s the thing: Once you give up the principle of equality before the law, you have given up the whole game. You have admitted the principle that people are unequal, and that some people are better than others. Once you have replaced the principle of equality with the idea that humans are unequal, you have granted your approval to the idea of rulers and servants. At that point, all you can do is to hope that no one in power decides that you belong in one of the lesser groups.

In 1858, Abraham Lincoln, then a candidate for the Senate, warned that arguments limiting American equality to white men and excluding black Americans were the same arguments “that kings have made for enslaving the people in all ages of the world…. Turn in whatever way you will—whether it come from the mouth of a King, an excuse for enslaving the people of his country, or from the mouth of men of one race as a reason for enslaving the men of another race, it is all the same old serpent.” Either people—men, in his day—were equal, or they were not.

Lincoln went on, “I should like to know if . . .

Continue reading.

Written by Leisureguy

18 May 2021 at 2:28 pm

How to be excellent

leave a comment »

Benjamin Studebaker, a graduate teaching assistant in politics and international studies at the University of Cambridge and a teaching associate at Gonville and Caius College, Cambridge, writes in Psyche:

So you want to be excellent at something? You don’t just want to be OK at it, to be able to get by or make a living. It’s not even enough to be rich and famous. Nickelback is a big Canadian band and they’ve made a ton of money, but most people don’t think their music is excellent. They are undeniably successful – but excellent? Excellence is a whole different thing.

Most of the advice out there is either about how to survive, or how to be successful. It’s also pretty two-dimensional. On one side, there are the people who tell you to work hard and be productive. Then there’s the other side, the people who tell you to ‘practise self-care’ to avoid burnout. Many self-help writers have made a lot of money from taking one of these sides and trashing the other.

Those writers are successful, but the advice they’re giving people isn’t excellent. It’s obvious that if we spend all our time just trying to get through the day, we won’t grow. But it’s also obvious that if we become obsessed with perfect ideals, we’ll burn out. You need a sustainable balance, a workable distribution of your time and energy. But distributing your time effectively is just the first step. The second step is to use your time in a way that leads to excellence rather than mere success.

Plato and Aristotle can help you with this. The Greek philosophers were wealthy aristocrats who didn’t have regular jobs. Because they had plenty of time and plenty of money, they could spend their whole lives thinking about what excellence really means. They didn’t have to worry about survival, because they were born with an income. They weren’t interested in success because, when you’re born rich, it’s not hard to be successful. They wanted to pursue the highest good, and they wanted that pursuit to be the object of everything they did. Even though you’re likely not a wealthy Greek aristocrat, you still have much to learn from them about excellence.

The first thing they noticed about being human is that even rich people are not gods. Everyone has a body, and our bodies have needs. Plato tells a story about this in one of his dialogues called the Phaedrus. He imagines the human being as a flying chariot, pulled by winged horses. The chariot has three parts. There is the rider, interested in truth, goodness and beauty. He wants to fly the chariot high into the sky, above the clouds, where these ideals can be discovered. But the rider has no wings. To get to the heavens, he relies on two horses – one light, and one dark. The light horse wants to be well regarded, prizing honour and status above all things. It responds to blame and praise. The dark horse wants to enjoy the pleasures of the world. It wants food, sex, sleep and every kind of luxury. The dark horse has no shame, but it fears the rider’s whip. For just as the dark horse values pleasure, it fears pain.

The rider can come to know excellence only if he can get these horses to fly the chariot up above the clouds, but the horses have no deep interest in what’s up there. The rider must motivate them by giving the horses enough of what they want to get them to cooperate, but not so much as to allow them to become too strong and drag the chariot wherever they wish. Ignore the horses outright, and they grow weak and disobedient. Cater to the horses too much, and they run the show. To achieve a type of excellence that gets at genuine value, we have to go beyond pleasure and status, but we can’t leave pleasure and status behind entirely. This type of excellence incorporates our physical and social needs, but goes beyond them, approaching value itself as an abstract ideal. To get there, a balance is needed, but what does that balance look like?

Think it through

Find a good social environment

Bringing balance to the chariot is a big challenge for a person. But it’s not a challenge we face alone. For Plato, the community we live in helps us take care of our horses. We don’t all grow our own food, make our own shelter, and provide our own entertainment. Other people help us meet the needs of the dark horse. And how can the light horse be satisfied without other people to make us feel valued and worthy? Plato argues that some social roles help us fly the chariot better than others. He even tries to make a list and put them all in order. Some roles barely give us enough to survive, much less thrive. Others give us comfort but aren’t respected. Some are respected but give us little comfort. A few yield comforts and respect but leave us without enough time to properly strive for excellence. When you’re choosing your work, your friends and your relationships, you have to keep all three things in mind. Miss comfort, and you’ll find yourself controlled by the need to be comfortable. Miss respect, and you’ll be controlled by the need to be respected. If you don’t leave time to strive, all you’ll do is survive.

Distribute your time well

How do we manage to obtain all three things in just one life? In the Politics, Aristotle distinguishes between ‘leisure’ and ‘play’. For him, leisure is time we spend learning and contemplating, trying to achieve excellence. Play is about rest and recovery. It might help you to think of Aristotle’s leisure as ‘growth’ and Aristotle’s play as ‘recovery’. So, for Aristotle, we spend our days doing three things – work, growth and recovery. The difficult thing is that both work and growth cost time and energy. Growing is at least as energy-intensive as working. We need time to recover from both activities.

When the eight-hour workday was first achieved, there was a slogan that went along with it: . . .

Continue reading.

Written by Leisureguy

5 May 2021 at 4:40 pm

Vaccine skepticism stems not from ignorance but from beliefs and values

leave a comment »

Sabrina Tavernise reports in the NY Times:

For years, scientists and doctors have treated vaccine skepticism as a knowledge problem. If patients were hesitant to get vaccinated, the thinking went, they simply needed more information.

But as public health officials now work to convince Americans to get Covid-19 vaccines as quickly as possible, new social science research suggests that a set of deeply held beliefs is at the heart of many people’s resistance, complicating efforts to bring the coronavirus pandemic under control.

“The instinct from the medical community was, ‘If only we could educate them,’” said Dr. Saad Omer, director of the Yale Institute for Global Health, who studies vaccine skepticism. “It was patronizing and, as it turns out, not true.”

About a third of American adults are still resisting vaccines. Polling shows that Republicans make up a substantial part of that group. Given how deeply the country is divided by politics, it is perhaps not surprising that they have dug in, particularly with a Democrat in the White House. But political polarization is only part of the story.

In recent years, epidemiologists have teamed up with social psychologists to look more deeply into the “why” behind vaccine hesitancy. They wanted to find out whether there was anything that vaccine skeptics had in common, in order to better understand how to persuade them.

They borrowed a concept from social psychology — the idea that a small set of moral intuitions forms the foundations upon which complex moral worldviews are constructed — and applied it to their study of vaccine skepticism.

What they discovered was a clear set of psychological traits offering a new lens through which to understand skepticism — and potentially new tools for public health officials scrambling to try to persuade people to get vaccinated.

Dr. Omer and a team of scientists found that skeptics were much more likely than nonskeptics to have a highly developed sensitivity for liberty — the rights of individuals — and to have less deference to those in positions of power.

Skeptics were also twice as likely to care a lot about the “purity” of their bodies and their minds. They disapprove of things they consider disgusting, and the mind-set defies neat categorization: It could be religious — halal or kosher — or entirely secular, like people who care deeply about toxins in foods or in the environment.

Scientists have found similar patterns among skeptics in Australia and Israel, and in a broad sample of vaccine-hesitant people in 24 countries in 2018.

“At the root are these moral intuitions — these gut feelings — and they are very strong,” said Jeff Huntsinger, a social psychologist at Loyola University Chicago who studies emotion and decision-making and collaborated with Dr. Omer’s team. “It’s very hard to override them with facts and information. You can’t reason with them in that way.”

These qualities tend to predominate among conservatives but they are present among liberals too. They are also present among people with no politics at all.

Kasheem Delesbore, a warehouse worker in northeastern Pennsylvania, is neither conservative nor liberal. He does not consider himself political and has never voted. But he is skeptical of the vaccines — along with many institutions of American power.

Mr. Delesbore, 26, has seen information online that a vaccine might harm his body. He is not sure what to make of it. But his faith in God gives him confidence: Whatever happens is God’s will. There is little he can do to influence it. . .

Continue reading.

Written by Leisureguy

29 April 2021 at 12:04 pm

Defund the police? Instead, end toxic masculinity and ‘warrior cops’

leave a comment »

Angela Workman-Stark, Associate Professor, Organizational Behaviour, Athabasca University, writes in The Conversation:

The police officer charged with murder in the death of George Floyd has been convicted in Minneapolis amid continued calls for defunding or abolishing police forces — not just in the United States, but in Canada and other places that have also grappled with police brutality.

The problem with these proposals is that they’re presented as solutions to police abuse without an appreciation that some element of coercive authority will still be required in society. Consequently, these efforts are unlikely to be successful.

Many of the calls for drastic change highlight the failure of police reform efforts. While many attempts at change have met with limited success, I suggest the reason for these outcomes is not because change is impossible; it’s more to do with an unwillingness to confront systemic issues within police forces.

For example, the former commissioner of the RCMP indicated that workplace misconduct and other forms of abusive behaviour were simply the actions of a few “rotten apples.”

As a former chief superintendent with the RCMP, where I held leadership roles implementing cultural change within the organization, I believe this statement ignores the potential potency of the police socialization process and what happens when new recruits come in the door.

An emphasis on danger and risk

From the early days of training, police recruits are socialized by war stories that glamorize the dangerous aspects of police work and place an exaggerated focus on the mission of police to deal with danger as the supposed gatekeepers of society.

Ultimately, these narratives shape expectations of what it means to be a “real” police officer. For some individuals, becoming a real police officer means doing the dirty work that no one else wants to do, including whatever it takes to put “bad guys” in jail.

But rather than promoting an image of police working with communities to solve problems, this emphasis on physicality and fighting crime has helped craft the image of the “warrior cop” who is ready to do battle and is isolated from the public.

The continued preoccupation with danger and crime control means that aggression, competitiveness and physical action are often associated with the image of the ideal police officer.

To determine who fits in and who doesn’t, clear distinctions are frequently made between the tasks of “real policing” and those discounted as feminine, such as the prevention aspects of the job.

Building on prior studies, my research shows that the pressure to conform and fit in can be so intense that officers engage in masculinity contests (the competitive pursuit of workplace status that is defined by traditionally “masculine” rules) by adopting these supposedly desirable forms of masculinity and avoiding any actions that might be deemed weak or unmanly.

Toxic masculinity

As noted in a report on sexual harassment within the RCMP, when masculinity contest norms are endorsed by police organizations, they can have grave consequences for the women (and even men) who are viewed as a weak fit.

In addition to officers hiding poor health or taking excessive risks, I also illustrate in my research how . . .

Continue reading. There’s much more.

And to illustrate how police training currently instills destructive ideas and dangerous values, watch this clip from a police training session

Written by Leisureguy

26 April 2021 at 11:44 am

Maslow’s Hierarchy of Needs: He botched it.

leave a comment »

Teju Ravilochan wrote a piece for GatherFor: on Medium. The thrust reflects the priorities and purposes of the group, which is based in New York City and works to develop and network small community groups to build community belonging and resilience. The piece begins:

Some months ago, I was catching up with my dear friend and board member, Roberto Rivera. As an entrepreneur and community organizer with a doctorate and Lin-Manuel-Miranda-level freestyle abilities, he is a teacher to me in many ways. I was sharing with him that for a long time, I’ve struggled with Maslow’s Hierarchy of Needs.

The traditional interpretation of Maslow’s Hierarchy of Needs is that humans need to fulfill their needs at one level before we can advance to higher levels.

Maslow’s idea emerged and was informed by his work with the Blackfeet Nation through conversations with elders and inspiration from the shape and meaning of the Blackfoot tipi. Maslow’s idea has been criticized for misrepresenting the Blackfoot worldview, which instead places self-actualization as a basis for community-actualization and community-actualization as a basis for cultural perpetuity, the latter of which exists at the top of the tipi in Blackfoot philosophy.

The Blackfoot Tipi

This is a slide from a presentation by Cindy Blackstock, a member of the Gitksan First Nation and University of Alberta Professor, shared in Karen Lincoln Michel’s blog. She describes Maslow’s theory as “a rip off of the Blackfoot nation.”

Maslow’s Failure to Elevate the Blackfoot Model

Continue reading. There’s much more.

It does strike me that in the US today the prevailing view of individuality above all — one’s own individual desires and needs being paramount, with community needs much less in the picture — has resulted in some bad outcomes for all.

Written by Leisureguy

26 April 2021 at 11:01 am

The Secret Life of Components: A YouTube series

leave a comment »

I stumbled across one of videos in the Components series (it was Springs) and found it informative and pleasant to watch. Lots of knowledge and experience conveyed. Go to Tim Hunkin’s page for links and an explanation.

This is one not to miss if you possibly can.

Written by Leisureguy

25 April 2021 at 12:58 pm

%d bloggers like this: