Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Psychology’ Category

A randomized controlled trial investigates diet and psychological well-being.

leave a comment »

Written by Leisureguy

4 August 2021 at 12:06 pm

The Abilene Paradox: The Management of Agreement

leave a comment »

The Abilene Paradox is well-known in management, and it’s an interesting social phenomenon in any case, particularly if you have an interest in small-group sociology. The article that introduced the idea is available as a PDF, which also includes spinoffs and exercises. The article itself begins:

The July afternoon in Coleman, Texas (population 5,607) was particularly hot—104 degrees as measured by the Walgreen’s Rexall Ex-Lax temperature gauge. In addition, the wind was blowing fine-grained West Texas topsoil through the house. But the afternoon was still tolerable — even potentially enjoyable. There was a fan going on the back porch; there was cold lemonade; and finally, there was entertainment. Dominoes. Perfect for the conditions. The game required little more physical exertion than an occasional mumbled comment. “Shuffle em,” and an unhurried movement of the arm to place the spots in the appropriate perspective on the table. All in all, it had the makings of an agreeable Sunday afternoon in Coleman — that is, it was until my father-in-law suddenly said, “Let’s gel in the car and go to Abilene and have dinner at the cafeteria.”

I thought, “What, go to Abilene? Fifty-three miles? In this dust storm and heat? And in an unairconditioned 1958 Buick?”

But my wife chimed in with, “Sounds like a great idea. I’d like to go. How about you, Jerry?” Since my own preferences were obviously out of step with the rest I replied, “Sounds good to me,” and added, “I just hope your mother wants to go.”

“Of course I want to go,” said my mother-in-law. “I haven’t been to Abilene in a long time.”

So into the car and off to Abilene we went. My predictions were fulfilled. The heat was brutal. We were coated with a fine layer of dust that was cemented with perspiration by the time we arrived. The food at the cafeteria provided first-rate testimonial material for antacid commercials.

Some four hours and 106 miles later we returned to Coleman, hot and exhausted. We sat in front of the fan for a long time in silence. Then, both to be sociable and to break the silence, I said, “It was a great trip, wasn’t it?”

No one spoke. Finally my mother18 in-law said, with some irritation, “Well, to tell the truth, I really didn’t enjoy it much and would rather have stayed here. I just went along because the three of you were so enthusiastic about going. I wouldn’t have gone if you all hadn’t pressured me into it.”

I couldn’t believe it. “What do you mean ‘you all’?” I said. “Don’t put me in the ‘you air group. I was delighted to be doing what we were doing. I didn’t want to go. I only went to satisfy the rest of you. You’re the culprits.”

My wife looked shocked. “Don’t call me a culprit. You and Daddy and Mama were the ones who wanted to go. I just went along to be sociable and to keep you happy. I would have had to be crazy to want to go out in heat like that.”

Her father entered the conversation abruptly. “Hell!” he said.

He proceeded to expand on what was already absolutely clear. “Listen, 1 never wanted to go to Abilene. I just thought you might be bored. You visit so seldom I wanted to be sure you enjoyed it. I would have preferred to play another game of dominoes and eat the leftovers in the icebox.”

After the outburst of recrimination we all sat back in silence. Here we were, four reasonably sensible people who, of our own volition, had just taken a 106-mile trip across a godforsaken desert in a furnace-like temperature through a cloud-like dust storm to eat unpalatable food at a hole-in-the-wall cafeteria in Abilene, when none of us had really wanted to go. In fact, to be more accurate, we’d done just the opposite of what we wanted to do. The whole situation simply didn’t make sense.

At least it didn’t make sense at the time. But since that day in Coleman, I have observed, consulted with, and been a part of more than one organization that has been caught in the same situation. As a result, they have either taken a side-trip, or, occasionally, a terminal journey to Abilene, when Dallas or Houston or Tokyo was where they really wanted to go. And for most of those organizations, the negative consequences of such trips, measured in terms of both human misery and economic loss, have been much greater than for our little Abilene group.

This article is concerned with that paradox – the Abilene Paradox. Stated simply, it is as follows: Organizations frequently take actions in contradiction to what they really want to do and therefore defeat the very purposes they are trying to achieve. It also deals with a major corollary of the paradox, which is that the inability to manage agreement is a major source of organization dysfunction. Last, the article is designed to help members of organizations cope more effectively with the paradox’s pernicious influence.

As a means of accomplishing the above, I shall: (1) describe the symptoms exhibited by organizations caught in the paradox; (2) describe, in summarized case-study examples, how they occur in a variety of organizations; (3) discuss the underlying causal dynamics; (4) indicate some of the implications of accepting this model for describing organizational behavior; (5) make recommendations for coping with the paradox; and, in conclusion, (6) relate the paradox to a broader existential issue.

SYMPTOMS OF THE PARADOX . . .

Continue reading. There’s much more.

And see also this brief explanation, which points out:

Although the Abilene Paradox is similar to groupthink, it differs in a major way. Groupthink involves people coming to a consensus through discussion despite the fact that the premise is greatly flawed. Each person in the group is influenced and becomes convinced that the proposed idea is a good one. In the Abilene Paradox, no one in the group believes the idea is a good one, yet they all go along with it.

Written by Leisureguy

3 August 2021 at 12:45 pm

What Philadelphia Reveals About America’s Homicide Surge

leave a comment »

In ProPublica Alec MacGillis takes a close look the homicide surge in Philadelphia to see what drove the homicide rate down and what caused the recent surge:

Nakisha Billa’s son was still a baby when she decided to make their first flight to safety. It was early in 2000 and she and Domonic were living in the North Philadelphia neighborhood of Kensington, which had long suffered some of the highest crime rates in the city. Billa was 22, proud to be living in her own place after having been raised in West Philadelphia mostly by her grandparents, and flush with the novelty of motherhood. “When I found out I was carrying Dom, it was the best thing that had ever happened to me,” she said. She liked to kiss his feet, and he liked it, too, so much so that he would stick them out invitingly with a big smile on his face.

One unseasonably warm spring day, she walked with Dom to the store to get something to eat. As they were leaving, a fight broke out on the street, with gunfire all around. “I just froze,” she said. “There was nothing I could do but just stand there and hold my baby.”

By the time they got safely home, she had resolved to leave the neighborhood. She knew her grandparents would take them in, but she preferred to seek her own path, and so she packed up their belongings and sought assistance at a shelter. “I knew that if I didn’t do something for Domonic and me, we would always be dependent on someone,” she said.

While bouncing around a succession of subsidized apartments, Billa started working in medical billing. But she wasn’t happy with the child care center that Dom was in. So she jumped at the chance of a job where, she was told, she would be allowed to bring him with her: driving a school bus. She studied hard for her commercial driver’s license and passed. Every weekday, she would set off with him strapped in the front row of the bus.

In 2005, Billa made yet another move, to Northeast Philadelphia. That swath of the city, an arm extending far up Interstate 95, had a much lower rate of crime and violence than where she was living in North Philadelphia. She wanted a safe environment for her son.

As it happened, even her former neighborhoods in North Philadelphia would grow safer in the years following her move. The decline in violent crime in Philadelphia was not nearly as attention-getting as those in New York City and Los Angeles, but it was impressive in its own right. Between 1990 and 2007, Philadelphia averaged 382 homicides per year. Beginning in 2008 the numbers dropped steadily, and in 2013 and 2014, the city registered fewer than 250 killings each year. The decline coincided with a notable upgrade of the city’s prospects: the rejuvenation of Center City, the resumption of population growth. “I believe that there are some people probably still alive today because of many of the things we did back in those days,” said Michael Nutter, who served as mayor from 2008 to 2016.

As elsewhere, there was no clear consensus about what was behind the drop in violent crime. Criminologists offered up a string of possible explanations, among them the passing of the crack epidemic, the expansion of police forces in the 1990s, and the reduction of childhood lead exposure in house paint and gasoline.

The debate was largely academic, a friendly argument over a happy story. In recent years, however, the trend started to reverse in Philadelphia and much of the country — first gradually, and then last year sharply. The nationwide homicide rate jumped 25% in 2020, taking it back to where it was in the late 1990s, wiping out two decades’ worth of progress. The nationwide rate is still below its highs in the early 1990s, but many cities, including Philadelphia, are near or past their all-time highs. And in many cities, including Philadelphia, this year is on track to be even worse than last year.

This soaring toll, which is heavily concentrated in Black neighborhoods, has brought new urgency to understanding the problem. But the terrible experience of the past year and a half has also offered an opportunity to make sense of what drives gun violence, and how to deter it. The coronavirus pandemic, and the decisions that officials made in response to it, had the effect of undoing or freezing countless public and social services that are believed to have a preventative effect on violence. Removing them, almost simultaneously, created a sort of unintended stress test, revealing how essential they are to preserving social order.

The effect of this withdrawal was layered atop other contributing factors, such as criminal justice reforms in Philadelphia and other cities, and further deterioration of police-community relations in the wake of more high-profile deaths at police hands. Criminologists and city leaders across the country are now scrambling to disentangle these layers of causation as the spike carries on, turning a city such as Philadelphia into a sort of high-stakes laboratory.

Of course, for Philadelphians like Nakisha Billa, the city is not a laboratory. It is her home. The path the city has taken on public safety over the past two decades has been something palpable in her life, shaping it at every turn — and shattering it during the pandemic.

Caterina Roman is wary of easy sound bites. But if she had to explain why violent crime declined for years in Philadelphia, it would boil down to this: . . ..

Continue reading.

Written by Leisureguy

3 August 2021 at 11:00 am

Why targets of deliberate deception often hesitate to admit they’ve been deceived.

leave a comment »

Brooke Harrington, a sociology professor at Dartmouth College and author of Pop Finance and Capital Without Borders: Wealth Management and the One Percent (see: brookeharrington.com) writes in the Atlantic:

Something very strange has been happening in Missouri: A hospital in the state, Ozarks Healthcare, had to create a “private setting” for patients afraid of being seen getting vaccinated against COVID-19. In a video produced by the hospital, the physician Priscilla Frase says, “Several people come in to get vaccinated who have tried to sort of disguise their appearance and even went so far as to say, ‘Please, please, please don’t let anybody know that I got this vaccine.’” Although they want to protect themselves from the coronavirus and its variants, these patients are desperate to ensure that their vaccine-skeptical friends and family never find out what they have done.

Missouri is suffering one of the worst COVID-19 surges in the country. Some hospitals are rapidly running out of ICU beds. To Americans who rushed to get vaccinated at the earliest opportunity, some Missourians’ desire for secrecy is difficult to understand. It’s also difficult to square with the common narrative that vaccine refusal, at least in conservative areas of the country, is driven by a lack of respect or empathy from liberals along the coasts. “Proponents of the vaccine are unwilling or unable to understand the thinking of vaccine skeptics—or even admit that skeptics may be thinking at all,” lamented a recent article in the conservative National Review. Writers across the political spectrum have urged deference and sympathy toward holdouts’ concerns about vaccine side effects and the botched CDC messaging about masking and airborne transmission early in the pandemic. But these takes can’t explain why holdouts who receive respect, empathy, and information directly from reliable sources remain unmoved—or why some people are afraid to tell their loved ones about being vaccinated.

What is going on here? Sociology suggests that pundits and policy makers have been looking at vaccine refusal all wrong: It’s not an individual problem, but a social one. That’s why individual information outreach and individual incentives—such as Ohio’s Vax-a-Million program, intended to increase vaccine uptake with cash prizes and college scholarships—haven’t worked. Pandemics, by definition, are collective problems. They propagate and kill because people live in communities. As a result, addressing pandemics requires understanding interpersonal dynamics—not just what promotes trust among people, but which behaviors convey status or lead to ostracism.

Shifting from an individual to a relational perspective helps us understand why people are seeking vaccination in disguise. They want to save face within the very specific set of social ties that sociologists call “reference groups”—the neighborhoods, churches, workplaces, and friendship networks that help people obtain the income, information, companionship, mutual aid, and other resources they need to live. The price of access to those resources is conformity to group norms. That’s why nobody strives for the good opinion of everyone; most people primarily seek the approval of people in their own reference groups.

In Missouri and other red states, vaccine refusal on partisan grounds has become a defining marker of community affiliation. Acceptance within some circles is contingent on refusal to cooperate with the Biden administration’s public-health campaign. Getting vaccinated is a betrayal of that group norm, and those who get the shot can legitimately fear losing their job or incurring the wrath of their families and other reference groups.

Sociology solves mysteries like these by zeroing in on problematic relationships, not the decisions that individuals make in isolation. Many of the people refusing safe, effective vaccination amid a deadly pandemic are enmeshed in a very distinctive type of relationship that sociologists have been studying for more than 70 years: the con job. Con artists gain social or financial advantage by convincing their marks to believe highly dubious claims—and to block out all information to the contrary.

COVID-19-related cons have become big business, not just for right-wing media outlets that have gained viewers while purveying vaccine disinformation but also for small-time social-media grifters and enterprising professionals. The New York Times recently profiled Joseph Mercola, a Florida osteopath whom the paper described as “The Most Influential Spreader of Coronavirus Misinformation.” Four years ago, the Federal Trade Commission forced Mercola to pay nearly $3 million in settlements for false advertising claims about indoor tanning beds that he had sold. In February of this year, Mercola told his millions of followers on Facebook that the vaccine would “alter your genetic coding,” and promoted his line of vitamin supplements as an alternative to ward off COVID-19.

To outsiders, the social dynamics of the con appear peculiar and irrational. Those caught up in it can seem self-destructive and, frankly, clueless. But to sociologists, including me, who study fraud, such behaviors obey a predictable logic.

The seminal text in the field—Erving Goffman’s 1952 essay “On Cooling the Mark Out”—observes that all targets of con artists eventually come to understand that they have been defrauded, yet they almost never complain or report the crime to authorities. Why? Because, Goffman argues, . . .

Continue reading. There’s more, and it’s illuminating.

Written by Leisureguy

2 August 2021 at 12:51 pm

What’s Behind the U.S. War on Science?

leave a comment »

Vincent Ialenti, formerly a MacArthur postdoctoral fellow at the University of British Columbia and now MacArthur Assistant Research Professor in the Elliott School of International Affairs at George Washington University and author of Deep Time Reckoning: How Future Thinking Can Help Earth Now has an interesting article in Sapiens. It begins:

In U.S. President-elect Joe Biden’s victory speech last November, he vowed his administration would “marshal the forces of science” to take bold action against climate change and the pandemic. Describing his election as a “great day” for American educators, he drafted a national coronavirus strategy with a clear mandate: “Listen to science.”

Biden, now halfway through his first year as president, has mostly followed through. He appointed a leading geneticist as his top science adviser and elevated his role to the Cabinet rank. He established a new position—deputy director for science and society at the Office of Science and Technology Policy—and filled it with a renowned sociologist. He reengaged the World Health Organization and issued a detailed pandemic plan focused on health equity and higher vaccination rates. He rejoined the Paris Climate Agreement and set an ambitious target for reducing greenhouse gas emissions by around 50 percent by 2030.

This all broke with his predecessor. Under former President Donald Trump, more than two-thirds of scientists across 16 federal agencies reported that hiring freezes and departures interfered with their work. Federal funding was cut for expertise on matters ranging from invasive insect risk to the effects of chemicals on pregnant women. The White House attempted to undermine the National Climate Assessment and even sent a “cease and desist” order to a top National Park Service scientist for testifying to Congress about climate change. The Trump administration moved coronavirus data collection away from the Centers for Disease Control and Prevention, and downplayed the seriousness of the pandemic.

Biden has a historic opportunity to reverse Trump’s regressive science policies. Yet to achieve a more fundamental change in how American political culture approaches science, Biden has to go further to confront an unsettling reality: Current suspicions of science did not begin with the election of one man in 2016. They have often been symptomatic of frustrations and critiques that gained relevance decades prior to Trump’s inauguration, leading many critics to write off scientists as just another untrustworthy, out-of-touch group of elites.

In my new book, Deep Time Reckoning, I refer to this as a “deflation of expertise.” To understand its origins, I first had to leave my home country and experience everyday life in a society that approaches scientific and other forms of expertise differently: Finland. Reflecting on these contrasts can reveal some of the societal disillusionments that fueled Trump’s war on science—and help the U.S. move beyond them.

From 2012 to 2014, I lived in Helsinki. I was conducting anthropological fieldwork among experts developing what will likely become the world’s first deep geological repository for high-level nuclear energy waste. I often asked these experts how Finland was able to keep so closely to the disposal schedules it set back in the early 1980s. The United States’ now-defunct nuclear repository project at Yucca Mountain had, in contrast, been stymied by decades of fierce litigation, political stagnation, and scientific uncertainty.

The Finnish experts attributed their project’s comparatively smooth rollout to Finland’s broad public trust in the competence of their domestic engineers, technocrats, and scientists.

Finns from many walks of life told me of their country’s fondness of large, centralized, hierarchical organizations like public transport systems, government ministries, and the welfare state. They pointed me toward polls casting Finland as unique in its high levels of trust in its domestic civil servantspolice officerseducatorsjournalists, and scientists. For sure, I met Finns who did not fit neatly with these generalizations. But on the whole, my findings lined up with the conclusions of Finnish social scientists: Finns generally “count on expertise, technology, and authorities.”

When I returned to home in August 2014, my mild reverse culture shock revealed Finland’s approach to expertise to be a world apart from the United States’.

Without realizing it at first, I found myself continuing my field research—but now it was the U.S. that looked unfamiliar. I asked my compatriots about trust in science while living in Upstate New York, then in Washington D.C., and when visiting my hometown in Central Massachusetts. I encountered a deep suspicion of experts that, in Finland, would have seemed almost paranoid.

American distrust of science is not new. Yet in recent decades, moral and religious critiques of science have fueled the growth of an anti-elite fervor against scientists and other experts, especially among conservatives. Why?

Some Americans I met told me how their trust in high-ranking military leaders had been shaken when the U.S. invaded Iraq in 2003 based on false pretenses about weapons of mass destruction. Others told me that their trust in economists had been damaged after the 2007–2008 global financial crisis. Still others expressed that their trust in Silicon Valley had given way to concerns about digital privacy losses, big data cybersecurity hacks, and U.S. National Security Agency surveillance.

After multiple breaches of public trust by powerful state institutions and trained experts, some people felt that suspicion of any kind of person in an elite position seemed reasonable. A trust gap was widening between the general public and elites.

Come the 2016 election, all sorts of claims to expert authority were written off as mere pompous elitism. Right-wing populists clamored loudly against technocrats, globalists, and the deep state. The Trump administration openly questioned established science on topics ranging from climate change to human evolution.

Meanwhile, deluges of online misinformation left millions of Americans—on both the political left and right—siloed in algorithmically generated, increasingly extreme social media echo chambers.

But the reasons behind this intensification of anti-science political fervor, especially among conservatives, the majority of whom are White people, are complex and multifaceted.

I will focus on a few that I see as particularly relevant to the intensification of anti-elitism in the U.S. For one thing, some White Americans have had to reckon with a crumbling American DreamOver the past two decades, working-class White people have seen decreased life expectancies and increased rates of suicide and opioid overdoses. Meanwhile, middle-class White males’ wages have stagnated or, in some cases, declined.

These economic changes, alongside other factors, have led some conservatives to feel that “establishment” institutions—not just in media and government, but also in science and technology—have abandoned them. Some also say they’ve lost faith in the country’s higher education system. According to a 2017 Gallup poll, Republican or Republican-leaning voters tend to be concerned that these institutions are “too liberal/political” and don’t allow students to “think for themselves.” Some critics fear conservative students are marginalized by far-left faculty and administrators, a critique that hasn’t been borne out by the research.

Today only 37 percent of conservative Republicans believe in global warming—down from 49 percent in 2008. Many reject peer-reviewed findings on COVID-19 or view public health guidance as a threat to their sense of self-determination. Only 27 percent of Republicans—compared to 43 percent of respondents who are or lean Democrat—report “a great deal” of confidence in the scientific community as a whole.

Science advocates can find hope in Biden’s political appointments and policy initiatives. However, Biden faces a grander challenge: regaining trust in science among those who have lost faith in expertise itself.

Sociologist Bruno Latour has observed that . . .

Continue reading. There’s more, and it offers insights into the breakdown of the US social compact.

Lying eyes: Using disproven methods to make legal judgments

leave a comment »

Gayan Samarasinghe writes in New Humanist:

The pandemic has meant most court hearings are now conducted online. For some of the judiciary, these virtual hearings are a great modernising development. Others are less enthusiastic. That is perhaps not surprising given that our courts are one of the most conservative of our institutions. Its priests appear in wigs and robes, its natural language is Latin, cameras and the press are largely shut out. In this context, trial by Skype is a polarising reform. But some judges complain that there is an insoluble problem when trials are held remotely: reading faces.

In a precedent-setting case at the start of the pandemic – known only as the matter of “P (A Child: Remote Hearing)” – a senior judge overturned a decision to hold a virtual trial. A local authority had accused a mother of child abuse by fabricating and inducing symptoms of illness in her daughter: a case, social services argued, of Munchausen Syndrome by Proxy.

The judge ruled the virtual trial could not proceed because the postage stamp version of the mother on Skype was too small. He wanted to see her properly, not only while she was giving her evidence, but while she was sitting in the well of the court and reacting to the evidence against her. This assessment of outward appearance, the judge said, was crucial to judging.

The judgment in P stands for a long and largely unquestioned tradition that asserts judges can and should try to read people’s facial expressions and body language when reaching decisions. But increasingly, both the ethics and science behind this practice are being called into question.

Biasing the court

In my first year of training as a barrister in the UK, I arrived at court to represent one of two parents involved in a bitter custody dispute over their young son. The judge – sensing the animosity in the room – warned both of them to be respectful when the other was speaking. He then expressed a sentiment that I have heard many judges repeat, something along these lines: “Judges often learn far more watching you while you are listening, than when you are speaking on oath.”

Not long after, in a domestic violence case in a different court, I learned what could happen when the warning wasn’t heeded. While listening to her ex-husband’s answers to questions, my client sighed, made faces and – worst of all – committed the crime of kissing her teeth: an acceptable way of expressing disapproval in her West African culture but a noise that is strangely hated in much of Europe (“le tchip” as it is known in French, is now banned in many schools both in France and in the UK).

When the court’s judgment was given, my client was described by the judge as a woman whose behaviour in court had not been consistent with that of a domestic abuse victim. By the narrowest of margins, however, and due to other evidence, the judge (seemingly grudgingly) found in her favour. I was relieved my client won her case, but I was troubled by the idea that even people who have been ruled to be victims may not have a courtroom demeanour that lives up to the imagined standard of what a victim should look like.

I couldn’t help wondering how juries in criminal trials approached similar questions. One of my former lecturers at law school, Professor Louise Ellison, has investigated how juries make decisions in rape cases. Her research suggests that many jurors may have little understanding of the factors that could influence a rape complainant’s demeanour in court. Although victims of trauma may present as calm for a number of reasons, Ellison’s jurors were struck and perplexed with what they perceived as “stoic” testimony by a complainant. Some speculated whether the witness’s measured pace and somewhat “flat” intonation meant that her answers had been rehearsed.

A complication for Ellison’s research is access to jurors. Her studies have been undertaken with actors in the roles of complainants. While criminal trials are held in public, jurors are forbidden, at risk of imprisonment, from revealing how they reached their verdicts. There is an anxiety as to how many mistrials might take place were jurors to reveal their deliberations. In one study, conducted with Ellison’s colleague Professor Vanessa Munro and published in the British Journal of Criminology in 2009, their mock jurors often said troubling things. One discussion focused on what the witness had chosen to wear to court – with a juror suggesting that a “dowdy” dress was a deliberate attempt to manipulate the jury: “she’s got no make-up on, her hair’s tied back, she looks like a frightened little woman. Who knows, in the office she could have black stockings on, four and a half inch heels, wearing loads of make-up.” Do these mirror the discussions of juries in real trials?

Some think the answers to these types of questions could do immense damage to the system: the right to be judged by a jury of one’s peers, ever since it was enshrined in the Magna Carta, has long been seen as a safeguard against tyranny and a fundamental part of British justice.

There may, however, be ways of improving the system. Much like in other social situations, jurors can and do correct each other. And judges can also help. Ellison has argued for . . .

Continue reading. There’s more.

Written by Leisureguy

29 July 2021 at 1:24 pm

Are We All Getting More Depressed?: A New Study Analyzing 14 Million Books, Written Over 160 Years, Finds the Language of Depression Steadily Rising

leave a comment »

Interesting column at Open Culture today, written by Josh Jones — and note at the bottom of the column the links to related content. He writes:

The relations among thought, language, and mood have become subjects of study for several scientific fields of late. Some of the conclusions seem to echo religious notions from millennia ago. “As a man thinketh, so he is,” for example, proclaims a famous verse in Proverbs (one that helped spawn a self-help movement in 1903). Positive psychology might agree. “All that we are is the result of what we have thought,” says one translation of the Buddhist Dhammapada, a sentiment that cognitive behavioral therapy might endorse.

But the insights of these traditions — and of social psychology — also show that we’re embedded in webs of connection: we don’t only think alone; we think — and talk and write and read — with others. External circumstances influence mood as well as internal states of mind. Approaching these questions differently, researchers at the Luddy School of Informatics, Computing, and Engineering at Indiana University asked, “Can entire societies become more or less depressed over time?,” and is it possible to read collective changes in mood in the written languages of the past century or so?

The team of scientists, led by Johan Bollen, Indiana University professor of informatics and computing, took a novel approach that brings together tools from at least two fields: large-scale data analysis and cognitive-behavioral therapy (CBT). Since diagnostic criteria for measuring depression have only been around for the past 40 years, the question seemed to resist longitudinal study. But CBT provided a means of analyzing language for markers of “cognitive distortions” — thinking that skews in overly negative ways. “Language is closely intertwined with this dynamic” of thought and mood, the researchers write in their study, “Historical language records reveal a surge of cognitive distortions in recent decades,” published just last month in PNAS.

Choosing three languages, English (US), German, and Spanish, the team looked for “short sequences of one to five words (n-grams), labeled cognitive distortion schemata (CDS).” These words and phrases express negative thought processes like “catastrophizing,” “dichotomous reasoning,” “disqualifying the positive,” etc. Then, the researchers identified the prevalence of such language in a collection of over 14 million books published between 1855 and 2019 and uploaded to Google Books. The study controlled for language and syntax changes during that time and accounted for the increase in technical and non-fiction books published (though it did not distinguish between literary genres).

What the scientists found in all three languages was a distinctive “‘hockey stick’ pattern” — a sharp uptick in the language of depression after 1980 and into the present time. The only spikes that come close on the timeline occur in English language books during the Gilded Age and books published in German during and immediately after World War II. (Highly interesting, if unsurprising, findings.) Why the sudden, steep climb in language signifying depressive thinking? Does it actually mark a collective shift in mood, or show how historically oppressed groups have had more access to publishing in the past forty years, and have expressed less satisfaction with the status quo?

While they are careful to emphasize that they “make no causal claims” in the study, the researchers have some ideas about what’s happened, observing for example: . . .

Continue reading. There’s more, including some likely causes — and note this earlier post by Kevin Drum, which focuses on the US.

Written by Leisureguy

29 July 2021 at 11:35 am

The Real Source of America’s Rising Rage

leave a comment »

Kevin Drum has a good article in Mother Jones that begins:

Americans sure are angry these days. Everyone says so, so it must be true.

But who or what are we angry at? Pandemic stresses aside, I’d bet you’re not especially angry at your family. Or your friends. Or your priest or your plumber or your postal carrier. Or even your boss.

Unless, of course, the conversation turns to politics. That’s when we start shouting at each other. We are way, way angrier about politics than we used to be, something confirmed by both common experience and formal research.

When did this all start? Here are a few data points to consider. From 1994 to 2000, according to the Pew Research Center, only 16 percent of Democrats held a “very unfavorable” view of Republicans, but then these feelings started to climb. Between 2000 and 2014 it rose to 38 percent and by 2021 it was about 52 percent. And the same is true in reverse for Republicans: The share who intensely dislike Democrats went from 17 percent to 43 percent to about 52 percent.

Likewise, in 1958 Gallup asked people if they’d prefer their daughter marry a Democrat or a Republican. Only 28 percent cared one way or the other. But when Lynn Vavreck, a political science professor at UCLA, asked a similar question a few years ago, 55 percent were opposed to the idea of their children marrying outside their party.

Or consider the right track/wrong track poll, every pundit’s favorite. Normally this hovers around 40–50 percent of the country who think we’re on the right track, with variations depending on how the economy is doing. But shortly after recovering from the 2000 recession, this changed, plunging to 20–30 percent over the next decade and then staying there.

Finally, academic research confirms what these polls tell us. Last year a team of researchers published an international study that estimated what’s called “affective polarization,” or the way we feel about the opposite political party. In 1978, we rated people who belonged to our party 27 points higher than people who belonged to the other party. That stayed roughly the same for the next two decades, but then began to spike in the year 2000. By 2016 it had gone up to 46 points—by far the highest of any of the countries surveyed—and that’s before everything that has enraged us for the last four years.

What’s the reason for this? There’s no shortage of speculation. Political scientists talk about the fragility of presidential systems. Sociologists explicate the culture wars. Historians note the widening divide between the parties after white Southerners abandoned the Democratic Party following the civil rights era. Reporters will regale you with stories about the impact of Rush Limbaugh and Newt Gingrich.

There’s truth in all of these, but even taken together they are unlikely to explain the underlying problem. Some aren’t new (presidential systems, culture wars) while others are symptoms more than causes (the Southern Strategy).

I’ve been spending considerable time digging into the source of our collective rage, and the answer to this question is trickier than most people think. For starters, any good answer has to fit the timeline of when our national temper tantrum began—roughly around the year 2000. The answer also has to be true: That is, it needs to be a genuine change from past behavior—maybe an inflection point or a sudden acceleration. Once you put those two things together, the number of candidates plummets.

But I believe there is an answer. I’ll get to that, but first we need to investigate a few of the most popular—but ultimately unsatisfying—theories currently in circulation.

Theory #1: Americans Have Gone Crazy With Conspiracy Theories

It’s probably illegal to talk about the American taste for conspiracy theorizing without quoting from Richard Hofstadter’s famous essay, “The Paranoid Style in American Politics.” It was written in 1964, but this passage (from the book version) about the typical conspiracy monger should ring a bell for the modern reader:

He does not see social conflict as something to be mediated and compromised, in the manner of the working politician. Since what is at stake is always a conflict between absolute good and absolute evil, the quality needed is not a willingness to compromise but the will to fight things out to a finish. Nothing but complete victory will do.

Or how about this passage from Daniel Bell’s “The Dispossessed”? It was written in 1962:

The politics of the radical right is the politics of frustration—the sour impotence of those who find themselves unable to understand, let alone command, the complex mass society that is the polity today…Insofar as there is no real left to counterpoise to the right, the liberal has become the psychological target of that frustration.

In other words, the extreme right lives to own the libs. And it’s no coincidence that both Hofstadter and Bell wrote about this in the early ’60s: That was about the time that the John Birch Society was gaining notoriety and the Republican Party nominated Barry Goldwater for president. But as Hofstadter in particular makes clear, a fondness for conspiracy theories has pervaded American culture from the very beginning. Historian Bernard Bailyn upended revolutionary-era history and won a Pulitzer Prize in 1968 for his argument that belief in a worldwide British conspiracy against liberty “lay at the heart of the Revolutionary movement”—an argument given almost Trumpian form by Sam Adams, who proclaimed that the British empire literally wanted to enslave white Americans. Conspiracy theories that followed targeted the Bavarian Illuminati, the Masons, Catholics, East Coast bankers, a global Jewish cabal, and so on.

But because it helps illuminate what we face now, let’s unpack the very first big conspiracy theory of the modern right, which began within weeks of the end of World War II.

In 1945 FDR met with Joseph Stalin and Winston Churchill at Yalta with the aim of gaining agreement about the formation of the United Nations and free elections in Europe. In this he succeeded: Stalin agreed to everything FDR proposed. When FDR returned home he gave a speech to Congress about the meeting, and it was generally well received. A month later he died.

Needless to say, Stalin failed to observe most of the agreements he had signed. He never had any intention of allowing “free and fair” elections in Eastern Europe, which he wanted as a buffer zone against any future military incursion from Western Europe. The United States did nothing about this, to the disgust of many conservatives. However, this was not due to any special gutlessness on the part of Harry Truman or anyone in the Army. It was because the Soviet army occupied Eastern Europe when hostilities ended and there was no way to dislodge it short of total war, something the American public had no appetite for.

And there things might have stood. Scholars could have argued for years about whether FDR was naive about Stalin, or whether there was more the US and its allies could have done to push Soviet troops out of Europe. Books would have been written and dissertations defended, but not much more. So far we have no conspiracy theory, just some normal partisan disagreement.

But then came 1948. Thomas Dewey lost the presidency to Harry Truman and Republicans lost control of the House. Soon thereafter the Soviet Union demonstrated an atomic bomb and communists overran China. It was at this point that a normal disagreement turned into a conspiracy theory. The extreme right began suggesting that FDR had deliberately turned over Eastern Europe to Stalin and that the US delegation at Yalta had been rife with Soviet spies. Almost immediately Joe McCarthy was warning that the entire US government was infiltrated by communists at the highest levels. J. Robert Oppenheimer, the architect of the Manhattan Project, was surely a communist. George Marshall, the hero of World War II, was part of “a conspiracy on a scale so immense as to dwarf any previous such venture in the history of man.”

Like most good conspiracy theories, there was a kernel of truth here. Stalin really did take over Eastern Europe. Alger Hiss, part of the Yalta delegation, really did turn out to be a Soviet mole. Klaus Fuchs and others really did pass along atomic secrets to the Soviets. Never mind that Stalin couldn’t have been stopped; never mind that Hiss was a junior diplomat who played no role in the Yalta agreements; never mind that Fuchs may have passed along secrets the Soviets already knew. It was enough to power a widespread belief in McCarthy’s claim of the biggest conspiracy in all of human history.

There’s no polling data from back then, but belief in this conspiracy became a right-wing mainstay for years—arguably the wellspring of conservative conspiracy theories for decades. Notably, it caught on during a time of conservative loss and liberal ascendancy. This is a pattern we’ve seen over and over since World War II. The John Birch Society and the JFK assassination conspiracies gained ground after enormous Democratic congressional victories in 1958 and again in 1964. The full panoply of Clinton conspiracies blossomed after Democrats won united control of government in the 1992 election. Benghazi was a reaction to Barack Obama—not just a Democratic win, but the first Black man to be elected president. And today’s conspiracy theories about stealing the presidential election are a response to Joe Biden’s victory in 2020.

How widespread are these kinds of beliefs? And has their popularity changed over time? The evidence is sketchy but there’s polling data that provides clues. McCarthy’s conspiracy theories were practically a pandemic, consuming American attention for an entire decade. Belief in a cover-up of the JFK assassination has always hovered around 50 percent or higher. In the mid-aughts, a third of poll respondents strongly or somewhat believed that 9/11 was an inside job, very similar to the one-third of Americans who believe today that there was significant fraud in the 2020 election even though there’s no evidence to support this. And that famous one-third of Americans who are skeptical of the COVID-19 vaccine? In 1954 an identical third of Americans were skeptical of the polio vaccine that had just become available.

So how does QAnon, the great liberal hobgoblin of the past year, measure up? It may seem historically widespread for such an unhinged conspiracy theory, but it’s not: Polls suggest that actual QAnon followers are rare and that belief in QAnon hovers at less than 10 percent of the American public. It’s no more popular than other fringe fever swamp theories of the past.

It’s natural to believe that things happening today—to you—are worse than similar things lost in the haze of history, especially when social media keeps modern outrages so relentlessly in our faces. But often it just isn’t true. A mountain of evidence suggests that the American predilection for conspiracy theories is neither new nor growing. Joseph Uscinski and Joseph Parent, preeminent scholars of conspiracy theories, confirmed this with some original research based on letters to the editors of the New York Times and the Chicago Tribune between 1890 and 2010. Their conclusion: Belief in conspiracy theories has been stable since about 1960. Along with more recent polling, this suggests that the aggregate belief in conspiracy theories hasn’t changed a lot and therefore isn’t likely to provide us with much insight into why American political culture has corroded so badly during the 21st century.

Theory #2: It’s All About Social Media

How about social media? Has it had an effect? Of . . .

Continue reading. There’s much more — along with what he views as the main cause.

And note these:

RELATED STORIES

Today It’s Critical Race Theory. 200 Years Ago It Was Abolitionist Literature.

The Moral Panic Over Critical Race Theory Is Coming for a North Carolina Teacher of the Year

Post-Trump, the GOP Continues to Be the Party of (White) Grievance

Written by Leisureguy

28 July 2021 at 12:00 pm

Coaches who care more about winning than about the athletes they coach

leave a comment »

Byron Heath has an interesting post on Facebook:

This realization I had about Simone Biles is gonna make some people mad, but oh well.
Yesterday I was excited to show my daughters Kerri Strug’s famous one-leg vault. It was a defining Olympic moment that I watched live as a kid, and my girls watched raptly as Strug fell, and then limped back to leap again.

But for some reason I wasn’t as inspired watching it this time. In fact, I felt a little sick. Maybe being a father and teacher has made me soft, but all I could see was how Kerri Strug looked at her coach, Bela Karolyi, with pleading, terrified eyes, while he shouted back “You can do it!” over and over again.

My daughters didn’t cheer when Strug landed her second vault. Instead they frowned in concern as she collapsed in agony and frantic tears.

“Why did she jump again if she was hurt?” one of my girls asked. I made some inane reply about the heart of a champion or Olympic spirit, but in the back of my mind a thought was festering:

*She shouldn’t have jumped again*

The more the thought echoed, the stronger my realization became. Coach Karolyi should have gotten his visibly injured athlete medical help immediately! Now that I have two young daughters in gymnastics, I expect their safety to be the coach’s number one priority. Instead, Bela Karolyi told Strug to vault again. And he got what he wanted; a gold medal that was more important to him than his athlete’s health.

I’m sure people will say “Kerri Strug was a competitor–she WANTED to push through the injury.” That’s probably true. But since the last Olympics we’ve also learned these athletes were put into positions where they could be systematically abused both emotionally and physically, all while being inundated with “win at all costs” messaging. A teenager under those conditions should have been protected, and told “No medal is worth the risk of permanent injury.” In fact, we now know that Strug’s vault wasn’t even necessary to clinch the gold; the U.S. already had an insurmountable lead. Nevertheless, Bela Karolyi told her to vault again according to his own recounting of their conversation:

“I can’t feel my leg,” Strug told Karolyi.

“We got to go one more time,” Karolyi said. “Shake it out.”

“Do I have to do this again?” Strug asked.

“Can you, can you?” Karolyi wanted to know.

“I don’t know yet,” said Strug. “I will do it. I will, I will.”

The injury forced Strug’s retirement at 18 years old. Dominique Moceanu, a generational talent, also retired from injuries shortly after. They were top gymnasts literally pushed to the breaking point, and then put out to pasture. Coach Karolyi and Larry Nassar (the serial sexual abuser) continued their long careers, while the athletes were treated as a disposable resource.

Today Simone Biles–the greatest gymnast of all time–chose to step back from the competition, citing concerns for mental and physical health. I’ve already seen comments and posts about how Biles “failed her country”, “quit on us”, or “can’t be the greatest if she can’t handle the pressure.” Those statements are no different than Coach Karolyi telling an injured teen with wide, frightened eyes: “We got to go one more time. Shake it out.”

The subtext here is: “Our gold medal is more important than your well-being.”

Our athletes shouldn’t have to destroy themselves to meet our standards. If giving empathetic, authentic support to our Olympians means we’ll earn less gold medals, I’m happy to make that trade.

Here’s the message I hope we can send to Simone Biles: You are an outstanding athlete, a true role model, and a powerful woman. Nothing will change that. Please don’t sacrifice your emotional or physical well-being for our entertainment or national pride. We are proud of you for being brave enough to compete, and proud of you for having the wisdom to know when to step back. Your choice makes you an even better example to our daughters than you were before. WE’RE STILL ROOTING FOR YOU!

I’ve read a fair amount about the psychology of athletes and performers. It would be interesting to read more about the psychology of coaches — not how they employ psychology on their charges, but on the psychology that drives them — why some coaches are so willing to sacrifice an athlete to secure a win.

Written by Leisureguy

28 July 2021 at 9:47 am

How Bad is American Life? Americans Don’t Even Have Friends Anymore

leave a comment »

Umair Haque has a somewhat gloomy piece in Medium, which includes the chart above. He writes:

Continue reading.

Written by Leisureguy

27 July 2021 at 11:58 am

Paris Sportif: The Contagious Attraction of Parkour

leave a comment »

I first encountered parkour in a Luc Besson movie, District 13 (from 2004, original title Banlieue 13), but it has a longer history, discussed by Macs Smith in an extract from his book Paris and the Parasite: Noise, Health, and Politics in the Media City published in The MIT Reader:

In a city fixated on public health and order, a viral extreme sport offers a challenge to the status quo.1955, Letterist International, a Paris-based group of avant-garde authors, artists, and urban theorists, published “Proposals for Rationally Improving the City of Paris.” The group, which would become better known as Situationist International, or SI, and play an important role in the May 1968 demonstrations, put forward wild suggestions for breaking the monotony of urban life. Some of these, like the call to abolish museums and distribute their masterpieces to nightclubs, were iconoclastic and anti-institutional, reflecting the group’s anarchic political leanings.

Others were less overtly political and testified to a thirst for excitement. To appeal to “spelunkers” and thrill-seekers, they called for Paris’s rooftops and metro tunnels to be opened up to exploration. The group believed that the mundaneness of urban life in the 1950s was integral to bourgeois capitalism. Boredom was part of how the government maintained order, and so a more equal city would necessarily have to be more frightening, more surprising, more fun.

SI disbanded in 1972, but its ideas about the links between emotion and urban politics have been influential. Among the best examples are the subcultures centered around urban thrill-seeking that exist today, like urban exploration (Urbex), rooftopping, and skywalking, all of which involve breaking into dangerous or forbidden zones of the city. The most famous inheritor to SI’s call to experience urban space differently is parkour, which was invented in the Paris suburb of Lisses in the 1980s. It was inspired by Hébertisme, a method of obstacle course training first introduced to the French Navy in 1910 by Georges Hébert. David Belle learned the principles of Hébertisme from his father, Raymond, who had been exposed to it at a military school in Vietnam. David, along with a friend, Sébastien Foucan, then adapted those principles, originally conceived for natural environments, to the suburban architecture of their surroundings.

Over time, parkour has incorporated techniques from tumbling, gymnastics, and capoeira, resulting in a striking blend of military power and balletic artistry. Parkour involves confronting an urban map with an embodied experience of urban space. It is often defined as moving from points A to B in the most efficient way possible, and parkour practitioners, called traceurs, often depict themselves as trailblazers identifying routes through the city that cartography does not capture. Traceurs sometimes evoke the fantasy of tracing a straight line on the map and finding a way to turn it into a path, although in practice, they more often work at a single point on the map — a park, a rooftop, an esplanade — and end a session back where they started.

Traceurs’ desire to rewrite the map is another thing they share with the Situationists, who liked to cut up maps and glue them back together to show the psychological distance between neighborhoods. But parkour distinguishes itself from SI through its use of video, which continues to be a point of debate within the practice. In the early 2000s, Sébastien Foucan reignited this debate when he broke away from Belle to pioneer his own version of the training system.

Foucan’s appearance in the 2003 documentary “Jump London” cemented “freerunning” as the name for this alternate practice, which put a greater emphasis on stylized movements. Foucan would go on to play a terrorist bomb-maker in Martin Campbell’s “Casino Royale,” leaping from cranes with Daniel Craig’s James Bond in pursuit. Some parkour purists see this as a degradation of the utilitarian roots of their training, and insist instead on a physio-spiritual discourse of communion with the environment, mastery of fear, and humility. They reject freerunning as a brash corruption of Hébert’s principles. The sociologist Jeffrey Kidder notes in his interviews with traceurs in Chicago that they dismiss participants who lack interest in serious rituals like safety, humility, and personal growth. They react negatively to media coverage that highlights parkour’s danger or assimilates it into adolescent rebellions like skateboarding, drug use, or loitering.

In my own email interview with the leaders of Parkour Paris, the official parkour organization of Paris, the same will to blame media is evident: “Parkour has been mediatized in ‘connotated’ films. The traceurs depicted in those fictions were friendly delinquents a bit like Robin Hood. Friendly, yes, but for the immense majority of people they were still delinquents from the banlieue,” they gripe. “It’s been very hard to shake that image.” . . .

Continue reading. There’s much more. And it includes this 50-minute video, Jump London:

Written by Leisureguy

27 July 2021 at 10:17 am

Whatever Is True, Is My Own: Seneca’s Open-minded Enquiry

leave a comment »

Barnaby Taylor teaches Classics at Exeter College, Oxford and writes in Antigone:

Say that you subscribe to a particular set of values, which you believe are the key to being truly good and happy. You haven’t mastered them yet, but you pursue them with increasing devotion, and feel yourself making progress. Say now that your friend, about whom you care very much, feels some attraction to these values, to this way of life, but is yet to cultivate a deep and lasting interest. He has other intellectual temptations, and, what’s more, he is weighed down by the cares and troubles of the world. How can you help him to develop his nascent interest in your philosophy of the good life? And what attitude should you encourage him to hold towards those with whom you disagree? These questions are explored in Seneca’s Moral Epistles, written over the last few years of his life to his friend and philosophical fellow-traveller, Lucilius.

Seneca (c. 4 BC – AD 65) was a Stoic, and so thought that virtue is the only thing that matters for a truly good life. Nothing else – including health, wealth, possessions, and family – makes any contribution to happiness. This may sound austere, and indeed there was a certain unrelenting quality to Stoic ethics, but the Epistles are not an austere work by any measure. Across 124 letters, in which the narrative exploration of life is generally preferred to abstract theorising, Seneca engages in a deep and intimate evaluation of what it means to be good, discussing at length, and with much wit and uncompromising self-scrutiny, his own faltering moral progress.

In the first 29 letters – those on which I’ll concentrate here – we find discussions of reading (what should one read, and how should one read it?) – friendship, moral and social imagination, candidness, perfectionism, self-awareness, vulnerability, solitude, sociability, emotion, mental discipline, old age, and death. Above all, Seneca focuses on the question of how to pursue a life of introspection in the midst of worldly responsibilities and concerns – a focus which may be especially attractive to those who, like me, have often felt the tension between the obligations of the world and the possibility of an inner life.

These early Senecan letters appeal to me in several ways. Partly it’s the elegance, wit and economy of his Latin style; partly it’s the thoughtful depiction and exploration of the didactic process, which interests me as someone who, being a teacher, spends a lot of time helping others to develop and cultivate their intellectual interests and values; partly it’s simply the richness and depth of the discussion; and partly it’s the sense of Seneca’s own flawedness and failure – these are not the writings of a moral saint.

d like here to focus on one surprising feature of these early letters, namely their treatment of a certain philosopher with whose doctrines Seneca elsewhere expresses fundamental disagreement. While acknowledging that the strictness of Stoic doctrine may need to be relaxed for those who are just getting started, Seneca is clear that what he is engaged in, and what he is encouraging Lucilius towards, is the cultivation of a Stoic life. Now, one might think that a good way of getting ahead with Stoicism would be to focus on Stoic texts, and indeed Seneca does give a few select quotations from the Stoic philosopher Hecaton (noster, “one of ours,” 5.7).

‘Don’t read too widely,’ Seneca advises Lucilius in the second letter, where the focus on books and reading is programmatic for the whole work. It is better, he says, to focus your attention on a few particularly valuable authors, and to really learn something from them, than to try to read everything and absorb nothing from it. This second letter ends, though, with a quotation not from a fellow Stoic, but from Epicurus (341–271 BC), the founder of Epicureanism, a doctrine whose central ethical tenets were held to be quite incompatible with Stoic thought. Epicurus goes on to become by far the most regularly quoted philosopher in the first books of the Epistles. . .

Continue reading.

Written by Leisureguy

25 July 2021 at 11:05 am

We’re all teenagers now

leave a comment »

Paul Howe, professor of political science at the University of New Brunswick in Fredericton, Canada  and author of Teen Spirit: How Adolescence Transformed the Adult World (2020), has an extract of his book in Aeon:

Most of us are familiar with the law of unintended consequences. In the 1920s, Prohibition put a halt to the legal production and sale of alcohol in the United States only to generate a new set of social ills connected to bootlegging and wider criminal activity. More recently, mainstream news media outlets, in pursuit of ratings and advertising dollars, lavished attention on an outlandish, orange-hued candidate when he first announced his run for president in 2015, and inadvertently helped to pave his way to the White House – oops. Aiding and abetting his campaign was a communications tool – social media – originally designed to bring people together and create community, but which now seems to serve more as a vehicle of division and discord.

A different development has been seen as an unqualified boon: the mass expansion, over the past century, of public education. In place of a narrowly educated elite and the minimally schooled masses, we now have a society where the vast majority possess knowledge and skills necessary for success in various dimensions of their lives, including work, community engagement, democratic participation and more. Some might fall short of their potential, but the general impact is clear: extending greater educational opportunity to one and all has provided untold benefits for both individuals and society at large over the long haul.

The latest work from Robert Putnam, the pre-eminent scholar of social change in the modern US, illustrates the common wisdom on the matter. His book The Upswing (co-authored with the social entrepreneur Shaylyn Romney Garrett) sets the stage by describing the social strife of the Gilded Age, the final decades of the 19th century when rapid industrialisation and technological change generated social dislocation, inequality, civic discord and political corruption. In response to this troubled state of affairs, the Progressive movement sprang into being, bringing a new community spirit to society’s problems, along with a series of pragmatic solutions. One signal achievement was the establishment of the modern public high school, an innovation that began in the US West and Midwest and spread quickly throughout the country. Enrolment at the secondary level among those aged 14 to 17 leapt from about 15 per cent in 1910 to 70 per cent by 1940.

In Putnam’s account, the clearest benefit of educating Americans to a higher level was unparalleled economic growth and upward social mobility for the newly educated lower classes – positive effects that unfolded over the first half of the 20th century and made the US a more prosperous and egalitarian society. These benefits were part and parcel of a more general upswing that encompassed rising levels of social trust, community engagement, political cooperation, and a stronger societal emphasis on ‘we’ than ‘I’.

But it did not last. For reasons not entirely clear, the 1960s saw individualism resurfacing as the dominant mindset of Americans and the ethos of US society, turning the upswing into a downswing that has continued to the present day and lies at the heart of many contemporary social and political problems.

Hidden in this puzzling arc of social change is another unintended consequence. Universal secondary education not only elevated Americans by spreading relevant knowledge and skills to the masses. It also gave rise to a more complex social and cultural transformation, as the adolescent period became pivotal in shaping who we are. The fact is that high school is, and always has been, about more than just education. In the late 1950s, the sociologist James Coleman investigated student life in 10 US high schools, seeking to learn more about adolescents and their orientation towards schooling. In The Adolescent Society: The Social Life of the Teenager and Its Impact on Education (1961), he reported that it was the social, not the educational, dimension of the high-school experience that was paramount to teens. Cloistered together in the high-school setting, teenagers occupied a separate and distinct social space largely immune from adult influence. Coleman warned that:

The child of high-school age is ‘cut off’ from the rest of society, forced inward toward his own age group, made to carry out his whole social life with others his own age. With his fellows, he comes to constitute a small society, one that has most of its important interactions within itself, and maintains only a few threads of connection with the outside adult society.

The emergence of a segregated teenage realm occurred well before Coleman put his finger on the problem. In their classic study of the mid-1920s, the sociologists Robert and Helen Lynd described the high school in ‘Middletown’ (later revealed to be Muncie, Indiana) as ‘a fairly complete social cosmos in itself … [a] city within a city [where] the social life of the intermediate generation centres … taking over more and more of [their] waking life.’

Life beyond the classroom reinforced the pattern: a national survey from around the same time found that the average urban teenager spent four to six nights a week socialising with peers rather than enjoying quiet nights at home with the family. With the advent of modern high school, the day-to-day life of teenagers was transformed, their coming-of-age experiences fundamentally altered. Adolescence became a kind of social crucible where teens were afforded the time and space to interact intensively with one another and develop by their own lights.

So while there was clear educational benefit gained from the reading, writing and arithmetic taking place in high-school classrooms across the land, a wider set of changes started to emanate from this new social configuration. The most visible was the emergence of a more sharply defined youth culture rooted in shared interests and passions that flourished more freely within adolescent society. Young people flocked to the movies like no other demographic, their enthusiasm for the silver screen and its celebrity icons helping to propel Hollywood to the forefront of popular culture. They latched on to new musical styles – jazz in the 1920s, swing in the 1930s – and embraced them as their own; devoured the new literary sensation of the times, comic books; and adopted common ways of dressing and personal styling as emblems of youth fashion. Embodied in these trends was a heightened emphasis on the fun and the frivolous side of life that would slowly reset societal standards as time went on.

Other changes were more subtle but equally portentous. Sociological studies conducted between the two world wars reveal a rapid liberalisation of attitudes towards practices such as betting, smoking and divorce, with rates of disapproval among youth declining by 20 to 35 percentage points in the space of just a single decade. In this same period, young people grew increasingly tolerant of social misdemeanours such as habitually failing to keep promises, using profane language, and keeping extra change mistakenly given by a store clerk – minor incivilities by today’s standards, but harbingers of a changing social landscape where the transgression of established norms was starting to become more common and accepted.

This rapid evolution in everyday behaviour reflected a deeper transformation: the character of rising generations, their values, temperament and traits, were being reshaped by the powerful influence of peers during the formative years of adolescence. Hedonistic desires were more openly expressed, pleasurable activities more freely pursued. Conscientiousness was downplayed, social norms treated with greater scepticism and disdain. Impulsiveness and emotionality were more commonly displayed, an open, adventurous spirit widely embraced.

What these diverse adolescent qualities amounted to were the building blocks of a nascent individualism that would reshape society profoundly as they came to full fruition over the course of the next few decades. Traits conducive to self-focused and self-directed thought and action were more deeply etched in teenagers and slowly altered the character of society at large as whole groups socialised in this manner moved forward to adulthood.

The effects of peer influence, this argument implies, run deeper than is commonly imagined, affecting not just superficial features of the self during the teenage years, but the kind of person we become. Important research from the personality psychologist Judith Rich Harris, synthesised in her seminal book, The Nurture Assumption (1998), backs up this idea. Harris reviewed the body of research on the nature versus nurture debate, finding it consistently showed environmental effects outside the home loomed larger than had previously been realised. And she presented evidence that . . .

Continue reading.

I commented on the article:

Fascinating article, and the hypothesis of adolescents “setting” their cultural outlook through being grouped with coevals during the transition to early adulthood (a) makes sense and (b) explains a lot. I am now elderly but in middle age (in the 1980’s), a common topic of conversation among people of my age was how much older our parents seemed to have been when they were the age we were. We (in our view) still had a youthful outlook, but our parents had always had an older (more adult?) outlook and attitude. And of course our parents had spent their adolescent years not among coevals but embedded in an adult workforce, where they picked up the culture and expectations of those adults, whereas we had picked up in our adolescent years the culture and outlook of other adolescents.

Another thought: I recall reading about things that happened in Iraq after George W. Bush had the US invade (and pretty much destroy) that country, and among those things was the US practice of imprisoning anyone whom they suspected of being a “terrorist” (sometimes just being anti-US). That amounted, various writers pointed out, to an intensive education in terrorism, by putting together practiced and knowledgeable insurgents and terrorist with many who had been merely discontented, but in the prisons, they learned a lot — skills, attitudes, and outlooks — and made connections so that they left as members of a network. (Another unforeseen side-effect.)

By penning up adolescents together for the years of their transition from childhood to early adulthood, we pretty much ensured that a new culture would evolve and they would leave that environment with that cultural outlook embedded in them.

Both those are examples of the rapidity with which memes evolve. (“Memes” in the sense Richard Dawkins meant when he defined the term in Chapter 11 of The Selfish Gene, as units of human culture.) Memetic evolution is the result of the same logic that explains the evolution of lifeforms: reproduction with variation, occasional mutation, and a natural selection that results in some changes being successful (reproducing more) and others not so successful — cf. The Meme Machine, by Susan Blackmore.

Cultures evolve very quickly, but even lifeforms can evolve fairly quickly in circumstances in which selection is intense — cf. the rapid evolution when a species becomes island-bound. The schools (and prisons) made a cultural island, and cultural evolution was swift.

Written by Leisureguy

22 July 2021 at 8:13 pm

Watching the Watchmen: The Michigan Group Who Planned to Kidnap the Governor

leave a comment »

Ken Bensinger and Jessica Garrison report in Buzzfeed:

The Michigan kidnapping case is a major test for the Biden administration’s commitment to fighting domestic terrorism — and a crucible for the fierce ideological divisions pulling the country apart.

In the inky darkness of a late summer night last September, three cars filled with armed men began circling Birch Lake in northern Michigan, looking for ways to approach Gov. Gretchen Whitmer’s three-bedroom vacation cottage, subdue her — using a stun gun if necessary — and drag her away.

One vehicle stopped to check out a boat launch while a second searched in vain for the right house in the thick woods ringing the lake. The third car ran countersurveillance, using night vision goggles to look out for cops and handheld radios to communicate with the others.

Earlier, they had scoped out a bridge over the Elk River, just a few miles away, scrambling down under the span to figure out where plastic explosives would need to be placed to blow it sky-high. That would slow police response, giving the men time to escape with the governor — who had infuriated them by imposing COVID lockdowns, among other outrages — and either take her to Lake Michigan, where they could abandon her on a boat, or whisk her to Wisconsin, where she would be tried as a “tyrant.”

“Everybody down with what’s going on?” an Iraq War veteran in the group demanded to know when they ended their recon mission, well past midnight, at a campsite where they were all staying.

“If you’re not down with the thought of kidnapping,” someone else replied, “don’t sit here.”

The men planned for all kinds of obstacles, but there was one they didn’t anticipate: The FBI had been listening in all along.

For six months, the Iraq War vet had been wearing a wire, gathering hundreds of hours of recordings. He wasn’t the only one. A biker who had traveled from Wisconsin to join the group was another informant. The man who’d advised them on where to put the explosives — and offered to get them as much as the task would require — was an undercover FBI agent. So was a man in one of the other cars who said little and went by the name Mark.

Just over three weeks later, federal and state agents swooped in and arrested more than a dozen men accused of participating in what a federal prosecutor called a “deeply disturbing” criminal conspiracy hatched over months in secret meetings, on encrypted chats, and in paramilitary-style training exercises. Seven of the men who had driven to Birch Lake that night would end up in jail.

The case made international headlines, with the Justice Department touting it as an example of law enforcement agencies “working together to make sure violent extremists never succeed with their plans.” Prosecutors alleged that kidnapping the governor was just the first step in what some on the right call “the Big Boog,” a long-awaited civil war that would overthrow the government and return the United States to some supposed Revolutionary War–era ideal.

The defendants, for their part, see it very differently. They say they were set up.


.
The audacious plot
 to kidnap a sitting governor — seen by many as a precursor to the Jan. 6 assault on the US Capitol by hundreds of Trump-supporting protesters — has become one of the most important domestic terrorism investigations in a generation.

The prosecution has already emerged as a critical test for how the Biden administration approaches the growing threat of homegrown anti-government groups. More than that, though, the case epitomizes the ideological divisions that have riven the country over the past several years. To some, the FBI’s infiltration of the innermost circle of armed anti-government groups is a model for how to successfully forestall dangerous acts of domestic terrorism. But for others, it’s an example of precisely the kind of outrageous government overreach that radicalizes people in the first place, and, increasingly, a flashpoint for deep state conspiracy theories.

The government has documented at least 12 confidential informants who assisted the sprawling investigation. The trove of evidence they helped gather provides an unprecedented view into American extremism, laying out in often stunning detail the ways that anti-government groups network with each other and, in some cases, discuss violent actions.

An examination of the case by BuzzFeed News also reveals that some of those informants, acting under the direction of the FBI, played a far larger role than has previously been reported. Working in secret, they did more than just passively observe and report on the actions of the suspects. Instead, they had a hand in nearly every aspect of the alleged plot, starting with its inception. The extent of their involvement raises questions as to whether there would have even been a conspiracy without them.

A longtime government informant from Wisconsin, for example, helped organize a series of meetings around the country where many of the alleged plotters first met one another and the earliest notions of a plan took root, some of those people say. The Wisconsin informant even paid for some hotel rooms and food as an incentive to get people to come.

The Iraq War vet, for his part, became so deeply enmeshed in a Michigan militant group that he rose to become its second-in-command, encouraging members to collaborate with other potential suspects and paying for their transportation to meetings. He prodded the alleged mastermind of the kidnapping plot to advance his plan, then baited the trap that led to the arrest.

This account is based on an analysis of court filings, transcripts, exhibits, audio recordings, and other documents, as well as interviews with more than two dozen people with direct knowledge of the case, including several who were present at meetings and training sessions where prosecutors say the plot was hatched. All but one of the 14 original defendants have pleaded not guilty, and they vigorously deny that they were involved in a conspiracy to kidnap anyone. . .

Continue reading. There’s much more.

Written by Leisureguy

21 July 2021 at 12:58 pm

“My Deep, Burning Class Rage”

leave a comment »

Charlotte Cowles recounts in The Cut how a woman, who understandably wishes to remain anonymous, encounters financial inequity among those she knows and how it affects her. It begins:

Get That Money is an exploration of the many ways we think about our finances — what we earn, what we have, and what we want. In Living With Money, we talk to people about the stories behind their bank balances. Here’s how a 40-year-old woman in New York copes with “class rage” — the feeling that all her friends and colleagues are wealthier, and she’ll never be able to catch up.

I define class rage pretty specifically. It’s how I feel when I think that someone is in a similar financial situation to me, and then I discover that they actually have this extra source of money. When I was younger, it was like, “Oh wait, you come from a rich family.” But now it’s like, there’s a secret trust fund. Or a wealthy spouse. At my core, I believe that if you have money, your life is easier. If a person grew up rich, or with relative financial security, then I just can’t relate to them at all.

I work in book publishing in New York, which definitely compounds this problem. The publishing world is full of wealthy people — like a lot of creative industries, it has some glamour but it doesn’t pay well. So if you want to live comfortably, it helps if you have another income source. And these aren’t the types of wealthy people who flaunt their money. They tend to be more embarrassed about it. So they downplay it, like, “Oh, I’m just a poor book editor. I just do this job because I love literature.” And I’m like, no! You do this job because you can! That’s what really gets to me.

I don’t feel this way toward rich people in general, like celebrities or bankers on Wall Street. It’s not about rich people who make a lot of money at their jobs. Instead, I feel it toward people who have always had money — who’ve had this sense of backup that allows them to experiment in life and do what they want. I’m so jealous of that built-in freedom.

I know that these are unfair assumptions, and I might sound like a terrible person. I have plenty of rich colleagues who still work hard and are nice, good people. I hate that I feel this way. And I’m sure that lots of people might feel the same way about me — money and resources are all relative. But I have quite a bit of debt and my whole life feels so tenuous sometimes. I’m 40 and I’m single and I spend almost all of my money on rent and I’m constantly stressed about finances. I blame a lot of my problems on money, even though I know that’s irrational — they’re not really money problems. I just can’t shake the fact that if I had more financial security, my life would be much better. I don’t get jealous about material things — it’s lifestyle stuff, like having the freedom to go out for dinner without having to go consign my clothes to pay for it, which I have definitely done.

I was always jealous of people with money. When I was growing up, my dad was a high school teacher and my mom mostly did temp work, sort of picking up jobs where she could. We weren’t dirt poor, but it was very hand-to-mouth. Money was always an issue. At one point when I was a kid, my dad got cancer and the medical bills put my parents into a lot of debt. They tried not to make a big deal out of it, but there was always just this level of concern. There was no cushion. We had one car and it was always breaking down. I always knew that if I wanted anything, I’d have to work very hard for it, probably harder than most people I knew. Asking my parents for money was and is definitely never an option.

When I went to college, that was the first time I noticed a real divide between people who had money and people who didn’t, because some of us needed jobs. It was also the first time I became aware of how it impacted how you could perform. Like, I had to work three nights a week, so I literally didn’t have as much time to spend on my assignments as I wanted to. When I moved to New York in my mid-20s for grad school, I saved up for a year beforehand, working seven days a week, often double shifts. I got a full scholarship, but I still had to pay rent and support myself. And I’ve just been in survival mode ever since. When I finished my MFA, I was earning $25,000 a year and my rent was $1,200 a month. You do the math.

In grad school, I saw a whole new level of privilege. I was working three jobs and my friends and I would talk about struggling with money and then I’d realize that their parents were paying their rent. Or they could charge things to their parents’ credit cards in an “emergency.” Or that some of them had never had a job before in their lives. I became aware of the sheer amount of money that had gone into some of these people. Like, between their private schooling and Ivy League college and grad school, that’s more money than I’ll ever make in my lifetime. To be this walking investment, with this price tag on your life — I can’t understand what that would feel like. I’m sure there’s some pressure, and that must suck. But at the same time, the road has been smoothed for you.

One of my close friends from my MFA program, we had pretty similar career struggles and worked in very similar jobs, and it seemed like we were on a similar path in life. And I did occasionally notice that she’d say something like, “Oh, my family has this little ramshackle cabin in the woods somewhere, it’s covered in cobwebs,” when she really should have said, “I have a ski house in Colorado.” But I didn’t really know the extent of it until she had a baby. And that’s when a line was drawn. Suddenly, she was looking at real estate, buying an apartment, hiring a full-time nanny. And I’m not proud of this, but it changed how I felt about our entire relationship. I felt deceived. I know that people shouldn’t have to declare how much money they have in their family as a prerequisite for friendship. But it was more that what had felt to me like a shared struggle wasn’t real for her. When we had talked about our worries, about our careers and our futures, all those conversations suddenly felt tainted. It’s possible that she was doing it just to fit in and be friendly. But I felt like I’d been fooled.

We drifted apart after that, which is what usually happens when I find out about somebody’s money. I’ve never gotten in a fight over it. I just sort of stew, and then there’s this psychological distance that emerges.

I can’t do a lot of things because of money. Everyone says that — “I’m too poor, I can’t go out.” And that enrages me because I really mean it. It’s isolating, because I can’t talk about it. I can’t say, “I have $7 in my checking account,” because it scares people. And no one wants to be around someone who complains about money. I definitely have had to cut out a lot of acquaintances and networking opportunities because I cannot afford to just meet for a drink. I’m sure that people think that I’m depressed or I’ve just drifted off or something, but it’s really just the money.

I completely understand why people downplay their wealth. I would probably do the same thing if I were around someone with a lot less money than me. But what annoys me is the hypocrisy of it, acting like you haven’t had a leg up. I would just prefer people to be honest. Just accept that you’re privileged. Accept that you’re lucky. Accept that certain things are easier for you because of money. But people never do. Sometimes I wonder if they’re even aware.

What really haunts me is when I feel like I’ve been . . .

Continue reading.

Written by Leisureguy

21 July 2021 at 12:48 pm

Our Workplaces Think We’re Computers. We’re Not.

leave a comment »

Illustration by The New York Times; photograph by Stephanie Anestis

Related somewhat is a quotation I just encountered:

“The question of whether a computer can think is no more interesting than the question of whether a submarine can swim.”   — Edsger W. Dijkstra

In the NY Times an interesting podcast, with this introduction:

For decades, our society’s dominant metaphor for the mind has been a computer. A machine that operates the exact same way whether it’s in a dark room or next to a sunny window, whether it’s been working for 30 seconds or three hours, whether it’s near other computers or completely alone.

But that’s wrong. Annie Murphy Paul’s The Extended Mind argues, convincingly, that the human mind is contextual. It works differently in different environments, with different tools, amid different bodily states, among other minds.

Here’s the problem: Our schools, our workplaces, our society are built atop that bad metaphor. Activities and habits that we’ve been taught to associate with creativity and efficiency often stunt our thinking, and so much that we’ve been taught to dismiss — activities that look like leisure, play or rest — are crucial to thinking (and living!) well.

Paul’s book, read correctly, is a radical critique of not just how we think about thinking, but how we’ve constructed much of our society. In this conversation, we discuss how the body can pick up on patterns before the conscious mind knows what it’s seen, why forcing kids (and adults) to “sit still” makes it harder for them to think clearly, the connection between physical movement and creativity, why efficiency is often the enemy of productivity, the restorative power of exposure to the natural world, the dystopian implications of massive cognitive inequality, why open-plan offices were a terrible idea and much more.

You can listen to our whole conversation by following “The Ezra Klein Show” on AppleSpotifyGoogle or wherever you get your podcasts.

(A full transcript of the episode is available here.)

Written by Leisureguy

21 July 2021 at 12:38 pm

How a solitary prisoner decoded Chinese for the QWERTY keyboard

leave a comment »

Would it have been easier and faster if he had used the Dvorak keyboard? 🙂 In Psyche Thomas S Mullaney, professor of Chinese history at Stanford University, gives a fascinating account that shows the amazing way the brain works. He writes:

In China, suburban garages do not factor in the lore of computing history the way they do in the United States. But prisons do – at least, one particular prison in which a brilliant Chinese engineer was sentenced to solitary confinement for thought crimes against Mao Zedong during China’s Cultural Revolution. His name was Zhi Bingyi and, during long and anxiety-ridden days, months and years of solitude, he made a breakthrough that helped launch China’s personal computing revolution: he helped make it possible to type Chinese with a run-of-the-mill Western-style QWERTY keyboard.

Zhi was born in 1911 on China’s eastern coast, in Jiangsu province. His generation shouldered an almost unbearable burden: a mandate to dedicate their lives to the modernisation of their country. Zhi completed his undergraduate education in 1935, receiving a degree in electrical engineering from Zhejiang University. He moved to Germany in 1936, receiving his doctorate in 1944 from the University of Leipzig. He spent nearly 11 years in Germany, becoming fluent in the language, and marrying a German woman.

Upon the couple’s return to China in 1946, Zhi held a variety of distinguished posts, yet his long-time experience overseas made him suspect in the eyes of the still-nascent Chinese Communist Party regime following the 1949 revolution. When the Cultural Revolution erupted in 1966, Zhi became a marked man. Named a ‘reactionary academic authority’ (fandong xueshu quanwei) – one of the era’s many monikers for those condemned as enemies of the revolution – he was confined in one of the period’s infamous ‘ox pens’. The cell measured a claustrophobic six square metres. Outside its four walls, China descended into the political turmoil of the Cultural Revolution. In his hometown of Shanghai, fanatics and paramilitary groups pledged undying loyalty to the person of Chairman Mao. In the early months of the crisis, bands of radical youth set out upon ‘seek and destroy’ raids intent on purging the country of all pre-revolutionary vestiges of ‘Old China’.

Unsure if he would ever see his wife again, with no other voices besides his guards’, and with no work to occupy his mind, Zhi filled the long hours staring at the wall of his cell – specifically, at an eight-character poster that made a chilling assurance to him and anyone unfortunate enough to set their eyes upon it:

坦白从宽,抗拒从严
(tanbai congkuan, kangju congyan)
‘Leniency For Those Who Confess, Severity For Those Who Resist’

The message was clear: We have the authority to destroy your life (if you resist)Or to make your imprisonment somewhat more tolerable (if you confess).

Zhi read this terrifying couplet over and over again, for days, weeks and months on end. And then something began to happen – something that reminds us of the inherent strangeness of language.

No matter one’s mother tongue, the process of becoming fluent in a language is a process of forgetting that language is a form of arbitrary code. There is nothing inherently ‘candid, frank, or open’ about the character 坦 (tan), nor ‘white, blank, or clear’ about the character 白 (bai). As with any young child, Zhi in his earliest years of life would have looked upon these symbols as random assemblages of pen strokes on the page, born of a complex web of conventions whose origins we will never be able to reconstruct in full. But steadily, over the course of innumerable repetitions, something happens to us: the sounds and sights of language begin to approach, and then to achieve, a kind of natural, god-givenness. The character 白 (bai) no longer ‘stands in’ for whiteness by dint of painstaking study and memorisation, but merges with it effortlessly. This merger is the fruition of every child’s struggle to speak, read and write: the struggle to make inroads into their family and community’s semiotic universe, transforming it from an indecipherable code to a medium of expression.

While most of us experience this transformation as a one-way process, it can be reversed. A sound or symbol made second-nature can be denatured – defamiliarised and queered, in which one is somehow able to tap into the original meaninglessness of one’s mother tongue, even as one continues to be able to hear, see and speak it fluently.

This is what happened to Zhi. As he whiled away his time in prison, mulling over these eight characters (seven, if we account for one character that is repeated), this act of repetition restored to them their inherent arbitrariness. By the 100th reading – perhaps the 1,000th, we cannot know – Zhi began to explode these characters in his mind, into a variety of elements and constellations. The first character (坦), for example, could be readily divided into two distinct parts: 土 and 旦, and then further still into + and − (making up the component 土) and 日 and  (making up 旦). The second character 白 could be subdivided, as well, perhaps into 日, with a small stroke on top. Then the rest. Even in this short, eight-character passage, the possibilities of decomposition were abundant.

Zhi managed to get hold of a pen – the one he was given to write political self-confessions – but paper was impossible to find. Instead, he used the lid of a teacup, which his captors provided him to drink hot water. When turned over, Zhi discovered, the lid was large enough to fit a few dozen Latin letters. Then he could erase them and start again, like a student in ancient Greece with an infinitely reusable wax tablet. And so he mulled over each character one by one, decomposing them into elements, and then converting those elements into letters of the Latin alphabet.

He was creating a ‘spelling’ for Chinese – although not in the conventional sense of the word.

In Zhi’s system, the letters of the Latin alphabet would not be used to spell out the sound of Chinese words. Nor would they be used to ‘compose’ them per se. Instead, he envisioned using Latin letters to retrieve one’s desired Chinese character from memory. For him, Latin letters would be the instructions or criteria one fed to a machine, telling the device to, in effect, ‘retrieve the Chinese characters that match these requirements’.

Take the example of fu (幅), a Chinese character meaning ‘width’. Ultimately, Zhi settled upon an unexpected ‘spelling’ for this character, which bore no resemblance to its sound: J-I-T-K. The first letter in this sequence (J) corresponded not to the phonetic value of the character (which should begin with ‘F’) but to a structural element located on the left-most side of the character: the component 巾 that, when seen in isolation, is pronounced jin. The code symbol ‘J’ was derived from the first letter of the pronunciation of the component.

The rest of the spelling – I, T and K – followed the same logic. ‘I’ was ‘equal to’ the component/character yi (一); ‘K’ referred to the component kou (口); and ‘T’ to tian (田). Other letters in Zhi’s code performed the same role:

D = the structure 刀 (with ‘D’ being derived from dao, the pronunciation of this character when seen in isolation)
L = 力 (same logic as above, based on the Pinyin pronunciation li)
R = 人 (same logic as above, based on the Pinyin pronunciation ren)
X = 夕 (same logic as above, based on the Pinyin pronunciation xi)

Zhi eventually gave his code a name: ‘See the Character, Know the Code’ (Jianzi shima), ‘On-Site Coding’ (OSCO), or simply ‘Zhi Code’ (Zhima).

In September 1969, Zhi was released from prison, rejoining his wife and family at their apartment on South Urumqi Road, in Shanghai – albeit in a form of prolonged house arrest.

Other changes were afoot, as well. In 1971, the United Nations recognised Beijing as the official representative of China, granting the country a seat on the Security Council. In 1972, Richard Nixon shocked the world with the first US presidential delegation to the People’s Republic of China (PRC). In 1976, Mao died of cancer, setting in motion a profound sweep of political, economic and social transformations. Then, in 1979, the gates opened even wider, with the normalisation of relations with the US.

One of the many changes that Sino-US normalisation brought was an influx – first a drip, then a flood – of US-built computers . . .

Continue reading. There’s more.

Written by Leisureguy

21 July 2021 at 10:56 am

“Powerful” detectors: Gender influence?

with one comment

Years ago my friend Robert Spaeth showed me an ad for a “powerful radio, able to pull in distant stations.” As he pointed out, radio receivers do not go out and pull in stations, and rather than “powerful,” good receivers are sensitive, able to detect very faint signals. He thought the ad was silly, and I think he was right.

The Big Bang article I blogged yesterday included this:  “There should be gravitational wave backgrounds as well, from a variety of eras in the early universe. Once again these haven’t been detected yet, but more powerful gravitational wave telescopes may yet see them.”

“Powerful” telescopes? Detecting faint signals does not require “power,” it requires being highly sensitive. Would one say that a scale that could detect and measure extremely light weights — the weight of a fly’s footprint, for example — was “powerful”? or “sensitive”? I would say an instrument capable of detecting very faint signals or very small events is “sensitive.” A machine that crushes cars or hoists tons of steel is “powerful,” but not a device that detects faint signals.

It strikes me that this misuse of “powerful” stems from a culture that embodies a gender-specific view: astrophysicists (and electrical engineers) are almost all male, and many males are fixated on strength and power and command, and are impatient with things like sensitivity and receptivity because those strike them as “feminine” (and thus weak).

Their male ideal is being active, driving what is done, being in control, having power, and getting their demands met. The idea of something very sensitive, waiting to receive faint signals is (on an unconscious level) incongruent with what they feel is their proper persona and role. Thus, they recast a sensitive detector as a “powerful” detector, a detector that can go out there and use its strength and power to get the job done.

Using “powerful” in that context is somewhat akin to Abraham Maslow’s “If all you have is a hammer, everything looks like a nail” — those who (psychologically) depend on strength and activity will view a device that works well at accomplishing a task must be one that has lots of strength and thus is “powerful.”

In fact, I believe that many men want very much not to be sensitive, for they view being sensitive as being weak (rather than as being perceptive). Gender-enforced attitudes can be limiting.

Written by Leisureguy

16 July 2021 at 1:34 pm

“Leaving burnout behind: the pain and pleasure of starting a new career in my 50s”

leave a comment »

Lucy Kellaway has in the Guardian an extract from her book Re-educated: How I Changed My Job, My Home, My Husband and My Hair:

I spent 30 years as a journalist before deciding to become a secondary school teacher. While a complete career change is rare, it is one of the best moves I ever made

I had my first midlife crisis in 2006. It started at 7am on a cold January morning when my mother got out of bed, made herself a cup of tea, had an aneurysm and died.

I was a 46-year-old married newspaper columnist with four children, who appeared to be living a more than satisfactory life. But as the sudden axe of grief fell, I looked at my career, which was going better than I’d ever thought possible, and thought: I don’t want this any more.

Mum had been a brilliant teacher at Camden School for Girls (where I also went in the 1970s), and even though the idea of teaching had always seemed horrible to me (too much work, too little money, no glamour, no recognition, really nothing to recommend it at all) I started to research postgraduate certificate in education (PGCE) courses. I was greeted by the smiling faces of 22-year-old trainees and thought: damn, I’ve left it too late. So I banished teaching from my mind and went back to doing what I had already been doing for 20 years: writing sarky columns and interviews for the Financial Times.

My next crisis, the one that brought the whole thing crashing down, happened 10 years later. This time it was my father’s death that started it.

In the raw days after Dad’s funeral I once again found myself Googling PGCE courses and was again greeted by pictures of twentysomething teachers. This time, instead of thinking I was too old, I thought: I don’t care, I’m doing it anyway. A couple of months later, I marched into the FT editor’s office and told him I was leaving to be a maths teacher – and setting up a charity to encourage other people my age to do the same thing.

“You sure about that?” he asked.

“Yes,” I said.

When I told people what I was planning to do, they all either said I was mad or (which amounted to the same thing) brave. But jacking in journalism to become a teacher so late in life wasn’t brave – it was desperate. Though I didn’t admit it at the time, I was entirely burnt out – I had been at the same place for an interminably unimaginative 32 years – and was showing the classic symptoms. I was cynical about the value of what I did and of journalism as a whole – what was all this crazy chasing of ephemera really for? I also felt the columns I was writing were rubbish. The very thought of writing another one was making me feel so sick I had to find a way out and do something else entirely.

Secondly, there was little financial sacrifice in quitting. Even though my new salary as a trainee would be barely a fifth of my old one, I owned my house and had savings as I had never spent anything like the money I earned. I also had a pension that would start in five years’ time.

It would have been much braver (and much madder) for me to quit at 47 when my children were all at school and required a certain amount of policing, feeding, homework assistance and financial support. Back then, I was still in thrall to the status of what I did (though at the time I would have denied that). The Financial Times was part of my identity – it was the impressive part. I feared that without it people wouldn’t want to know me any more. I wouldn’t be asked to things. There would be no more invitations to the champagne opening evening of the Chelsea flower show. Ten years on, the appeal of status had worn very thin – I knew my close friends would still like me if I was a teacher, and if I wanted to go to the Chelsea flower show that badly I could always buy my own ticket.

In the end, leaving wasn’t hard. There was almost no jeopardy. The only risk was one I had manufactured myself: having set up a . . .

Continue reading.

Written by Leisureguy

15 July 2021 at 2:08 pm

Keeping People Out of Jail [for minor crimes] Keeps People Out of Jail [by curtailing major crimes]

leave a comment »

David Byrne (of Talking Heads fame, and also founder of the site) writes in Reasons to Be Cheerful:

Both sides of the political fence in the U.S. agree that mass incarceration isn’t working. It is expensive, discriminatory and has serious societal consequences. Crime has, in general, been trending down for decades (even in 2020, despite public perception) while prisons just keep filling up.

Credit: NYU Brennan Center

 The partisans may disagree on the best way to lower the prison population, but the good news is they agree it has to happen. The present system is unsustainable.

One way of reducing mass incarceration is to simply start ignoring certain laws. Some 80 percent of cases filed nationally are for misdemeanors. These are the types of crimes that are often victimless, but that can mess up the life of the person prosecuted for them. A few places have addressed this in the most straightforward way possible: by not automatically prosecuting these crimes. What has happened as a result? Studies have shown that these places reduced their prison populations without putting the public at risk. Crime did not go up. In fact, in many cases, it went down. And, surprisingly, often not just for misdemeanors. 


.
A seemingly radical idea

The consequences not just for the individual but for society and the economy begin well before someone is actually incarcerated. Simply being prosecuted, having a record, becomes a disadvantage for life. It can make it harder to get a jobto vote, to get a loan for an education or a mortgage for a home. Minor nonviolent infractions can leave one disadvantaged forever. They can effectively ruin a life. 

Rachael Rollins, the district attorney of Suffolk County, which includes Boston, was well aware of this when she did something that seemed radical upon being elected in 2018. The county stopped automatically prosecuting people for small crimes: minor drug possession, shoplifting, disorderly conduct and other nonviolent offenses. A study released a year later showed that this change prevented a large number of folks who were charged with these offenses from being funneled into the criminal justice system. But it also had a broader effect: violent offenses in Suffolk County went down by 64 percent, and even traffic offenses decreased by 63 percent. 

Why would declining to prosecute people for low-level crimes also reduce other types of crimes? The study, by the National Bureau of Economic Research, found that the key is keeping folks out of the criminal justice system. Doing so reduced the odds by 58 percent that these folks would engage with that system in the future. So, to be clear, this doesn’t suddenly empty out the prisons — it’s not retroactive — but it dramatically slows the flow of folks being incarcerated, which, in turn, reduces the chances that those people will commit future crimes. As the presently incarcerated end their sentences and leave, there won’t be the same flow of new prisoners coming in to replace them. It seems to me this is incredibly good news — both sides of the political divide should be happy.

I decided to call up the three authors of this study to see what they felt were the implications of their research on this policy. It turns out they were as pleasantly surprised by the results as I was. 

The authors of the study are Amanda Agan (Rutgers University), Anna Harvey (New York University) and Jennifer Doleac (Texas A&M University).

DB: Can you summarize your results for our readers?

AA: Our study found that, at least for certain defendants [mostly first time offenders], non-prosecution — not moving forward with charging an individual defendant — actually reduces the probability that that individual ends up back in the criminal justice system, and so reduces recidivism and reduces future criminal justice contact. 

We wanted to study this because there’s a potential tension: if we’re going to choose to not prosecute somebody, is this going to embolden them to go on to commit more crimes? Or is it going to kind of put them on a better path and allow them that second chance to potentially reduce their criminal justice involvement?  And we’ve found it’s the latter, that this is reducing recidivism in the future.

DB: What presently happens when these minor offenses are prosecuted?

AH: This is really important for your readers to understand: in most jurisdictions, when you’re arrested for a crime, that goes on to what is potentially a lifetime permanent criminal record maintained by the state criminal record agency, that potentially, depending on the state statutes, can show up to employers when they conduct a background check, and it can show up to law enforcement, police officers and prosecutors in the future. 

There’s really good evidence to suggest that [being prosecuted] changes people, the way you’re treated down the road, potentially, for a lifetime. What we’re studying are nonviolent misdemeanor arrests, which are, in most of the cases we study, later dismissed. What we’re finding is that defendants whose cases in which the prosecutors don’t prosecute them, they don’t receive a criminal record. And that seems to have a really beneficial effect.

DB: How did you, as the authors of the study, know for sure that it was leniency in prosecuting that had this effect? 

AA: What we were doing was a little different. We were taking advantage of the fact that in Suffolk County, the person who decides whether to charge you with a crime or not is basically randomly assigned to your case. You have no control over whether me, Jen or Anna is going to be the one that’s going to end up making a decision about whether to charge you or not. 

[For example], it turns out Jen is really lay-down-the-law. She’s really going to want to prosecute people. And Anna, Anna is super lenient, she really likes to give second chances, just kind of by nature. 

And so we’re using that luck of the draw, that some offenders happen to get Anna and others happen to get Jen. We try to understand what happens when you get Anna and you don’t get that criminal record. What is the effect on your future recidivism versus if you got that harsher prosecutor? 

DB: What kinds of offenses are we talking about?

AAThey’re all nonviolent misdemeanors like disturbing the peace, trespassing, some low-level kinds of theft or shoplifting, minor drug possession. There are also some more serious kinds of traffic or moving violations that move into the realm of criminality, rather than just citations or traffic tickets, that are going to kind of make up a majority of these nonviolent crimes that we’re talking about. 

AH: Misdemeanors are often things like driving with an expired registration, or driving with an expired license, driving with expired insurance — basically, driving without the right paperwork. Who doesn’t forget to renew stuff? 

AA:In a sense [this is] criminalizing poverty.  [Low-income folks] don’t have the time or resources to go and handle some of these problems.

As these researchers point out, once you’ve engaged with the criminal justice system it can be a slippery slope, so keeping folks out of it in the first place can have a huge knock-on effect. It prevents future crime.

AH: One of the things that we’re finding is that even though the cases that we’re studying are only these nonviolent misdemeanor offenses, when you prosecute a first-time nonviolent misdemeanor, offender, person, individual, they’re more likely to come back — not just on another non-violent misdemeanor offense, they’re more likely to come back on a violent offense and a felony offense. 


.
Charm City lives up to its name

In March 2020 in Baltimore, State’s Attorney Marilyn Mosby tried a similar experiment initially prompted by the increased risk of Covid spreading in prisons. Her office would no longer prosecute a host of minor nonviolent charges: limited drug possession, prostitution, minor traffic infractions, misdemeanors and trespassing. This doesn’t mean all these things became legal, but it does mean that if you get arrested for them, you probably won’t be locked up. 

What happened? Well, no surprise, crime rates dropped suddenly. Which doesn’t mean folks stopped doing these things — only that they weren’t being prosecuted for them. But what’s interesting is that it wasn’t just those nonviolent crime rates that dropped. Violent crime dropped 20 percent too, and property crime dropped 36 percent. 39 percent fewer people overall got caught up in the criminal justice system, which is what you’d expect if some charges are not prosecuted. But, as in Suffolk County, it seems the reduction extended well beyond those nonviolent crimes. This also helps reduce discrimination, as it is mostly people of color who get caught up in the system. 

This past March, after the experiment proved to be successful, it was made permanent. When police realized these offenses were not being prosecuted they stopped arresting folks for them — for instance, there were 80 percent fewer arrests for drug possession. That allowed prosecutors to focus on violent crimes instead of these misdemeanors, which, according to some research, results in an increase in public safety. 

The cops were skeptical at first. The police commissioner expected crime to rise, but it continued to go down — even when it rose in many other big cities during the pandemic. Johns Hopkins University in Baltimore did a follow up study and found that of 1,431 folks who had charges dropped in this experiment only five ended up being arrested again, which is considered pretty incredible. 

Baltimore is now also following the example of the CAHOOTS program in Eugene, Oregon, previously written about here, by directing some calls about nonviolent incidents to the Baltimore Crisis Response, Inc., a behavioral health organization, rather than to the police. People in crisis get help from trained social workers instead of dealing with the police and risking the possibility of getting locked up. The police commissioner there has since come out in support of police not being expected to be social workers


.
It’s catching on

From NBC News: . . .

Continue reading. There’s more.

Written by Leisureguy

14 July 2021 at 12:40 pm

%d bloggers like this: