Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Psychology’ Category

A history of FLICC: the 5 techniques of science denial

leave a comment »

The above illustration is from a really excellent post about the techniques deniers use, often unwittingly (that is, some deniers simply cannot think very well — that’s not a good thing, but it’s better than being cynically deceptive). The post includes some interesting videos, so clicking the link is a good idea. The post begins:

In 2007, Mark Hoofnagle suggested on his Science Blog Denialism that denialists across a range of topics such as climate change, evolution, & HIV/AIDS all employed the same rhetorical tactics to sow confusion. The five general tactics were conspiracy, selectivity (cherry-picking), fake experts, impossible expectations (also known as moving goalposts), and general fallacies of logic.

Two years later, Pascal Diethelm and Martin McKee published an article in the scientific journal European Journal of Public Health titled Denialism: what is it and how should scientists respond? They further fleshed out Hoofnagle’s five denialist tactics and argued that we should expose to public scrutiny the tactics of denial, identifying them for what they are. I took this advice to heart and began including the five denialist tactics in my own talks about climate misinformation.

In 2013, the Australian Youth Climate Coalition invited me to give a workshop about climate misinformation at their annual summit. As I prepared my presentation, I mused on whether the five denial techniques could be adapted into a sticky, easy-to-remember acronym. I vividly remember my first attempt: beginning with Fake Experts, Unrealistic Expectations, Cherry Picking… realizing I was going in a problematic direction for a workshop for young participants. I started over and settled on FLICC: Fake experts, Logical fallacies, Impossible expectations, Cherry picking, and Conspiracy theories. . .

Continue reading.

Written by Leisureguy

20 October 2021 at 12:00 pm

Steven Pinker and the Apocalypse

leave a comment »

Robert Wright in the Nonzero Newsletter:

Steven Pinker, in his new book Rationality, says he sees a paradox within the world view of the woke—at least, those of the woke who subscribe to postmodernism.

On the one hand, postmodernists “hold that reason, truth, and objectivity are social constructions that justify the privilege of dominant groups.” On the other hand, their moral convictions “depend on a commitment to objective truth. Was slavery a myth? Was the Holocaust just one of many possible narratives? Is climate change a social construction? Or are the suffering and danger that define these events really real—claims that we know are true because of logic and evidence and objective scholarship?”

I guess he has a point (though, honestly, I’m not conversant enough in postmodern thought to say how many postmodernists are indeed hoist with this petard). But there’s also a kind of paradox within Pinker’s world view—not a logical contradiction, but an interesting tension.

Pinker is sympathetic to evolutionary psychology. (As am I; in 1994 I published an ev-psych manifesto called The Moral Animal that was favorably reviewed in the New York Times Book Review by… Steven Pinker.) And evolutionary psychology suggests that the human brain was designed by natural selection to, among other things, advance self-serving narratives that may stray from the truth.

Indeed, some of the “cognitive biases” that get so much attention these days—including in this newsletter and in Pinker’s new book—may exist for that very purpose. Confirmation bias, for example, leads us to uncritically embrace evidence that seems to support our world view, thus helping us mount rhetorically powerful arguments on behalf of our interests and the interests of groups we belong to.

Evolutionary psychology also suggests that people are naturally inclined to use whatever power they have to amplify these dubious narratives. So, really, Pinker should be willing to entertain the possibility that the world described by woke postmodernists—a world in which the powerful construct a version of reality that works to their advantage and to the disadvantage of the less powerful—is the real world.

I just recorded a conversation with Pinker for the Wright Show. (It will go public Tuesday evening, but paid newsletter subscribers can watch it—see below—now.) I somehow failed to ask him about this seeming harmony between evolutionary psychology and postmodernism—the fact that the two world views support similarly cynical views of human discourse. But that’s OK, because I’m pretty sure I know what he’d say in response:

Yes, he agrees that human nature inclines people to sometimes embrace self-serving falsehoods, and that this tendency can work to the advantage of the powerful and the disadvantage of the powerless. But he still diverges from the postmodernist perspective (as he defines it) by insisting that there is such thing as objective truth, even if none of us has reliable access to it. He writes in Rationality, “Perfect rationality and objective truth are aspirations that no mortal can ever claim to have attained. But the conviction that they are out there licenses us to develop rules we can all abide by that allow us to approach the truth collectively in ways that are impossible for any of us individually.”

I share Pinker’s belief that . . .

Continue reading.

Written by Leisureguy

19 October 2021 at 5:07 pm

Spotting liars is relatively easy with the right sort of questions

leave a comment »

What seems like “several” years ago (but I see now will be 15 years ago in 10 days), I blogged a technique for conducting a job interview: Hiring a STAR. As I noted in the post (in an update I wrote after thinking about it), the reason it works is because reality is very rich in detail and thus so are our memories of it. This richness of detail makes detecting liars — or, in more tactful phrasing, bluffers — relatively easy: ask for details.

Cody Porter, Senior Teaching Fellow in Psychology and Offending Behaviour, University of Portsmouth, writes in The Conversation:

Most people lie occasionally. The lies are often trivial and essentially inconsequential – such as pretending to like a tasteless gift. But in other contexts, deception is more serious and can have harmful effects on criminal justice. From a societal perspective, such lying is better detected than ignored and tolerated.

Unfortunately, it is difficult to detect lies accurately. Lie detectors, such as polygraphs, which work by measuring the level of anxiety in a subject while they answer questions, are considered “theoretically weak” and of dubious reliability. This is because, as any traveller who has been questioned by customs officials knows, it’s possible to be anxious without being guilty.

We have developed a new approach to spot liars based on interviewing technique and psychological manipulation, with results just published in the Journal of Applied Research in Memory and Cognition.

Our technique is part of a new generation of cognitive-based lie-detection methods that are being increasingly researched and developed. These approaches postulate that the mental and strategic processes adopted by truth-tellers during interviews differ significantly from those of liars. By using specific techniques, these differences can be amplified and detected.

One such approach is the Asymmetric Information Management (AIM) technique. At its core, it is designed to provide suspects with a clear means to demonstrate their innocence or guilt to investigators by providing detailed information. Small details are the lifeblood of forensic investigations and can provide investigators with facts to check and witnesses to question. Importantly, longer, more detailed statements typically contain more clues to a deception than short statements.

Essentially, the AIM method involves informing suspects of these facts. Specifically, interviewers make it clear to interviewees that if they provide longer, more detailed statements about the event of interest, then the investigator will be better able to detect if they are telling the truth or lying. For truth-tellers, this is good news. For liars, this is less good news.

Indeed, research shows that when suspects are provided with these instructions, they behave differently depending on whether they are telling the truth or not. Truth-tellers typically seek to demonstrate their innocence and commonly provide more detailed information in response to such instructions.

In contrast,  . . .

Continue reading. There’s more.

Written by Leisureguy

18 October 2021 at 11:21 am

How to think like a detective

leave a comment »

Ivar Fahsingis, a detective chief superintendent and associate professor at the Norwegian Police University College in Oslo with 15 years’ experience as a senior detective in the Oslo Police department and at the National Criminal Investigation Service of Norway, has an interesting article in Psyche:

A criminal investigation is a complex, multifaceted problem-solving challenge. Detectives must make critical decisions rapidly – sometimes involving life and death, based on limited information in a dynamic environment of active and still-evolving events. Detectives are responsible and empowered under the law to make judgment calls that will dramatically affect the lives of those involved. The stakes are high, the settings are ugly, and there’s no room for error.

Detectives are often portrayed as misanthropic masterminds. They seem to possess almost mythical personal gifts that the average person can only dream of. I’m sorry to disappoint you, but this isn’t entirely true. Not all detectives are masterminds, and you actually don’t need to be a detective to think like one. A few tools and methods can improve your inner detective, help you find facts, and learn to better understand the relationship between them.

Most of us, whether we’re highly educated or not, have never actually learnt how to think and make safe judgments under pressure. Yet good thinking is important for every aspect of life. Learning how to think like an expert detective can boost your incisiveness and creativity. It can make you less judgmental and a better listener. Honing your detective-thinking skills could help you solve everyday issues, such as planning the perfect vacation or choosing the best job candidate.

I am a university academic, but I’m also a real-life detective myself – more specifically, I’m a detective chief superintendent at the Norwegian Police University College. I’ve worked on some of the worst crimes in Norway for 30 years. These days, I spend much of my time teaching police detectives and other investigators how to make safer decisions in serious and complex matters – and I’m going to share some of the basics with you in this Guide.

When I first started as a police officer, none of my fellow detectives, police academy teachers or criminal investigation department bosses were seemingly able, nor interested, in telling me in practical terms how to think like a detective. Instead, they talked about attitude, talent and experience. Most of all, they liked talking about old cases they’d solved. They never spoke about the cases they failed to solve or the next challenge. The most crucial tool of any successful investigator – namely, sharp reasoning skills – was also never mentioned. We were all very keen on formulating mental profiles of offenders. Yet, strangely, the idea of profiling the effective detective was almost taboo. It’s as if the ability to think like an expert detective was taken for granted.

In fact, what might at first seem akin to a supernatural gift is mostly a metacognitive skill, which means the ability to think about thinking. Anyone can learn to improve their metacognitive skill, but it doesn’t come easily. For most of us, it goes against our instincts. Consider the common cognitive bias known as WYSIATI or ‘what you see is all there is’, described by the Nobel Prize-winning psychologist Daniel Kahneman in his book Thinking, Fast and Slow (2011). WYSIATI refers to the fact that we typically make our judgments according to the information we have readily available – no matter how incomplete it is. We find it difficult to appreciate that there are still many things we don’t know. Another bias known as ‘confirmation bias’ compounds WYSIATI, and describes our tendency to seek out more evidence to support our existing beliefs or judgments. . .

Continue reading. There’s much more, including quite a bit of practical advice and a sample investigation.

Also, the references at the end of the article are useful:

Links & books

To develop your thinking skills, you need regular training and feedback. Can you solve the three switches puzzle hosted by Guardian News on YouTube? Clue: it helps to start thinking like a detective.

When it comes to examining your existing beliefs, perspective is everything. Are you prone to defending your viewpoint at all costs, like a soldier, or are you spurred on by curiosity, like a scout? In her TED talk ‘Why You Think You’re Right – Even If You’re Wrong’ (2016), the rationalist Julia Galef examines the motivations behind these two different mindsets and how they shape our interpretations of information. When your steadfast opinions are tested, Galef asks: ‘What do you most yearn for? Do you yearn to defend your own beliefs, or do you yearn to see the world as clearly as you possibly can?’

In this blog post for the UK’s Foreign, Commonwealth and Development Office, Gisle Kvanvig and I told the story of how, building on the work of British experts, we used the idea of a detective mindset to inform a new, more ethical approach to interviewing and investigation techniques in, for example, law enforcement. Following this approach, officers are trained to handle the interview room much like a crime scene where accurate, reliable and actionable information can be collected for the purpose of investigating the case.

The book Blackstone’s Senior Investigating Officers’ Handbook (5th ed, 2019) by Tony Cook is a unique one-stop guide to all the processes and actions involved in conducting major investigations, presented in a clear and understandable fashion.

For my PhD thesis The Making of an Expert Detective: Thinking and Deciding in Criminal Investigations (2016), I drew on theoretical frameworks developed in social and cognitive psychology to examine the degree to which individual and systemic factors can compensate for inherent biases in criminal detectives’ judgments and decision-making.

The book The Routledge International Handbook of Legal and Investigative Psychology (2019), edited by the psychologists Ray Bull and Iris Blandón-Gitlin, explores contemporary topics in psychological science, applying them to investigative and legal procedures. Featuring contributions from recognised scholars from around the globe (including myself), it brings together current research, emerging trends, and cutting-edge debates in a single comprehensive and authoritative volume.

The book Superforecasting: The Art and Science of Prediction (2015) by the political scientist Philip E Tetlock and the author Dan Gardner offers a deeper insight into prediction, drawing on decades of research and the results of a massive, US government-funded forecasting tournament. The Good Judgment Project involves tens of thousands of ordinary people – including a Brooklyn filmmaker, a retired pipe-installer, and a former ballroom dancer – who set out to forecast global events. Some of the volunteers have turned out to be astonishingly good. These ‘superforecasters’ have beaten other benchmarks, competitors and prediction markets. They’ve even beaten the collective judgment of intelligence analysts with access to classified information.

‘Correlation does not imply causation’: for decades, this mantra was invoked by scientists in order to avoid taking positions as to whether one thing caused another, such as smoking and cancer, or carbon dioxide and climate change. But today, that taboo is dead. The causal revolution has (seemingly) cut through a century of confusion, and placed cause and effect on a firm scientific basis. The Book of Why (2018) by the computer scientist Judea Pearl and the science writer Dana Mackenzie explains causal thinking to general readers, showing how it allows us to explore both the world that is and the worlds that could have been. It is the essence of human and artificial intelligence. And just as these scientific discoveries have enabled machines to think better, The Book of Why explains how we too can think better.

Written by Leisureguy

16 October 2021 at 12:03 pm

How — and Why — Trump Will Win Again

leave a comment »

The future is notoriously difficult to predict accurately (though inaccurate predictions abound), buI fear this post by Umair Haque might well prove accurate:

Continue reading. There’s more, and it’s worth reading the whole thing.

Written by Leisureguy

14 October 2021 at 5:09 pm

Counties with more Confederate monuments also had more lynchings, study finds

leave a comment »

Let me beat you to it: Correlation is not causation. However, the common cause in this case is evident. Gillian Brockell reports in the Washington Post:

It was 1898, and John Henry James was on a train headed toward certain death. The Black ice cream vendor had been falsely accused of raping a White woman, arrested and taken to a neighboring town to avoid a lynch mob. But the next morning, authorities put him on a train back to Charlottesville, where he was to be indicted at the Albemarle County Courthouse. He never made it; an angry crowd pulled him from the train outside of town and lynched him.

Within a few years, a Confederate monument nicknamed “Johnny Reb” went up at that same courthouse, along with some old Confederate cannons. Then came a statue of Stonewall Jackson next door, and two blocks away, a monument to Robert E. Lee.

The fact that James’s lynching and the erection of the memorials took place in the same era and the same area is not a coincidence, according to a report from the University of Virginia published Monday in the Proceedings of the National Academy of Sciences. It found that in formerly Confederate states, counties with more Confederate memorials also had more lynchings.

This “provides compelling evidence that these symbols are associated with hate” and racism, and not more innocuous things like “heritage” or “Southern pride,” the study’s authors concluded.

The study was led by social psychology researcher Kyshia Henderson, along with data scientist Samuel Powers and professors Sophie Trawalter, Michele Claibourn and Jazmin Brown-Iannuzzi at U-Va.’s Batten School of Leadership and Public Policy.

As recently as 2015, 57 percent of Americans saw the Confederate flag as representing “Southern pride” more than racism, according to a CNN poll. Seventy-five percent of White Southerners felt it represented pride, versus only 11 percent of Black Southerners. Those numbers had hardly moved from a similar poll taken 15 years earlier in 2000.

But the debate around Confederate symbols and memorials is not just a matter of opinion; it contains “testable questions,” Henderson said. “Specifically, we can test whether Confederate memorials are associated with hate.”

The team compared county-level data on lynchings between 1832 and 1950 with data on Confederate memorials. They found that in any given area, and even controlling for population and other demographic variables, the number of lynchings was a “significant predictor” of the number of memorials. . .

Continue reading. He is not saying that lynchings cause Confederate monuments, nor that Confederate monuments cause lynchings. They both spring from the same worldview.

Written by Leisureguy

14 October 2021 at 12:17 pm

Lessons Learned from Two Gun-Violence Epidemics

leave a comment »

Daniel Webster writes at the Johns Hopkins School of Public Health:

came to Johns Hopkins in 1987 to get my doctorate and focus on public policies that enhance public health and safety. Initially, I focused on reducing motor vehicle deaths, but gun violence was engulfing U.S. cities in the late 1980s, including Baltimore. In neighborhoods near our campus, young lives were being lost to gunfire. The sharp upward trajectory looked like that of an uncontrolled infectious disease. Encouraged by my adviser, Stephen Teret, I decided to focus my career on preventing gun violence.

The current surge in homicides reminds me of when I entered the field. Just as in the 1986–1994 epidemic of gun violence, Black Americans living in neighborhoods of concentrated poverty and disinvestment have been the hardest hit. Social factors, including structural racism, contributed to both epidemics, but so have the proliferation of firearms, weaknesses in gun laws, and problems enforcing gun laws.

There are some positive differences in our response to the current epidemic of gun violence compared to the prior one. Previously, the response was dominated by more arrests, more incarceration, and increased investment in law enforcement. This time, policymakers are funding public health approaches. Community violence prevention programs have become integral to local strategies. Reducing violence and abuses by police and policies to promote racial equity have become high priorities.

But as new solutions have emerged, so have new challenges. Firearms constructed with kits purchased online and DIY videos are a new pipeline for firearms. Social media amplifies conflicts that are often settled with gunfire. Egregious acts of police violence have curtailed effective partnerships between law enforcement and community groups.

I’ve studied community violence interruption programs in Baltimore since 2007 and have had the privilege of being friends with some of the programs’ brave workers like Dante Barksdale. He was an effective violence interrupter for Safe Streets Baltimore in the McElderry Park neighborhood and later worked for Baltimore City recruiting and mentoring violence interrupters.

Dante was murdered earlier this year. Like many others in Baltimore, I was devastated by his loss. I learned much from Dante and others in Safe Streets. They taught me about the deep deprivation and trauma that is common among those involved in gun violence. They also acknowledged that their ability to prevent shootings depends somewhat on law enforcement being a credible deterrent to gun violence. Sadly, that’s been lacking in Baltimore and many other cities.

While some in public health have called for abolition or dramatic defunding of police, I think public health professionals should partner with police to develop new models for community safety that minimize harms from the criminal justice system. Our goal should be to push law enforcement to focus more on eliminating racial disparities, reducing serious violence, and being accountable to community members. Our police should be trauma-informed and able to engage collaboratively with multiple sectors.

To stem the current surge in gun violence, we need smarter policies to reduce gun availability in risky contexts. The Center I lead has helped strengthen laws to keep firearms from individuals subject to domestic violence restraining orders and advance the adoption by 17 states and the District of Columbia of extreme risk protection order laws. Our research provides evidence that strong background check requirements that include handgun purchaser licensing can reduce the diversion of guns for criminal use, firearm-related homicides and suicides, mass shootings, and law enforcement officers being shot. We know that states with these policies have rates of civilians being fatally shot by law enforcement that are about a quarter as high as those of states that lack them. Our national surveys show that three-quarters of Americans support these policies and a similar proportion of gun owners in states with purchaser licensing support the laws.

We also are rigorously studying laws governing . . .

Continue reading. There’s more.

Many civilian deaths by firearms could be prevented by sensible modifications to gun policies.

Written by Leisureguy

13 October 2021 at 12:19 pm

Britain experiencing cultural blindness about Brexit

leave a comment »

I have a post about cultural blindness, the specific examples being the microculture of basketball and informal fights between individuals. However, the phenomenon — when a culture is oblivious to something that is plainly visible to those outside the culture — crops up in other contexts, and Umair Haque points out how Britain has become culturally blind to the Brexit vote, shoving down an Orwellian memory hole. He writes:

Continue reading. There’s much more.

Written by Leisureguy

12 October 2021 at 2:30 pm

Good Leaders Know You Can’t Fight Reality

leave a comment »

Scott Edinger writes in the Harvard Business Review:

Summary:  Acceptance is often misunderstood as approval or being against change, but it is neither. Acceptance is about acknowledging the facts and letting go of the time, effort, and energy wasted in the fight against reality. Your reality may be that you are falling behind on revenue, a competitor has outflanked you with a new product, or that the effects of the pandemic are still hurting your business. Whatever it is you’re facing, you can’t employ your best skills to deal with it until you stop the wrangle against reality and accept what you’ve been handed, ready to change things for the better. The author offers three kinds of acceptance that leaders should focus on: 1) Accepting results 2) Accepting circumstances 3) Accepting their failings and those of others. [Stephen Covey talks about exactly this in terms of one’s Circle of Concern and Circle of Control — see this post. – LG]

The ability to accept reality is one of the most useful, and most misunderstood, skills for a leader. It’s a concept that has been around for centuries in philosophy and more recently in psychology, and properly applied can help drive change. As Carl Jung wrote, “We cannot change anything until we accept it. Condemnation does not liberate, it oppresses.” But I don’t see acceptance applied enough by leaders today as a valuable tool for achieving better results.

Acceptance may not sound like a hugely valuable skill, especially because we hear so much about leaders whose force of will seems to defy reality. The most notable example is Apple’s Steve Jobs, whose reputation for pushing people to do the impossible has become the stuff of legends. Jobs is reported to have distorted his employees’ sense of scale, making them believe that an unachievable task was possible — dubbed the “Reality Distortion Field” by his colleagues. While there is admirable value in this force of will, this characteristic is often exaggerated in leaders who lack the balancing counterweight of also accepting reality. As Jack Welch, the other most written about business leader of our time said, “Face reality as it is, not as it was or as you wish it to be.”

As a result, most of the poor leadership behavior I’ve observed has its roots in the inability to accept and work within the boundaries of what is happening, or the circumstances as they are. Unnecessarily harsh behavior, tantrums, aggressiveness, avoidance, and shutting people out can often be traced to leaders who are doing a lousy job of handling reality in the moment. A few years ago, I watched the CEO of a public company scream, “I will not accept this forecast!” at the president of one of his divisions, laced with some angry profanity. In the days that followed, the CEO and division president went back and forth, revising the revenue projections upward to more “acceptable” results, until the CEO finally OK’ed the next quarter’s forecast. While the numbers looked better on paper, they were not based on any real progress with customers or any reality within the business.

Fast forward to the end of quarter. The CEO was furious because the revenue numbers didn’t match the revised and more acceptable (to him) forecast. Ironically, the results were exactly on track with the initial forecast. As a result, the CEO abruptly initiated a round of layoffs and cut important internal investments to help the business operate more efficiently with customers. The numbers were telling the true story from the beginning, but the CEO wouldn’t accept and act on a reality he didn’t like. This created an avalanche of problems for employees and customers that negatively impacted future value of the business.

This situation could have gone down quite differently. The CEO and the division president could have collaborated on contingency plans for restoring the desired growth as well as expenses. But the CEO’s unwillingness to accept the reality of the situation foreclosed any meaningful discussion or potential for change.

A version of this disconnect happens in companies around the world every day. It is a classic example of a leader being unhappy about a circumstance, result, or even a person, and insisting that reality be different. The amount of time, effort, and energy I see wasted by leaders as they argue and fight about reality is astonishing. It takes courage to accept reality as it is, and only then can you and your team begin to make changes.

Here are three kinds of acceptance that leaders should focus on:

Accepting Results

Perhaps the worst has happened, or an outcome is simply bad. This can include a failed strategy, poor financial performance, loss of a job, or any other setback. Leaders can hem and haw, rant and rave, but until they can properly accept what has happened, they aren’t likely to move forward or lead anyone else forward.

This doesn’t mean you have to be “good” with the results. It’s about not channeling your energy into non-stop wishing that things were different, behaving unprofessionally, or arguing about the outcome. It may even require you to examine and accept your role in the results. Leaders must remember that not accepting or willfully fighting a result won’t change it. More importantly, it doesn’t put you in a strong position to make changes to prevent future failures.

Accepting Circumstances

Maybe timelines have been missed on important projects, or your return-to-office schedule has been thrown off by the Delta variant, or you are over budget and need to make important sacrifices. As leaders, we often face circumstances that are beyond our control. Susan David, author of Emotional Agility, notes the importance of giving up control of what you never had control over to begin with, and making room for your emotional reaction without acting on every thought or negative feeling. She writes, “We see leaders stumble not because they have undesirable thoughts and feelings — that’s inevitable — but because they get hooked by them, like fish caught on a line. … In our complex, fast-changing knowledge economy, [the] ability to manage one’s thoughts and feelings is essential to business success.”

Again, this doesn’t mean you have to be happy about or approve of a situation. Rather, acceptance gives you power to move forward in the most effective way possible instead of waging a futile battle against circumstances you can’t control. Our emotional response, particularly . . .

Continue reading. There’s more.

I would say that by this criterion Donald Trump is not a good leader, based on his reaction to losing the election.

Written by Leisureguy

11 October 2021 at 4:09 pm

A chilling look at what is happening — again — in the US: The breakdown of the Union

leave a comment »

I highly recommend reading Heather Cox Richardson’s complete post. Here I quote the conclusion:

. . . Last night, in Iowa, Trump held a “rally.” Mainstream Republican officials, including Senator Chuck Grassley, Governor Kim Reynolds, and Representatives Mariannette Miller-Meeks and Ashley Hinson, attended. Right on cue, a Trump supporter told a reporter: “We’re just sick of it, you know, and we’re not going to take it any more. I see a civil war coming….”

Today’s split in the Republican Party mirrors the split in the Democrats in 1860. The leadership is made up of extremists who consider their opponents illegitimate, maintain they alone understand the Constitution, and are skewing the mechanics of our electoral system to keep themselves in power. In 1860, the Democratic Party split, its moderates joining with the fledgling Republicans to defend the United States of America.

Then, as now, the radicals calling for the destruction of the nation were a shrinking minority desperate to cling to power. Then they took up arms to divide the nation in two and keep power in their part of it; now they are launching a quieter war simply by rigging future elections to conquer the whole nation.

Written by Leisureguy

11 October 2021 at 11:03 am

One woman’s six-word mantra that has helped to calm millions

leave a comment »

Judith Hoare, an Australian journalist, wrote The Woman Who Cracked the Anxiety Code: The Extraordinary Life of Dr Claire Weekes (2019). Psyche has what I assume is an adapted extract from that book. It’s worth reading, and it begins:

Imagine being in a pandemic, isolated and inert. Your life feels out of control, and you are stressed, not sleeping well. Then a raft of bewildering new symptoms arrive – perhaps your heart races unexpectedly, or you feel lightheaded. Maybe your stomach churns and parts of your body seem to have an alarming life of their own, all insisting something is badly wrong. You are less afraid of the pandemic than of the person you have now become.

Most terrifying of all is the invasive flashes of fear in the absence of any specific threat.

Back in 1927, this was 24-year-old Claire Weekes. A brilliant young scholar on her way to becoming the first woman to attain a doctorate of science at the University of Sydney, Weekes had developed an infection of the tonsils, lost weight and started having heart palpitations. Her local doctor, with scant evidence, concluded that she had the dreaded disease of the day, tuberculosis, and she was shunted off to a sanatorium outside the city.

‘I thought I was dying,’ she recalled in a letter to a friend.

Enforced idleness and isolation left her ruminating on the still unexplained palpitations, amplifying her general distress. Upon discharge after six months, she felt worse than when she went in. What had become of the normal, happy young woman she was not so long ago?

Flash forward to 1962 and the 59-year-old Dr Claire Weekes was working as a general practitioner, having retrained in medicine after an earlier stellar career in science during which she earned an international reputation in evolutionary biology. That year she also wrote her first book, the global bestseller Self-Help for Your Nerves.

The book was born from the furnace of the two years of high anxiety Weekes had endured in her 20s. Back then, her saviour came in the form of a soldier friend who had fought in the First World War. He explained how shellshocked soldiers had been programmed by fear and suffered similar physical symptoms to her own. Her heart continued to race, he told her, because she was frightened of it. Don’t fight the fear, he advised her, but try to ‘float’ past it. For Weekes, this was a revelation and a huge relief. She took his advice and recovered quite quickly.

Flash forward to 1962 and the 59-year-old Dr Claire Weekes was working as a general practitioner, having retrained in medicine after an earlier stellar career in science during which she earned an international reputation in evolutionary biology. That year she also wrote her first book, the global bestseller Self-Help for Your Nerves.

The book was born from the furnace of the two years of high anxiety Weekes had endured in her 20s. Back then, her saviour came in the form of a soldier friend who had fought in the First World War. He explained how shellshocked soldiers had been programmed by fear and suffered similar physical symptoms to her own. Her heart continued to race, he told her, because she was frightened of it. Don’t fight the fear, he advised her, but try to ‘float’ past it. For Weekes, this was a revelation and a huge relief. She took his advice and recovered quite quickly.

Critical of Freudian psychoanalysis with its emphasis on sex and tracking down the original cause of distress through talk therapy, Weekes boasted of getting patients off ‘the old Viennese couch … [and out] into the world’. She believed that fear was the driver of much nervous suffering, and that many had simply been ‘tricked by their nerves’. An original cause certainly needed attention if it was still fuelling distress, but Weekes discerned that often it took second place to people’s fear of ‘the state they were in’.

Weekes exposed fear’s vast menu of bewildering and distressing symptoms, and became famous for explaining the mind-body connection. People recognised themselves in the words she used, borrowed from her patients: ‘All tied up.’ ‘Headaches.’ ‘Tired and weary.’ ‘Palpitations.’ ‘Dreadful.’ ‘Nervous.’ ‘Sharp pain under the heart.’ ‘No interest.’ ‘Restless.’ ‘My heart beats like lead.’ ‘I have a heavy lump of dough in my stomach.’ ‘Heart-shakes.’ The nervous system seemed infinitely inventive. Then, bewilderment and fear of ‘what happens next’ took over.

Yet far from being possessed or crazy, Weekes explained to her readers that they were ordinary people who could cure themselves once they understood how their nerves had been ‘sensitised’ and then, by following some simple steps, learn to control the savage flame of fear. ‘It is very much an illness of your attitude to fear,’ she counselled in Peace from Nervous Suffering (1972).

Weekes was effectively treating the panic attack before it even had a name. She also believed that fear is the common thread that runs through many different psychological ‘disorders’, such as obsessive-compulsive disorder (OCD), phobias, general anxiety disorder, depression and post-traumatic stress disorder (PTSD), to use the formal diagnostic terms that had yet to be invented in her time. In this sense, Weekes anticipated contemporary ‘transdiagnostic’ approaches to mental health that acknowledge the commonalities across supposedly separate disorders. Weekes credited her scientific training with allowing her to see what she called ‘the trunk of the tree’ rather than being distracted by the branches.

In the 1930s, the US president Franklin Roosevelt memorably observed that ‘the only thing we have to fear is fear itself’. This concept is at the heart of Weekes’s unique work. ‘[Th]e nervous person must understand that when he panics, he feels not one fear, as he supposes, but two separate fears. I call these the first and second fear,’ she wrote in 1972.

Five years later, in her address to the Association for the Advancement of Psychotherapy, Weekes explained that the first fear is easy to identify. It is survival mode, that automatic instinct that means you duck a falling brick or a punch without thought. You don’t have to think. The body thinks for you. Today it would be called by its shorthand ‘fight, flight or freeze’. In what she calls a ‘sensitised’ person – someone who has been ill, burdened by worry or, say, fought in the trenches – it can come out of the blue, and be electric in its swiftness and ‘so out of proportion to the danger causing it’ that ‘it cannot be readily dismissed’.

So, shaken badly by this random jolt of first fear, the sufferer inevitably adds ‘second fear’ by worrying about this inexplicably alarming moment. Weekes said that the second fear could be recognised by its prefix ‘Oh my goodness!’ or ‘What if?’ to which any imaginings can be added. This kickstarts the fear-adrenaline-fear cycle, in which heart palpitations, among a medley of other symptoms, play such a powerful part. A circuit-breaker was required. The one she picked was the one she learned from her friend, the soldier: don’t fight fear.

Weekes distilled her understanding of ‘nervous illness’ into a six-word mantra for overcoming anxiety:  . . .

Continue reading. There’s more, of course, including the six words and why they work.

Written by Leisureguy

11 October 2021 at 10:39 am

How to Be Productive Infographic

leave a comment »

“Decide the outcome before even starting” is Stephen Covey’s second habit, “Begin with the end in mind.” And “Eliminate trivial decisions” is roughly equivalent to “Spend more time in Quadrants I and II and less time in III and IV.”

Written by Leisureguy

10 October 2021 at 7:22 am

“Love Letter to America,” a memoir by Tomas Schuman, the pseudonym of Yuri Bezmenov

leave a comment »

David Perell today wrote in his newsletter Friday Finds:

Ideological Subversion: Yuri Bezmenov was a Soviet journalist and a former KGB informant who defected and moved to Canada. He issued a warning to America in a 1985 interview, where he explained how US public opinion could be manipulated. Most frighteningly, he says that under a state of ideological subversion where a person becomes demoralized, new evidence won’t change people’s minds. Under the pen name Tomas D. Schuman, he also wrote a book called Love Letter to America, which is strangely hard to find online.

The link is to this video of a 1985 interview with Bezmenov. 

You can download Love Letter to America (as PDF or text) from Scribd. It looks like an interesting memoir, published in 1984. I downloaded it as PDF, which I can read in Acrobat Reader (changing the default setting to allow smooth scrolling, and upping the magnification a fair bit). I also added it to my Calibre library, converting it from PDF to AZW3, Amazon’s Kindle format. (Calibre offers a broad range of choices for file conversion.) 

Love Letter to America begins:

Dear Americans,

My name is Tomas David Schuman. I am what you may call a “defector” from the USSR. and I have a message for you: love you very much. I love all of you liberals and conservatives, “decadent capitalists” and “oppressed masses”. blacks and whites and browns and yellows, rednecks and intellectuals. For me you are the people who created a unique nation, country and society in the history of mankind, by no means a perfect one, but, let’s face it the most free, affluent and just in today’s world. am not alone in this love. People all over the Earth, whether they praise America or bitterly criticize her, look upon you as the only hope for mankind’s survival and the last stronghold of freedom. Some may not think in these idealistic terms, but they certainly enjoy the fruits of your civilization, often forgetting to be grateful for them. Millions of people in the so-called “socialist camp™ or in the “Third World” literally owe their lives to America.

As a war-time child, survived partly thanks to such “decadent capitalist” (as the Soviets say) things as “Spam” meat, condensed milk and egg powder that were supplied to my country by the USA through the lend-lease program of World War II. In the Soviet Union we secretly but proudly called ourselves “the Spam generation”. Too prosaic? Who cares about “Spam’ in today’s USA, apart from “underprivileged welfare recipients? Well, for me these foods are not merely the nostalgic delight of my troubled childhood, but rather, a symbol of love from a friend when I was in need. No amount of communist propaganda against America has ever been able to convince me that the United States out to colonze and exploi will tell you many people have been more than willing to be “exploited” the American way. For what other reason have thousands risked their lives, gone to unimaginable troubles, left behind their families their motherland and traditional ways of life to come to America? Have you ever heard of “illegal aliens” risking their lives crossing the border at midnight into Socialist USSR? Or the “boat people” swimming oceans and drowning by the thousands just to reach the shores of Communist China? Or defectors like me, leaving behind relative affluence and risking bullets in the back in order to join the “progressive workers paradise” in Russia? No, we all come here to America, obviously willing to be “exploited by capitalists” and enjoy *oppression” together with you. Because we believe and KNOW — America IS A BETTER place.

I am writing this not to please you with words you want to hear. The rest of my message may be more unpleasant to you than even Communist propaganda, or more offensive than the speeches of “leaders” in Kremlin. But as a true friend of America, I want to help.

My dear friends, think you are in big trouble. Whether you believe it or not, YOU ARE AT WAR. And you may lose this war very soon, together with all your affluence and freedoms, unless you start defending yourselves. I hope you have noticed on your color televisions that there is in fact war going on right now all over this planet. This war has many faces, but it’s all the same it’s war. Some call it “national liberation’ “, some title it “class struggle” or “political terrorism’ Others call it “anticolonialism’ or “*struggle for majority rule”. Some even come up with such fancy names as “war of patriotic forces” or peace movement”. call it World Communist Aggression.

I know what I am talking about, because was on the side of the aggressor before I decided to take YOUR side. I do not believe KNOW that in this war no one is being “liberated, decolonised or made equal”, as Soviet doctrine proclaims. You may notice, if you give yourselves the trouble to observe, that the only “equality” and . . .

Written by Leisureguy

8 October 2021 at 5:40 pm

C.S. Peirce, American Aristotle

leave a comment »

Charles Sanders Peirce (last name pronounced “purse”), a highly original philosopher and thinker, is the subject of an essay in Aeon by Daniel Everett, which begins:

[I intend] to make a philosophy like that of Aristotle, that is to say, to outline a theory so comprehensive that, for a long time to come, the entire work of human reason, in philosophy of every school and kind, in mathematics, in psychology, in physical science, in history, in sociology and in whatever other department there may be, shall appear as the filling up of its details.
C S Peirce, Collected Papers (1931-58)

The roll of scientists born in the 19th century is as impressive as any century in history. Names such as Albert Einstein, Nikola Tesla, George Washington Carver, Alfred North Whitehead, Louis Agassiz, Benjamin Peirce, Leo Szilard, Edwin Hubble, Katharine Blodgett, Thomas Edison, Gerty Cori, Maria Mitchell, Annie Jump Cannon and Norbert Wiener created a legacy of knowledge and scientific method that fuels our modern lives. Which of these, though, was ‘the best’?

Remarkably, in the brilliant light of these names, there was in fact a scientist who surpassed all others in sheer intellectual virtuosity. Charles Sanders Peirce (1839-1914), pronounced ‘purse’, was a solitary eccentric working in the town of Milford, Pennsylvania, isolated from any intellectual centre. Although many of his contemporaries shared the view that Peirce was a genius of historic proportions, he is little-known today. His current obscurity belies the prediction of the German mathematician Ernst Schröder, who said that Peirce’s ‘fame [will] shine like that of Leibniz or Aristotle into all the thousands of years to come’.

Some might doubt this lofty view of Peirce. Others might admire him for this or that contribution yet, overall, hold an opinion of his oeuvre similar to that expressed by the psychologist William James on one of his lectures, that it was like ‘flashes of brilliant light relieved against Cimmerian darkness’. Peirce might have good things to say, so this reasoning goes, but they are too abstruse for the nonspecialist to understand. I think that a great deal of Peirce’s reputation for obscurity is due, not to Peirce per se, but to the poor organisation and editing of his papers during their early storage at and control by Harvard University (for more on this, see André de Tienne’s insightful history of those papers).

Such skepticism, however incorrect, becomes self-reinforcing. Because relatively few people have heard of Peirce, at least relative to the names above, and because he has therefore had a negligible influence in popular culture, some assume that he merits nothing more than minor fame. But there are excellent reasons why it is worth getting to know more about him. The leading Peirce scholar ever, Max Fisch, described Peirce’s intellectual significance in this fecund paragraph from 1981:

Who is the most original and the most versatile intellect that the Americas have so far produced? The answer ‘Charles S Peirce’ is uncontested, because any second would be so far behind as not to be worth nominating. Mathematician, astronomer, chemist, geodesist, surveyor, cartographer, metrologist, spectroscopist, engineer, inventor; psychologist, philologist, lexicographer, historian of science, mathematical economist, lifelong student of medicine; book reviewer, dramatist, actor, short-story writer; phenomenologist, semiotician, logician, rhetorician [and] metaphysician … He was, for a few examples, … the first metrologist to use a wave-length of light as a unit of measure, the inventor of the quincuncial projection of the sphere, the first known conceiver of the design and theory of an electric switching-circuit computer, and the founder of ‘the economy of research’. He is the only system-building philosopher in the Americas who has been both competent and productive in logic, in mathematics, and in a wide range of sciences. If he has had any equals in that respect in the entire history of philosophy, they do not number more than two.

Peirce came from a well-to-do, prominent family of senators, businessmen and mathematicians. His father, Benjamin Peirce, was considered the greatest US mathematician of his generation, teaching mathematics and astronomy at Harvard for some 50 years. Charles’s brother, James, also taught mathematics at Harvard, eventually becoming a dean there. C S Peirce was, on the other hand, despised by the presidents of Harvard (Charles Eliot; where Peirce studied) and Johns Hopkins University (Daniel Gilman; where Peirce initially taught). Eliot and Gilman, among others, actively opposed Peirce’s employment at any US institution of higher education and thus kept him in penury for the latter years of his life. They falsely accused him of immorality and underestimated his brilliance due to input from jealous rivals, such as Simon Newcomb.

Though the story of Peirce’s life and thinking processes is inspiring and informative, this story is not told here. (I recommend Joseph Brent’s 1998 biography of Peirce as an excellent beginning. My own planned intellectual biography of Peirce intends to trace his life from his Pers family roots in Belgium in the 17th century to the history of the influence of his work on modern philosophy and science.) The objective here is rather to highlight some portions of Peirce’s thought to explain why his theories are so important and relevant to contemporary thinking across a wide range of topics.

The importance and range of Peirce’s contributions to science, mathematics and philosophy can be appreciated partially by recognising that many of the most important advances in philosophy and science over the past 150 years originated with Peirce: the development of mathematical logic (before and arguably better eventually than Gottlob Frege); the development of semiotics (before and arguably better than Ferdinand de Saussure); the philosophical school of pragmatism (before and arguably better than William James); the modern development of phenomenology (independently of and arguably superior to Edmund Husserl); and the invention of universal grammar with the property of recursion (before and arguably better than Noam Chomsky; though, for Peirce, universal grammar – a term he first used in 1865 – was the set of constraints on signs, with syntax playing a lesser role).

Beyond these philosophical contributions, Peirce also made fundamental discoveries in science and mathematics. A few of these are:  . . .

Continue reading. There’s much more.

Written by Leisureguy

8 October 2021 at 1:08 pm

How Other Nations Pay for Child Care. The U.S. Is an Outlier.

leave a comment »

The US continues to push its own decline, a sort of internal version of what Britain did with Brexit. (Interesting that nowadays it is referred to as “Britain” rather than “Great Britain.” I suppose “Great” is gone, and that’s generally recognized. I wonder when the United States will be commonly known as “the States,” since it is no longer very united at all.)

Claire Cain Miller reports in the NY Times (and no paywall for this report):

Typical 2-year-olds in Denmark attend child care during the day, where they are guaranteed a spot, and their parents pay no more than 25 percent of the cost. That guaranteed spot will remain until the children are in after-school care at age 10. If their parents choose to stay home or hire a nanny, the government helps pay for that, too.

Two-year-olds in the United States are less likely to attend formal child care. If they do, their parents pay full price — an average $1,100 a month — and compete to find a spot. If their parents stay home or find another arrangement, they are also on their own to finance it, as they will be until kindergarten.

In the developed world, the United States is an outlier in its low levels of financial support for young children’s care — something Democrats, with their safety net spending bill, are trying to change. The U.S. spends 0.2 percent of its G.D.P. on child care for children 2 and under — which amounts to about $200 a year for most families, in the form of a once-a-year tax credit for parents who pay for care.

The other wealthy countries in the Organization for Economic Cooperation and Development spend an average of 0.7 percent of G.D.P. on toddlers, mainly through heavily subsidized child care. Denmark, for example, spends $23,140 annually per child on care for children 2 and under.

“We as a society, with public funding, spend so much less on children before kindergarten than once they reach kindergarten,” said Elizabeth Davis, an economist studying child care at the University of Minnesota. “And yet the science of child development shows how very important investment in the youngest ages are, and we get societal benefits from those investments.”

Congress is negotiating the details of the spending bill, and many elements are likely to be cut to decrease the cost. The current draft of the child care plan would make attendance at licensed child care centers free for the lowest-earning families, and it would cost no more than 7 percent of family income for those earning up to double the state’s median income. It would provide universal public preschool for children ages 3 and 4. And it would increase the pay of child care workers and preschool teachers to be equivalent to elementary teachers (currently, the median hourly wage for a preschool teacher of 4-year-olds is $14.67, and for a kindergarten teacher of 5-year-olds $32.80).

Among the 38 nations in the Organization for Economic Cooperation and Development, the United States is second only to Luxembourg on education spending for elementary school through college. But Americans have long had mixed feelings about whether young children should stay home with family or go to child care. Some Republicans say direct payments to parents would give them the choice to enroll in child care or stay home. Though many conservative-leaning states have public preschool, some Republicans have said they do not want the federal government involved. Some business groups oppose how the Biden spending bill would be paid for: increased taxes on businesses and wealthy Americans.

The pandemic, though, has forced the issue.

“I’ve been writing these reports saying this is a crisis for more than 30 years — it’s not new,” said Gina Adams, a senior fellow at the Urban Institute. “But the pandemic reminded people that child care is a linchpin of our economy. Parents can’t work without it. It’s gotten to a point where the costs of not investing are much, much more clear.”

Overall, federal, state and local governments spend about $1,000 a year on care for low-income children ages 2 and under, and $200 on other toddlers, according to a paper for the Hamilton Project at Brookings, by Professor Davis and Aaron Sojourner, also an economist at the University of Minnesota.

Some states and cities offer . . .

Continue reading. Again: no paywall.

Written by Leisureguy

7 October 2021 at 1:16 pm

Making a Living: The history of what we call “work”

leave a comment »

Aaron Benanav reviews an interesting book in The Nation:

We have named the era of runaway climate change the “Anthropocene,” which tells you everything you need to know about how we understand our tragic nature. Human beings are apparently insatiable consuming machines; we are eating our way right through the biosphere. The term seems to suggest that the relentless expansion of the world economy, which the extraction and burning of fossil fuels has made possible, is hard-wired into our DNA. Seen from this perspective, attempting to reverse course on global warming is likely to be a fool’s errand. But is unending economic growth really a defining feature of what it means to be human?

For the longest part of our history, humans lived as hunter-gatherers who neither experienced economic growth nor worried about its absence. Instead of working many hours each day in order to acquire as much as possible, our nature—insofar as we have one—has been to do the minimum amount of work necessary to underwrite a good life.

This is the central claim of the South African anthropologist James Suzman’s new book, Work: A Deep History, From the Stone Age to the Age of Robots, in which he asks whether we might learn to live like our ancestors did—that is, to value free time over money. Answering that question takes him on a 300-millennium journey through humanity’s existence.

Along the way, Suzman draws amply on what he has learned since the 1990s living and dissertating among the Ju/’hoansi Bushmen of Eastern Namibia, whose ancestral home is in southern Africa’s Kalahari Desert. The Ju/’hoansi are some of the world’s last remaining hunter-gatherers, although few engage in traditional forms of foraging anymore.

Suzman has less to say in Work about his years as the director of corporate citizenship and, later, the global director of public affairs at De Beers, the diamond-mining corporation. He took that job in 2007. Around the same time, in response to a public outcry after the Botswanan government evicted Bushmen from the Kalahari so that De Beers could conduct its mining operations there, the company sold its claim to a deposit to a rival firm, Gem Diamonds, which opened a mine in the Bushmen’s former hunting grounds in 2014. It later shuttered the mine and then sold it in 2019, after reportedly losing $170 million on the venture.

Suzman’s employment with De Beers—a company that has spent vast sums on advertising to convince the world’s middle classes that diamonds, one of the most common gems, are actually among the scarcest—may have left its mark on Work nonetheless. “The principal purpose” of his undertaking, Suzman explains, is “to loosen the claw-like grasp that scarcity economics has held” over our lives and thereby “diminish our corresponding and unsustainable preoccupation with economic growth.” It is an arresting intervention, although one that reveals the limits of both contemporary economics and anthropology as guides to thinking about our era of climate emergency.

For 95 percent of our 300,000-year history, human beings have lived as hunter-gatherers on diets consisting of fruits, vegetables, nuts, insects, fish, and game. Ever since Adam Smith published The Wealth of Nations in 1776, it has largely been taken for granted that staying alive was an all-consuming activity for our ancestors, as well as for the remaining hunter-gatherers who still lived as they did. Latter-day foragers appeared to have been “permanently on the edge of starvation,” Suzman explains, and “plagued by constant hunger.”

This disparaging perspective on the life of the hunter-gatherer found ample support in Western travel narratives and then in ethnographic studies. Explorers treated contemporary foraging peoples as if they were living fossils, artifacts of an earlier era. In reality, these foragers were living in time, not out of it, and trying to survive as best they could under adverse historical conditions. Expanding communities of agriculturalists, like both colonial empires and post-colonial states, had violently pushed most foragers out of their ancestral homelands and into more marginal areas. Western reportage has made it seem as if these dispossessed refugees were living as their ancestors had since time immemorial, when in fact their lives were typically much more difficult.

A countercurrent of thinkers has provided a consistent alternative to this largely contemptuous mainstream perspective. The 18th-century French philosopher Jean-Jacques Rousseau, for example, took the forager to be an unrealizable ideal for modern humans rather than our embarrassing origin story. In the 20th century, anthropologists Franz Boas and Claude Levi-Strauss continued this tradition: They countered racist, stage-based theories of human evolution by showing that foraging peoples possessed complex and intelligent cultures. These thinkers form important precursors to Suzman’s perspective, but, in Work, he sets them aside.

Instead, Suzman focuses on the comparatively recent “Man the Hunter” conference, co-organized by the American anthropologist Richard Lee. That 1966 gathering marked a decisive shift in how anthropologists thought about foragers as economic actors, and this is the point that Suzman wants to emphasize. Lee had been conducting research among the !Kung Bushmen of southern Africa, a people related to the Ju/’hoansi. Lee showed that the !Kung acquired their food through only “a modest effort,” leaving them with more “free time” than people in the advanced industrial societies of the West. The same was likely true, he suggested, of human beings over the largest part of their history.

One implication of this finding is that economists since Adam Smith have been consistently wrong about what Lee’s colleague Marshall Sahlins called “stone age economics.” Using modern research methods, social scientists have confirmed that Lee and Sahlins were largely right (although they may have underestimated foragers’ average work hours). The chemical analysis of bones has demonstrated conclusively that early humans were not constantly teetering on the brink of starvation. On the contrary, they ate well despite having at their disposal only a few stone and wooden implements. What afforded these early humans existences of relative ease and comfort? According to Suzman, the turning point in the history of early hominids came with their capacity to control fire, which gave them access to a “near-limitless supply of energy” and thereby lightened their toils.

Fire predigests food. When you roast the flesh of a woolly mammoth—or, for that matter, a bunch of carrots—the process yields significantly more calories than if the food was left uncooked. The capacity to access those additional calories gave humans an evolutionary advantage over other primates. Whereas chimpanzees spend almost all of their waking hours foraging, early humans got the calories they needed with just a few hours of foraging per day.

Mastering fire thus made for a radical increase in humanity’s free time. Suzman contends that it was this free time that subsequently shaped our species’s cultural evolution. Leisure afforded long periods of hanging around with others, which led to the development of language, storytelling, and the arts. Human beings also gained the capacity to care for those who were “too old to feed themselves,” a trait we share with few other species.

The use of fire helped us become . . .

Continue reading.

Written by Leisureguy

5 October 2021 at 12:15 pm

Is America Still Capable of Democracy?

leave a comment »

I think at this point that’s a reasonable question, and a clear affirmative is not to be had. Umair Haque has another of his jeremiads, which, alas, stay close to what we observe. He writes:

Continue reading. There’s more.

Written by Leisureguy

1 October 2021 at 1:35 pm

The Gut Microbiome and the Brain

leave a comment »

Jackie Power writes for the Johns Hopkins University School of Public Health:

On Calliope Holingue’s Twitter profile, one of her descriptors is “obsessed with the gut-brain connection.”

It’s a statement that she can back up: Her research centers on the gut-brain link in autism. She teaches the Summer Institute course Mental Health and the Gut. And as a teen, she began to explore the gut-brain connection on her own, years before “microbiome” became a household word.

Holingue, PhD ’19, MPH, has lived with obsessive compulsive disorder since early childhood, and started developing increasingly disabling gastrointestinal problems in high school.

Her medical doctors weren’t much help, so she sought solutions on her own.

“They viewed my GI issues as purely a psychiatric issue, so they would say, ‘This is anxiety, this is stress. You just need to relax and you’ll feel better,’” she recalls. “But I didn’t think that the stress and anxiety explained everything. I’d have crippling pain or severe reactions to food even when I was not stressed at all.”

Eventually diagnosed with irritable bowel syndrome, she embarked on a years-long process to improve her health. After lots of trial and error with probiotics, different diets, and mindfulness, she’s now in a much better place emotionally and physically, though the journey is an ongoing one.

Along the way, she channeled her knowledge of the field into graduate study and research, and joined the faculty at the Kennedy Krieger Institute, where she’s exploring the gut-brain connection.

A TWO-WAY STREET

Trillions of bacteria, viruses, yeasts, protozoa, and fungi—at least as many as the number of human cells in our bodies and weighing approximately four pounds—inhabit the intestinal tract.

Collectively known as the gut microbiome, these microbes help us metabolize nutrients and protect us from harmful bacteria and toxins. They have also drawn intense study by scientists like Holingue eager to understand the microbiome’s connection to mental health. In addition to microbes, the gut-brain axis involves the vagus nerve, hormones, immune cells, neurotransmitters, and metabolites, all of which work together to allow the bidirectional communication between the gut and brain.

“The take-home message in everything we study is the arrow goes both ways,” says Glenn J. Treisman, MD, PhD, the Eugene Meyer III Professor of Psychiatry and Medicine at the Johns Hopkins School of Medicine. “The brain affects your gut. The gut affects your brain. The microbiome affects your gut, which affects your brain. The brain affects your gut, which affects your microbiome.”

Disruptions to the gut microbiome, say by infection or a change in diet, can trigger reactions in the body that may affect psychological, behavioral, and neurological health. For example, reactions such as the overproduction of inflammatory cytokines or slowed production of neuroactive metabolites have been implicated in depression.

2020 review of research on depression and the gut microbiome noted that generally, people with depression have a less diverse gut microbiome, with higher levels of bacteria associated with inflammation, like Bacteroidetes, and decreased levels of bacteria associated with anti-inflammation, like Firmicutes.

“There’s been hundreds of studies at this point looking at various psychiatric and brain disorders and linking them with the gut microbiome,” says Holingue. “I feel like we’re in the place where the human genetics field was maybe 10 years ago. We’re drowning in associations and trying to figure out what does this mean, what is causing what, and what do we do with this information?”

Untangling the complex associations offers the tantalizing prospect of novel therapies for conditions from depression and anxiety to autism spectrum disorder.

A HAPPY MICROBIOME IS A DIVERSE MICROBIOME

The growing interest in the gut-brain axis emerged partly out of the recognition that people with psychiatric and neurodevelopmental conditions tend to have more GI problems, such as diarrhea, constipation, and abdominal pain, than the general population. It’s estimated that 50%–90% of patients with irritable bowel syndrome have a psychiatric comorbidity.

Key questions followed: Are the gut issues caused by brain disorders or is it the other way around?

A 2011 influential . . .

Continue reading.

Written by Leisureguy

28 September 2021 at 9:36 am

The American way of life: Debt

leave a comment »

Umair Haque writes in Medium:

Continue reading. There’s more. 

Something is wrong, very wrong.

Written by Leisureguy

27 September 2021 at 4:01 pm

The bias that blinds: Why doctors give some people dangerously different medical care

leave a comment »

Jessica Nordell writes in the Guardian:

I met Chris in my first month at a small, hard-partying Catholic high school in north-eastern Wisconsin, where kids jammed cigarettes between the fingers of the school’s lifesize Jesus statue and skipped mass to eat fries at the fast-food joint across the street. Chris and her circle perched somewhere adjacent to the school’s social hierarchy, and she surveyed the adolescent drama and absurdity with cool, heavy-lidded understanding. I admired her from afar and shuffled around the edges of her orbit, gleeful whenever she motioned for me to join her gang for lunch.

After high school, we lost touch. I went east; Chris stayed in the midwest. To pay for school at the University of Minnesota, she hawked costume jewellery at Dayton’s department store. She got married to a tall classmate named Adam and merged with the mainstream – became a lawyer, had a couple of daughters. She would go running at the YWCA and cook oatmeal for breakfast. Then in 2010, at the age of 35, she went to the ER with stomach pains. She struggled to describe the pain – it wasn’t like anything she’d felt before. The doctor told her it was indigestion and sent her home. But the symptoms kept coming back. She was strangely tired and constipated. She returned to the doctor. She didn’t feel right, she said. Of course you’re tired, he told her, you’re raising kids. You’re stressed. You should be tired. Frustrated, she saw other doctors. You’re a working mom, they said. You need to relax. Add fibre to your diet. The problems ratcheted up in frequency. She was anaemic, and always so tired. She’d feel sleepy when having coffee with a friend. Get some rest, she was told. Try sleeping pills.

By 2012, the fatigue was so overwhelming, Chris couldn’t walk around the block. She’d fall asleep at three in the afternoon. Her skin was turning pale. She felt pain when she ate. Adam suggested she see his childhood physician, who practised 40 minutes away. That doctor tested her blood. Her iron was so low, he thought she was bleeding internally. He scheduled a CT scan and a colonoscopy. When they revealed a golf ball-sized tumour, Chris felt, for a moment, relieved. She was sick. She’d been telling them all along. Now there was a specific problem to solve. But the relief was short-lived. Surgery six days later showed that the tumour had spread into her abdomen. At the age of 37, Chris had stage four colon cancer.

Historically, research about the roots of health disparities – differences in health and disease among different social groups – has sought answers in the patients: their behaviour, their status, their circumstances. Perhaps, the thinking went, some patients wait longer to seek help in the first place, or they don’t comply with doctors’ orders.

Maybe patients receive fewer interventions because that’s what they prefer. For Black Americans, health disparities have long been seen as originating in the bodies of the patients, a notion promoted by the racism of the 19th-century medical field. Medical journals published countless articles detailing invented physiological flaws of Black Americans; statistics pointing to increased mortality rates in the late 19th century were seen as evidence not of social and economic oppression and exclusion, but of physical inferiority.

In this century, research has increasingly focused on the social and environmental determinants of health, including the way differences in access to insurance and care also change health outcomes. The devastating disparate impact of Covid-19 on communities of colour vividly illuminates these factors: the disproportionate burden can be traced to a web of social inequities, including more dangerous working conditions, lack of access to essential resources, and chronic health conditions stemming from ongoing exposure to inequality, racism, exclusion and pollution. For trans people, particularly trans women of colour, the burden of disease is enormous. Trans individuals, whose marginalisation results in high rates of poverty, workplace discrimination, unemployment, and serious psychological distress, face much higher rates of chronic conditions such as asthma, chronic pulmonary obstructive disorder, depression and HIV than the cisgender population. A 2015 survey of nearly 28,000 trans individuals in the US found that one-third had not sought necessary healthcare because they could not afford it.

More recently, researchers have also begun looking at differences that originate in the providers – differences in how doctors and other healthcare professionals treat patients. And study after study shows that they treat some groups differently from others.

Black patients, for instance, are less likely than white patients to receive pain medication for the same symptoms, a pattern of disparate treatment that holds even for children. Researchers attribute this finding to false stereotypes that Black people don’t feel pain to the same degree as white people – stereotypes that date back to chattel slavery and were used to justify inhumane treatment. The problem pervades medical education, where “race” is presented as a risk factor for myriad diseases, rather than the accumulation of stressors linked to racism. Black immigrants from the Caribbean, for instance, have lower rates of hypertension and cardiovascular disease than US-born Black people, but after a couple of decades, their rates of illness increase toward those of the US-born Black population, a result generally attributed to the particular racism they encounter in the US.

Black patients are also given fewer therapeutic procedures, even when studies control for insurance, illness severity and type of hospital. For heart attacks, black people are less likely to receive guideline-based care; in intensive care units for heart failure, they are less likely to see a cardiologist, which is linked to survival.

These biases affect the quality of many other interactions in clinics. Doctors spend less time and build less emotional rapport with obese patients. Transgender people face overt prejudice and discrimination. The 2015 survey also found that in the preceding year, a third of respondents had had a negative encounter with a healthcare provider, including being refused treatment. Almost a quarter were so concerned about mistreatment that they avoided necessary healthcare. Transgender individuals can therefore face a dangerous choice: disclose their status as trans and risk discrimination, or conceal it and risk inappropriate treatment.

Even though medical providers are not generally intending to provide better treatment to some people at the expense of others, unexamined bias can create devastating harm.


.
C
hris was told that her symptoms, increasingly unmanageable, were not serious. Women as a group receive fewer and less timely interventions, receive less pain treatment and are less frequently referred to specialists. One 2008 study of nearly 80,000 patients in more than 400 hospitals found that women having heart attacks experience dangerous treatment delays, and that once in the hospital they more often die. After a heart attack, women are less likely to be referred to cardiac rehabilitation or to be prescribed the right medication. Critically ill women older than 50 are less likely to receive life-saving interventions than men of the same age; women who have knee pain are 22 times less likely to be referred for a knee replacement than a man. A 2007 Canadian study of nearly 500,000 patients showed that after adjusting for the severity of illness, women spent a shorter time in the ICU and were less likely to receive life support; after age 50, they were also significantly more likely to die after a critical illness.

Women of colour are at particular risk for poor treatment. A 2019 analysis of their childbirth experiences found that they frequently encountered condescending, ineffective communication and disrespect from providers; some women felt bullied into having C-sections. Serena Williams’s childbirth story is by now well known: the tennis star has a history of blood clots, but when she recognised the symptoms and asked for immediate scans and treatment, the nurse and the doctor doubted her. Williams finally got what she needed, but ignoring women’s symptoms and distress contributes to higher maternal mortality rates among Black, Alaska Native and Native American women. Indeed, Black women alone in the US are three to four times more likely to die of complications from childbirth than white women.

There’s also a structural reason for inferior care: women have historically been excluded from much of medical research. The reasons are varied, ranging from a desire to protect childbearing women from drugs that could impair foetal development, via notions that women’s hormones could complicate research, to an implicit judgment that men’s lives were simply more worth saving. Many landmark studies on ageing and heart disease never included women; the all-men study of cardiovascular disease named MRFIT emerged from a mindset that male breadwinners having heart attacks was a national emergency, even though cardiovascular disease is also the leading cause of death for women. In one particularly egregious example, a 1980s study examining the effect of obesity on breast cancer and uterine cancer excluded women because men’s hormones were “simpler” and “cheaper” to study.

Basic to these practices was an operating assumption that men were the default humans, of which women were a subcategory that could safely be left out of studies. Of course, there’s a logical problem here: the assertion is that women are so complicated and different that they can’t be included in research, and yet also so similar that any findings should seamlessly extend to them. In the 90s, the US Congress insisted that medical studies funded by the National Institutes of Health should include women; earlier, many drug studies also left out women, an exclusion that may help explain why women are 50%-75% more likely to experience adverse side-effects from drugs.

As the sociologist Steven Epstein points out, medicine often starts with categories that are socially and politically relevant – but these are not always medically relevant. Relying on categories such as race risks erasing the social causes of health disparities and may entrench the false and damaging ideas that are inscribed in medical practice. At the same time, ignoring differences such as sex is perilous: as a result of their exclusion, women’s symptoms have not been medically well understood. Doctors were told, for example, that women present with “atypical symptoms” of heart attacks. In fact, these “atypical” symptoms are typical – for women. They were only “atypical” because they hadn’t been studied. Women and men also vary in their susceptibility to different diseases, and in the course and symptoms of those diseases. They respond to some drugs differently. Women’s kidneys filter waste more slowly, so some medications take longer to clear from the body.

This dearth of knowledge about women’s bodies has led doctors to see differences where none exist, and fail to see differences where they do. As the journalist Maya Dusenbery argues in her book Doing Harm, this ignorance also interacts perniciously with historical stereotypes.

When women’s understudied symptoms don’t match the textbooks, doctors label them “medically unexplained”. These symptoms may then be classified as psychological rather than physical in origin. The fact that so many of women’s symptoms are “medically unexplained” reinforces the stereotype that women’s symptoms are overreactions without a medical basis, and casts doubt over all women’s narratives of their own experiences. One study found that while men who have irritable bowel syndrome are more likely to receive scans, women tend to be offered tranquilisers and lifestyle advice. In response to her pain and fatigue, my friend Chris was told she should get some sleep.


.
T
he doctor who finally ordered the right tests for Chris told her that he’d seen many young women in his practice whose diagnoses had been . . .

Continue reading. There’s much more, and it’s infuriating. It seems to have its roots in that a disproportionately large number of medical doctors and researchers are white men, an unknown number of whom are misogynistic and/or racist. I think an interesting study would be to take a large randomized sample of medical practitioners and researchers and administer a psychological test to determine the degree to which each is misogynistic or racist. I’m also wondering whether medical schools reinforce or combat those attitudes, or instead just ignore the problem. (I suspect they ignore the problem.)

UPDATE: After sleeping on it, I see that this as an institutional and systemic failure, not just the failure of individual doctors (though they are, of course, a part of the system). But the systemic/institutional aspects make the problem deep-rooted and will require a systemic and institutional effort to correct.

Written by Leisureguy

25 September 2021 at 6:27 pm

%d bloggers like this: