Archive for the ‘Mental Health’ Category
Becky Bach reports in Pacific Standard:
Thousands of veterans suffering from post-traumatic stress disorder rely on the Department of Veterans Affairs for relief. They might be better served, however, if they tapped the hard-won wisdom of incarcerated Vietnam veteran Michael “Doc” Piper.
Piper knows, though the VA has yet to acknowledge, that community service could be the most effective treatment available for PTSD, a debilitating condition marked by nightmares, anxiety, flashbacks, pain, anger, self-blame, alienation, and depression.
Despite his confinement in Soledad Correctional Training Facility, a California state prison, 67-year-old Piper is a professional volunteer. From a 106-square-foot former broom closet with no Internet access, Piper helps fellow incarcerated veterans access VA benefits. By helping others, Piper says he’s been able to cope with his anger, nightmares, and flashbacks. But he’s not the only one who understands the power of community service.
Mission Continues, a St. Louis-based organization, is generating national attention, including a June 2013 Time cover story, for its success helping veterans who served in Iraq and Afghanistan integrate into society. It’s a dire need: The VA has treated nearly 300,000 veterans from those conflicts for PTSD symptoms, according to a November report. Mission Continues, which was founded in 2007 by a group of veterans, places veterans in six-month service fellowships in community organizations across the country.
Fellows paint hospital walls, collect food donations, and plant gardens, developing career and life skills in the process. And although in-depth studies are lacking, an investigation (PDF) by Washington University in St. Louis social scientist Monica Matthieu, found that Mission Continues helps.
Matthieu and her team surveyed 27 Mission Continues fellows, many of whom have been diagnosed with PTSD. Following their fellowship, 71 percent continued their education and 86 percent were able to find employment.
The VA currently assaults PTSD with a grab bag of treatments. It recommends (PDF) a combination of drugs, most commonly anti-depressants, and therapies including individual and group psychotherapy, hypnosis, and meditation. The department’s 2010 guidelines also recommend social and family skills training, job training, education, and spiritual support. VA therapists even teach stress-tolerance techniques.
For example, . . .
From his column today:
Then there are those who look to politics for identity. They treat their partisan affiliation as a form of ethnicity. These people drive a lot of talk radio and television. Not long ago, most intelligent television talk was not about politics. Shows would put interesting people together, like Woody Allen with Billy Graham (check it out on YouTube), and they’d discuss anything under the sun.
Now most TV and radio talk is minute political analysis, while talk of culture has shriveled. This change is driven by people who, absent other attachments, have fallen upon partisanship to give them a sense of righteousness and belonging.
This emotional addiction can lead to auto-hysteria.
Hmmm. Am I guilty? I certainly find politics enthralling, as some are reported to find baseball. And, like baseball, you follow not only the games (the legislation and court decisions), but also the players, the prospects, the trade-offs, and so on. I think the difference is that politics, through enacting and enforcing the laws and regulations that so shape our lives.
His point on TV talk shows is interesting. We do not have cultural talk shows—all our talk shows seem to be some variant of the news: politics, sports, business, actual news. Nothing that is of this moment. Little to provide intellectual context to what we witness.
UPDATE: It strikes me that this NY Times Op-Ed by Gary Gutting is directly relevant—and in fact also relevant to the minimum-wage discussion:
“Crisis” and “decline” are the words of the day in discussions of the humanities. A primary stimulus for the concern is a startling factoid: only 8 percent of undergraduates major in humanities. But this figure is misleading. It does not include majors in closely related fields such as history, journalism and some of the social sciences. Nor does it take account of the many required and elective humanities courses students take outside their majors. Most important, the 8 percent includes only those with a serious academic interest in literature, music and art, not those devoted to producing the artistic works that humanists study.
Once we recognize that deeply caring about the humanities (including the arts) does not require majoring in philosophy, English or foreign languages, it’s not at all obvious that there is a crisis of interest in the humanities, at least in our universities.Is the crisis rather one of harsh economic reality? Humanities majors on average start earning $31,000 per year and move to an average of $50,000 in their middle years. (The figures for writers and performing artists are much lower.) By contrast, business majors start with salaries 26 percent higher than humanities majors and move to salaries 51 percent higher.
But this data does not show that business majors earn more because they majored in business. Business majors may well be more interested in earning money and so accept jobs that pay well even if they are not otherwise fulfilling, whereas people interested in the humanities and the arts may be willing to take more fulfilling but lower-paying jobs. College professors, for example, often know that they could have made far more if they had gone to law school or gotten an M.B.A., but are willing to accept significantly lower pay to teach a subject they love.
This talk of “a subject they love” brings us to the real crisis, which is both economic and cultural (or even moral). The point of work should not be just to provide the material goods we need to survive. Since work typically takes the largest part of our time, it should also be an important part of what gives our life meaning. Our economic system works well for those who find meaning in economic competition and the material rewards it brings. To a lesser but still significant extent, our system provides meaningful work in service professions (like health and social work) for those fulfilled by helping people in great need. But for those with humanistic and artistic life interests, our economic system has almost nothing to offer.
Or rather, it has a great deal to offer but only for a privileged elite (the cultural parallel to our economic upper class) who have had the ability and luck to reach the highest levels of humanistic achievement. If you have (in Pierre Bourdieu’s useful term) the “cultural capital” to gain a tenured professorship at a university, play regularly in a major symphony orchestra or write mega best sellers, you can earn an excellent living doing what you love. Short of that, you must pursue your passion on the side.Teaching should be an obvious solution for many humanities majors. But . .
Rob Stein of NPR has a podcast and an article that seemed of interest:
Could the microbes that inhabit our guts help explain that old idea of “gut feelings?” There’s growing evidence that gut bacteria really might influence our minds.
“I’m always by profession a skeptic,” says Dr. Emeran Mayer, a professor of medicine and psychiatry at the University of California, Los Angeles. “But I do believe that our gut microbes affect what goes on in our brains.”
Mayer thinks the bacteria in our digestive systems may help mold brain structure as we’re growing up, and possibly influence our moods, behavior and feelings when we’re adults. “It opens up a completely new way of looking at brain function and health and disease,” he says.
So Mayer is working on just that, doing MRI scans to look at the brains of thousands of volunteers and then comparing brain structure to the types of bacteria in their guts. He thinks he already has the first clues of a connection, from an analysis of about 60 volunteers.
Mayer found that the connections between brain regions differed depending on which species of bacteria dominated a person’s gut. That suggests that the specific mix of microbes in our guts might help determine what kinds of brains we have — how our brain circuits develop and how they’re wired. . . .
Continue reading. There’s a lot more, including a video, and much is surprising. For example:
But other researchers have been trying to figure out a possible connection by looking at gut microbes in mice. There they’ve found changes in both brain chemistry and behavior. One experiment involved replacing the gut bacteria of anxious mice with bacteria from fearless mice.
It worked the other way around, too — bold mice became timid when they got the microbes of anxious ones. And aggressive mice calmed down when the scientists altered their microbes by changing their diet, feeding them probiotics or dosing them with antibiotics.
Bilingual brains are less likely to decline into dementia, reports Barbara King at NPR:
The largest study so far to ask whether speaking two languages might delay the onset of dementia symptoms in bilingual patients as compared to monolingual patients has reported a robust result. Bilingual patients suffer dementia onset an average of 4.5 years later than those who speak only a single language.
While knowledge of a protective effect of bilingualism isn’t entirely new, the present study significantly advances scientists’ knowledge. Media reportsemphasize the size of its cohort: 648 patients from a university hospital’s memory clinic, including 391 who were bilingual. It’s also touted as the first study to reveal that bilingual people who are illiterate derive the same benefit from speaking two languages as do people who read and write. It also claims to show that the benefit applies not only to Alzheimer’s sufferers but also people with frontotemporal and vascular dementia.
Only when I read the research report itself, though, published in the journalNeurology and written by Suvarna Alladi and 7 co-authors, did I realize fully the brilliance of conducting this study in Hyderabad, India.
That choice of location, I believe, lends extra credibility to the study’s results.
Here’s why. India, as the researchers note, is a nation of linguistic diversity. In the Hyderabad region, a language called Telugu is spoken by the majority Hindu group, and another called Dakkhini by the minority Muslim population. Hindi and English are also commonly spoken in formal contexts, including at school. Most people who grow up in the region, then, are bilingual, and routinely exposed to at least three languages.
The patients who contributed data to the study, then, are surrounded by multiple languages in everyday life, not primarily as a result of moving from one location to another. This turns out to be an important factor, as the authors explain:
In contrast to previous studies, the bilingual group was drawn from the same environment as the monolingual one and the results were therefore free from the confounding effects of immigration. The bilingual effect on age at dementia onset was shown independently of other potential confounding factors, such as education, sex, occupation, cardiovascular risk factors, and urban vs rural dwelling, of subjects with dementia.
In other words, thanks in large part to the study’s cultural context, these researchers made great progress zeroing in on bilingualism as the specific reason for the delay in dementia symptoms.
What exactly is it about the ability to speak in two languages that seems to provide this protective effect? Alladi and co-authors explain: . . .
Continue reading. And note that being a polyglot offers no noticeable brain-health advantages over being bilingual.
Unfortunately, languages are learned most easily early in life, and yet in the US foreign language programs in the elementary schools are almost unheard of. It’s easy to understand why, given the current US mania to cut taxes and reduce government spending, which beggars our public schools at the same time that corporations are working hard to offer charter schools for profit. Hiring specialized staff who are qualified to teach a foreign language is, today in the US, far beyond the capabilities of public elementary schools—these days, they are looking at getting rid of art and music classes and the school library, not take on new educational missions.
However, Esperanto is easy to learn, and as a second language it would offer not only the benefit of bilingualism as well as an excellent foundation (as shown by research studies) for learning a third language, presumably an evolved language (French, Spanish, Italian, Mandarin, or the like). And parents could feasibly try speaking Esperanto to each other and let the child learn the natural way. In practice, I’ve read, if one parent always speaks to the child in one language (be it Spanish or Esperanto or French) and the other always speaks to the child in English, then the child from first babbling will automatically choose the language appropriate to the parent being addressed. It’s not a conscious decision in young children: they simply speak to Parent A in “Parent A talk” and to Parent B in “Parent B talk.”
So, at least in concept, a parent could learn Esperanto pretty easily (it’s made to be easily learned) and try for a bilingual toddler at home. Plus it’s often good to have a “secret language” when out in public with the kids.
Just a thought. It occurs to me because I’m revisiting (with enjoyment) my own Esperanto stash of books.
If you decide to try it, here are some resources:
Lernu!, a site that teaches Esperanto. (Lerni is the infinitive “to learn”; lernu is the imperative, so the site name is (in English) “Learn!”.
Anki, a free Web-based flash card system. Esperanto vocabulary is easy to acquire (because of the system of affixes: one root generates a panoply of words), but it is important to routinely review words to ensure that they are readily available when you want to speak or write. Anki is free, works on multiple platforms, and is highly capable. It has an add-on that takes care of Esperanto diacritics (the characters the characters ĉ, ĝ, ĥ, ĵ, ŝ, ŭ are typed cx, gx, hx, jx, sx, and ux and automatically corrected. (With OS X and Windows you can easily trigger the same automatic substitutions by the operating system. See, for example, the comment to this post.) The nice thing about Anki is that many “flash-card decks” (in many languages and disciplines) are available as downloadable files, though in fact in most cases it’s best to create your own deck as you go.
Here are the Anki Esperanto decks available for download. Note that a few include images or audio.
BTW, if you’re interested in other languages/disciplines, Anki offer a host of add-ons, all free. Specifically, note that the program is valuable in working with any set of facts or data you must learn, not simply language vocabulary. And it has a great magnitude of decks contributed by users.
As I write this, I see that it will work best with those with a certain sort of obsessive interest. Still, I wanted to mention the possibility, partly because I enjoy Esperanto. I do recognize that it’s never everyone’s cup of tea, though until you sip a bit you won’t know whether it’s for you or not.
UPDATE: Now I’m sidetracked in trying to get my Mac keyboard Esperanto-ized, and I came across this useful note on Ukelele.
I recall Timothy Leary taking LSD trips with prisoners. He thought that LSD would enable them to remold some aspects of their personality—break old patterns and get new insights—in an effort to reduce recidivism. I don’t recall how the research came out, but the recent findings about marijuana’s effect in “rebooting” repetitive patterns and its observed effects on memory (see this post) might offer a therapeutic approach to interrupting the patterns of thought of PTSD.
Vanessa Walz writes at Ladybud.com:
After Staff Sargent Mike Whiter returned home from serving his country, he tried to kill himself three times.
Whiter served in the US Marines for 11 years, including combat tours in Kosovo and Iraq. After his medical discharge, Whiter sought help from the Veterans Administration for his Post Traumatic Stress Disorder (PTSD) as well as his physical injuries.“They put me on 36 different medications in 6 years,” recalls Whiter. “I was on methadone and morphine, benzos, klonopin, xanax, SSRIs…You name a drug, I’ve been on it. I couldn’t sleep, I was having nightmares, I couldn’t leave my house – I was afraid to leave my house.”
Whiter believes that prescription drugs, particularly SSRIs, are contributing to veteran suicides. “SSRIs have suicide listed as a side effect,” Whiter explains. “And they’re giving these pills to people who are already suicidal.”
According to a study released by the Department of Veterans Affairs in February of 2013, 22 veterans take their own lives every day – one suicide every 65 minutes. A 2013 survey by the Iraq and Afghanistan Veterans of America showed that 30% of service members have considered or attempted suicide, and 45% reported that they know an Iraq or Afghanistan veteran who has attempted suicide. In recent years, there have been significantly more US veteran and military deaths by suicide than in combat.
HOPE FOR VETERANS
Mike Whiter’s life today is very different than it was a few years ago. After learning about medical marijuana and PTSD on the Discovery Channel, Whiter believed it might help him.
Since he started using marijuana, Whiter has been able to stop taking all of his prescription medications. No more sleepless nights, no more flashbacks, no more isolation in his home – in fact, Whiter is now the co-director of Philadelphia NORMLand the founder of Pennsylvania Veterans for Medical Marijuana. He has been a featured speaker at numerous public events and rallies, unimaginable during the time he suffered from crippling social anxiety due to his PTSD.
“Marijuana saved my life,” Whiter says. “And when I say that, I’m not exaggerating at all. Those medications would have killed me if I hadn’t taken my own life first. The medications that the VA prescribes are killing veterans.”
In addition to relieving his PTSD symptoms, Whiter credits marijuana for . . .
Interesting article by Belle Beth Cooper:
Happiness is so interesting, because we all have different ideas about what it is and how to get it. It’s also no surprise that it’s the Nr.1 value for Buffer’s culture, if you see ourslidedeck about it. So naturally we are obsessed with it.
I would love to be happier, as I’m sure most people would, so I thought it would be interesting to find some ways to become a happier person that are actually backed up by science. Here are ten of the best ones I found.
1. Exercise more – 7 minutes might be enough
You might have seen some talk recently about the scientific 7 minute workout mentioned in The New York Times. So if you thought exercise was something you didn’t have time for, maybe you can fit it in after all.
Exercise has such a profound effect on our happiness and well-being that it’s actually been proven to be an effective strategy for overcoming depression. In a study cited in Shawn Achor’s book, The Happiness Advantage, three groups of patients treated their depression with either medication, exercise, or a combination of the two. The results of this study really surprised me. Although all three groups experienced similar improvements in their happiness levels to begin with, the follow up assessments proved to be radically different:
The groups were then tested six months later to assess their relapse rate. Of those who had taken the medication alone, 38 percent had slipped back into depression. Those in the combination group were doing only slightly better, with a 31 percent relapse rate. The biggest shock, though, came from the exercise group: Their relapse rate was only 9 percent!
You don’t have to be depressed to gain benefit from exercise, though. It can help you to relax, increase your brain power and even improve your body image, even if you don’t lose any weight.
A study in the Journal of Health Psychology found that people who exercised felt better about their bodies, even when they saw no physical changes:
Body weight, shape and body image were assessed in 16 males and 18 females before and after both 6 × 40 mins exercise and 6 × 40 mins reading. Over both conditions, body weight and shape did not change. Various aspects of body image, however, improved after exercise compared to before.
We’ve explored exercise in depth before, and looked at what it does to our brains, such as releasing proteins and endorphins that make us feel happier, as you can see in the image below.
One thing I consistently see is that states are cutting back on support for education at all levels, from pre-K through universities: slashing departments, cutting out “frills” like libraries, art and music education, foreign languages, and so on. And states cut back on funding for public health: mental health centers and treatments, public hospitals, immunization campaigns. And cutting back on infrastructure maintenance.
I say that there’s more to government than not spending money. The idea is that taxes should never be raised, regardless of population growth, seems strange to me. The US is seriously underfunded in its government, when you compare tax rates in the US and in other advanced nations. We are starving our government, which leads to poor services and support. Why would we want that?
Sy Mukherjee reports in ThinkProgress what happened in Kansas:
The suicide rate in Kansas rose by a staggering 30 percent between 2011 and 2012, according to new government data. Although experts can’t pinpoint a single reason for the spike, many believe that a combination of cuts to mental health funding and the socioeconomic stresses brought on by the global recession are to blame.
There is abundant evidence that economic despair propagates mental health problems — particularly among men. A recent, first-of-its-kind study measuring the impact of the recession on global mental health found that suicide rates increased significantly in countries whose unemployment rates also rose. In fact, a 37 percent higher unemployment rate was linked to a 3.3 percent increase in men’s global suicide rate.
In America, the recession exacerbated a suicide rate that had already been rising for over a decade. Unfortunately, that trend corresponded with massive cuts to mental health care funding as cash-strapped states tried to balance their budgets. States collectively cut $1.8 billion from mental health servicesbetween 2009 and 2011, and by some other estimates, that figure is actually closer to $4.35 billion between 2009 and 2012. Kansas instituted the ninth largest cut to mental health care of any state in the nation between 2009 and 2012.
As The Nation points out, another round of cuts imposed by sequestration has forced the federal government to pull back funding for substance abuse and mental health programs. That hits local communities hard — for instance, one community health center in Kansas lost over half of its funding thanks to a combination of state budget cuts and sequestration.
“Treatment dollars have gone down and more and more people are coming to us, a growing number without any other payment for services,” said Marilyn Cook, executive director of the Sedwick County community health center, in an interview with the Wichita Eagle. “[W]ithout adequate funding, it’s difficult for us to get to everybody who needs care and help.”
Craig Silverman has an interesting, if discouraging, article in the Columbia Journalism Review:
Which of these headlines strikes you as the most persuasive:
“I am not a Muslim, Obama says.”
“I am a Christian, Obama says.”
The first headline is a direct and unequivocal denial of a piece of misinformation that’s had a frustratingly long life. It’s Obama directly addressing the falsehood.
The second option takes a different approach by affirming Obama’s true religion, rather than denying the incorrect one. He’s asserting, not correcting.
Which one is better at convincing people of Obama’s religion? According to recent research into political misinformation, it’s likely the latter.
The study was led by Brendan Nyhan and Jason Reifler, two leading researchers examining political misinformation and the ways in which it can and can’t be refuted, among other topics. Their 2009 paper, “The Effects of Semantics and Social Desirability in Correcting the Obama Muslim Myth,” found that affirming statements appeared to be more effective at convincing people to abandon or question their incorrect views regarding President Obama’s religion.
McRaney spends several thousand words explaining the “backfire effect,” which he nicely summarized in one sentence: “When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.”
As I detailed in a recent column, the backfire effect makes it difficult for the press to effectively debunk misinformation. We present facts and evidence, and it often does nothing to change people’s minds. In fact, it can make people dig in even more. Humans also engage in motivated reasoning, a tendency to let emotions “set us on a course of thinking that’s highly biased, especially on topics we care a great deal about”.
These two important cognitive effects can have a significant impact on society and debates in the public sphere. They also end up negating some of the debunking and reporting work done by the press. My recent attempts to understand the backfire effect and motivated reasoning has transformed into a search for ways to combat these entrenched human phenomena.
I sought out Reifler, an assistant professor of political science at Georgia State University, to learn more about his and his colleagues’ findings regarding affirmative statements and their effect of the Obama Muslim myth. I asked him if there are other other ways of presenting information that can debunk lies.
“I’m sure that there are but I don’t know what they are,” he told me, ever the cautious researcher.
Nevertheless, he did offer some encouragement.
“I think we’re moving in that direction,” he says.
Part of the process of discovering what works is to rule out what doesn’t. I listed a some of them in my previous column, and Nyhan and Reifler provide more evidence in a 2010 paper, “When Corrections Fail: The Persistence of Political Misperceptions,” published inPolitical Behavior. (Note that their definition of a correction is different from the ones used in the press.) Their study saw respondents read a mock news article “containing a statement from a political ﬁgure that reinforces a widespread misperception.” Some of the articles also included a paragraph of text that refuted (or “corrected”) the misperception and statement.
One article, for example, led with President George W. Bush talking about Iraq and the possibility it “would pass weapons or materials or information to terrorist networks.” It then transitioned to a graph that cited information from a CIA report that Iraq did not in fact possess illicit weapons at the time of the U.S.-led invasion. Would these corrective paragraphs influence respondents who believed Iraq had WMDs?
As the researches write, the corrective sections “frequently fail to reduce misperceptions among the targeted ideological group.”
Then there’s that familiar term: “We also document several instances of a ‘backfire effect’ in which corrections actually increase misperceptions among the group in question.”
So perhaps a single, credible refutation within a news article isn’t likely to convince people to change their views. But other research suggests that a constant flow of these kind of corrections could help combat misinformation. The theory is that the more frequently someone is exposed to information that goes against their incorrect beliefs, the more likely it is that they will change their views.
“It’s possible there is something to be said for persistence,” Reifler said. “At some point the cost of always being wrong or always getting information that runs counter to what you believe is likely to outweigh the cost of having to change your mind about something. We need to figure out what is the magic breaking or tipping point, or what leads people to get to that tipping point. I think we’re just scratching the surface.”
He pointed to a 2010 paper in Political Psychology by David P. Redlawsk and others,“The Affective Tipping Point: Do Motivated Reasoners Ever ‘Get It’?”
The researchers sought to determine if a tipping point exists that could cause voters to abandon motivated reasoning and view facts in a more rational way.
“We show experimental evidence that such an affective tipping point . . .
Turning a blind eye. Giving someone the cold shoulder. Looking down on people. Seeing right through them.
These metaphors for condescending or dismissive behavior are more than just descriptive. They suggest, to a surprisingly accurate extent, the social distance between those with greater power and those with less — a distance that goes beyond the realm of interpersonal interactions and may exacerbate the soaring inequality in the United States.
A growing body of recent research shows that people with the most social power pay scant attention to those with little such power. This tuning out has been observed, for instance, with strangers in a mere five-minute get-acquainted session, where the more powerful person shows fewer signals of paying attention, like nodding or laughing. Higher-status people are also more likely to express disregard, through facial expressions, and are more likely to take over the conversation and interrupt or look past the other speaker.
Bringing the micropolitics of interpersonal attention to the understanding of social power, researchers are suggesting, has implications for public policy.
Of course, in any society, social power is relative; any of us may be higher or lower in a given interaction, and the research shows the effect still prevails. Though the more powerful pay less attention to us than we do to them, in other situations we are relatively higher on the totem pole of status — and we, too, tend to pay less attention to those a rung or two down.
A prerequisite to empathy is simply paying attention to the person in pain. In 2008, social psychologists from the University of Amsterdam and the University of California, Berkeley, studied pairs of strangers telling one another about difficulties they had been through, like a divorce or death of a loved one. The researchers found that the differential expressed itself in the playing down of suffering. The more powerful were less compassionate toward the hardships described by the less powerful.
Dacher Keltner, a professor of psychology at Berkeley, and Michael W. Kraus, an assistant professor of psychology at the University of Illinois, Urbana-Champaign, have done much of the research on social power and the attention deficit.
Mr. Keltner suggests that, in general, we focus the most on those we value most. While the wealthy can hire help, those with few material assets are more likely to value their social assets: like the neighbor who will keep an eye on your child from the time she gets home from school until the time you get home from work. The financial difference ends up creating a behavioral difference. Poor people are better attuned to interpersonal relations — with those of the same strata, and the more powerful — than the rich are, because they have to be.
While Mr. Keltner’s research finds that the poor, compared with the wealthy, have keenly attuned interpersonal attention in all directions, in general, those with the most power in society seem to pay particularly little attention to those with the least power. To be sure, high-status people do attend to those of equal rank — but not as well as those low of status do.
This has profound implications for societal behavior and government policy. Tuning in to the needs and feelings of another person is a prerequisite to empathy, which in turn can lead to understanding, concern and, if the circumstances are right, compassionate action.
In politics, readily dismissing inconvenient people can easily extend to dismissing inconvenient truths about them. The insistence by some House Republicans in Congress on cutting financing for food stamps and impeding the implementation of Obamacare, which would allow patients, including those with pre-existing health conditions, to obtain and pay for insurance coverage, may stem in part from the empathy gap. As political scientists have noted, redistricting and gerrymandering have led to the creation of more and more safe districts, in which elected officials don’t even have to encounter many voters from the rival party, much less empathize with them.
Social distance makes it all the easier to focus on small differences between groups and to put a negative spin on the ways of others and a positive spin on our own.
Freud called this “the narcissism of minor differences,” a theme repeated by . . .
Ezra Klein interviews Christopher Parker in Wonkblog:
The story of the shutdown is, in large part, the story of mainstream Republicans realizing they can’t control tea party Republicans — and deciding that it’s better to go along than to try and fight. Christopher Parker, a political scientist at the University of Washington, is co-author of the book “Change They Can’t Believe In: The Tea Party and Reactionary Politics in America“, which employs large surveys and content analyses to better understand how the politics of the tea party differ from the politics of the Republican Party. We spoke on Wednesday, and a transcript of our conversation, edited for length and clarity, follows.
Ezra Klein: Tell me a bit about the scope of your research on the tea party.Christopher Parker: So I run a survey research lab at the University of Washington. In 2010, I began to see these opposing views on the tea party. You had Peggy Noonan and Juan Williams basically saying, the tea partiers are just angry Republicans, no big deal. Then I read Frank Rich, and he says no, these people are completely different. He says they’re more in line with Richard Hofstadter’s “Paranoid Style of American Politics.” And I thought, I can get real data on this! And when I looked at it empirically, I found that people who supported the tea party tended to be more racist, sexist, homophobic, xenophobic, and anti-Obama.
EK: So I’m not exactly a tea partier myself. But when I hear you say that I bristle. The description members of the tea party would give of themselves is that they’re really concerned about the growth of government and the rise in taxes and the management of the economy. Labeling them things like racist, sexist and homophobic sounds like an attempt to just write them out of civilized discourse. So persuade me that this isn’t just an attack.
CP: What I do in these surveys and models is I account for desire for limited government. I account for ideology. I account for all these other things where people could say they’re just more conservative. There’s just this empirical connection between support for the tea party and antagonistic views toward quote-unquote marginalized groups, or, if you prefer, toward quote-unquote not real Americans. If you look at the historical and social scientific literature on American national identity, the portrait that emerges is mainly white, male, middle class, straight, at least a bit educated, and a bit older.
Look at who rose during this period. It’s not all about Obama. Nancy Pelosi was the first female speaker of the House. Barney Frank wielded real power. Two women, one of whom was a Latina, went to the Supreme Court. Undocumented workers have gotten a ton of attention. There’s been the rise of same-sex rights.
That’s the crux of the book. The title is ‘Change They Can’t Believe In’. This isn’t new. Whenever there’s rapid social change it triggers this kind reactionary conservatism. People see their social prestige threatened, their way of life threatened. And they react.
EK: Tell me about the surveys. Who are you talking to? . . .
I used to believe that MPD was a real phenomenon—the mind can indeed do strange things—but it increasingly seems like a hox. I had a tab with a great summary of the situation—an article by a psychiatrist in the US who was alarmed to see that in the UK diagnoses of MPD are still common. The article was extremely good, but Chrome crashed and “Restore” didn’t work and no matter how I skim “history” I cannot find it. So it goes.
And Google was of little help, though it did uncover this article in the NY Times Magazine by Debbie Nathan a couple of years ago:
“What about Mama?” the psychiatrist asks her patient. “What’s Mama been doing to you, dear? . . . I know she gave you the enemas. And I know she filled your bladder up with cold water, and I know she used the flashlight on you, and I know she stuck the washcloth in your mouth, cotton in your nose so you couldn’t breathe. . . . What else did she do to you? It’s all right to talk about it now. . . . ”
“My mommy,” the patient says.
“My mommy said that I was a bad little girl, and . . . she slapped me . . . with her knuckles. . . .”
“Mommy isn’t going to ever hurt you again,” the psychiatrist says at the close of the session. “Do you want to know something, Sweetie? I’m stronger than Mother.”
The transcript of this conversation is stored at John Jay College of Criminal Justice, in New York City, among the papers of Flora Schreiber, author of “Sybil,” the blockbuster book about a woman with 16 personalities. “Sybil” was published in 1973; within four years it had sold more than six million copies in the United States and hundreds of thousands abroad. A television adaptation broadcast in 1976 was seen by a fifth of all Americans. But Sybil’s story was not just gripping reading; it was instrumental in creating a new psychiatric diagnosis: multiple-personality disorder, or M.P.D., known today as dissociative-identity disorder.
Schreiber collaborated on the book with Dr. Cornelia Wilbur, the psychiatrist who asks, “What about Mama?” — and with Wilbur’s patient, whose name Schreiber changed to Sybil Dorsett. Schreiber worked from records of Sybil’s therapy, including thousands of pages of patient diaries and transcripts of tape-recorded therapy sessions. Before she died in the late 1980s, Schreiber stipulated that the material be archived at a library. For a decade after Schreiber’s death, Sybil’s identity remained unknown. To protect her privacy, librarians sealed her records. In 1998, two researchers discovered that her real name was Shirley Mason. In trying to track her down, they learned that she was dead, and the librarians at John Jay decided to unseal the Schreiber papers.
The same year that her identity was revealed, Robert Rieber, a psychologist at John Jay, presented a paper at the American Psychological Association in which he accused Mason’s doctor of a “fraudulent construction of a multiple personality,” based on tape-recordings that Schreiber had given him. “It is clear from Wilbur’s own words that she was not exploring the truth but rather planting the truth as she wanted it to be,” Rieber wrote.
It wasn’t the first indication that there might be problems with Mason’s diagnosis. As far back as 1994, Herbert Spiegel, an acclaimed psychiatrist and hypnotherapist, began telling reporters that he occasionally treated Shirley Mason when her regular psychiatrist went out of town. During those sessions, Spiegel recalled, Mason asked him if he wanted her to switch to other personalities. When he questioned her about where she got that idea, she told him that her regular doctor wanted her to exhibit alter selves.
And yet, in the popular imagination, Sybil and her fractured self remained powerfully tied to the idea of M.P.D. and the childhood traumas it was said to stem from. “Mamma was a bad mamma,” Wilbur declares in the transcripts. “I can help you remember.” But countless other records suggest that the outrages Sybil recalled never happened. If Sybil wasn’t really remembering, then what exactly was Wilbur helping her to do?
When Mason made her first visit to Dr. Wilbur’s Park Avenue office, in late 1954, it had been nine years since she’d first gone to her for help. At that time, Mason was an art student in the Midwest seeking treatment from Wilbur, a young psychiatrist in Omaha, for blackouts and disturbing behaviors that included disappearing for hours when her parents took her around town on errands. Mason talked to Wilbur about her lifelong ailments — anorexia, nervousness, anemia and feelings of worthlessness — and about growing up as the only child of Seventh-Day Adventists in the small town of Dodge Center, Minn. Oddly, Wilbur also talked about her own life, including her talent for treating hysterics and her interest in people suffering from the strangest type of hysteria of all: multiple personalities. Mason developed a crush on her psychiatrist, who seemed to understand her like no one else. She completed only a handful of psychotherapy sessions, but they gave her the strength she needed to finish college and eventually move to New York. She’d had relapses since, but now her former doctor was in the city! A half-dozen sessions, she thought, might keep her nerves from ever acting up again.
During this second round of treatment, . . .
Read this story. My hope is that some Nevada officials will do some long stretches of hard time. When a government turns on its most vulnerable and needy citizens, then it has become a travesty of government.
On the bright side, Nevada did not simply line them up against a wall and machine-gun them to death, though undoubtedly it would have if it thought it could get away with it.
This is an interesting finding, though not entirely unexpected: if doing a bad act caused guilt, shame, and pain, fewer people would do such things: people generally avoid pain. But if the acts cause pleasure, well, then, that’s a totally different story. In that case we’d expect such behavior to be fairly common: pleasure conflicting with learned moral principles is always a struggle, as we all know. Bruce Bower reports in Science News:
People who act unethically without harming an obvious victim — think plagiarizing on a term paper or stealing office supplies at work — get a buzz immediately after their transgressions, a new study suggests.
The existence of this “cheater’s high” challenges influential theories holding that any wrongdoing triggers guilt, shame or remorse, say psychologist Nicole Ruedy of the University of Washington in Seattle and her colleagues. Although people expect to feel guilty after breaching ethics, cheaters temporarily bask in the glow of having gotten away with forbidden acts, the researchers proposeSeptember 2 in the Journal of Personality and Social Psychology.
That won’t surprise shoplifters, joy riders and con artists who make no secret of savoring their swindles. But scientists have largely ignored the emotional upside of unethical behavior. Immediate emotional payoffs may reinforce certain types of offenses, Ruedy says.
Her team found that college students and volunteers recruited online predicted that they or others would feel bad if they were to cheat on a laboratory task or pad a time sheet at work to earn a bonus.
Yet those forecasts didn’t pan out in a new experiment where 179 participants unscrambled as many of 15 words as possible in four minutes. Every correct word was worth one dollar. Work sheets were stapled on top of two sheets of carbonless copy paper inside a folder. Upon finishing, volunteers tore off their work sheets and handed the folder to a researcher, not knowing that they had forked over a record of their responses.
Participants then used an answer sheet to check their own work in private before turning it in. Comparisons to copy paper responses showed that 71 volunteers cheated by inserting additional words from the answer sheet.
Afterward, cheaters on average reported a larger boost in excitement and other positive feelings than their honest peers did, with no change in negative emotions.
Further experiments indicated that . . .
Continue reading. Given the feeling, it’s easy to understand how people can get hooked on it.
The problem is that the current GOP is an extreme case of a general conservative tendency to value loyalty, so that findings (whether by scientists or economists) that go against accepted conservative doctrine are viewed as expressions of disloyalty, not as discoveries about reality, and those responsible for the findings, shown by them to be disloyal, are rejected and ignored. It’s a self-sealing system because of the supreme value of loyalty, which clearly trumps honesty among virtues. For example, a member of the GOP will not hesitate to lie if the lie is consistent with (and required by) loyalty—as you can see from many, many examples.
The result, as Paul Krugman points out this morning, is that experts,/wonks/sciencists who by nature have their primary loyalty to their discipline, don’t fall into line when party ideology diverges from observable reality, so they become marginalized if not outright excluded. (Of course, there will always be pseudo-experts (aka scam artists, con men/women), who will say and do anything to set up the marks and fleece them of their money. The GOP is know knee-deep in such experts, think tanks, and the like, grinding out (for good money) what the GOP wants to hear.
The problem, of course, is that in the end reality always wins. So it’s useful to know what it is, even if you currently have a philosophical disagreement with it.
I mentioned in earlier posts this different between conservative and liberal outlooks, based on the difference between the virtues liberals hold as preeminent. From that link:
Haight identifies “five distinct moral realms: harm/care, fairness, in-group loyalty, deference to authority, and purity/sanctity. The first two promote individual freedom and self-expression, and are beloved by liberals; the final three bind societies together, and are close to the hearts of social conservatives.” (I’m quoting from this post.)
I’m reading Stephen Grosz’s The Examined Life, and I thought the section I quote below is of particular interest to many. It’s based on research by the Stanford psychologist Carol Dweck, which she published in the book Mindset, which I frequently recommend. Here’s the passage, whch I think will be of interest to parents, teachers, and others who interact with the young.
Rounding the corner into the nursery school classroom to collect my daughter, I overheard the nursery assistant tell her, ‘You’ve drawn the most beautiful tree. Well done.’ A few days later, she pointed to another of my daughter’s drawings and remarked, ‘Wow, you really are an artist!’
On both occasions, I found myself at a loss. How could I explain to the nursery assistant that I would prefer it if she didn’t praise my daughter?
Nowadays, we lavish praise on our children. Praise, self-confidence and academic performance, it is commonly believed, rise and fall together. But current research suggests otherwise – over the past decade, a number of studies on self-esteem have come to the conclusion that praising a child as ‘clever’ may not help her at school. In fact, it might cause her to underperform. Often a child will react to praise by quitting – why make a new drawing if you have already made ‘the best’? Or a child may simply repeat the same work – why draw something new, or in a new way, if the old way always gets applause?
In a now famous 1998 study of children aged ten and eleven, psychologists Carol Dweck and Claudia Mueller asked 128 children to solve a series of mathematical problems. After completing the first set of simple exercises, the researchers gave each child just one sentence of praise. Some were praised for their intellect – ‘You did really well, you’re so clever’; others for their hard work – ‘You did really well, you must have tried really hard.’ Then the researchers had the children try a more challenging set of problems. The results were dramatic. The students who were praised for their effort showed a greater willingness to work out new approaches. They also showed more resilience and tended to attribute their failures to insufficient effort, not to a lack of intelligence. The children who had been praised for their cleverness worried more about failure, tended to choose tasks that confirmed what they already knew, and displayed less tenacity when the problems got harder. Ultimately, the thrill created by being told ‘You’re so clever’ gave way to an increase in anxiety and a drop in self-esteem, motivation and performance. When asked by the researchers to write to children in another school, recounting their experience, some of the ‘clever’ children lied, inflating their scores. In short, all it took to knock these youngsters’ confidence, to make them so unhappy that they lied, was one sentence of praise.
Why are we so committed to praising our children?
In part, we do it to demonstrate that we’re different from our parents. In Making Babies, a memoir about becoming a mother, Anne Enright observes, ‘In the old days – as we call the 1970s, in Ireland – a mother would dispraise her child automatically . . . “She’s a monkey,” a mother might say, or “Street angel, home devil,” or even my favourite, “She’ll have me in an early grave.” It was all part of growing up in a country where praise of any sort was taboo.’ Of course, this wasn’t the case in Ireland alone. Recently, a middle-aged Londoner told me, ‘My mum called me things I’d never call my kids – too clever by half, cheeky, precocious and show-off. Forty years on, I want to shout at my mum, “What’s so terrible about showing off?”’
Now, wherever there are small children – at the local playground, at Starbucks and at nursery school – you will hear the background music of praise: ‘Good boy,’ ‘Good girl,’ ‘You’re the best.’ Admiring our children may temporarily lift our self-esteem by signalling to those around us what fantastic parents we are and what terrific kids we have – but it isn’t doing much for a child’s sense of self. In trying so hard to be different from our parents, we’re actually doing much the same thing – doling out empty praise the way an earlier generation doled out thoughtless criticism. If we do it to avoid thinking about our child and her world, and about what our child feels, then praise, just like criticism, is ultimately expressing our indifference.
Which brings me back to the original problem – if praise doesn’t build a child’s confidence, what does?
Shortly after qualifying as a psychoanalyst, I discussed all this with an eighty-year-old woman named Charlotte Stiglitz. Charlotte – the mother of the Nobel Prize-winning economist Joseph Stiglitz – taught remedial reading in northwestern Indiana for many years. ‘I don’t praise a small child for doing what they ought to be able to do,’ she told me. ‘I praise them when they do something really difficult – like sharing a toy or showing patience. I also think it is important to say “thank you”. When I’m slow in getting a snack for a child, or slow to help them and they have been patient, I thank them. But I wouldn’t praise a child who is playing or reading.’ No great rewards, no terrible punishments – Charlotte’s focus was on what a child did and how that child did it.
I once watched Charlotte with a four-year-old boy, who was drawing. When he stopped and looked up at her – perhaps expecting praise – she smiled and said, ‘There is a lot of blue in your picture.’ He replied, ‘It’s the pond near my grandmother’s house – there is a bridge.’ He picked up a brown crayon, and said, ‘I’ll show you.’ Unhurried, she talked to the child, but more importantly she observed, she listened. She was present.
Being present builds a child’s confidence because it lets the child know that she is worth thinking about. Without this, a child might come to believe that her activity is just a means to gain praise, rather than an end in itself. How can we expect a child to be attentive, if we’ve not been attentive to her?
Being present, whether with children, with friends, or even with oneself, is always hard work. But isn’t this attentiveness – the feeling that someone is trying to think about us – something we want more than praise?
The book is good, but that particular passage seems important to share since the findings are so counter to modern assumptions.
Behind every coalition promise to “get tough on single mothers”, behind every Daily Mail story about Britain’s “handout culture”, or Mitt Romney’s notorious comments about “the 47%”, there lies an assumption: that being poor is a failure of character. Awkwardly, for those who find this obnoxious, the research sometimes makes it seem true. People who are less well-off really do appear to give in more readily to temptation, making the very purchases they can’t afford; to make unwise financial decisions; to use less effective parenting techniques; or to fail to take life-saving drugs, even when they’re free. Is this a deep-seated weakness of will, made worse by a “culture of dependency”? The Harvard economist Sendhil Mullainathan and the Princeton psychologist Eldar Shafir reject that idea, and some of the most familiar leftwing responses, too. Poverty, they argue, is indeed a matter of willpower and bad decisions, but the Mail has it back-to-front. It’s not that foolish choices make you poor; it’s that poverty’s effects on the mind lead to bad choices. Living with too little imposes huge psychic costs, reducing our mental bandwidth and distorting our decisionmaking in ways that dig us deeper into a bad situation.
Of course, it’s hardly news that poverty creates a vicious cycle. Not having money is expensive, thanks to credit card late fees, high interest rates on payday loans, the extra cost of buying in instalments, and so on. But the alarming conclusion of this book is how completely scarcity colonises the mind. Merely asking poorer people to contemplate a hypothetical £1,000 car repair, one study by the authors shows, impairs their performance on intelligence tests as much as missing a night’s sleep – about 13 or 14 IQ points. In another study, Indian sugar cane farmers performed worse pre-harvest, when money was tight, compared to post-harvest. “Scarcity captures the mind,” explain Mullainathan and Shafir. It promotes tunnel vision, helping us focus on the crisis at hand but making us “less insightful, less forward-thinking, less controlled”. Wise long-term decisions and willpower require cognitive resources. Poverty leaves far less of those resources at our disposal.
Their most arresting claim is that the same effects kick in – albeit not always with such grave implications – in any conditions of scarcity, not just lack of money. Chronically busy people, suffering from a scarcity of time, also demonstrate impaired abilities and make self-defeating choices, such as unproductive multi-tasking or neglecting family for work. Lonely people, suffering from a scarcity of social contact, become hyper-focused on their loneliness, prompting behaviours that render it worse. In one sense, Mullainathan and Shafir concede, scarcity is so ubiquitous as to be almost meaningless. But the feeling of scarcity – of not having as much of something as you believe you need – is something more specific and agonising. To use the authors’ favourite metaphor, life under such conditions is like packing a tiny suitcase for a trip. It entails a ceaseless focus on difficult trade-offs: the umbrella or the extra sweater? The greatest freedom that money can buy is the freedom from thinking about money – or, to quote Henry David Thoreau, “a man is rich in proportion to the number of things he can afford to let alone”.
There’s a risk here of lapsing into the obvious: rich and relaxed is better than poor and time-starved. Mallainathan and Shafir do sometimes succumb; financial abundance, we are gravely informed, “allows us to buy more things”. Yet the strongest chapters demonstrate that the psychological effects of scarcity aren’t obvious at all. In certain limited ways, for example, poverty actually confers cognitive benefits. Some of the classic findings about how irrational we are when it comes to money – such as our willingness to travel across town to save £5 on a cheap toaster, but not on a flatscreen TV – apply much less to the poor. Dieters, experiencing a scarcity of food, are significantly better than others at identifying words briefly flashed on a screen, provided that they’re about food. Lonely peopleread facial expressions more accurately. And time-scarcity brings motivational benefits, as any journalist on a deadline could tell you.
But these positive effects of tunnel vision are outweighed by what the authors call “the bandwidth tax”, the ways scarcity limits or distorts our skills. This tax, they argue persuasively, explains a number of otherwise confounding kinds of self-defeating behaviour among those suffering scarcity – from the failure of poorer farmers in Africa to weed their fields, even though they have the time to do so and would make more money that way, to the failure of low-income Americans to take diabetes drugs and other medications, or to eat more healthily even when it’s financially viable. “The failures of the poor are part and parcel of the misfortune of being poor in the first place,” they write. It’s not that poor people have less bandwidth. It’s that “all people, if they were poor, would have less effective bandwidth”.
The bandwidth argument threatens to undermine much received political wisdom on poverty. Get-tough policies, like cutting off access to benefits after a fixed number of years, won’t motivate people to find jobs: a deadline of several years is too distant to feature in the calculations of people only concerned with paying the next bill. On the other hand, well-intended interventions like providing financial education or job-readiness training could backfire, too. Another class to attend, another item to tick off the to-do list – all use up more bandwidth, potentially impairing people’s capacities more than improving them.
How can we stop falling into these traps? Mullainathan and Shafir offer a few“nudge”-style suggestions. . .
Interesting finding reported in Science News by Bruce Bower:
Poverty drains brains while it empties pocketbooks, a new study concludes.
Money worries consume poor people’s attention, dramatically undermining their performance on IQ-related tests of reasoning and mental control, say economist Anandi Mani of the University of Warwick in Coventry, England, and her colleagues. Among the poor, but not the rich, evoking financial concerns damages reasoning abilities about as much as going a night without sleep or losing 13 IQ points, Mani’s team reports in the Aug. 30 Science.
Shortly after reaping a financial windfall, poor individuals perform far better on the same mental tests. That improvement may be thanks partly to temporary freedom from money concerns, the scientists propose.
Their findings follow evidence that scarcity of money (or anything else important) promotes short-term thinking, helping to explain why poor people generally save too little and borrow too much (SN: 12/1/12, p. 17).
The new study raises a valid concern, although people barely scraping by frequently deal with money in sophisticated ways, says Harvard University sociologist Kathryn Edin, who studies U.S. families subsisting on welfare. “Poverty can lead to better, not just worse, mental functioning.”
Many mothers on welfare, for instance, work out complicated family budgets and keep careful spending records, Edin finds.
In one experiment, Mani’s group classified nearly 400 shoppers at a New Jersey mall as affluent or poor based on self-reported incomes and family size. Participants made easy or hard hypothetical financial decisions before taking nonverbal tests of logical thinking and the ability to control rapid responses to computer images.
Poor people who contemplated tough money problems scored lower on both mental tests than their wealthy counterparts. On easy problems, rich and poor groups scored similarly.
In a second experiment, the researchers administered the same tests to 364 sugarcane farmers in India. Farmers eked out a living until harvests yielded big pay days.
The researchers gave tests before and after harvests; test scores rose substantially after harvests. Stress reduction, indicated by lower blood pressure and heart rate, partially explained farmers’ mental turnaround, Mani says.
Policymakers should consider . . .
UPDATE: See also this article.
Maybe we can start to recognize the costs of war and in reality support our troops by not sending them off to suffer and die in unnecessary wars. Max Herman at Pacific Standard writes:
The video below, from Stars and Stripes, shows Staff Sgt. Ty Carter’s speech at a ceremony awarding him the Congressional Medal of Honor yesterday. Carter, one of only five living recipients of the medal who fought in post 9/11 conflicts, was cited for his actions during a battle in Afghanistan in which he attempted to rescue a fellow soldier, Spc Stephan Mace. Carter pulled Mace to safety and treated him amid a 12-hour-long battle. Mace, wounded grievously, eventually died.*
Carter’s speech is notably different from the popular image of a war hero receiving a medal. In lieu of crisp salutes and talk of duty, Carter spends much of his time speaking about loss, and frankly discussing his own mental health after the ordeal. To be clear, Staff Sgt. Carter appears to be a person of extraordinary mettle. In his presentation, however, he does not mind projecting another image, of a young man who has seen too many terrible things. Speaking in what can only be called a tone of vulnerability, he tells the White House audience, including President Obama:
Only those closest to me can see the scars that come from seeing good men take their last breath. During the battle, I lost some of the hearing in my left ear. But I will always hear the voice of Specialist Stephan Mace. I will hear his plea for help for the rest of my life.
He goes on to talk about how he recovered from the experience.
However, thanks to the professionalism of my platoon Sgt, Sgt Hill, and my behavioral health provider, Capt. Cobb, and my friends and family, I will heal.
“Behavioral health” is a synonym for mental health. A “behavioral health provider” is a therapist. In a ceremony traditionally designed to showcase bravery in battle, Carter is taking the extraordinary step of focusing on how he, the classic American war hero, came home from Afghanistan with his head in a bad place. He goes on to speak of anguished families of the soldiers lost in the same violent battle for which he received the medal. President Obama also remarks on the mental health issue.
Certainly what the video below displays is a cultural shift, from the 1940s image of the hard-bitten GI, to the modern, human hero like Carter. It’s also tempting to read the focus of this week’s ceremony as a tacit pushback against an emerging skepticism about war’s role in a wave of military suicides over the past half-decade-plus. Coincidentally, two weeks ago a study published in the Journal of the American Medical Association claimed that military deployments were not to blame for the widely-reported rise in suicides among service members since 2005. . .
Continue reading. Video at the link.