Archive for the ‘Education’ Category
A fascinating and useful article by Alec MacGillis and ProPublica, published in the Atlantic:
Sometime during the past few years, the country started talking differently about white Americans of modest means. Early in the Obama era, the ennobling language of campaign pundits prevailed. There was much discussion of “white working-class voters,” with whom the Democrats, and especially Barack Obama, were having such trouble connecting. Never mind that this overbroad category of Americans—the exit pollsters’ definition was anyone without a four-year college degree, or more than a third of the electorate—obliterated major differences in geography, ethnicity, and culture. The label served to conjure a vast swath of salt-of-the-earth citizens living and working in the wide-open spaces between the coasts—Sarah Palin’s “real America”—who were dubious of the effete, hifalutin types increasingly dominating the party that had once purported to represent the common man. The “white working class” connoted virtue and integrity. A party losing touch with it was a party unmoored.
That flattering glow has faded away. Today, less privileged white Americans are considered to be in crisis, and the language of sociologists and pathologists predominates. Charles Murray’s Coming Apart: The State of White America, 1960–2010 was published in 2012, and Robert D. Putnam’s Our Kids: The American Dream in Crisis came out last year. From opposite ends of the ideological spectrum, they made the case that social breakdown among low-income whites was starting to mimic trends that had begun decades earlier among African Americans: Rates of out-of-wedlock births and male joblessness were rising sharply. Then came the stories about a surge in opiate addiction among white Americans, alongside shocking reports of rising mortality rates (including by suicide) among middle-aged whites. And then, of course, came the 2016 presidential campaign. The question was suddenly no longer why Democrats struggled to appeal to regular Americans. It was why so many regular Americans were drawn to a man like Donald Trump.
Equally jarring has been the shift in tone. A barely suppressed contempt has characterized much of the commentary about white woe, on both the left and the right. Writing for National Review in March, the conservative provocateur Kevin Williamson shoveled scorn on the low-income white Republican voters who, as he saw it, were most responsible for the rise of Trump:
Nothing happened to them. There wasn’t some awful disaster. There wasn’t a war or a famine or a plague or a foreign occupation. Even the economic changes of the past few decades do very little to explain the dysfunction and negligence—and the incomprehensible malice—of poor white America. So the gypsum business in Garbutt ain’t what it used to be. There is more to life in the 21st century than wallboard and cheap sentimentality about how the Man closed the factories down.
The truth about these dysfunctional, downscale communities is that they deserve to die. Economically, they are negative assets. Morally, they are indefensible. Forget all your cheap theatrical Bruce Springsteen crap. Forget your sanctimony about struggling Rust Belt factory towns and your conspiracy theories about the wily Orientals stealing our jobs … The white American underclass is in thrall to a vicious, selfish culture whose main products are misery and used heroin needles. Donald Trump’s speeches make them feel good. So does OxyContin.
Analysis on the left has been less gratuitously nasty but similarly harsh in its insinuation. Several prominent liberals have theorized that what’s driving rising mortality and drug and alcohol abuse among white Americans is, quite simply, despair over the loss of their perch in the country’s pecking order. “So what is happening?” asked Josh Marshall on his “Talking Points Memo” blog in December. “Let’s put this clearly,” he said in wrapping up his analysis of the dismal health data. “The stressor at work here is the perceived and real loss of the social and economic advantages of being white.”
The barely veiled implication, whichever version you consider, is that the people undergoing these travails deserve relatively little sympathy—that they maybe, kinda had this reckoning coming. Either they are layabouts drenched in self-pity or they are sad cases consumed with racial status anxiety and animus toward the nonwhites passing them on the ladder. Both interpretations are, in their own ways, strikingly ungenerous toward a huge number of fellow Americans.
They are also unsatisfying as explanations for what is happening out there. Williamson, for one, mischaracterizes the typical Trump voter. As exit polls show, the candidate’s base is not the truly bereft white underclass Williamson derides. Those Americans are, by and large, not voting at all, as I’m often reminded when reporting in places like Appalachia, where turnout rates are the lowest in the country. People voting for Trump are mostly a notch higher on the economic ladder—in a position to feel exactly the resentment that Williamson himself feels toward the shiftless needy. As for liberals’ diagnosis that a major public-health crisis is rooted in racial envy, it fails to square with, among other things, the fact that blacks and Hispanics have hardly been flourishing themselves. Yes, there’s an African American president, but by many metrics the Great Recession was even worse for minorities than for whites.
Two new books—one a provocative, deeply researched history and the other an affecting memoir—are well timed to help make better sense of the plight of struggling whites in the United States. Both accounts converge on an important insight: The gloomy state of affairs in the lower reaches of white America should not have caught the rest of the country as off guard as it has—and mobilizing solutions for the crisis will depend partly on closing the gaps that allowed for such obliviousness.
“Welcome to america as it was,” Nancy Isenberg, a historian at Louisiana State University, writes near the outset of White Trash: The 400-Year Untold History of Class in America. Her title might seem sensational were it not so well earned. As she makes plain, a white lower class not only figured more prominently in the development of the colonies and the young country than national lore suggests, but was spoken of from the start explicitly in terms of waste and refuse.
For England, the New World beckoned as more than a vast store of natural resources, Isenberg argues. It was also a place to dispose of the dregs of its own society. In the late 16th century, the geographer Richard Hakluyt argued that America could serve as a giant workhouse where the “fry [young children] of wandering beggars that grow up idly and hurtfully and burdenous to the Realm, might be unladen and better bred up.” The exportable poor, he wrote, were the “offals of our people.” In 1619, King James I was so fed up with vagrant boys milling around his Newmarket palace that he asked the Virginia Company to ship them overseas. Three years later, John Donne—yes, that John Donne—wrote about the colony of Virginia as if it were England’s spleen and liver, Isenberg writes, draining the “ill humours of the body … to breed good bloud.” Thus it was, she goes on, that the early settlers included so many “roguish highwaymen, mean vagrants, Irish rebels, known whores, and an assortment of convicts,” including one Elizabeth “Little Bess” Armstrong, sent to Virginia for stealing two spoons.
One of America’s founding myths, of course, is that the simple act of leaving England and boldly starting new lives in the colonies had an equalizing effect on the colonists, swiftly narrowing the distance between indentured servant and merchant, landowner and clerk—all except the African slave. Nonsense, Isenberg says: “Independence did not magically erase the British class system.” A “ruthless class order” was enforced at Jamestown, where one woman returned from 10 months of Indian captivity to be told that she owed 150 pounds of tobacco to her dead husband’s former master and would have to work off the debt. The Puritans were likewise “obsessed with class rank”—membership in the Church and its core elect were elite privileges—not least because the early Massachusetts settlers included far more nonreligious riffraff than is generally realized. A version of the North Carolina constitution probably co-authored by John Locke was designed to “avoid erecting a numerous democracy.” It envisioned a nobility of landgraves and caciques (German for “princes” and Spanish for “chieftains”), along with a “court of heraldry” to oversee marriages and make sure they preserved pedigree.
Class distinctions were maintained above all in the apportionment of land. In Virginia in 1700, indentured servants had virtually no chance to own any, and by 1770, less than 10 percent of white Virginians had claim to more than half the land. In 1729 in North Carolina, a colony with 36,000 people, there were only 3,281 listed grants, and 309 grantees owned nearly half the land. “Land was the principal source of wealth, and those without any had little chance to escape servitude,” Isenberg writes. “It was the stigma of landlessness that would leave its mark on white trash from this day forward.” This was not just a Southern dynamic. The American usage of squatter traces to New England, where many of the nonelect—later called “swamp Yankees”—carved out homes on others’ land only to be chased off and have their houses burned.
The Founding Fathers were, as Isenberg sees it, complicit in perpetuating these stark class divides. George Washington believed that only the “lower class of people” should serve as foot soldiers in the Continental Army. Thomas Jefferson envisioned his public schools educating talented students “raked from the rubbish” of the lower class, and argued that ranking humans like animal breeds was perfectly natural. “The circumstance of superior beauty is thought worthy of attention in the propagation of our horses, dogs and other domestic animals,” he wrote. “Why not that of man?” John Adams believed the “passion for distinction” was a powerful human force: “There must be one, indeed, who is the last and lowest of the human species.”
By the time the nation gained independence, the white underclass—its future dependents—was fully entrenched. This underclass could be found just about everywhere in the new country, but it was perhaps most conspicuous in North Carolina, where many whites who had been denied land in Virginia trickled into the area south of the Great Dismal Swamp, establishing what Isenberg calls “the first white trash colony.” William Byrd II, the Virginia planter, described these swamp denizens as suffering from “distempers of laziness” and “slothful in everything but getting children.” North Carolina’s governor described his people as “the meanest, most rustic and squalid part of the species.”
Accounts of this underclass as “an anomalous new breed of human,” as Isenberg puts it, proliferated as poor whites without property spread west and south across the country. These “crackers” and “squatters” were “no better than savages,” with “children brought up in the Woods like brutes,” wrote a Swiss-born colonel in the colonial army in 1759. In 1810, the ornithologist Alexander Wilson described the “grotesque log cabins” where the lowly patriarch typically stood wearing a shirt “defiled and torn,” his “face inlaid with dirt and soot.” Thomas Jefferson’s granddaughter came back from an 1817 excursion with her grandfather telling of that “half civiliz’d race who lived beyond the ridge.” In 1830, the country even got its first “Cracker Dictionary” to document the slang of poor whites.
At various junctures, politicians (think Davy Crockett and Andrew Jackson) turned humble roots into a mark of “backwoodsman” authenticity, but the pendulum always swung back. The term white trash made its first appearance in print as early as 1821. It gained currency three decades later, by which point observers were expressing horror over these people’s “tallow” skin and their habit of eating clay. As George Weston warned in his widely circulated 1856 pamphlet “The Poor Whites of the South,” they were “sinking deeper and more hopelessly into barbarism with every succeeding generation.” Speaking of this class as a separate breed—a species unto itself—was a way to skirt the challenge it presented to the nation’s vision of equality and inclusivity. Isenberg points up the tension: “If whiteness was not an automatic badge of superiority, a guarantee of the homogeneous population of independent, educable freemen … then the ideals of life, liberty, and the pursuit of happiness were unobtainable.”
With so much talk of breeds, it is no surprise that, in the early 20th century, the U.S. was gripped by a eugenics craze, which Isenberg sees as motivated by revulsion over the supposed degeneracy of poor whites, especially those in the South. State fairs held “fitter family” contests, Teddy Roosevelt fretted about Americans’ “germ protoplasm,” and Supreme Court Justice Oliver Wendell Holmes Jr. issued a ruling upholding the forced sterilization of a poor Virginian named Carrie Buck, deemed a “moron.”
Isenberg, for all her efforts to clarify the role of class in the national culture, succumbs to a different kind of distortion herself. She is frustratingly hazy about regional distinctions within the white lower class, a blurriness that also skews some of the contemporary liberal theorizing about white despondency. As her account progresses, she focuses increasingly on the South, without squarely addressing that choice and its implications. To zero in on the white underclass in or near slaveholding areas is, understandably, to dwell on the fraught dynamic between poor whites and enslaved African Americans and its role in the national debate leading up to the Civil War. On the one hand, opponents of slavery argued that the association of labor with servitude dulled the work ethic of poor whites. On the other, defenders of slavery claimed that being spared the lowliest toil kept poor Southern whites a step above their Northern counterparts.
But there were whole other swaths of the country where many poor whites lived without any blacks nearby to speak of—not least the broad expanse of Appalachia. Isenberg makes plain in a brief aside that she does not buy the idea, enshrined in so many books in recent years, of a separate cohort of “Scots-Irish”—hard-drinking, hard-scrapping brawlers from the “borderlands” of Scotland, northern Ireland, and northern England who, cherishing their freedom and wanting nothing to do with the coastal elites, settled up in the Appalachian hills in the mid-18th century. One such account, Grady McWhiney’s Cracker Culture (1988), earns Isenberg’s brisk dismissal as a “flawed historical study that turned poor whites into Celtic ethnics (Scots-Irish).”
Regardless of the merits in that dispute, Isenberg ought to have reckoned more fully with the distinctions between poor whites in the Deep South and those elsewhere. . .
About time: Former Football Players Sue Stanford University, NCAA, PAC-12 Over Mishandled Concussions
A complaint filed on behalf of thousands of former Stanford University football players alleges the university, the National Collegiate Athletic Association and the Pac-12 Conference knew that football players were in danger of permanent brain injuries but did not protect the players so as to “protect the very profitable business of ‘amateur’ college football.”
Chris Dore, a partner at the law firm Edelson PC said the lawsuit filed Thursday against Stanford, the NCAA and Pac-12 is only one of 15 lawsuits that have been filed in recent weeks by his firm against colleges and athletic conferences on behalf of college football players.
The wave of lawsuits comes on the heels of two other related lawsuits: One is a a lawsuit against the NCAA, which did not include monetary compensation for players but made strides in medical monitoring and tests for concussions. The second is a $1 billion concussion settlement against the National Football League alleging that the league failed to warn players and hid the damages of brain injury.
Dore told CBS San Francisco Friday that the defendants didn’t want to discourage play or participation in the sport out of concern that there would be “loss of significant profits.”
He said that this punitive class action lawsuit is led by plaintiff David Burns, who played at Stanford University in the 1970s, but is filed on behalf of thousands of football players who played for the university’s team between 1959 and 2010.
Dore said the defendants knew about scientific studies, some even conducted at the very same universities, that described the dangers of concussions, but did nothing to protect players.
The complaint states that “… Defendants Stanford, Pac-12, and the NCAA have kept their players and the public in the dark about an epidemic that was slowly killing their athletes.”
The plaintiff is demanding a jury trial and monetary relief for players. Dore said dozens more lawsuits are expected in coming weeks. . .
When money is involved, damage is ignored: cf. the cigarette industry, the oil and coal industry, and now the enormous amounts of money made in college sports (not by the players, of course, who get none of the revenue).
Later in the article:
“Unfortunately, for decades, Defendants Stanford, Pac-12, and the NCAA knew about the debilitating long-term dangers of concussions, concussion-related injuries, and sub-concussive injuries (referred to as “traumatic brain injuries” or “TBIs”) that resulted from playing college football, but actively concealed this information to protect the very profitable business of ‘amateur’ college football,” the complaint alleges.
Stanford, of course, denies everything. They had no idea that playing football caused any brain injury at all. Big surprise for them. (So will they continue to field teams?)
Denise Ryan writes in the Vancouver Sun:
The 15-year-old girl who bumps around in the police wagon is being unceremoniously returned to the Willingdon Industrial School for Girls, a juvenile correctional institute on Vancouver’s east side.
It is 1969. Paulette Steeves, a ward of the provincial government and incorrigible runaway, has been incarcerated here since the age of 13.
“I’m not going back,” Steeves says defiantly. “I’m going to get away.”
The other young women in the police wagon respond with disbelief. “You can’t do that. How are you going to do that?”
“Just watch me,” the girl says.
The wagon passes through the front gate, pulls up the drive, and slows to a stop. A female police officer opens the rear door to let the prisoners out.
Suddenly, the girl bolts, long hair whipping behind her. She leaps onto the fence, scrambles to the top, seizes the barbed wire with bare hands, hurls her body forward. Points of metal shred her skin as she sails over the top.
She hears the other girls erupt in cheers. Steeves lands hard, then she’s away. Escaping is her specialty.
Forty-five years later, Steeves’s hands and legs are mapped with scars from that fence, clues to her origin story.
Now ensconced in a fortress that is equally imposing, though far more genteel than Willingdon, Steeves is telling another origin story. She has just been named director of the Native American Studies program at the University of Massachusetts Amherst, and her work as an archeologist seeks to upend long-held notions about indigenous culture in the Americas.
Steeves, who is Cree-Metis, was the first PhD candidate in her field to successfully defend her dissertation using indigenous method and theory. She has spent years building a database of Pleistocene archeological sites that show her ancestors have been in the Americas far longer than previously acknowledged. (The Pleistocene is the geological epoch that lasted from 2.6 million to approximately 12,000 years ago.)
Her work, which challenges the “colonial” legacy of archeology, is considered revolutionary by some, controversial by others. Steeves believes objections to inclusion of “indigenous ways and methods” in archeology comes from “a really strong, and deep-rooted racism in North American anthropology against Native Americans.”
Now 60, Steeves is tall and broad, with a mass of long hair, a figure that is both imposing and soft.
The history of indigenous people in the Americas was manufactured, says Steeves, to make it easier to overlook the atrocities that colonization brought. “When people started coming here to the Americas, they were finding signs of great civilizations, and stories were created to say these sites and this civilization was not built by the indigenous people — they called them the savages, they created the people here as ‘nature’, not as culture. If it’s culture, you can’t massacre them, or kill them, or put a head price on them. But if they are nature, it’s okay to do that.”
When she began her research, Steeves hoped to compile a list of 10 or 20 archeological sites in the western hemisphere older than 11,000 years ago. She was stunned to find over 400 sites. “Counter to the western stories that we’ve been here 12,000 years, we’ve been here over 60,000 years, likely over 100,00 years, and there is a great deal of evidence to support that.”
She refutes the common narrative of indigenous people as a group that has been culturally erased, wiped out by bad luck, disease and a lack of resistance, both metaphorical and physical.
“I see a different story. A story of persistance.”
She should know. The Cree-Metis girl who threw herself over the fence of the Willingdon school time and time again until she won her freedom, says simply: “I am a survivor of forced cultural assimilation.”
“We were extremely poor,” says Steeves. Born in Whitehorse, her childhood was cut from the cloth of aboriginal marginalization. “My mom was an alcoholic. My parents split when I was five. My stepdad used to beat the shit out of her.”
By the age of 12, Steeves was running away regularly. She dropped out of school, picked apples, panhandled, and made her way to Vancouver, where she survived as a street kid before landing in Willingdon at age 13.
“My mother, who was 80 per cent native, warned us never to tell anyone we were Indians,” she says. The reason was heartbreaking: Long before Paulette and and her siblings were born, her mother had two children who were taken from her by authorities and put up for adoption.
“She never saw them again, and she never, ever got over it,” says Steeves. “Because of that, it was really important to her to hide our Indian-ness.”
Part of racism is who is included and who is excluded, socially, economically and historically. Steeves grew up on the outside, excluded first from her own culture, and also outside of mainstream white culture.
By 21, Steeves had moved to Lillooet, where she worked in a sawmill and gave birth to her first son, Jessie. She had two more children, but found herself trapped in an abusive relationship. Her eldest son was diagnosed with a serious environmental illness. Doctors told her he wouldn’t live beyond the age of six.
Steeves, who had begun to reclaim her heritage — her ancestors are Cree, Sioux and Dutch — sought the counsel of elders. “You are going to do something really good for Indian people. Not just us Indians here, Indians everywhere. It’s going to be a lot harder than this, so learn from this.”
Steeves was mystified. “Here I was with three children, one of whom was terminally ill. I had a Grade 8 education, a truck and 26 cents. What was I going to do?” . . .
Facebook Comment Email Republish Donate New Jersey’s Student Loan Program is ‘State-Sanctioned Loan-Sharking’
In ProPublica Annie Waldman reports on a government agency working against the good of the public:
Amid a haze of grief after her son’s murder last year, Marcia DeOliveira-Longinetti faced an endless list of tasks — helping the police access Kevin’s phone and email, canceling his subscriptions, credit cards and bank accounts, and arranging his burial in New Jersey.
And then there were his college loans.
When DeOliveira-Longinetti called about his federal loans, an administrator offered condolences and assured her the remaining balance would be written off.
But she got a far different response from a New Jersey state agency that had also lent her son money.
“Please accept our condolences on your loss,” said a letter from the Higher Education Student Assistance Authority to DeOliveira-Longinetti, who had co-signed the loans. “After careful consideration of the information you provided, the Authority has determined that your request does not meet the threshold for loan forgiveness. Monthly bill statements will continue to be sent to you.”
DeOliveira-Longinetti was shocked and confused. After all, the agency features a photo of Governor Chris Christie on its website, and boasts in its brochures that its “singular focus has always been to benefit the students we serve.”
But her experience with the authority, which runs by far the largest state-based student loan program in the country, is hardly an isolated one, an investigation by ProPublica, in collaboration with the New York Times, found.
New Jersey’s loans, which currently total $1.9 billion, are unlike those of any other government lending program for students in the country. They come with extraordinarily stringent rules that can easily lead to financial ruin. Repayments cannot be adjusted based on income, and borrowers who are unemployed or facing other financial hardships are given few breaks.
New Jersey’s loans also carry higher interest rates than similar federal programs. Most significantly, the loans come with a cudgel that even the most predatory for-profit players cannot wield: the power of the state.
New Jersey can garnish wages, rescind state income tax refunds, revoke professional licenses, even take away lottery winnings — all without having to get court approval.
“It’s state-sanctioned loan sharking,” said Daniel Frischberg, a bankruptcy lawyer. “The New Jersey program is set up so that you fail.”
The authority has become even more aggressive in recent years. Interviews with dozens of borrowers, who were among the tens of thousands who have turned to the program, show how the loans have unraveled lives.
The program’s regulations have destroyed families’ credit and forced them to forfeit their salaries. One college graduate declared bankruptcy at age 26 after struggling to repay his debt. The agency filed four simultaneous lawsuits against a 31-year-old paralegal after she fell behind on her payments.
Another borrower, Chris Gonzalez, couldn’t keep up with his loans after he got non-Hodgkin’s lymphoma and was laid off by Goldman Sachs. While the federal government allowed him to suspend his payments because of hardship, New Jersey sued him, seeking nearly $266,000 in payments, and seized a state tax refund he was owed.
One reason for the aggressive tactics is . . .
‘I hated this man more than my rapists’: Woman confronts football coach 18 years after alleged gang rape
Oddly enough, a very heartening story. And I think Brenda Tracy is right: this is a good example of accountability and transparency. The coach faced his faults and acknowledged them.
Published in the New Yorker, the following was delivered as the commencement address at the California Institute of Technology, on Friday, June 10th:
If is place has done its job—and I suspect it has—you’re all scientists now. Sorry, English and history graduates, even you are, too. Science is not a major or a career. It is a commitment to a systematic way of thinking, an allegiance to a way of building knowledge and explaining the universe through testing and factual observation. The thing is, that isn’t a normal way of thinking. It is unnatural and counterintuitive. It has to be learned. Scientific explanation stands in contrast to the wisdom of divinity and experience and common sense. Common sense once told us that the sun moves across the sky and that being out in the cold produced colds. But a scientific mind recognized that these intuitions were only hypotheses. They had to be tested.
When I came to college from my Ohio home town, the most intellectually unnerving thing I discovered was how wrong many of my assumptions were about how the world works—whether the natural or the human-made world. I looked to my professors and fellow-students to supply my replacement ideas. Then I returned home with some of those ideas and told my parents everything they’d got wrong (which they just loved). But, even then, I was just replacing one set of received beliefs for another. It took me a long time to recognize the particular mind-set that scientists have. The great physicist Edwin Hubble, speaking at Caltech’s commencement in 1938, said a scientist has “a healthy skepticism, suspended judgement, and disciplined imagination”—not only about other people’s ideas but also about his or her own. The scientist has an experimental mind, not a litigious one.
As a student, this seemed to me more than a way of thinking. It was a way of being—a weird way of being. You are supposed to have skepticism and imagination, but not too much. You are supposed to suspend judgment, yet exercise it. Ultimately, you hope to observe the world with an open mind, gathering facts and testing your predictions and expectations against them. Then you make up your mind and either affirm or reject the ideas at hand. But you also hope to accept that nothing is ever completely settled, that all knowledge is just probable knowledge. A contradictory piece of evidence can always emerge. Hubble said it best when he said, “The scientist explains the world by successive approximations.”
The scientific orientation has proved immensely powerful. It has allowed us to nearly double our lifespan during the past century, to increase our global abundance, and to deepen our understanding of the nature of the universe. Yet scientific knowledge is not necessarily trusted. Partly, that’s because it is incomplete. But even where the knowledge provided by science is overwhelming, people often resist it—sometimes outright deny it. Many people continue to believe, for instance, despite massive evidence to the contrary, that childhood vaccines cause autism (they do not); that people are safer owning a gun (they are not); that genetically modified crops are harmful (on balance, they have been beneficial); that climate change is not happening (it is).
Vaccine fears, for example, have persisted despite decades of research showing them to be unfounded. Some twenty-five years ago, a statistical analysis suggested a possible association between autism and thimerosal, a preservative used in vaccines to prevent bacterial contamination. The analysis turned out to be flawed, but fears took hold. Scientists then carried out hundreds of studies, and found no link. Still, fears persisted. Countries removed the preservative but experienced no reduction in autism—yet fears grew. A British study claimed a connection between the onset of autism in eight children and the timing of their vaccinations for measles, mumps, and rubella. That paper was retracted due to findings of fraud: the lead author had falsified and misrepresented the data on the children. Repeated efforts to confirm the findings were unsuccessful. Nonetheless, vaccine rates plunged, leading to outbreaks of measles and mumpsthat, last year, sickened tens of thousands of children across the U.S., Canada, and Europe, and resulted in deaths.
People are prone to resist scientific claims when they clash with intuitive beliefs. They don’t see measles or mumps around anymore. They do see children with autism. And they see a mom who says, “My child was perfectly fine until he got a vaccine and became autistic.”
Now, you can tell them that correlation is not causation. You can say that children get a vaccine every two to three months for the first couple years of their life, so the onset of any illness is bound to follow vaccination for many kids. You can say that the science shows no connection. But once an idea has got embedded and become widespread, it becomes very difficult to dig it out of people’s brains—especially when they do not trust scientific authorities. And we are experiencing a significant decline in trust in scientific authorities.
The sociologist Gordon Gauchat studied U.S. survey data from 1974 to 2010 and found some deeply alarming trends. Despite increasing education levels, the public’s trust in the scientific community has been decreasing. This is particularly true among conservatives, even educated conservatives. In 1974, conservatives with college degrees had the highest level of trust in science and the scientific community. Today, they have the lowest.
Today, we have multiple factions putting themselves forward as what Gauchat describes as their own cultural domains, “generating their own knowledge base that is often in conflict with the cultural authority of the scientific community.” Some are religious groups (challenging evolution, for instance). Some are industry groups (as with climate skepticism). Others tilt more to the left (such as those that reject the medical establishment). As varied as these groups are, they are all alike in one way. They all harbor sacred beliefs that they do not consider open to question.
To defend those beliefs, . . .