Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Politics’ Category

Calvin, Man of Action

leave a comment »

Written by Leisureguy

21 September 2021 at 11:36 am

At Rikers Island, Inmates Locked in Showers Without Food and Defecating in Bags

leave a comment »

The US is really amazing. New York City is supposed a city of wealth and culture and represents to much of the world what the US is. In the Intercept Nick Pinto reports on how New York City treats those entrusted to its care:

JAIL OFFICIALS KNEW that state legislators were going to be touring Rikers Island on September 13. But if they made any effort to disguise the degree of degradation and danger that pervades New York City’s jail complex, it didn’t show. Lawmakers and the people who accompanied them returned from their visit visibly shaken.

“There’s a segregated intake unit that we walked through where they have people held in showers,” said Alice Fontier, managing director for Neighborhood Defender Services, who toured one Rikers building, the Otis Bantum Correctional Center, with lawmakers. “It’s about 2 feet wide by 6 feet. There is no toilet. They’ve given them plastic bags to use for feces and urine. And they’re sitting in the cells with their own bodily waste locked into these conditions. This is the most horrific thing I’ve seen in my life. I’ve been coming to this jail since 2008. This is unlike anything that has ever happened here.”

Rikers has been a festering wound in New York City for about as long as it has existed as a jail complex. Cut off from the rest of the city by water on all sides and accessible only by a long causeway, New York’s island gulag has always been out of sight and out of mind. Periodically, a snapshot of conditions inside will escape the island’s event horizon, as in 2014 when then-U.S. Attorney Preet Bharara issued a scathing report describing Rikers as a place “more inspired by ‘Lord of the Flies’ than any legitimate philosophy of humane detention.”

Bharara’s report helped buttress the movement to close Rikers once and for all, a movement to which Mayor Bill de Blasio was a late joiner in 2017, during his reelection campaign.

Since that time, de Blasio has responded to alarms about conditions on Rikers Island by falling back on his commitment to close the complex — but only closing it sometime years in the future, long after he has left office. The mayor has not visited the island jails at all since winning his second term.

Recent events, though, forced de Blasio to pay closer attention. In the last eight months, 10 people have died in custody on the island, five of them taking their own lives. Covid-19 is once again on the rise on Rikers. On September 10, the chief medical officer on Rikers wrote a letter to New York City Council, warning that “in 2021 we have witnessed a collapse in basic jail operations, such that today I do not believe the City is capable of safely managing the custody of those it is charged with incarcerating in its jails.”

As de Blasio belatedly rolls out a plan for addressing the crisis on Rikers, he is casting responsibility for the condition in his jails variously on the Covid-19 pandemic, prison guards, state government, prosecutors, and the judiciary. But while the unfolding human catastrophe is indeed a tragedy with deep origins and many authors, it is also the predictable conclusion of de Blasio’s own policies and politics.

Even as he has taken credit for the long-term plan to eventually close Rikers, the mayor has embraced a pressure campaign by his police commissioner that seeks to roll back carceral system reforms and re-entrench bail and gratuitous pretrial detention in New York’s criminal system.

In the conscience-shocking crisis on Rikers Island, de Blasio is reaping the whirlwind for his acquiescence to an agenda of mass incarceration.

MUCH OF THE coverage of the crisis on Rikers has focused on a cascading staffing crisis. In recent weeks, accounts circulated of housing units going whole days without any guards at all. By the city government’s estimates, on any given day, fully 35 percent of staff are unavailable to work. On September 15, according to New York City officials, 789 jail employees called in sick, 68 were out for a “personal emergency,” and 93 were simply absent without leave.

As guards sick out, their colleagues find their own working conditions declining even further. Corrections officers increasingly work double, triple, and even quadruple shifts. On many housing units, there are no officers on the floor. The number of assaults — against incarcerated people and staff alike — is going up. . .

Continue reading. There’s much more and no paywall.

Written by Leisureguy

19 September 2021 at 4:51 pm

New Evidence of Corruption at Epa Chemicals Division

leave a comment »

Sharon Lerner reports in the Intercept:

Scientists at the Environmental Protection Agency have provided The Intercept with new information showing that senior staff have made chemicals appear safer — sometimes dodging restrictions on their use — by minimizing the estimates of how much is released into the environment.

The EPA gauges the potential risk posed by a chemical using two measures: how toxic the agency considers it and how much of the substance the public will likely be exposed to. Whistleblowers from the EPA’s New Chemicals Division have already provided The Intercept with evidence that managers and other officials were pressuring them to assess chemicals to be less toxic than they actually are — and sometimes removing references to their harms from chemical assessments.

Now new documents, including meeting summaries, internal emails, and screenshots from the EPA’s computer system, along with interviews with whistleblowers and other EPA scientists, show that the agency’s New Chemicals Division has avoided calculating the exposure to — and thus the risk posed by — hundreds of chemicals and have repeatedly resisted calls to change that policy even after scientists have shown that it puts the public at risk.

Call It “Negligible”

Since 1995, the EPA has operated under the assumption that chemicals emitted below certain cutoff levels are safe. Whether a toxic chemical is emitted through the smokestacks of an industrial plant, via leaks in its machinery, or from a leaky landfill into groundwater, the agency requires scientists to quantify the precise risk posed by the chemical only if the release (and thus likely human exposure) reaches certain thresholds. If the releases from both smokestacks and leaks are below the thresholds, the chemical is given a pass. In recent years, however, scientists have shown that some of the chemicals allowed onto the market using this loophole do in fact present a danger, particularly to the people living in “fence-line communities” near industrial plants.

In 2018, several EPA scientists became worried that the use of these exposure thresholds could leave the public vulnerable to health risks. Their concern was heightened by an email that a manager in the Office of Pollution Prevention and Toxics sent in October of that year, instructing the scientists to change the language they used to classify chemicals that were exempted from risk calculation because they were deemed to have low exposure levels. Up to that point, they had described them in reports as “below modeling thresholds.” From then on, the manager explained, the scientists were to include the words “expects to be negligible” — a phrase that implies there’s no reason for concern.

Several scientists who worked on calculating chemical risks believed that there was in fact reason for concern and that the use of the thresholds leaves the public vulnerable to health effects, including cancer. And after being instructed to refer to exposures they hadn’t actually measured or modeled as “negligible,” the scientists proposed dropping or lowering the cutoffs and running the calculations for each individual chemical — a task that would add only minutes to the assessment process. But the managers refused to heed their request, which would have not only changed how chemicals were assessed moving forward but would have also had implications for hundreds of assessments in the past.

“They told us that the use of the thresholds was a policy decision and, as such, we could not simply stop applying them,” one of the scientists who worked in the office explained to The Intercept.

The issue resurfaced in May 2020 when a scientist presented the case of a single chemical the agency was then considering allowing onto the market. Although it fell into the “negligible” category using the cutoffs that had been set decades previously, when the scientists calculated the exposure levels using an alternate EPA model, which is designed to gauge the risk of airborne chemicals, it became clear that the chemical did pose a risk of damaging the human nervous system. The chemical is still going through the approval process.

In February, a small group of scientists reviewed the safety thresholds set by the EPA for all of the 368 new chemicals submitted to the agency in 2020. They found that more than half could pose risks even in cases in which the agency had already described exposure as “negligible” and thus had not calculated specific risk. Again, the scientists brought the exposure threshold issue to the attention of managers in the New Chemicals Division, briefing them on their analysis and requesting that the use of the outdated cutoffs be stopped. But they received no response to their proposal. Seven months later, the thresholds remain in use and the risk posed by chemicals deemed to have low exposure levels is still not being calculated and included in chemical assessments, according to EPA scientists who spoke with The Intercept.

The internal struggles over exposure present yet another example of managers and senior staff working to undermine the agency’s mission, according to the EPA scientists. “Our work on new chemicals often felt like an exercise in finding ways to approve new chemicals rather than reviewing them for approval,” said one of two scientists who filed new disclosures to the EPA inspector general on August 31 about the issue. The detailed account of corruption within the New Chemicals Division that four whistleblowers previously submitted to members of Congress, the EPA inspector general, and The Intercept also included information on the ongoing problems caused by the use of the exposure thresholds.”

“It all comes down to money,” said Kyla Bennett, director of science policy for Public Employees for Environmental Responsibility, or PEER, the organization representing the whistleblowers, who pointed out that risk values above the agency’s accepted cutoffs require the EPA to impose limits that may make a chemical harder to use — and sell. “Companies don’t want warning labels, they don’t want restrictions.”

It’s unclear why some senior staff and managers within the EPA’s New Chemicals Division seem to feel an obligation not to burden the companies they regulate with restrictions. “That’s the $64,000 question,” said Bennett, who pointed out that EPA staffers may enhance their post-agency job prospects within the industry if they stay in the good graces of chemical companies. She also noted that managers’ performance within the division is assessed partly based on how many chemicals they approve. “The bean counting is driving their actions,” said Bennett. “The performance metrics should be, how many chemicals did you prevent from going onto the market, rather than how many did you get onto the market.”

In response to questions about this story, the EPA  . . .

Continue reading. There’s more, and no paywall.

Written by Leisureguy

19 September 2021 at 4:42 pm

When Wall Street came to coal country: how a big-money gamble scarred Appalachia

leave a comment »

Mountaintop-removal coal mining in West Virginia

Evan Osnos reports in the Guardian:

Once or twice a generation, Americans rediscover Appalachia. Sometimes, they come to it through caricature – the cartoon strip Li’l Abner or the child beauty pageant star Honey Boo Boo or, more recently, Buckwild, a reality show about West Virginia teenagers, which MTV broadcast with subtitles. Occasionally, the encounter is more compassionate. In 1962, the social critic Michael Harrington published The Other America, which called attention to what he described as a “vicious circle of poverty” that “twists and deforms the spirit”.

Around the turn of this century, hedge funds in New York and its environs took a growing interest in coalmines. Coal never had huge appeal to Wall Street investors – mines were dirty, old-fashioned and bound up by union contracts that made them difficult to buy and sell. But in the late 1990s, the growing economies of Asia began to consume more and more energy, which investors predicted would drive up demand halfway around the world, in Appalachia. In 1997, the Hobet mine, a 25-year-old operation in rural West Virginia, was acquired for the first time by a public company, Arch Coal. It embarked on a major expansion, dynamiting mountaintops and dumping the debris into rivers and streams. As the Hobet mine grew, it consumed the ridges and communities around it. Seen from the air, the mine came to resemble a giant grey amoeba – 22 miles from end to end – eating its way across the mountains.

Up close, the effects were far more intimate. When Wall Street came to coal country, it triggered a cascade of repercussions that were largely invisible to the outside world but of existential importance to people nearby.

Down a hillside from the Hobet mine, the Caudill family had lived and hunted and farmed for a century. Their homeplace, as they called it, was 30 hectares (75 acres) of woods and water. The Caudills were hardly critics of mining; many were miners themselves. John Caudill was an explosives expert until one day, in the 30s, a blast went off early and left him blind. His mining days were over, but his land was abundant, and John and his wife went on to have 10 children. They grew potatoes, corn, lettuce, tomatoes, beets and beans; they hunted game in the forests and foraged for berries and ginseng. Behind the house, a hill was dense with hemlocks, ferns and peach trees.

One by one, the Caudill kids grew up and left for school and work. They settled into the surrounding towns, but stayed close enough to return to the homeplace on weekends. John’s grandson, Jerry Thompson, grew up a half-hour down a dirt road. “I could probably count on one hand the number of Sundays I missed,” he said. His grandmother’s menu never changed: fried chicken, mashed potatoes, green beans, corn and cake. “You’d just wander the property for hours. I would have a lot of cousins there, and we would ramble through the barns and climb up the mountains and wade in the creek and hunt for crawdads.”

Before long, the Hobet mine surrounded the land on three sides, and Arch Coal wanted to buy the Caudills out. Some were eager to sell. “We’re not wealthy people, and some of us are better off than others,” Thompson said. One cousin told him, “I’ve got two boys I got to put through college. I can’t pass this up because I’ll never see $50,000 again.” He thought, “He’s right; it was a good decision for him.”

In the end, nine family members agreed to sell, but six refused, and Jerry was one of them. Arch sued all of them, arguing that storing coalmine debris constituted, in legal terms, “the highest and best use of the property”. The case reached the West Virginia supreme court, where a justice asked, sceptically, “The highest and best use of the land is dumping?”

Phil Melick, a lawyer for the company, replied: “It has become that.” He added: “The use of land changes over time. The value of land changes over time.”

Surely, the justice said, the family’s value of the property was not simply economic? It was, Melick maintained. “It has to be measured economically,” he said, “or it can’t be measured at all.”


.
To their surprise, the Caudills won their case, after a fashion. They could keep 10 hectares – but the victory was fleeting. Beneath their feet, the land was becoming unrecognisable. Chemicals produced by the mountaintop mine were redrawing the landscape in a bizarre tableau. In streams, the leaves and sticks developed a thick copper crust from the buildup of carbonate, and rocks turned an inky black from deposits of manganese. In the Mud River, which ran beside the Caudills’ property, a US Forest Service biologist collected fish larvae with two eyes on one side of the head. He traced the disfigurements to selenium, a byproduct of mining, and warned, in a report, of an ecosystem “on the brink of a major toxic event”. (In 2010, the journal Science published a study of 78 West Virginia streams near mountaintop-removal mines, which found that nearly all of them had elevated levels of selenium.)

This was more than just the usual tradeoff between profit and pollution, another turn in the cycle of industry and cleanup. Mountaintop removal was, fundamentally, a more destructive realm of technology. It had barely existed until the 90s, and it took some time before scientists could measure the effects on the land and the people. For ecologists, the southern Appalachians was a singular domain – one of the most productive, diverse temperate hardwood forests on the planet. For aeons, the hills had contained more species of salamander than anywhere else, and a lush canopy that attracts neotropical migratory birds across thousands of miles to hatch their next generation. But a mountaintop mine altered the land from top to bottom: after blasting off the peaks – which miners call the “overburden” – bulldozers pushed the debris down the hillsides, where it blanketed the streams and rivers. Rainwater filtered down through a strange human-made stew of metal, pyrite, sulphur, silica, salts and coal, exposed to the air for the first time. The rain mingled with the chemicals and percolated down the hills, funnelling into the brooks and streams and, finally, into the rivers on the valley floor, which sustained the people of southern West Virginia. 

Emily Bernhardt, a Duke University biologist, who spent years tracking the effects of the Hobet mine, told me: “The aquatic insects coming out of these streams are loaded with selenium, and then the spiders that are eating them become loaded with selenium, and it causes deformities in fish and birds.” The effects distorted the food chain. Normally, tiny insects hatched in the water would fly into the woods, sustaining toads, turtles and birds. But downstream, scientists discovered that some species had been replaced by flies usually found in wastewater treatment plants. By 2009, the damage was impossible to ignore. In a typical study, biologists tracking a migratory bird called the cerulean warbler found that its population had fallen by 82% in 40 years. The 2010 report in Science concluded that the impacts of mountaintop-removal mining on water, biodiversity and forest productivity were “pervasive and irreversible”. Mountaintop mines had buried more than 1,000 miles of streams across Appalachia, and, according to the EPA, altered 2,200 sq miles of land – an area bigger than Delaware.

Before long, scientists discovered impacts on the people, too. Each explosion at the top of a mountain released elements usually kept underground – lead, arsenic, selenium, manganese. The dust floated down on to the drinking water, the back-yard furniture, and through the open windows. Researchers led by Michael Hendryx, a professor of public health at West Virginia University, published startling links between mountaintop mines and health problems of those in proximity to it, including cancer, cardiovascular disease and birth defects. Between 1979 and 2005, the 70 Appalachian counties that relied most on mining had recorded, on average, more than 2,000 excess deaths each year. Viewed one way, those deaths were the cost of progress, the price of prosperity that coal could bring. But Hendryx also debunked that argument: the deaths cost $41bn a year in expenses and lost income, which was $18bn more than coal had earned the counties in salaries, tax revenue and other economic benefits. Even in the pure economic terms that the companies used, Hendryx observed, mountaintop mining had been a terrible deal for the people who lived there.


.
O
ne afternoon, I hiked up through the woods behind the Caudills’ house to see the changes in the land. By law, mines are required to “remediate” their terrain, returning it to an approximation of its former condition. But, far from the public eye, the standards can be comically lax. After climbing through the trees for a while, I emerged into a sun-drenched bowl of . . .

Continue reading. There’s much more.

Written by Leisureguy

18 September 2021 at 11:26 am

The Battle of Antietam and the endurance of the Confederate ideal

leave a comment »

Heather Cox Richardson writes:

One hundred and fifty nine years ago this week, in 1862, 75,000 United States troops and about 38,000 Confederate troops massed along Antietam Creek near Sharpsburg, Maryland.

After a successful summer of fighting, Confederate general Robert E. Lee had crossed the Potomac River into Maryland to bring the Civil War to the North. He hoped to swing the slave state of Maryland into rebellion and to weaken Lincoln’s war policies in the upcoming 1862 elections. For his part, Union general George McClellan hoped to finish off the southern Army of Northern Virginia that had snaked away from him all summer.

The armies clashed as the sun rose about 5:30 on the clear fall morning of September 17, 159 years ago today. For twelve hours the men slashed at each other. Amid the smoke and fire, soldiers fell. Twelve hours later, more than 2000 U.S. soldiers lay dead and more than 10,000 of their comrades were wounded or missing. Fifteen hundred Confederates had fallen in the battle, and another 9000 or so were wounded or captured. The United States had lost 25% of its fighting force; the Confederates, 31%. The First Texas Infantry lost 82% of its men.

That slaughter was brought home to northern families in a novel way after the battle. Photographer Alexander Gardner, working for the great photographer Matthew Brady, brought his camera to Antietam two days after the guns fell silent. Until Gardner’s field experiment, photography had been limited almost entirely to studios. People sent formal photos home and recorded family images for posterity, as if photographs were portraits.

Taking his camera outside, Gardner recorded seventy images of Antietam for people back home. His stark images showed bridges and famous generals, but they also showed rows of bodies, twisted and bloating in the sun as they awaited burial. By any standards these war photos were horrific, but to a people who had never seen anything like it before, they were earth-shattering.

White southern men had marched off to war in 1861 expecting that they would fight and win a heroic battle or two and that their easy victories over the northerners they dismissed as emasculated shopkeepers would enable them to create a new nation based in white supremacy. In the 1850s, pro-slavery lawmakers had taken over the United States government, but white southerners were a minority and they knew it. When the election of 1860 put into power lawmakers and a president who rejected their worldview, they decided to destroy the nation.

Eager to gain power in the rebellion, pro-secession politicians raced to extremes, assuring their constituencies that they were defending the true nature of a strong new country and that those defending the old version of the United States would never fight effectively.

On March 21, 1861, the future vice president of the Confederacy, Alexander Stephens, laid out the world he thought white southerners should fight for. He explained that the Founders were wrong to base the government on the principle that humans were inherently equal, and that northerners were behind the times with their adherence to the outdated idea that “the negro is equal, and…entitled to equal privileges and rights with the white man.” Confederate leaders had corrected the Founders’ error. They had rested the Confederacy on the “great truth” that “the negro is not equal to the white man; that slavery subordination to the superior race is his natural and normal condition.”

White southern leaders talked easily about a coming war, assuring prospective soldiers that defeating the United States Army would be a matter of a fight or, perhaps, two. South Carolina Senator James Chesnut Jr. assured his neighbors that there would be so few casualties he would be happy to drink all the blood shed in a fight between the South and the North. And so, poorer white southerners marched to war.

The July 1861 Battle of Bull Run put the conceit of an easy victory to rest. Although the Confederates ultimately routed the U.S. soldiers, the southern men were shocked at what they experienced. “Never have I conceived of such a continuous, rushing hailstorm of shot, shell, and musketry as fell around and among us for hours together,” one wrote home. “We who escaped are constantly wondering how we could possibly have come out of the action alive.”

Northerners, too, had initially thought the war against the blustering southerners would be quick and easy, so quick and easy that some congressmen brought picnics to Bull Run to watch the fighting, only to get caught in the rout as soldiers ditched their rucksacks and guns and ran back toward the capital. Those at home, though, could continue to imagine the war as a heroic contest.

They could elevate the carnage, that is, until Matthew Brady exhibited Gardner’s images of Antietam at his studio in New York City. People who saw the placard announcing “The Dead of Antietam” and climbed the stairs up to Brady’s rooms to see the images found that their ideas about war were changed forever.

“The dead of the battle-field come up to us very rarely, even in dreams,” one reporter mused. “We see the list in the morning paper at breakfast, but dismiss its recollection with the coffee. There is a confused mass of names, but they are all strangers; we forget the horrible significance that dwells amid the jumble of type.” But Gardner’s photographs erased the distance between the battlefield and the home front. They brought home the fact that every name on a casualty list “represents a bleeding, mangled corpse.” “If [Gardner] has not brought bodies and laid them in our dooryards and along the streets, he has done something very like it,” the shocked reporter commented.

The horrific images of Antietam showed to those on the home front the real cost of war they had entered with bluster and flippant assurances that it would be bloodless and easy. Southern politicians had promised that white rebels fighting to create a nation whose legal system enshrined white supremacy would easily overcome a mongrel army defending the principle of human equality.

The dead at Antietam’s Bloody Lane and Dunker Church proved they were wrong. The Battle of Antietam was enough of a Union victory to allow President Abraham Lincoln to issue the preliminary emancipation proclamation, warning southern states that on January 1, 1863, “all persons held as slaves within any State, or designated part of a State,” where people still fought against the United States, “shall be then, thenceforward, and forever free; and the…government of the United States…will recognize and maintain the freedom of such persons….”

Lincoln’s proclamation meant that anti-slavery England would not formally enter the war on the side of the Confederates, dashing their hopes of foreign intervention, and in November 1863, Lincoln redefined the war as one not simply to restore the Union, but to protect a nation “conceived in liberty, and dedicated to the proposition that all men are created equal.”

To that principle, northerners and Black southerners rallied, despite the grinding horror of the battlefields, and in 1865, they defeated the Confederates.

But they did not defeat the idea the Confederates fought, killed, and died for: a nation in which the law distinguishes among people according to the color of their skin. Today, once again, . . .

Continue reading.

Written by Leisureguy

18 September 2021 at 11:09 am

A look inside The War for Gaul: A New Translation

leave a comment »

The following is an excerpt from The War for Gaul: A New Translation, by Julius Caesar, translated by James J. O’Donnell, professor of history, philosophy, and religious studies and University Librarian at Arizona State University, whose books include PagansThe Ruin of the Roman Empire, and Augustine: A New Biography.

Caesar deserves to be compared with Alexander the Great. No one before or since comes close. Command, conquest, and a lasting legacy set them apart from the likes of mere strivers like Napoleon or Hitler. And the war in Gaul was the making of Caesar.

Isn’t that what you would expect a translator of Caesar to say? It’s all entirely true and many have said as much before. But admiring him without understanding him makes us complicit in his ill-­doing as well. This translation of his account of the war in Gaul will try to restore your objectivity and freedom of judgment. Make of him what you will.

***

Cormac McCarthy should be the one to write the story of Caesar in Gaul. As insensitive and brutal as McCarthy’s Americans afoot in a land of native and Spanish peoples they wrongly took for uncivilized, Caesar’s armies had little excuse for what they did and they preferred not to remember it once done. But Caesar told their story coolly. Though people die in droves, horribly, on these pages, the Latin word for “blood” appears only twice, near the end.

The facts of the story must be made clear. A general with something to prove, a career to make, and plunder to be harvested for financial gain was handed an army and a province and a guarantee he would have both for long enough to make serious mischief. He spent nine years battering his way through his province and the rich and promising lands beyond, bullying allies and brutalizing the resistant. By the time he was through, the lands and peoples that obeyed his commands—and those of his successors for another half millennium—had been vastly increased, and he was poised to make himself master of the world, or at least the world that stretched from the English Channel to Damascus.

He had no business doing any of this. His colleagues admired his chutzpah, knowing that he went far beyond every reasonable moral or legal boundary. His excesses were possible because he was in competition with two other monsters, one of whom fell in battle at the opposite end of the world while Caesar was in Gaul, the other of whom let Caesar go too long, then fought him, then fled, and ended up hacked to death by the minions of a king who thought it prudent to curry favor with Caesar.

But the book Caesar wrote is magnificent: amoral, certainly, but clear, vivid, and dramatic, a thing to be remembered and read for the ages. Books about war often make us sympathize with the wretchedness of the victims. This one forces us to be Romans of the kind its author wanted to be. We read it nervously, cheering for a bullfight we didn’t want to attend and don’t approve of, admiring the grace of the awesome minuet that floods the sand with blood. There is no denying that this is a great work of literature, one of the greatest, and at the same time, there should be no denying that it is a bad man’s book about his own bad deeds. I think it is the best bad man’s book ever written.

But many will resist my saying the plain fact. Because his carven prose depends on a deliberately restrained vocabulary and a terse, correct style, the book has been thought suitable for schoolboys for many generations, until about the time Latin schoolmasters discovered finally that women can read too. Now the book is in disfavor, for the wrong reasons: because it is about war, and because it is too easy. But we all need to read books about war if we are to avoid dying in one, and this book is anything but easy.

The best reasons for not teaching this book to the young are . . .

Continue reading.

Written by Leisureguy

16 September 2021 at 12:49 pm

A seemingly simple problem that has persisted unsolved

leave a comment »

The cartoon is from the late 1800’s. It seems odd that a problem that would seem to have a simple solution would be so persistent. It’s as though there are forces working against a solution. 

Written by Leisureguy

15 September 2021 at 10:48 am

The enormous costs and elusive benefits of the war on terror: 20 years, $6 trillion, 900,000 lives

leave a comment »

Dylan Matthews reports in Vox:

On the evening of September 11, 2001, hours after two hijacked airliners had destroyed the World Trade Center towers and a third had hit the Pentagon building, President George W. Bush announced that the country was embarking on a new kind of war.

“America and our friends and allies join with all those who want peace and security in the world, and we stand together to win the war against terrorism,” Bush announced in a televised address to the nation.

It was Bush’s first use of the term that would come to define his presidency and deeply shape those of his three successors. The global war on terror, as the effort came to be known, was one of the most expansive and far-reaching policy initiatives in modern American history, and certainly the biggest of the 2000s.

It saw the US invade and depose the governments of two nations and engage in years- or decades-long occupations of each; the initiation of a new form of warfare via drones spanning thousands of miles of territory from Pakistan to Somalia to the Philippines; the formalization of a system of detention without charge and pervasive torture of accused militants; numerous smaller raids by special forces teams around the world; and major changes to air travel and border security in the US proper.

The “war on terror” is a purposely vague term. President Barack Obama famously rejected it in a 2013 speech — favoring instead “a series of persistent, targeted efforts to dismantle specific networks of violent extremists.”

But 9/11 signaled the beginning of a distinct policy regime from the one that preceded it, and a regime that exists in many forms to the present day, even with the US exit from Afghanistan.

Over the past 20 years, the costs of this new policy regime — costs in terms of lives lost, money spent, people and whole communities displaced, bodies tortured — have become clear. It behooves us, then, to try to answer a simple yet vast question: Was it worth it?

A good-faith effort to answer this question — to tally the costs and benefits on the ledger and not just resort to one’s ideological priors — is more challenging than you’d think. That’s largely because it involves quantifying the inherently unquantifiable. If, as proponents argue, the war on terror kept America safe, how do you quantify the psychological value of not being in a state of constant fear of the next attack? What about the damage of increased Islamophobia and violent targeting of Muslims (and those erroneously believed to be Muslims) stoked by the war on terror? There are dozens more unquantifiable purported costs and benefits like these.

But some things can be measured. There have been no 9/11-scale terrorist attacks in the United States in the past 20 years. Meanwhile, according to the most recent estimates from Brown University’s Costs of War Projectat least 897,000 people around the world have died in violence that can be classified as part of the war on terror; at least 38 million people have been displaced due to these wars; and the effort has cost the US at least $5.8 trillion, not including about $2 trillion more needed in health care and disability coverage for veterans in decades to come.

When you lay it all out on paper, an honest accounting of the war on terror yields a dismal conclusion: Even with an incredibly generous view of the war on terror’s benefits, the costs have vastly exceeded them. The past 20 years of war represent a colossal failure by the US government, one it has not begun to reckon with or atone for.

We are now used to the fact that the US government routinely bombs foreign countries with which it is not formally or even informally at war, in the name of killing terrorists. We are used to the fact that the National Security Agency works with companies like Facebook and Google to collect our private information en masse. We are used to the fact that 39 men are sitting in Guantanamo Bay, almost all detained indefinitely without trial.

These realities were not inevitable. They were chosen as part of a policy regime that has done vastly more harm than good.

What America and the world might have gained from the war on terror

Before going further, it’s important to define our terms. . . 

Continue reading. There’s much more, and it’s a harsh indictment. One of several charts in the article:

Why Americans Die So Much

leave a comment »

Derek Thompson writes in the Atlantic:

America has a death problem.

No, I’m not just talking about the past year and a half, during which COVID-19 deaths per capita in the United States outpaced those in similarly rich countries, such as Canada, Japan, and France. And I’m not just talking about the past decade, during which drug overdoses skyrocketed in the U.S., creating a social epidemic of what are often called “deaths of despair.”

I’m talking about the past 30 years. Before the 1990s, average life expectancy in the U.S. was not much different than it was in Germany, the United Kingdom, or France. But since the 1990s, American life spans started falling significantly behind those in similarly wealthy European countries.

According to a new working paper released by the National Bureau of Economic Research, Americans now die earlier than their European counterparts, no matter what age you’re looking at. Compared with Europeans, American babies are more likely to die before they turn 5, American teens are more likely to die before they turn 20, and American adults are more likely to die before they turn 65. At every age, living in the United States carries a higher risk of mortality. This is America’s unsung death penalty, and it adds up. Average life expectancy surged above 80 years old in just about every Western European country in the 2010s, including Portugal, Spain, France, Italy, Germany, the U.K., Denmark, and Switzerland. In the U.S., by contrast, the average life span has never exceeded 79—and now it’s just taken a historic tumble.

Why is the U.S. so much worse than other developed countries at performing the most basic function of civilization: keeping people alive?

“Europe has better life outcomes than the United States across the board, for white and Black people, in high-poverty areas and low-poverty areas,” Hannes Schwandt, a Northwestern University professor who co-wrote the paper, told me. “It’s important that we collect this data, so that people can ask the right questions, but the data alone does not tell us what the cause of this longevity gap is.”

Finding a straightforward explanation is hard, because there are so many differences between life in the U.S. and Europe. Americans are more likely to kill one another with guns, in large part because Americans have more guns than residents of other countries do. Americans die more from car accidents, not because our fatality rate per mile driven is unusually high but because we simply drive so much more than people in other countries. Americans also have higher rates of death from infectious disease and pregnancy complications. But what has that got to do with guns, or commuting?

By collecting data on American life spans by ethnicity and by income at the county level—and by comparing them with those of European countries, locality by locality—Schwandt and the other researchers made three important findings.

First, Europe’s mortality rates are shockingly similar between rich and poor communities. Residents of the poorest parts of France live about as long as people in the rich areas around Paris do. “Health improvements among infants, children, and youth have been disseminated within European countries in a way that includes even the poorest areas,” the paper’s authors write.

But in the U.S., which has the highest poverty and inequality of just about any country in the Organization for Economic Cooperation and Development, where you live is much more likely to determine when you’ll die. Infants in the U.S. are considerably more likely to die in the poorest counties than in the richest counties, and this is true for both Black and white babies. Black teenagers in the poorest U.S. areas are roughly twice as likely to die before they turn 20, compared with those in the richest U.S. counties. In Europe, by contrast, the mortality rate for teenagers in the richest and poorest areas is exactly the same—12 deaths per 100,000. In America, the problem is not just that poverty is higher; it’s that the effect of poverty on longevity is greater too.

Second, even rich Europeans are outliving rich Americans. “There is an American view that egalitarian societies have more equality, but it’s all one big mediocre middle, whereas the best outcomes in the U.S. are the best outcomes in the world,” Schwandt said. But this just doesn’t seem to be the case for longevity. White Americans living in the richest 5 percent of counties still die earlier than Europeans in similarly low-poverty areas; life spans for Black Americans were shorter still. (The study did not examine other American racial groups.) “It says something negative about the overall health system of the United States that even after we grouped counties by poverty and looked at the richest 10th percentile, and even the richest fifth percentile, we still saw this longevity gap between Americans and Europeans,” he added. In fact, Europeans in extremely impoverished areas seem to live longer than Black or white Americans in the richest 10 percent of counties.

Third,  . . .

Continue reading. There’s more, including this interesting factoid:

Air pollution has declined more than 70 percent since the 1970s, according to the EPA, and most of that decline happened during the 30-year period of this mortality research.

Related, via a post this morning by Kevin Drum:

Drum notes:

The US death rate from COVID-19 is no longer skyrocketing, but it’s still going up. Our mortality rate is 150% above Britain and more than 1000% higher than Germany.

I imagine the primary causes are widespread refusal (especially in Red states) to wear masks, to avoid crowds, and to be vaccinated, all obvious steps that significantly reduce the likelihood of infection and thus reduce the likelihood of death.

Note this headline in the NY Times this morning: “The U.S. is falling to the lowest vaccination rates of the world’s wealthiest democracies.” From that article:

. . . Canada leads the G7 countries in vaccination rates, with almost three-quarters of its population at least partially vaccinated as of Thursday, according to Our World in Data. France, Italy and Britain follow, with percentages between 70 and 73. Germany’s rate is just ahead of Japan’s, at around 65 percent.

The U.S. vaccination curve has leveled dramatically since an initial surge in the first half of this year, when the vaccine first became widely available. In a push to vaccinate the roughly 80 million Americans who are eligible for shots but have not gotten them, President Biden on Thursday mandated that two-thirds of American workers, including health care workers and the vast majority of federal employees, be vaccinated against the coronavirus.

Written by Leisureguy

13 September 2021 at 1:11 pm

After 9/11, a rush of national unity. Then, quickly, more and new divisions.

leave a comment »

Dan Balz had an interesting column in the Washington Post yesterday. (The gift link I used by-passes the paywall.) The column begins:

On Monday, the leaders of Congress are to gather with colleagues at noon for a bipartisan ceremony marking the terrorist attacks of Sept. 11, 2001. It will be reminiscent of the gathering on the night of the attacks, when members of Congress, many holding small American flags, stood on the Capitol steps and spontaneously sang “God Bless America.” But so much has changed.

Twenty years ago, members of Congress were joined in a determined and resilient expression of national unity at an unprecedented moment in the nation’s history, a day that brought deaths and heroism but also shock, fear and confusion. Monday’s ceremony will no doubt be somber in its remembrance of what was lost that day, but it will come not as expression of a united America but simply as a momentary cessation in political wars that rage and have deepened in the years since those attacks.

In a video message to Americans released Friday, President Biden spoke of how 9/11 had united the country and said that moment represented “America at its best.” He called such unity “our greatest strength” while noting it is “all too rare.” The unity that followed the attacks didn’t last long. Americans reverted more quickly than some analysts expected to older patterns of partisanship. With time, new divisions over new issues have emerged, and they make the prospect of a united nation ever more distant.

On a day for somber tribute, the man who was president on 9/11, George W. Bush, spoke most directly of those new divisions — and threats — in a speech in Shanksville, Pa., where Flight 93 went down on the day of the attacks. Bush warned that dangers to the country now come not only across borders “but from violence that gathers from within.” It was an apparent but obvious reference to the attack on the Capitol on Jan. 6.

“There is little cultural overlap between violent extremists abroad and violent extremists at home,” he said. “But in their disdain for pluralism, in their disregard for human life, in their determination to defile national symbols, they are children of the same foul spirit. And it is our continuing duty to confront them.”

The question is often asked: As the United States has plunged deeper into division and discord, is there anything that could spark a change, anything big enough to become a catalyst for greater national unity? But if ­9/11 doesn’t fit that model, what does? And look what happened in the aftermath of that trauma.

For a time, the shock of the attacks did bring the country together. Bush’s approval ratings spiked to 90 percent in a rally-round-the-flag reaction that was typical when the country is faced with external threats or crises.

One notable expression of the unity at the time came from Al Gore, the former vice president who had lost the bitter 2000 election to Bush after a disputed recount in Florida and a controversial Supreme Court decision.

Speaking at a Democratic Party dinner in Iowa less than a month after the attacks, Gore called Bush “my commander in chief,” adding, “We are united behind our president, George W. Bush, behind the effort to seek justice, not revenge, to make sure this will never, ever happen again. And to make sure we have the strongest unity in America that we have ever had.” The Democratic audience rose, applauding and cheering.

Trust in government rose in those days after the attacks. Shortly after 9/11, trust in government jumped to 64 percent, up from 30 percent before the attacks, according to Public Opinion Strategies, a Republican polling firm that was closely tracking public attitudes to the attacks. By the summer of 2002, the firm found that trust had fallen back, to 39 percent.

<

Five years after the attacks, then-Sen. John McCain (R-Ariz.), now deceased, was quoted as saying that America was “more divided and more partisan than I’ve ever seen us.” Today, after many contentious elections, political warfare over economic, cultural and social issues and a domestic attack on the U.S. Capitol on Jan. 6, many Americans would say things have become worse.

As he prepared the U.S. response to the attacks by al-Qaeda in the fall of 2001, Bush made clear the United States would go it alone if necessary, assembling what was called a “coalition of the willing.” He put other nations on notice, saying the United States would hold them accountable in the campaign against the terrorists. “You’re either with us or against us in the fight,” he said.

Bush described the world in Manichaean terms: good vs. evil.

Today’s politics at home is often practiced that way. That phrase — “with us or against us” — could stand as a black-and-white expression of the way in which many Americans approach the political battles: all in with the team, red or blue, or not in at all. If you win, I lose. No middle ground.

Lack of imagination on the part of Americans had helped 9/11 to happen. No one in the upper reaches of government  . . .

Continue reading. No paywall on this one.

Written by Leisureguy

12 September 2021 at 10:35 am

How the energy industry tricked Americans into loving a dangerous appliance.

leave a comment »

Rebecca Leber has a long article in Mother Jones that’s well worth reading. It begins:

Early last year in the Fox Hills neighborhood of Culver City, California, a man named Wilson Truong posted an item on the Nextdoor social media platform—where users can interact with their neighbors—warning that city leaders were considering stronger building codes that would discourage the use of natural gas in new homes and businesses. In a message titled “Culver City banning gas stoves?” he wrote, “First time I heard about it I thought it was bogus, but I received a newsletter from the city about public hearings to discuss it…Will it pass???!!! I used an electric stove but it never cooked as well as a gas stove so I ended up switching back.”

Truong’s post ignited a debate. One neighbor, Chris, defended electric induction stoves. “Easy to clean,” he wrote of these glass stovetops, which use a magnetic field to heat pans. [Induction is definitely best of all. – LG] Another neighbor, Laura, expressed skepticism. “No way,” she wrote. “I am staying with gas. I hope you can too.”

Unbeknownst to both, Truong wasn’t their neighbor at all, but an account manager for Imprenta Communications Group. Among the public relations firm’s clients was Californians for Balanced Energy Solutions, a front for the nation’s largest gas utility, SoCalGas, which aims to thwart state and local initiatives restricting the use of fossil fuels in new buildings. c4bes had tasked Imprenta with exploring how platforms such as Nextdoor could be used to engineer community support for natural gas. Imprenta assured me that Truong’s post was an isolated affair, but c4bes displays it alongside two other anonymous Nextdoor comments on its website as evidence of its advocacy in action.

Microtargeting Nextdoor groups is part of the newest front in the gas industry’s war to bolster public support for its product. For decades the American public was largely sold on the notion that “natural” gas was relatively clean, and when used in the kitchen, even classy. But that was before climate change moved from distant worry to proximate danger. Burning natural gas in commercial and residential buildings accounts for more than 10 percent of US emissions, so moving toward homes and apartments powered by wind and solar electricity instead could make a real dent. Gas stoves and ovens also produce far worse indoor air pollution than most people realize; running a gas stove and oven for just an hour can produce unsafe pollutant levels throughout your house all day. These concerns have prompted moves by 42 municipalities to phase out gas in new buildings. Washington state lawmakers intend to end all use of natural gas by 2050. California has passed aggressive standards, including a plan to reduce commercial and residential emissions to 60 percent of 1990 levels by 2030. During his campaign, President Biden called for stricter standards for appliances and new construction. Were more stringent federal rules to come to pass, it could motivate builders to ditch gas hookups for good.

Gas utilities have responded to this existential threat to their livelihood by launching local anti-electrification campaigns. To ward off a municipal vote in San Luis Obispo, California, a union representing gas utility workers threatened to bus in “hundreds” of protesters during the pandemic with “no social distancing in place.” In Santa Barbara, residents have received robotexts warning that a gas ban would dramatically increase their bills. The Pacific Northwest group Partnership for Energy Progress, funded in part by Washington state’s largest gas utility, Puget Sound Energy, has spent at least $1 million opposing electrification mandates in Bellingham and Seattle, including $91,000 on bus ads showing a happy family cooking with gas next to the slogan “Reliable. Affordable. Natural Gas. Here for You.”

The industry group American Gas Association has a website dedicated to promoting cooking with gas.

The gas industry also has worked aggressively with legislatures in seven states to enact laws—at least 14 more have bills—that would prevent cities from passing cleaner building codes. This past spring, according to a HuffPost investigation, gas and construction interests managed to block cities from pushing for the stricter energy efficiency codes favored by local officials. In a potential blow to the Biden administration’s climate ambitions, two big trade groups convinced the International Code Council—the notoriously industry-friendly gatekeeper of default construction codes—to cut local officials out of the decision-making process entirely.

Beyond applying political pressure, the gas industry has identified a clever way to capture the public imagination. Surveys showed that most people had no preference for gas water heaters and furnaces over electric ones. So the gas companies found a different appliance to focus on. For decades, sleek industry campaigns have portrayed gas stoves—like granite countertops, farm sinks, and stainless-steel refrigerators—as a coveted symbol of class and sophistication, not to mention a selling point for builders and real estate agents.

The strategy has been remarkably successful in boosting sales of natural gas, but as the tides turn against fossil fuels, defending gas stoves has become a rear guard action. While stoves were once crucial to expanding the industry’s empire, now they are a last-ditch attempt to defend its shrinking borders.

Over the last hundred years, gas companies have engaged an all-out campaign to convince Americans that cooking with a gas flame is superior to using electric heat. At the same time, they’ve urged us not to think too hard—if at all—about what it means to combust a fossil fuel in our homes.

In the 1930s, the industry embraced the term “natural gas,” which gave the impression that its product was cleaner than any other fossil fuel: “The discovery of Natural Gas brought to man the greater and most efficient heating fuel which the world has ever known,” bragged one 1934 ad. “Justly is it called—nature’s perfect fuel.”

It was also during the 1930s that the industry first adopted the slogan “cooking with gas”; a gas executive saw to it that the phrase worked its way into Bob Hope bits and Disney cartoons. By the 1950s the industry was targeting housewives with star-studded commercials that featured matinee idols scheming about how to get their husbands to renovate their kitchens. In one 1964 newspaper advertisement from the Pennsylvania People’s Natural Gas Company, the star Marlene Dietrich professed, “Every recipe I give is closely related to cooking with gas. If forced, I can cook on an electric stove but it is not a happy union.” (Around the same time, General Electric waged an advertising campaign starring Ronald Reagan that depicted an all-electric house as a Jetsons-like future.) During the 1980s, the gas industry debuted a cringeworthy rap: “I cook with gas cause the cost is much less / Than ’lectricity. Do you want to take a guess?” and “I cook with gas cause broiling’s so clean / The flame consumes the smoke and grease.” . . .

Continue reading. There’s much more, including serious and fact-based arguments against using gas ranges. No paywall.

Later in the article:

Beginning in the 1990s, the industry faced a new challenge: mounting evidence that burning gas indoors can contribute to serious health problems. Gas stoves emit a host of dangerous pollutants, including particulate matter, formaldehyde, carbon monoxide, and nitrogen oxides. One 2014 simulation by the Lawrence Berkeley National Laboratory found that cooking with gas for one hour without ventilation adds up to 3,000 parts per billion of carbon monoxide to the air—raising indoor concentrations by up to 30 percent in the average home. Carbon monoxide can kill; it binds tightly to the hemoglobin molecules in your blood so they can no longer carry oxygen. What’s more, new research shows that the typical home carbon monoxide alarms often fail to detect potentially dangerous levels of the gas. Nitrogen oxides, which are not regulated indoors, have been linked to an increased risk of heart attack, along with asthma and other respiratory diseases. Homes with gas stoves have anywhere between 50 and 400 percent higher concentrations of nitrogen dioxide than homes without, according to EPA research. Children are at especially high risk from nitrogen oxides, according to a study by UCLA Fielding School of Public Health commissioned by the Sierra Club. The paper included a meta-analysis of existing epidemiological studies, one of which estimated that kids in homes with gas stoves are 42 percent more likely to have asthma than children whose families use electric.

From my own direct experience I know that cooking on an induction burner is by far the best — I’ve cooked with gas and with electric coil burners, and induction beats them hands down.

Written by Leisureguy

12 September 2021 at 9:26 am

The Problem With “Doing Your Own Research”

leave a comment »

Tim Wise writes on Medium:

The internet is a wonderful thing, and also the absolute worst thing ever.

On the one hand, it allows people to access information at the push of a button and then connect with others worldwide, even sharing that information if they’d like to do so.

On the other hand, it allows people to access information at the push of a button and then connect with others worldwide, even sharing that information if they’d like to do so.

Yes, the relative democratization of communication — compared to the days when gatekeepers more tightly limited the voices to which we might be exposed — is a welcome step in the direction of a more open society.

But at the same time, with more information also comes more noise. And with the ability to spread noise like never in human history, cacophony becomes the default position.

It seems wistful to remember the days of antiquity (also known as the 1990s), when getting your opinion heard required writing a letter to the editor of this thing called a newspaper and then waiting several days to see if it would be published. Or perhaps, if you were really ambitious, sending an entire essay or article to a magazine and then waiting for several weeks to discover the same.

As much as we complained about the difficulty of breaking through these mainstream media filters, I’m not sure if what replaced them is better.

Perhaps it would be fine had we even the most rudimentary skills at discerning truth from falsehood. But humans are not much on critical thinking, Americans least of all. We are a nation of image-crazed consumers and wanna-be “influencers,” actively hostile to critical thought and allergic to teaching such skills in school, lest we usurp the authority of parents to brainwash our children the way we see fit.

And so instead of developing the media literacy necessary to separate the factual wheat from the fictional chaff, millions just “do their own research,” by which they mean to tell you they:

1. Own a Google machine;
2. Have a lot of extra time on their hands; and,
3. Don’t actually know what research is.

Pro tip: research is not just a matter of looking stuff up.

It is not what you’re doing when conversing with anonymous people on Reddit, soaking in whatever StarShine77 has decided to offer up that morning.

It is not what you’re doing when scrolling through YouTube videos fed to you by an algorithm that is intentionally programmed to show you more of the same shit you were already watching and absolutely nothing that might contradict it.

It’s not what you’re doing when you pass around memes, with citations at the bottom like “U.S. Research Center,” which is not a real thing, and even if it were, that’s not a fucking citation, Grandpa.

But sadly, this is part of what it means to be American in the 21st century: to confuse having a right to an opinion with having a right to be taken seriously for whatever ass-backward opinion you have.

You’ll hear it all the time: “Well, I have a right to believe whatever I want, and you do too, and I guess we’ll just agree to disagree.”

No, cousin Judy, that’s not the end of it.

You can believe whatever codswallop floats your inner-tube, to be sure, but when it’s utter and complete horseshit, we won’t simply agree to disagree.

Agreeing to disagree is what we do when we debate who was the greatest Major League pitcher of all time, and you say Bob Gibson and I say Sandy Koufax — and we both could be right.

What we’re doing now, Mr. “The COVID vaccine will change your DNA and allow the government to track you,” is not that. It’s me, buying a calming shade of yellow interior wall paint with which to coat your bedroom and Googling “doctors near you that specialize in helping people with delusions.”

The idea that your opinion on a subject is equal to someone else’s, when that someone else has spent years studying and researching it (using more complex methods than refreshing their Facebook feed), is ridiculous.

Expertise is, in fact, a thing.

And yes, I know, sometimes experts disagree. Even physicians sometimes have different takes on the proper course of treatment for a given condition.

That’s why, when faced with such decisions, it’s good to get a second opinion.

But guess what? When you get that second opinion, from whom do you get it?

Another gotdamn doctor who went to a gotdamn medical school.

You do not get that second opinion about whether you need open-heart surgery to address your arterial blockage from KaleMomma420. Or rather, if you do, you deserve whatever happens to you.

Best of all is when . . .

Continue reading.

Written by Leisureguy

11 September 2021 at 6:43 pm

After 9/11, the U.S. Got Almost Everything Wrong

leave a comment »

In the Atlantic Garrett M. Graff, a journalist, historian, and the author of The Only Plane in the Sky: An Oral History of 9/11, lays out the bad decisions after 9/11 — many of which were strongly opposed at the time (for example, many (including yours truly) vociferously opposed the (stupid) invasion of Iraq):

On the friday after 9/11, President George W. Bush visited the New York City site that the world would come to know as Ground Zero. After rescue workers shouted that they couldn’t hear him as he spoke to them through a bullhorn, he turned toward them and ad-libbed. “I can hear you,” he shouted. “The whole world hears you, and when we find these people who knocked these buildings down, they’ll hear all of us soon.” Everybody roared. At a prayer service later that day, he outlined the clear objective of the task ahead: “Our responsibility to history is already clear: to answer these attacks and rid the world of evil.”

Appearing on NBC’s Meet the Press two days later, Vice President Dick Cheney offered his own vengeful promise. “We also have to work, though, sort of the dark side, if you will,” he told the host, Tim Russert. “We’ve got to spend time in the shadows in the intelligence world. A lot of what needs to be done here will have to be done quietly, without any discussion, using sources and methods that are available to our intelligence agencies, if we’re going to be successful.” He added, “That’s the world these folks operate in, and so it’s going to be vital for us to use any means at our disposal.”

In retrospect, Cheney’s comment that morning came to define the U.S. response to the 2001 terrorist attacks over the next two decades, as the United States embraced the “dark side” to fight what was soon dubbed the “Global War on Terror” (the “GWOT” in gov-speak)—an all-encompassing, no-stone-unturned, whole-of-society, and whole-of-government fight against one of history’s great evils.

It was a colossal miscalculation.

The events of September 11, 2001, became the hinge on which all of recent American history would turn, rewriting global alliances, reorganizing the U.S. government, and even changing the feel of daily life, as security checkpoints and magnetometers proliferated inside buildings and protective bollards sprouted like kudzu along America’s streets.

I am the author of an oral history of 9/11. Two of my other books chronicle how that day changed the FBI’s counterterrorism efforts and the government’s doomsday plans. I’ve spent much of this year working on a podcast series about the lingering questions from the attacks. Along the way, I’ve interviewed the Cassandra-like FBI agents who chased Osama bin Laden and al-Qaeda before the attacks; first responders and attack survivors in New York, Washington, and Pennsylvania; government officials who hid away in bunkers under the White House and in the Virginia countryside as the day unfolded; the passengers aboard Air Force One with the president on 9/11; and the Navy SEALs who killed bin Laden a decade later. I’ve interviewed directors of the CIA, FBI, and national intelligence; the interrogators in CIA black sites; and the men who found Saddam Hussein in that spider hole in Iraq.

As we approach the 20th anniversary of 9/11 on Saturday, I cannot escape this sad conclusion: The United States—as both a government and a nation—got nearly everything about our response wrong, on the big issues and the little ones. The GWOT yielded two crucial triumphs: The core al-Qaeda group never again attacked the American homeland, and bin Laden, its leader, was hunted down and killed in a stunningly successful secret mission a decade after the attacks. But the U.S. defined its goals far more expansively, and by almost any other measure, the War on Terror has weakened the nation—leaving Americans more afraid, less free, more morally compromised, and more alone in the world. A day that initially created an unparalleled sense of unity among Americans has become the backdrop for ever-widening political polarization.

The nation’s failures began in the first hours of the attacks and continue to the present day. Seeing how and when we went wrong is easy in hindsight. What’s much harder to understand is how—if at all—we can make things right.

As a society, we succumbed to fear.

The most telling part of September 11, 2001, was the interval between the first plane crash at the World Trade Center, at 8:46 a.m., and the second, at 9:03. In those 17 minutes, the nation’s sheer innocence was on display.

The aftermath of the first crash was live on the nation’s televisions by 8:49 a.m. Though horrified, many Americans who saw those images still went on about their morning. In New York, the commuter-ferry captain Peter Johansen recalled how, afterward, he docked at the Wall Street Terminal and every single one of his passengers got off and walked into Lower Manhattan, even as papers and debris rained down from the damaged North Tower.

At the White House, National Security Adviser Condoleezza Rice called Bush, who was in Florida. They discussed the crash and agreed it was strange. But Rice proceeded with her 9 a.m. staff meeting, as previously scheduled, and Bush went into a classroom at the Emma E. Booker Elementary School to promote his No Child Left Behind education agenda. At the FBI, the newly arrived director, Robert Mueller, was actually sitting in a briefing on al-Qaeda and the 2000 bombing of the USS Cole when an aide interrupted with news of the first crash; he looked out the window at the bright blue sky and wondered how a plane could have hit the World Trade Center on such a clear day.

Those muted reactions seem inconceivable today but were totally appropriate to the nation that existed that September morning. The conclusion of the Cold War a decade earlier had supposedly ended history. To walk through Bill Clinton’s presidential library in Little Rock today is to marvel at how low-stakes everything in the 1990s seemed.

But after that second crash, and then the subsequent ones at the Pentagon and in the fields outside Shanksville, Pennsylvania, our government panicked. There’s really no other way to say it. Fear spread up the chain of command. Cheney, who had been hustled to safety in the minutes after the second crash, reflected later, “In the years since, I’ve heard speculation that I’m a different man after 9/11. I wouldn’t say that. But I’ll freely admit that watching a coordinated, devastating attack on our country from an underground bunker at the White House can affect how you view your responsibilities.”

The initial fear seemed well grounded. Experts warned of a potential second wave of attacks and of al-Qaeda sleeper cells across the country. Within weeks, mysterious envelopes of anthrax powder began sickening and killing people in Florida, New York, and Washington. Entire congressional office buildings were sealed off by government officials in hazmat suits.

The world suddenly looked scary to ordinary citizens—and even worse behind the closed doors of intelligence briefings. The careful sifting of intelligence that our nation’s leaders rely on to make decisions fell apart. After the critique that federal law enforcement and spy agencies had “failed to connect the dots” took hold, everyone shared everything—every tip seemed to be treated as fact. James Comey, who served as deputy attorney general during some of the frantic post-9/11 era, told me in 2009 that he had been horrified by the unverified intelligence landing each day on the president’s desk. “When I started, I believed that a giant fire hose of information came in the ground floor of the U.S. government and then, as it went up, floor by floor, was whittled down until at the very top the president could drink from the cool, small stream of a water fountain,” Comey said. “I was shocked to find that after 9/11 the fire hose was just being passed up floor by floor. The fire hose every morning hit the FBI director, the attorney general, and then the president.”

According to one report soon after 9/11, a nuclear bomb that terrorists had managed to smuggle into the country was hidden on a train somewhere between Pittsburgh and Philadelphia. This tip turned out to have come from an informant who had misheard a conversation between two men in a bathroom in Ukraine—in other words, from a terrible global game of telephone. For weeks after, Bush would ask in briefings, “Is this another Ukrainian urinal incident?”

Even disproved plots added to the impression that the U.S. was under constant attack by a shadowy, relentless, and widespread enemy. Rather than recognizing that an extremist group with an identifiable membership and distinctive ideology had exploited fixable flaws in the American security system to carry out the 9/11 attacks, the Bush administration launched the nation on a vague and ultimately catastrophic quest to rid the world of “terror” and “evil.”

At the time, some commentators politely noted the danger of tilting at such nebulous concepts, but a stunned American public appeared to crave a bold response imbued with a higher purpose. As the journalist Robert Draper writes in To Start a War, his new history of the Bush administration’s lies, obfuscations, and self-delusions that led from Afghanistan into Iraq, “In the after-shocks of 9/11, a reeling America found itself steadied by blunt-talking alpha males whose unflappable, crinkly-eyed certitude seemed the only antidote to nationwide panic.”

he crash of that second plane at 9:03, live on millions of television sets across the country, had revealed a gap in Americans’ understanding of our world, a gap into which anything and everything—caution and paranoia, liberal internationalism and vengeful militarism, a mission to democratize the Middle East and an ever more pointless campaign amid a military stalemate—might be poured in the name of shared national purpose. The depth of our leaders’ panic and the amorphousness of our enemy led to a long succession of tragic choices.

We chose the wrong way to seek justice.

Before 9/11, the United States had a considered, constitutional, and proven playbook for targeting terrorists: They were arrested anywhere in the world they could be caught, tried in regular federal courts, and, if convicted, sent to federal prison. The mastermind of the 1993 World Trade Center bombing? Arrested in Pakistan. The 1998 embassy bombers? Caught in Kenya, South Africa, and elsewhere. In Sweden on the very morning of 9/11, FBI agents had arrested an al-Qaeda plotter connected to the attack on the USS Cole. The hunt for the plotters of and accomplices to the new attacks could have been similarly handled in civilian courts, whose civil-liberties protections would have shown the world how even the worst evils met with reasoned justice under the law.

Instead, on November 13, 2001, President Bush announced in an executive order that those rounded up in the War on Terror would be treated not as criminals, or even as prisoners of war, but as part of a murky category that came to be known as “enemy combatants.”

While civil libertarians warned of a dark path ahead, Americans seemed not . . .

Continue reading. There’s much more.

Later in the article:

Meanwhile, for all the original talk of banishing evil from the world, the GWOT’s seemingly exclusive focus on Islamic extremism has led to the neglect of other threats actively killing Americans. In the 20 years since 9/11, thousands of Americans have succumbed to mass killers—just not the ones we went to war against in 2001. The victims have included worshippers in churchessynagogues, and temples; people at shopping mallsmovie theaters, and a Walmart; students and faculty at universities and community colleges; professors at a nursing school; children in elementarymiddle, and high schools; kids at an Amish school and on a Minnesota Native American reservation; nearly 60 concertgoers who were machine-gunned to death from hotel windows in Las Vegas. But none of those massacres were by the Islamic extremists we’d been spending so much time and money to combat. Since 9/11, more Americans have been killed by domestic terrorists than by foreign ones. Political pressure kept national-security officials from refocusing attention and resources on the growing threat from white nationalists, armed militias, and other groups energized by the anti-immigrant, anti-Muslim strains of the War on Terror.

FDR was right: the thing to fear is fear itself — fear leads to panic, and panic leads to bad and ill-considered decisions.

Update: But see also David Corn’s article  in Mother Jones: “It’s Not Too Late to Learn the Lessons We Didn’t Learn From 9/11.”

Written by Leisureguy

10 September 2021 at 3:57 pm

How Educational Differences Are Widening America’s Political Rift

leave a comment »

Nat Cohn has an interesting article on America’s division along education lines. It’s in the NY Times, but the link here is a gift link that skirts the paywall. The article begins:

The front lines of America’s cultural clashes have shifted in recent years. A vigorous wave of progressive activism has helped push the country’s culture to the left, inspiring a conservative backlash against everything from “critical race theory” to the supposed cancellation of Dr. Seuss.

These skirmishes may be different in substance from those that preceded them, but in the broadest sense they are only the latest manifestation of a half-century trend: the realignment of American politics along cultural and educational lines, and away from the class and income divisions that defined the two parties for much of the 20th century.

As they’ve grown in numbers, college graduates have instilled increasingly liberal cultural norms while gaining the power to nudge the Democratic Party to the left. Partly as a result, large portions of the party’s traditional working-class base have defected to the Republicans.

Over the longer run, some Republicans even fantasize that the rise of educational polarization might begin to erode the Democratic advantage among voters of color without a college degree. Perhaps a similar phenomenon may help explain how Donald J. Trump, who mobilized racial animus for political gain, nonetheless fared better among voters of color than previous Republicans did, and fared worse among white voters.

President Biden won about 60 percent of college-educated voters in 2020, including an outright majority of white college graduates, helping him run up the score in affluent suburbs and putting him over the top in pivotal states.

This was a significant voting bloc: Overall, 41 percent of people who cast ballots last year were four-year college graduates, according to census estimates. By contrast, just 5 percent of voters in 1952 were college graduates, according to that year’s American National Elections Study.

Yet even as college graduates have surged in numbers and grown increasingly liberal, Democrats are no stronger than they were 10, 30 or even 50 years ago. Instead, rising Democratic strength among college graduates and voters of color has been counteracted by a nearly equal and opposite reaction among white voters without a degree.

When the Harvard-educated John F. Kennedy narrowly won the presidency in 1960, he won white voters without a degree but lost white college graduates by a two-to-one margin. The numbers were almost exactly reversed for Mr. Biden, who lost white voters without a degree by a two-to-one margin while winning white college graduates.

About 27 percent . . .

Continue reading — and no paywall for this article.

I’ve observed the Republicans increasingly seem ignorant, and it seems that observation is accurate.

Written by Leisureguy

10 September 2021 at 3:30 pm

Eco-Fashion’s Animal Rights Delusion

leave a comment »

Alden Wicker has an interesting article in Craftsmanship magazine:

1. The Silkworm vs. the Orangutan
2. Vegan Fast Fashion
3. If Not Leather, then What?
4. Dyed-in-the-Wool Environmentalists
5. Are Indigenous People Politically Incorrect?
6. Peta’s Explanation

For most women like me, when a fine silk blouse catches our eye in a clothing store, we don’t think much about the worms that made the silk. If you do, here’s the story you will typically find: A few days after silkworms disappear inside their cocoons, right about the time they finish spinning, the little pods are collected and submerged in boiling water. To make a pound of raw silk, up to 5,000 worms must die.

To People for the Ethical Treatment of Animals (PETA), the nation’s leading animal-rights group, that’s a pretty destructive process for the cause of glamour. This is why PETA encourages consumers to buy “cruelty-free” silk alternatives like polyester and viscose (popularly known as rayon). Consumers have hardly needed PETA’s prodding. In a single decade, consumption of rayon doubled, rising to 5.2 million tons in 2015; meanwhile, the silk industry had declined to 202,000 metric tonnes by 2015, constituting less than 0.2 percent of the global textile market. Another victory for animal rights and the fight for more socially conscious consumerism, right?

Maybe—or maybe not. As with so many eco-conscious consumer choices, the issues involved in silk production are both elusive and multilayered. If we’re going to call ourselves conscious consumers, therefore, we have to calculate all aspects of the production process, and its consequences.

In the case of silk, let’s first look at the other way to make silk, which doesn’t kill the worms. For this kind of silk, called Peace or Ahimsa Silk, the pupa is allowed to grow into a moth, tear a hole in the cocoon, and crawl out into the light. But there’s a catch. Because that hole cuts what used to be a continuous strand of thread, the process yields a fabric with a nubbier, less shimmering texture, much like raw silk. It’s beautiful in its own way, but also double the cost. That can drive the retail price of a wedding dress, for example, up by more than $1,000.

To a bride who is committed to having a wedding dress that allowed moths to be “free and happy,” that price may feel worthwhile—as long as she can afford it. But she might want to look again at the Peace worm’s glorious beginnings. It turns out that if silkworms are allowed to emerge as moths, they live short and very difficult lives. Having been domesticated for thousands of years, bombyx mori are unable to fly, and cannot even eat. The males spend their one glorious day of moth-dom crawling across the ground to find and couple with a nearby female before dying. The females lay eggs over the next few days and then die as well. In any case, PETA opposes the use of Peace Silk simply because there is no certification process to ensure the worms weren’t mistreated.

Now, let’s look back at those worms that were put to death in boiling water.

Traditional southern Chinese silks are handmade in a closed-loop ecosystem, in which the silkworms that spin the superfine threads eat the leaves of mulberry trees planted by ponds, the fish in the ponds eat the worm poop, and in turn fertilize the mulberry trees. In Asia, which produces the lion’s share of silk, the boiled pupae are fried up and eaten as a low-carbon protein source—not a bad byproduct for a rapidly growing country badly in need of food. And certain types of silk (Jia¯o-chou and Xiang-yun-sha—see photos) are still dyed using nontoxic vegetable and mud dyes.

Stella McCartney offered a potential solution to the silkworm conundrum when she  . . .

Continue reading. There’s more.

A sidebar to the above text notes:

To make rayon—a supposedly animal-friendly fabric—you have to harvest a large number of trees or bamboo, shred and dissolve the wood in a soup of carbon disulfide, dry the resulting glop, then spin it into semi-synthetic fibers. Workers exposed to the fumes from this process can suffer insanity, nerve damage, and increased risk of heart disease and stroke. Factories in China, Indonesia, and India expel its effluent straight into waterways, rendering formerly vibrant ecosystems completely dead.

Written by Leisureguy

10 September 2021 at 9:43 am

A somewhat comforting thought: A large proportion of Americans have always experienced difficulty in thinking clearly

leave a comment »

A group of people observing a doctor as he vaccinates a man in an 1870s illustration called “Vaccinating the Poor,” by Solomon Eytinge Jr. via National Library of Medicine

Maggie Astor writes in the NY Times:

As disease and death reigned around them, some Americans declared that they would never get vaccinated and raged at government efforts to compel them. Anti-vaccination groups spread propaganda about terrible side effects and corrupt doctors. State officials tried to ban mandates, and people made fake vaccination certificates to evade inoculation rules already in place.

The years were 1898 to 1903, and the disease was smallpox. News articles and health board reports describe crowds of parents marching to schoolhouses to demand that their unvaccinated children be allowed in, said Michael Willrich, a professor of history at Brandeis University, with some even burning their own arms with nitric acid to mimic the characteristic scar left by the smallpox vaccine.

“People went to some pretty extraordinary lengths not to comply,” said Professor Willrich, who wrote “Pox: An American History,” a book about the civil liberties battles prompted by the epidemic.

If it all sounds familiar, well, there is nothing new under the sun: not years that feel like centuries, not the wailing and gnashing of teeth over masks, and not vaccine mandates either.

As the coronavirus overwhelms hospitals across the South and more than 650,000 Americans — an increasing number of them children — lie dead, the same pattern is emerging. On Thursday, President Biden announced that he would move to require most federal workers and contractors to be vaccinated and, more sweepingly, that all employers with 100 or more employees would have to mandate vaccines or weekly testing. Colleges, businesses and local governments have enacted mandates at a steady pace, and conservative anger has built accordingly.

On Monday, Representative Jim Jordan, Republican of Ohio, tweeted that vaccine mandates were “un-American.” In reality, they are a time-honored American tradition.

But to be fair, so is public fury over them.

“We’re really seeing a lot of echoes of the smallpox era,” said Elena Conis, an associate professor and historian of medicine at the University of California, Berkeley. “Mandates elicit resistance. They always have.”

The roots of U.S. vaccine mandates predate both the U.S. and vaccines. The colonies sought to prevent disease outbreaks by quarantining ships from Europe and sometimes, in the case of smallpox, requiring inoculations: a crude and much riskier predecessor to vaccinations in which doctors rubbed live smallpox virus into broken skin to induce a relatively mild infection that would guard against severe infection later. They were a source of enormous fear and anger.

In January 1777, George Washington mandated inoculations for the soldiers under his command in the Continental Army, writing that if smallpox were to break out, “we should have more to dread from it, than from the Sword of the Enemy.” Notably, it was in large part the soldiers’ desires that overcame his resistance to a mandate.

“They were the ones calling for it,” said Andrew Wehrman, an associate professor of history at Central Michigan University who studies the politics of medicine in the colonial and revolutionary eras. “There’s no record that I have seen — and I’ve looked — of any soldier turning it down, protesting it.”

Buoyed by the success of the mandate, Washington wrote to his brother in June 1777 that he was upset by a Virginia law restricting inoculations. “I would rather move for a Law to compell the Masters of Families to inoculate every Child born within a certain limitted time under severe Penalties,” he wrote.

Over the next century, many local governments did exactly that. Professor Wehrman this week tweeted an example of what, in an interview, he said was a “ubiquitous” phenomenon: The health board in Urbana, Ohio, Jordan’s hometown, enacted a requirement in 1867 that in any future epidemic, “the heads of families must see that all the members of their families have been vaccinated.”

But by the end of the 1800s, opposition was louder and more widespread. Some states, particularly in the West, introduced laws prohibiting vaccine mandates. Others narrowly passed mandates after intense debate.

The reasons for resistance were myriad: Some Americans opposed mandates on the grounds of personal liberty; some because they believed lawmakers were in cahoots with vaccine makers; and some because of safety concerns that were, to be fair, more grounded in reality than the modern equivalent. Vaccines then were not regulated the way they are now, and there were documented cases of doses contaminated with tetanus.

The government’s response resembled what, today, are wild conspiracy theories. Contrary to the assertions of some on the far right, the Biden administration has never suggested going door to door to force people to take coronavirus vaccines. But in the 1890s and 1900s, that actually happened: Squads of men would enter people’s homes in the middle of the night, breaking down doors if necessary, to inject people with smallpox vaccines. 

Legally speaking, the Supreme Court . . .

Continue reading. There’s more.

I’ll point out that the deadly scourge of smallpox, which killed millions upon millions, was ended by vaccines. Smallpox is now an extinct disease — no thanks to anti-vaxxers.

Written by Leisureguy

9 September 2021 at 4:49 pm

The road from Nixon

leave a comment »

Heather Cox Richardson lays out recent history in showing how the dominoes topple after Nixon’s criminal presidency:

On this day in 1974, President Gerald Ford granted “a full, free, and absolute pardon unto Richard Nixon for all offenses against the United States which he, Richard Nixon, has committed or may have committed or taken part in during the period from January 20, 1969 through August 9, 1974.” Ford said he was issuing the pardon to keep from roiling the “tranquility” the nation had begun to enjoy since Nixon stepped down. If Nixon were indicted and brought to trial, the trial would “cause prolonged and divisive debate over the propriety of exposing to further punishment and degradation a man who has already paid the unprecedented penalty of relinquishing the highest elective office of the United States.”

Ford later said that he issued the pardon with the understanding that accepting a pardon was an admission of guilt. But Nixon refused to accept responsibility for the events surrounding the break-in at the headquarters of the Democratic National Committee in Washington, D.C.’s fashionable Watergate office building. He continued to maintain that he had done nothing wrong but was hounded from office by a “liberal” media.

Rather than being chastised by Watergate and the political fallout from it, a faction of Republicans continued to support the idea that Nixon had done nothing wrong when he covered up an attack on the Democrats before the 1972 election. Those Republicans followed Nixon’s strategy of dividing Americans. Part of that polarization was an increasing conviction that Republicans were justified in undercutting Democrats, who were somehow anti-American, even if it meant breaking laws.

In the 1980s, members of the Reagan administration did just that. They were so determined to provide funds for the Nicaraguan Contras, who were fighting the leftist Sandinista government, that they ignored a law passed by a Democratic Congress against such aid. In a terribly complicated plan, administration officials, led by National Security Adviser John Poindexter and his deputy Oliver North, secretly sold arms to Iran, which was on the U.S. terror list and thus ineligible for such a purchase, to try to put pressure on Iranian-backed Lebanese terrorists who were holding U.S. hostages. The other side of the deal was that they illegally funneled the money from the sales to the Contras.

Although Poindexter, North, and North’s secretary, Fawn Hall, destroyed crucial documents, enough evidence remained to indict more than a dozen participants, including Poindexter, North, Defense Secretary Caspar Weinberger, National Security Adviser Robert McFarlane, Assistant Secretary of State Elliott Abrams, and four CIA officials. But when he became president himself, Reagan’s vice president George H.W. Bush, himself a former CIA director and implicated in the scandal, pardoned those convicted or likely to be. He was advised to do so by his attorney general, William Barr (who later became attorney general for President Donald Trump).

With his attempt to use foreign policy to get himself reelected, Trump took attacks on democracy to a new level. In July 2019, he withheld congressionally appropriated money from Ukraine in order to force the country’s new president, Volodymyr Zelensky, to announce he was opening an investigation into the son of then–Democratic presidential hopeful Joe Biden. That is, Trump used the weight of the U.S. government and its enormous power in foreign affairs to try to hamstring his Democratic opponent. When the story broke, Democrats in the House of Representatives called this attack on our democracy for what it was and impeached him, but Republicans voted to acquit.

It was a straight line from 2019’s attack to that of the weeks after the 2020 election, when the former president did all he could to stop the certification of the vote for Democrat Joe Biden. By January 6, though, Trump’s disdain for the law had . . .

Continue reading. There’s more.

Written by Leisureguy

9 September 2021 at 9:34 am

System Error: An interesting discussion about tradeoffs in technology and society

leave a comment »

You can watch this discussion (which appeared in Browser) or read it below.

Uri: Hello. I’m delighted to be here today with three Stanford professors – philosopher Rob Reich, political scientist Jeremy Weinstein and computer scientist Mehran Sahami – who are authors of the new book System Error: Where Big Tech Went Wrong and How We Can Reboot. Thank you all so much for being here today.

We’re going to play a very simple game we call The Last Word, where we ask you to answer difficult questions in a very specific number of words. Rob, we’ll start with you. Could you please tell us what this book is all about in exactly ten words?

Rob: [smiles] Alright: [counts on fingers] Reenergizing democratic institutions through the sensible regulation of Big Tech.

Uri: That was fantastic

Jeremy: Wow

Uri: Obviously the relationship between Big Tech and the democratic process, and our values as a society, is a very prominent topic on everyone’s minds these days, though often with more sound than light. I was wondering if you can tell us about the three perspectives you’re bringing to it, and what you hope to achieve with the book.

Jeremy: So let me start by building on Rob’s ten-word answer: in this moment, many people around the United States and around the world, feel that the effects of technology are washing over them. That it’s a wave that you have no agency in shaping or influencing. And our view is that we need to pivot that discussion and recognise that there’s profound agency that people have – as technologists who design technology, as users of technology, as citizens in a democratic society – and that ultimately the effects of technology are something that we can impact, impact by ensuring that our values are reflected in technology as it’s designed, and impact by shaping the way that government mitigates the harms of technology that is all around us.

Mehran: I think part of the message of the book as well is thinking not only in the big picture but also understanding what are the details of the technology and how they’re impacting people’s lives. So things like automated decision-making that are now using AI techniques to make consequential decisions in people’s lives; what happens with the future of work as AI scales; issues around privacy, as information about us is being gathered online and aggregated; and ultimately something many people are familiar with, the misinformation and disinformation that flows through social networks. So being able to disaggregate those technologies and understand the forces that are at play creates a greater urgency about why we need to do something about them.

Rob: The spirit of the book is after four years of teaching a class together at Stanford – in the belly of the beast of Silicon Valley, as it were – we wanted to try to expand the conversation in trying to reach really talented undergraduates using a technological lens, policy lens, and a philosophy lens to broaden the conversation.

And as Jeremy described, the book has answers of a certain kind to the dilemmas or problems of Big Tech, but they’re not a policy blueprint – “if only Congress would take our answers, things would miraculously get much better” – rather, it’s a way of shaping a new conversation and a new framework for thinking about the trade-offs that are encoded in the various products that Silicon Valley and Big Tech has brought to the world, and ensuring that the decisions that get made in the corporate boardrooms and product development lifecycles of the big tech companies are not the ones that are imposed upon the rest of us, because we haven’t exercised our own agency in trying to shape a technological future worth having.

Uri: I have to say that the book was very uncomfortable for me, as a young person who went through a similar university and had that feeling that these questions of values didn’t come up as much, and that we did all feel a little powerless, like we were a part of a bigger system that shaped us and which was out of our control. Which I think a lot of people feel, and I think that’s something really great about the way you’ve approached this and made us aware of how we’ve been shaped so far, but also an empowering story about what we can do, which I really appreciated.

Rob: Let me just add to that, if I can Uri – I’m a long time Browser reader, subscriber, I have some sense of maybe of the community of people who are likely to be listening. And there’s a sense in which of course it’s important that technological and scientific progress have delivered extraordinary benefits to societies and to individuals. And the question is not about, as it were, a values conversation that the philosopher or the policy maker shows up and says, stop, we need to slow it all down and make sure that we have a broader conversation that effectively brings a halt to technological progress.

To the contrary, the idea is that the interesting aspects of an ethics conversation and a policy conversation are really not about right and wrong, or true and false choices about technology or science, but rather about better and worse social outcomes. The ways in which so many of the technological advances of the past hundred or 200 years, when they are brought to market typically by private companies, and then the market consolidates, they exercise an extraordinary effect on society. And it’s the task of all of us to harness the enormous benefits and then to try to mitigate some of the harms. And that’s a task that goes far beyond the decision-making of people in companies alone.

This is why at the end of the day, I think ethics is an energising way of thinking about technology, not “the moral police have shown up to the technologists and told them when to stop.”

Uri: Absolutely. And well, on that note, Jeremy you are, I believe a philosopher who has spent time in government. I don’t know if that’s a rare beast.

Jeremy: Not a philosopher. I’m a political scientist who spent time in government, which is also a relatively rare beast.

Uri: So I was wondering if you could tell us in exactly five words, what you think are the main challenges in the ways that social values get stymied, or challenged, or fail to be implemented through the process of government?

Jeremy: [thinks]: building consensus around shared goals.

Uri: You are all so good at this, I’m absolutely gobsmacked.

Jeremy: Now can I add two sentences beyond that?

Uri: Please do, please do.

Jeremy: So in the book we write about democracy as a technology. Democracy is the technology that our society and many other societies have chosen to help us navigate really difficult value trade-offs, that as a collective of human beings living together where we can’t have everything we want, not everyone can get the outcomes they want, we have to make some choices.

And you can think about lots of different ways of making those choices. You could think about those choices being made by a single individual, like a king or the Pope, which was one way that societies used to organise themselves. You could think about leaving those decisions to companies, and that’s been a bit of the mode that we’ve been in with Big Tech. And this book is an argument about the role of our democratic institutions in making those choices. And the reason it’s hard to make those choices, and why I chose the words that I did, is that people want different things and they want them very enthusiastically, and they’re very unhappy when they don’t get the things that they want.

So this process of deliberation, and negotiation, and contestation, that’s what politics is all about. And right now we’re at a moment of a tremendous lack of faith in our democratic institutions and an inability to bridge the partisan divides in the United States. But it doesn’t mean that there’s some alternative way to accomplish that underlying task, that is the task of our democracy.

Rob: There’s a mistake that I think I perceive that technologists make sometimes  – and we discussed this in the book some – the important part for any reader to understand if they’re trying to figure out what’s going on in Big Tech: you don’t need to . . .

Continue reading. There’s more.

Written by Leisureguy

8 September 2021 at 2:56 pm

The sluggish response when a government is no longer focused on public service

leave a comment »

Stephen Engelberg, Editor-in-Chief of ProPublica, sends out a regular newsletter. This is from the one I received this morning:

More than a decade ago, a young reporter named Sasha Chavkin filed a story for ProPublica about the sort of bureaucratic indifference that makes people hate their government. Across the country, thousands of people who had suffered grievous injuries that prevented them from working were being hounded for student loans they had no chance of repaying. Many had been classified as disabled by the Social Security Administration and were already receiving government support. But the Department of Education, which handles loan forgiveness, insisted that borrowers jump through a separate set of hoops to prove they were unable to work. In some cases, the department was garnishing Social Security payments sent to people with disabilities who were in arrears on their loans.

We published Sasha’s story on Feb. 13, 2011. It introduced readers to Tina Brooks, a former police officer who fractured a vertebra in her back and damaged three others in her neck when she plunged 15 feet down a steep quarry while training for bicycle patrol. Although five doctors and a judge from Social Security all agreed that she was fully disabled, Education Department officials continued to insist she pay off $43,000 in loans.

It was one of those stories where each paragraph makes you madder.

“I’m a cop, and I know how to fill out paperwork,” Brooks told Sasha. “But when you’re trying to comply with people and they’re not telling you the rules, I might as well beat my head on the wall.”

ProPublica is unusual among news organizations in that we measure our success by the tangible impact our stories achieve. As editors and reporters, we are trained to try to make every story well-written, fair, solidly documented and maybe even prizeworthy. But Herb and Marion Sandler, the founders of ProPublica, said from the very beginning that they had a higher goal for ProPublica: that our stories should make a difference.

It’s a tough target to hit. Journalists, myself included, are notoriously poor at forecasting which stories will spur change. Sometimes, we reveal utterly outrageous abuses and the reaction

is muted. Other times, people explode with anger and change comes overnight. New reporters hired from other organizations frequently ask: What’s a ProPublica story? My answer is that readers should finish one of our investigative articles with a clear understanding of what’s gone wrong and to whom they should send a blistering letter (or email) demanding immediate action.

I expected our 2011 story on disabilities and student loans to prompt swift action. Congress had already demanded that the Department of Education improve its handling of disability cases. An internal audit, which we obtained, had found that the department was failing to follow its own rules. It seemed like a political no-brainer to intervene, both for members of Congress and for the Obama administration. They stood to earn kudos for adopting an approach that is both required by law and a gesture of human decency.

For reasons that are not entirely clear, little of that happened. The Education Department made some modest improvements but continued to insist that people fill out applications for relief. The process remained cumbersome, and the burden remained on the disabled person to prove they were entitled to relief. Few loans were forgiven.

It was only last month that the department announced that it was enacting a new policy in which people deemed severely disabled by the SSA would automatically have their loans forgiven. The technique? A simple computer search that would match the names of people receiving disability payments with names of student loan borrowers. Officials said they would be writing off a staggering $5.8 billion in loans. Clearly, the existing procedures had not worked for the vast majority of disabled borrowers.

I asked Sasha what finally made the difference. His answer, not surprisingly, was politics. The left wing of the Democratic Party, notably Sens. Bernie Sanders and Elizabeth Warren, have been pressuring the Biden administration to launch a broad program of relief for 43 million Americans who owe nearly $1.6 trillion in student loans. President Joe Biden has never endorsed that idea. But as Sasha points out “this fix for disabled borrowers was something no one could reasonably oppose.” The no-brainer solution, he said, was always out there, but it “took a long time and a lot of unnecessary hardship” before it was politically beneficial to the people with the power to impose change.

It’s worth noting that this story is not yet over. The Department of Education continues to withhold debt relief from a substantial number of student loan borrowers who receive federal disability payments — people whose disabilities the SSA views as serious but that it believes have some chance of easing in the future.

Remarkably, one of the people we interviewed back in 2011, a carpenter and draftsman who suffers from chronic obstructive pulmonary disease, is among those who remain on the hook for his student loans. He has tried to return to work several times since 2011, but his medical problems made that impossible. SSA officials argue that his lung disease might someday improve enough to allow him to work.

“There’s no improving COPD,” the carpenter, Scott Creighton, said in our recent story. “Since I spoke to you last time I’ve had one pulmonary embolism and I’ve had one heart attack.”

Some have argued in recent years that we live in a post-shame era, that spotlighting outrageous wrongdoing no longer brings results. For those who feel that is true, I suggest you visit the page on which we list stories that have had an impact. I hope you’ll find it inspiring. I do.

Steve

Written by Leisureguy

8 September 2021 at 12:00 pm

Disinformation’s death toll

leave a comment »

Written by Leisureguy

7 September 2021 at 11:21 am

%d bloggers like this: