Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Iraq War’ Category

At some level, George W. Bush recognizes what he did in Iraq

leave a comment »

Written by Leisureguy

19 May 2022 at 1:10 pm

War sent America off the rails 19 years ago. Could another one bring it back?n

leave a comment »

Mission accomplished? Not quite. In this May 2003 photo, George W. Bush declares the end of major combat in Iraq as he speaks aboard an aircraft carrier off the California coast. The war dragged on for many years after that. (AP Photo/J. Scott Applewhite, File)

The US invasion of Iraq was an act of hubris that killed hundreds of thousands and cost hundreds of billions of dollars and left a stain on the US that persists to this day. Jason Opal, Associate Professor of History and Chair, History and Classical Studies, McGill University, writes in The Conversation:

At the start of 2022, the right to vote, the rule of law and even the existence of facts seemed to be in grave peril in the United States.

Explanations for this crisis ranged from the decades-long decline of the American middle class to the more recent rise of social media and its unique capacity to spread lies.

In truth, many factors were at play, but the most direct cause of America’s harrowing descent — the one event that arguably set the others in motion — began 19 years ago.

War by choice

On March 19, 2003, George W. Bush and his neoconservative brain trust launched the Iraq war because of the alleged threat of Saddam Hussein’s mothballed weapons [and many pointed out that this threat was fictitious – LG]. Bush and his advisers believed in using military force to spread American political and economic might around the globe.

It was an ideology both foolish and fanatical, the pet project of a tiny circle of well-connected warmongers. Bush himself had lost the popular vote in 2000 and was slumping in the polls before Sept. 11, 2001.

But no one wanted to look weak after the terrorist attacks, and so, in one of the last bipartisan gestures of the past two decades, U.S. senators from Hillary Clinton to Mitch McConnell voted for war in the Middle East.

Having sold the invasion with bad faith and bluster, the neocons planned it with hubris and incompetence. Against the professional advice of the U.S. military, they sought to destroy Saddam Hussein’s regime with minimal ground forces, whereupon they would dismantle the Iraqi state and invite private contractors to somehow rebuild the place.

At first, their fantasies swept to victory. But by 2004, the country they had shattered began to lash out at both the invaders and itself, and by 2006 the singular disaster of our times began to spread.

Butterfly effects

Some two million Iraqis decamped to Syria and Jordan and even more fled to places within Iraq, where the ghoulish seeds of ISIS began to grow.

When ISIS spread following the U.S. withdrawal from Iraq in 2011, a second wave of refugees sought shelter in Europe. This stoked nationalism and helped propel Brexit to a stunning win in the United Kingdom. . .

Continue reading. There’s more. 

The US started the sequence, and the dominoes continued to topple in turn. Karl Rove famously said that Bush administration created its own reality, but he failed to recognize what a slipshod job it was doing.

Written by Leisureguy

20 March 2022 at 2:15 pm

Why Humans Wage War

leave a comment »

WAR HISTORIAN: Margaret MacMillan has a personal interest in her subject. Her father and both her grandfathers served in wars, and her great grandfather was David Lloyd George, Britain’s prime minister during World War I.Ander McIntyre

Nautilus in January of last year had an interesting interview of Margaret MacMillan by Steve Paulson, which began:

In 1991 two hikers in the Italian Alps stumbled on a mummified body buried in the ice. The Iceman, it turned out, died more than 5,000 years ago. At first, archeologists assumed he’d fallen in a snowstorm and frozen to death. Then they discovered various cuts and bruises on his body and an arrowhead embedded in his shoulder. They also found traces of blood on the stone knife he was carrying. Most likely, he died fighting.

Canadian historian Margaret MacMillan regards the Iceman story as emblematic of our violent tendencies. Humans are a quarrelsome lot with a special talent for waging war. In her book War: How Conflict Shaped Us, she argues that warfare is so deeply embedded in human history that we barely recognize its ripple effects. Some are obvious, like the rise and fall of nations, but others can be surprising. For all that we cherish peace, war has also galvanized social and political change, sometimes for the better. It’s also sparked scientific advances. 

MacMillan is the author of several highly regarded histories of war and peace. She also has a personal interest in this subject. Her father and both her grandfathers served in wars, and her great grandfather was David Lloyd George, Britain’s prime minister during World War I. But she says her family history isn’t that unusual. “I’m in my 70s and most of us have had family members who were in the First World War or the Second World War or knew someone who was in either war,” she told me. 

MacMillan synthesizes a vast body of literature about war, from battlefield accounts to theories of war, and she shows how new technologies and weaponry have repeatedly changed the course of history. As I discovered during our conversation, she’s especially interested in the question she poses at the beginning of her book: “Does war bring out the bestial side of human nature or the best?”

Do you think human beings are inherently violent?

I come down on the side that we’re not inherently violent but we may have violent tendencies that evolution has left us. When we’re afraid, we have a tendency to lash out, but I don’t think that means we are necessarily violent. We often see examples of altruism and people living together. What is more important is why people fight—and I’m thinking of war, not just random one-on-one fighting. People fight wars because of organization, ideas, and cultural values. The more organized we are, unfortunately, the better we seem to get at fighting. War is very organized. It’s not the brawl you get outside a bar or the random violence you might get when someone feels frightened.

Steven Pinker says human beings are getting less violent, especially since the Enlightenment. What do you think of his argument?

It’s a very interesting argument, which he makes with great evidence and subtlety. We no longer have prizefights where people batter each other to death. We no longer have public executions. And in most developed societies and many less developed societies, the homicide rates are way down. Your own country, the United States, is something of an outlier there. I think his argument that we are becoming more peaceful in domestic societies is right. But I don’t think that’s war. War is something different.

There’s a very interesting counterargument by Richard Wrangham called “the goodness paradox.” He argues that we have, in fact, become nicer and less violent as individuals. We may have domesticated ourselves by our choice of mates and by breeding out those who are most violent, or killing those who are most violent among us, like the way wolves have been domesticated into friendly dogs who sit on your lap. We may have become nicer as individuals, but we’ve also become better at organizing and using purposive violence. That’s the paradox. We’ve gotten better at making war even as we’ve become nicer people.

Isn’t waging war actually uncommon in the animal kingdom?

Well, our nearest cousins, the chimpanzees, do seem to wage war. Chimpanzees will stake out their own territory and male chimpanzees will go out in bands to patrol that territory. If an unfortunate chimpanzee from another band stumbles into that territory, the chimpanzees will gang up and kill the intruder. But our other close cousins in the animal kingdom, the bonobo, do live in harmony and peace and don’t react with violence to outside bonobos coming in. It may be because chimpanzees have natural predators and bonobos, for geographical reasons, don’t.

It’s worth pointing out that bonobos are matriarchal, whereas chimpanzees are dominated by the big males.

And that leads to a very interesting speculation. Are men more likely to . . .

Continue reading.

Written by Leisureguy

27 February 2022 at 2:12 pm

Great movie: “The Ghost Writer”

leave a comment »

Directed by Roman Polanski, with Ewan MacGregor, Pierce Brosnan, Olivia Williams, Kim Cattrall, Timothy Hutton, Tom Wilkinson, et al. Worth watching closely. Netflix. From 2010

Written by Leisureguy

4 January 2022 at 12:00 am

The Human Toll of America’s Air Wars

leave a comment »

Azmat Khan reports in the NY Times on the effects of US Air Force attacks. (Link is a gift link that bypasses the paywall.) The report begins:

For Ali Fathi Zeidan and his extended family, West Mosul was in 2016 still the best of many bad options. Their longtime home in a nearby village, Wana, had been taken by ISIS, then retaken by Kurdish pesh merga forces, and — as if that were not enough — it stood just seven miles below the crumbling Mosul Dam, which engineers had long warned might soon collapse, creating a deluge that would kill everyone in its path. The family had avoided the camps for internally displaced people, where they would have faced a constant risk of separation, and found their way instead to the city, to a grimy industrial neighborhood called Yabisat. They moved into a storage facility, divided it up into separate rooms, brought in a water tank, built a kitchen and a bathroom. Though ISIS had taken Mosul, parts of the city were still relatively safe. Now it was home.

Family was everywhere. Zeidan’s daughter Ghazala was married to a man named Muhammad Ahmed Araj, who grew up in the neighborhood. Araj’s brother, Abdul Aziz Ahmed Araj, lived nearby in a small, crowded apartment. Zeidan’s other daughter moved into an apartment on the other side of Mosul with her husband and their six children, but one of them, 11-year-old Sawsan, preferred to spend her time across town in Yabisat: She was attached to her grandparents and loved playing with her cousins.

Sawsan had been staying with her grandparents for a week when the whole family sat down to dinner on March 5, 2016. All told, there were 21 people around the table. None of them knew that their Iraqi neighborhood was at that moment in the cross hairs of the American military.

Weeks before, Delta Force commandos had captured a high-ranking operative in ISIS’ burgeoning chemical-weapons program, and the information he provided interrogators led military officials to a chemical-weapons production plant in Yabisat; observers had been studying the site for weeks, by way of surveillance flights.

On March 2, military officials presented their findings for validation, as part of the Pentagon’s “deliberate targeting” process, which — as opposed to the rapid process of targeting in the heat of battle — required vetting at multiple levels and stages across the U.S.-led coalition. It had all the makings of a good strike. Unlike with so many other targets, military officials had human intelligence directly from the enemy and video surveillance that showed clear target sites.

They had also concluded that there was no civilian presence within the target compound. Though the surveillance video had captured 10 children playing near the target structure, the military officials who reviewed this footage determined the children would not be harmed by a nighttime strike because they did not live there: They were classified as “transient,” merely passing through during daylight hours.

But as investigators later documented, during the target-validation process one U.S. official disputed this conclusion: A “representative” with the United States Agency for International Development said that the children and their families most likely lived at or around the target compound. In the current environment, she argued, parents would be unlikely to let their children stray far from home. In her view, the determination that there was “no civilian presence” at the target was wrong, and authorizing the strike could lead to the deaths of these children and their parents and families. Military officials dismissed her concerns and authorized the strike.

Three days later, on the evening of March 5, Abdul Aziz heard the explosions, maybe a dozen in all. They came from the direction of his brother’s house. He wanted to see what happened, but because bombings were often accompanied by a second round of missiles, he waited. Later, when he approached the block, he saw the flames and fire consuming what was once his brother’s home. “The place was flattened,” he told me when I first met him, nearly four years later. “It was just rocks and destruction. There was fire everywhere.” They returned at dawn, with blankets to carry the dead. “We searched for our relatives,” he told me, “picking them up piece by piece and wrapping them.”

Across town, Ali Younes Muhammad Sultan, Sawsan’s father, heard the news from his brother. Everyone at the dinner had been killed: Zeidan and his wife, Nofa; Araj, Ghazala and their four children; Zeidan’s adult son Hussein, Hussein’s wife and their six children; Zeidan’s adult son Hassan, Hassan’s wife and their two children; and Sawsan, their own beloved daughter. Sultan and his wife went to the hospital where Sawsan’s remains were taken.

“If it weren’t for her clothes, I wouldn’t have even known it was her,” he later told me. “She was just pieces of meat. I recognized her only because she was wearing the purple dress that I bought for her a few days before. It’s indescribable. I can’t put it into words. My wife — she didn’t even know whether to go to her daughter, or the rest of the family first. It is just too hard to describe. We’re still in denial and disbelief. To this day, we cannot believe what happened. That day changed everything for us.”

In the immediate aftermath of the strike, Defense Department officials lauded it as an intelligence coup. But doubts quickly began to surface. A series of ISIS videos taken at the hospital and the strike site was posted online, showing the burned and bloody corpses of children. The coalition opened a civilian casualty review.

The Pentagon’s review process is one of the few, if indeed not the only, means by which the U.S. military holds itself to account with regard to civilian casualties as it executes its air wars. The coalition has conducted at least 2,866 such assessments since the air war against ISIS in Iraq and Syria began in August 2014, but little more than a dozen of the resulting reports have ever been made public until now. Instead, each month,

Continue reading. There’s much more, including photos and an audio recording of the article read aloud.

Again, this gift link should bypass the paywall — and the report has much more worth reading.

Written by Leisureguy

2 January 2022 at 3:13 pm

The Cost of Sentimentalizing War

leave a comment »

Carlos Lozada reviews a new book by Elizabeth Samet in the New Yorker:

The terrorist strikes of September 11, 2001, supposedly launched a new kind of American war, with unfamiliar foes, unlikely alliances, and unthinkable tactics. But the language deployed to interpret this conflict was decidedly old-school, the comfort food of martial rhetoric. With the Axis of Evil, the menace of Fascism (remixed as “Islamofascism”), and the Pearl Harbor references, the Second World War hovered over what would become known as the global war on terror, infusing it with righteousness. This latest war, President George W. Bush said, would have a scope and a stature evoking the American response to that other attack on the U.S. “one Sunday in 1941.” It wouldn’t be like Desert Storm, a conflict tightly bounded in time and space; instead, it was a call to global engagement and even to national greatness. “This generation will lift the dark threat of violence from our people and our future,” Bush avowed.

Elizabeth D. Samet finds such familiarity endlessly familiar. “Every American exercise of military force since World War II, at least in the eyes of its architects, has inherited that war’s moral justification and been understood as its offspring: motivated by its memory, prosecuted in its shadow, inevitably measured against it,” she writes in “Looking for the Good War: American Amnesia and the Violent Pursuit of Happiness” (Farrar, Straus & Giroux). A professor of English at West Point and the author of works on literature, leadership, and the military, Samet offers a cultural and literary counterpoint to the Ambrose-Brokaw-Spielberg industrial complex of Second World War remembrance, and something of a meditation on memory itself. It’s not simply that subsequent fights didn’t resemble the Second World War, she contends; it’s that the war itself does not resemble our manufactured memories of it, particularly the gushing accounts that enveloped its fiftieth anniversary. “The so-called greatness of the Greatest Generation is a fiction,” she argues, “suffused with nostalgia and with a need to return to some finest hour.” Those who forget the past may be condemned to repeat it, but those who sentimentalize the past are rewarded with best-seller status.

The mythology of the Second World War features six main elements, by Samet’s tally: that the United States joined the war in order to rid the world of tyranny and Fascism; that “all Americans were absolutely united” in their commitment to the fight; that “everyone” in the country sacrificed; that Americans got into the war reluctantly and then waged it decently; that the war was tragic but ended on a happy note; and, finally, that “everyone has always agreed” on the first five points.

The word choices here—“all,” “absolutely,” “everyone,” and “always”—do stretch the myths to the point of easy refutability, but some of the best-known popular chronicles clearly display the tendencies Samet decries. “Citizen Soldiers,” Stephen Ambrose’s 1997 book about Allied troops in Europe, presents the reticence of American G.I.s in describing their motivations as a kind of self-conscious idealism and aw-shucks humility. “They knew they were fighting for decency and democracy and they were proud of it,” Ambrose writes. “They just didn’t talk or write about it.” But, without such oral or written records, can one really divine such noble impulses? Samet dismisses Ambrose’s œuvre, including the nineteen-nineties best-sellers, “Band of Brothers” and “D-Day,” as “less historical analysis than comic-book thought bubble.” Obsessed with notions of masculinity and chivalry, Ambrose indulges in “a fantasy that American soldiers somehow preserved a boyish innocence amid the slaughter,” she writes. If anything, the boyish innocence may belong to Ambrose himself, who admits that he grew up venerating veterans of the Second World War, a youthful hero worship that, Samet notes, “tends to overwhelm the historian’s mandate.”

For a more accurate account, Samet highlights a multivolume study, “The American Soldier,” by the sociologist Samuel Stouffer and a team of collaborators. During the war, they studied the ideological motives of American troops, and concluded that, “beyond acceptance of the war as a necessity forced upon the United States by an aggressor, there was little support of attempts to give the war meaning in terms of principles and causes.” Samet finds this real-time depiction of a nonideological American soldier to be credible. In the words of the military sociologist Charles C. Moskos, who studied the motivations of soldiers in the Second World War and in Vietnam, each man fights a “very private war . . . for his own survival.” Or, as John Hersey put it in a later foreword to “Into the Valley,” his narrative of U.S. marines battling on Guadalcanal, the soldiers fought “to get the damn thing over and go home.”

Samet argues that Steven Spielberg’s blockbuster movie “Saving Private Ryan,” from 1998, is “wholly unrepresentative” of Second World War attitudes toward the individual soldier. She contrasts the 1949 film “Twelve O’Clock High,” in which a brigadier general (played by Gregory Peck) insists that his men place collective loyalties above personal ones. After one pilot breaks formation, during a sortie over Nazi Europe, in order to assist a fellow-aviator at risk of being shot down, Peck lashes out, “You violated group integrity. . . . The one thing which is never expendable is your obligation to this group. . . . That has to be your loyalty—your only reason for being.” By focussing on the fate of a single survivor, Samet writes, Spielberg’s film “effectively transforms the conflict from one characterized by mass mobilization and modern industrial warfare to something more old-fashioned, recalling the heroism of ancient epics,” in which individual glories and tragedies take narrative precedence over the wider war.

Samet is particularly harsh on Tom Brokaw’s “The Greatest Generation,” also from 1998, with its “explicitly messianic agenda” of showing us a cohort so packed with honor and honesty and self-sacrifice that it was, as the newsman writes, “birthmarked for greatness.” In a section titled “Shame,” Brokaw acknowledges the racism that was so “pervasive in practice and in policy” in this greatest of eras, but he responds with uplifting sketches of members of racial minorities who manage to overcome it. (“It is my country, right or wrong,” one of them concludes. “None of us can ever contribute enough.”) Samet dissents, stressing, for instance, that the conflict in the Pacific, “begun in revenge and complicated by bitter racism” against the Japanese, has been overshadowed by the less morally troubling sagas of European liberation.

“Unity must always prevail,” Samet writes of the war myths. “Public opinion must turn overnight after Pearl Harbor, while the various regional, racial, and political divisions that roiled the country must be immediately put aside as Americans rally toward a shared cause.” A more complicated reality emerges in Studs Terkel’s 1984 “ ‘The Good War’ ” (the title includes quotation marks because the notion of a good war seemed “so incongruous,” Terkel explained), an oral history that amasses the recollections of wartime merchant marines, admirals, U.S.O. entertainers, G.I.s, and nurses. Their views on the war span “the sentimental and the disillusioned, the jingoistic and the thoughtfully patriotic, the nostalgic and the dismissive,” Samet writes.

To investigate cultural attitudes toward G.I.s in the aftermath of the war, she considers such novels as John Horne Burns’s “The Gallery” (1947), in which American soldiers in Italy engage in black-market transactions with locals; and such movies as “Suddenly” (1954), in which Frank Sinatra portrays a veteran turned contract killer who hopes that his war record will win him sympathy. (“I’m no traitor, Sheriff. I won a Silver Star.”) In other noir films of the era, returning G.I.s are loners disillusioned not just with the war and the years taken from them but also with what their country seemed to have become in their absence: hard, greedy, indifferent. Samet even scours military handbooks, including a 1945 one, memorably titled “112 Gripes About the French,” which admonished American G.I.s that they “didn’t come to Europe to save the French,” or “to do anyone any favors,” so they should stop stomping through the Continent as though expecting everyone’s gratitude. Not exactly “Band of Brothers,” is it?

There is a before-and-after quality to the Second World War in American political writing. The adjective “postwar” still clings to this one conflict, as if no American soldiers had wielded weapons in battle since. But if memories of one conflict shape attitudes toward the next, Samet writes, then the Good War legend has served “as prologue to three-quarters of a century of misbegotten ones.” There’s plenty of support for this quandary. In “A Bright Shining Lie: John Paul Vann and America in Vietnam” (1988), Neil Sheehan identified the “disease of victory,” wherein U.S. leaders, particularly in the military ranks, succumbed to postwar complacency and overconfidence. Samet recalls the reflections of Rear Admiral Gene La Rocque, a Second World War veteran who retired during Vietnam, and who told Terkel that “the twisted memory” of the Good War “encourages the men of my generation to be willing, almost eager, to use military force anywhere in the world.”

Memories of the Good War also helped shape the views of military life held by the men who fought in Vietnam. Samet takes up Philip Caputo’s Vietnam memoir . . .

Continue reading. There’s more — or just read Samet’s book.

I think what the book discusses why the US is the most war-inclined nation on earth, constantly involved in formal wars and in covert military operations, nonstop.

An Air Force sergeant killed himself on the steps of the Lincoln Memorial. The note he left is heartbreaking.

leave a comment »

The US is not doing right by its veterans, nor by its armed forces.

Petula Dvorak writes in the Washington Post (and that’s a gift link: no paywall):

Kenneth Omar Santiago’s perfect smile dazzles on social media as he poses in his Air Force uniforms — flight suits to mess dress.

He accepts military awards, travels to far-off places, salsa dances and swims with sharks to oohs and aahs from friends in Lowell, Mass., his hometown.

“He’s got it all,” more than one commented.

Before Veterans Day, he posted a 1,116 word message, his longest yet.

Then, in a green T-shirt with an American flag emblazoned across his chest, the 31-year-old walked to the steps of the Lincoln Memorial and shot himself.

Statistics tell us at least 16 other members of the military community also took their lives that Monday night and every night — the average daily toll — leading up to Veterans Day, when the nation thanks veterans for their service with a free 10-piece order of boneless chicken wings or a free doughnut.

At 7:09 p.m., minutes after he posted the note, his friends began responding:

“Kenny, you are loved. Do not do this!!”

“Hey, you are not alone!! Rob is trying to call you now.”

“Santi for the love of god don’t do this.”

“Call his unit.”

“Call the cops!”

“Command post is tracking.”

But by then, two nurses visiting the memorial at night were trying to give him CPR. A medevac helicopter flew in minutes later, landing next to the Reflecting Pool to take Santiago to the hospital. He was pronounced dead hours later, 1 a.m. on Tuesday Nov. 9, police said.

Naveed Shah reposted a video of that helicopter landing when he saw it on social media.

It made Shah, an Army veteran and political director of the veteran’s group Common Defense, furious.

“In the past decade that I have spent in veterans advocacy, much has been done about the veterans suicide epidemic with few results,” Shah said. “Santiago’s death in this hallowed place, at this time of reverence for veterans, perhaps should provide pause for government officials and elected leaders in Washington to consider the impact 20 years of wars have had on our armed forces.”

Veterans know it’s bad and it’s going to get worse, with the 20-year anniversary of the Sept. 11 terrorist attacks, the withdrawal of troops from Afghanistan and the covid-19 death rate in the military doubling these past few months.

And when we tell them to go get help, help is hard to find. There’s a “severe occupational staffing shortage” in more than half of the psychiatric facilities veterans are sent to, according to the September Inspector General’s report on the Department of Veterans Affairs.

The struggle to get treatment has always been there for veterans. Take an equally public suicide eight years ago across the National Mall, at the other end of the cross that makes America’s most iconic space. Vietnam War veteran John Constantino saluted the white dome of the Capitol and immolated himself. At the time, his family attorney said it was the result of “a long battle with mental illness.”

Constantino’s death was public, laden with symbolism, just like Santiago’s.

“Nobody ever knows who is struggling or [waging] wars the eye cannot see. What does chronic depression even look like?” Santiago wrote in his note, which he double-posted on Instagram and Facebook, along with a slide show of him as a baby, with family, in Bali, at games, at work. “At times I think my close friends just tolerate me. Moreover, I feel truly alone. I always have. For a long time (years) I’ve known I would take my own life.”

His friends told me they wish he could’ve shared this when he was alive.

“In the military, he had to always have this front, he had to always appear strong,” said Sarah Kanellas, one of his childhood friends from Lowell, Mass. Her partner is in the military, and she knows that no matter what military officials say, there’s a stigma.

“You know how in basic training they break them down so they can build them back up? I get it, I know why they have to do that,” Kanellas said. “But they need to make mental health part of the building back up.”

Military bigwigs say they’re doing this. Defense Secretary Lloyd Austin often says “mental health is health.”

And this week in his Veterans Day statement, Austin said: “We are working so hard to provide the best medical and mental health care possible for those whose military service has concluded. We must prove capable of treating the wounds we see, as well as the ones we cannot see.”

But that message hasn’t trickled down to the troops. . .

Continue reading. Gift link = no paywall.

Written by Leisureguy

11 November 2021 at 10:16 pm

After 9/11, a rush of national unity. Then, quickly, more and new divisions.

leave a comment »

Dan Balz had an interesting column in the Washington Post yesterday. (The gift link I used by-passes the paywall.) The column begins:

On Monday, the leaders of Congress are to gather with colleagues at noon for a bipartisan ceremony marking the terrorist attacks of Sept. 11, 2001. It will be reminiscent of the gathering on the night of the attacks, when members of Congress, many holding small American flags, stood on the Capitol steps and spontaneously sang “God Bless America.” But so much has changed.

Twenty years ago, members of Congress were joined in a determined and resilient expression of national unity at an unprecedented moment in the nation’s history, a day that brought deaths and heroism but also shock, fear and confusion. Monday’s ceremony will no doubt be somber in its remembrance of what was lost that day, but it will come not as expression of a united America but simply as a momentary cessation in political wars that rage and have deepened in the years since those attacks.

In a video message to Americans released Friday, President Biden spoke of how 9/11 had united the country and said that moment represented “America at its best.” He called such unity “our greatest strength” while noting it is “all too rare.” The unity that followed the attacks didn’t last long. Americans reverted more quickly than some analysts expected to older patterns of partisanship. With time, new divisions over new issues have emerged, and they make the prospect of a united nation ever more distant.

On a day for somber tribute, the man who was president on 9/11, George W. Bush, spoke most directly of those new divisions — and threats — in a speech in Shanksville, Pa., where Flight 93 went down on the day of the attacks. Bush warned that dangers to the country now come not only across borders “but from violence that gathers from within.” It was an apparent but obvious reference to the attack on the Capitol on Jan. 6.

“There is little cultural overlap between violent extremists abroad and violent extremists at home,” he said. “But in their disdain for pluralism, in their disregard for human life, in their determination to defile national symbols, they are children of the same foul spirit. And it is our continuing duty to confront them.”

The question is often asked: As the United States has plunged deeper into division and discord, is there anything that could spark a change, anything big enough to become a catalyst for greater national unity? But if ­9/11 doesn’t fit that model, what does? And look what happened in the aftermath of that trauma.

For a time, the shock of the attacks did bring the country together. Bush’s approval ratings spiked to 90 percent in a rally-round-the-flag reaction that was typical when the country is faced with external threats or crises.

One notable expression of the unity at the time came from Al Gore, the former vice president who had lost the bitter 2000 election to Bush after a disputed recount in Florida and a controversial Supreme Court decision.

Speaking at a Democratic Party dinner in Iowa less than a month after the attacks, Gore called Bush “my commander in chief,” adding, “We are united behind our president, George W. Bush, behind the effort to seek justice, not revenge, to make sure this will never, ever happen again. And to make sure we have the strongest unity in America that we have ever had.” The Democratic audience rose, applauding and cheering.

Trust in government rose in those days after the attacks. Shortly after 9/11, trust in government jumped to 64 percent, up from 30 percent before the attacks, according to Public Opinion Strategies, a Republican polling firm that was closely tracking public attitudes to the attacks. By the summer of 2002, the firm found that trust had fallen back, to 39 percent.

<

Five years after the attacks, then-Sen. John McCain (R-Ariz.), now deceased, was quoted as saying that America was “more divided and more partisan than I’ve ever seen us.” Today, after many contentious elections, political warfare over economic, cultural and social issues and a domestic attack on the U.S. Capitol on Jan. 6, many Americans would say things have become worse.

As he prepared the U.S. response to the attacks by al-Qaeda in the fall of 2001, Bush made clear the United States would go it alone if necessary, assembling what was called a “coalition of the willing.” He put other nations on notice, saying the United States would hold them accountable in the campaign against the terrorists. “You’re either with us or against us in the fight,” he said.

Bush described the world in Manichaean terms: good vs. evil.

Today’s politics at home is often practiced that way. That phrase — “with us or against us” — could stand as a black-and-white expression of the way in which many Americans approach the political battles: all in with the team, red or blue, or not in at all. If you win, I lose. No middle ground.

Lack of imagination on the part of Americans had helped 9/11 to happen. No one in the upper reaches of government  . . .

Continue reading. No paywall on this one.

Written by Leisureguy

12 September 2021 at 10:35 am

After 9/11, the U.S. Got Almost Everything Wrong

leave a comment »

In the Atlantic Garrett M. Graff, a journalist, historian, and the author of The Only Plane in the Sky: An Oral History of 9/11, lays out the bad decisions after 9/11 — many of which were strongly opposed at the time (for example, many (including yours truly) vociferously opposed the (stupid) invasion of Iraq):

On the friday after 9/11, President George W. Bush visited the New York City site that the world would come to know as Ground Zero. After rescue workers shouted that they couldn’t hear him as he spoke to them through a bullhorn, he turned toward them and ad-libbed. “I can hear you,” he shouted. “The whole world hears you, and when we find these people who knocked these buildings down, they’ll hear all of us soon.” Everybody roared. At a prayer service later that day, he outlined the clear objective of the task ahead: “Our responsibility to history is already clear: to answer these attacks and rid the world of evil.”

Appearing on NBC’s Meet the Press two days later, Vice President Dick Cheney offered his own vengeful promise. “We also have to work, though, sort of the dark side, if you will,” he told the host, Tim Russert. “We’ve got to spend time in the shadows in the intelligence world. A lot of what needs to be done here will have to be done quietly, without any discussion, using sources and methods that are available to our intelligence agencies, if we’re going to be successful.” He added, “That’s the world these folks operate in, and so it’s going to be vital for us to use any means at our disposal.”

In retrospect, Cheney’s comment that morning came to define the U.S. response to the 2001 terrorist attacks over the next two decades, as the United States embraced the “dark side” to fight what was soon dubbed the “Global War on Terror” (the “GWOT” in gov-speak)—an all-encompassing, no-stone-unturned, whole-of-society, and whole-of-government fight against one of history’s great evils.

It was a colossal miscalculation.

The events of September 11, 2001, became the hinge on which all of recent American history would turn, rewriting global alliances, reorganizing the U.S. government, and even changing the feel of daily life, as security checkpoints and magnetometers proliferated inside buildings and protective bollards sprouted like kudzu along America’s streets.

I am the author of an oral history of 9/11. Two of my other books chronicle how that day changed the FBI’s counterterrorism efforts and the government’s doomsday plans. I’ve spent much of this year working on a podcast series about the lingering questions from the attacks. Along the way, I’ve interviewed the Cassandra-like FBI agents who chased Osama bin Laden and al-Qaeda before the attacks; first responders and attack survivors in New York, Washington, and Pennsylvania; government officials who hid away in bunkers under the White House and in the Virginia countryside as the day unfolded; the passengers aboard Air Force One with the president on 9/11; and the Navy SEALs who killed bin Laden a decade later. I’ve interviewed directors of the CIA, FBI, and national intelligence; the interrogators in CIA black sites; and the men who found Saddam Hussein in that spider hole in Iraq.

As we approach the 20th anniversary of 9/11 on Saturday, I cannot escape this sad conclusion: The United States—as both a government and a nation—got nearly everything about our response wrong, on the big issues and the little ones. The GWOT yielded two crucial triumphs: The core al-Qaeda group never again attacked the American homeland, and bin Laden, its leader, was hunted down and killed in a stunningly successful secret mission a decade after the attacks. But the U.S. defined its goals far more expansively, and by almost any other measure, the War on Terror has weakened the nation—leaving Americans more afraid, less free, more morally compromised, and more alone in the world. A day that initially created an unparalleled sense of unity among Americans has become the backdrop for ever-widening political polarization.

The nation’s failures began in the first hours of the attacks and continue to the present day. Seeing how and when we went wrong is easy in hindsight. What’s much harder to understand is how—if at all—we can make things right.

As a society, we succumbed to fear.

The most telling part of September 11, 2001, was the interval between the first plane crash at the World Trade Center, at 8:46 a.m., and the second, at 9:03. In those 17 minutes, the nation’s sheer innocence was on display.

The aftermath of the first crash was live on the nation’s televisions by 8:49 a.m. Though horrified, many Americans who saw those images still went on about their morning. In New York, the commuter-ferry captain Peter Johansen recalled how, afterward, he docked at the Wall Street Terminal and every single one of his passengers got off and walked into Lower Manhattan, even as papers and debris rained down from the damaged North Tower.

At the White House, National Security Adviser Condoleezza Rice called Bush, who was in Florida. They discussed the crash and agreed it was strange. But Rice proceeded with her 9 a.m. staff meeting, as previously scheduled, and Bush went into a classroom at the Emma E. Booker Elementary School to promote his No Child Left Behind education agenda. At the FBI, the newly arrived director, Robert Mueller, was actually sitting in a briefing on al-Qaeda and the 2000 bombing of the USS Cole when an aide interrupted with news of the first crash; he looked out the window at the bright blue sky and wondered how a plane could have hit the World Trade Center on such a clear day.

Those muted reactions seem inconceivable today but were totally appropriate to the nation that existed that September morning. The conclusion of the Cold War a decade earlier had supposedly ended history. To walk through Bill Clinton’s presidential library in Little Rock today is to marvel at how low-stakes everything in the 1990s seemed.

But after that second crash, and then the subsequent ones at the Pentagon and in the fields outside Shanksville, Pennsylvania, our government panicked. There’s really no other way to say it. Fear spread up the chain of command. Cheney, who had been hustled to safety in the minutes after the second crash, reflected later, “In the years since, I’ve heard speculation that I’m a different man after 9/11. I wouldn’t say that. But I’ll freely admit that watching a coordinated, devastating attack on our country from an underground bunker at the White House can affect how you view your responsibilities.”

The initial fear seemed well grounded. Experts warned of a potential second wave of attacks and of al-Qaeda sleeper cells across the country. Within weeks, mysterious envelopes of anthrax powder began sickening and killing people in Florida, New York, and Washington. Entire congressional office buildings were sealed off by government officials in hazmat suits.

The world suddenly looked scary to ordinary citizens—and even worse behind the closed doors of intelligence briefings. The careful sifting of intelligence that our nation’s leaders rely on to make decisions fell apart. After the critique that federal law enforcement and spy agencies had “failed to connect the dots” took hold, everyone shared everything—every tip seemed to be treated as fact. James Comey, who served as deputy attorney general during some of the frantic post-9/11 era, told me in 2009 that he had been horrified by the unverified intelligence landing each day on the president’s desk. “When I started, I believed that a giant fire hose of information came in the ground floor of the U.S. government and then, as it went up, floor by floor, was whittled down until at the very top the president could drink from the cool, small stream of a water fountain,” Comey said. “I was shocked to find that after 9/11 the fire hose was just being passed up floor by floor. The fire hose every morning hit the FBI director, the attorney general, and then the president.”

According to one report soon after 9/11, a nuclear bomb that terrorists had managed to smuggle into the country was hidden on a train somewhere between Pittsburgh and Philadelphia. This tip turned out to have come from an informant who had misheard a conversation between two men in a bathroom in Ukraine—in other words, from a terrible global game of telephone. For weeks after, Bush would ask in briefings, “Is this another Ukrainian urinal incident?”

Even disproved plots added to the impression that the U.S. was under constant attack by a shadowy, relentless, and widespread enemy. Rather than recognizing that an extremist group with an identifiable membership and distinctive ideology had exploited fixable flaws in the American security system to carry out the 9/11 attacks, the Bush administration launched the nation on a vague and ultimately catastrophic quest to rid the world of “terror” and “evil.”

At the time, some commentators politely noted the danger of tilting at such nebulous concepts, but a stunned American public appeared to crave a bold response imbued with a higher purpose. As the journalist Robert Draper writes in To Start a War, his new history of the Bush administration’s lies, obfuscations, and self-delusions that led from Afghanistan into Iraq, “In the after-shocks of 9/11, a reeling America found itself steadied by blunt-talking alpha males whose unflappable, crinkly-eyed certitude seemed the only antidote to nationwide panic.”

he crash of that second plane at 9:03, live on millions of television sets across the country, had revealed a gap in Americans’ understanding of our world, a gap into which anything and everything—caution and paranoia, liberal internationalism and vengeful militarism, a mission to democratize the Middle East and an ever more pointless campaign amid a military stalemate—might be poured in the name of shared national purpose. The depth of our leaders’ panic and the amorphousness of our enemy led to a long succession of tragic choices.

We chose the wrong way to seek justice.

Before 9/11, the United States had a considered, constitutional, and proven playbook for targeting terrorists: They were arrested anywhere in the world they could be caught, tried in regular federal courts, and, if convicted, sent to federal prison. The mastermind of the 1993 World Trade Center bombing? Arrested in Pakistan. The 1998 embassy bombers? Caught in Kenya, South Africa, and elsewhere. In Sweden on the very morning of 9/11, FBI agents had arrested an al-Qaeda plotter connected to the attack on the USS Cole. The hunt for the plotters of and accomplices to the new attacks could have been similarly handled in civilian courts, whose civil-liberties protections would have shown the world how even the worst evils met with reasoned justice under the law.

Instead, on November 13, 2001, President Bush announced in an executive order that those rounded up in the War on Terror would be treated not as criminals, or even as prisoners of war, but as part of a murky category that came to be known as “enemy combatants.”

While civil libertarians warned of a dark path ahead, Americans seemed not . . .

Continue reading. There’s much more.

Later in the article:

Meanwhile, for all the original talk of banishing evil from the world, the GWOT’s seemingly exclusive focus on Islamic extremism has led to the neglect of other threats actively killing Americans. In the 20 years since 9/11, thousands of Americans have succumbed to mass killers—just not the ones we went to war against in 2001. The victims have included worshippers in churchessynagogues, and temples; people at shopping mallsmovie theaters, and a Walmart; students and faculty at universities and community colleges; professors at a nursing school; children in elementarymiddle, and high schools; kids at an Amish school and on a Minnesota Native American reservation; nearly 60 concertgoers who were machine-gunned to death from hotel windows in Las Vegas. But none of those massacres were by the Islamic extremists we’d been spending so much time and money to combat. Since 9/11, more Americans have been killed by domestic terrorists than by foreign ones. Political pressure kept national-security officials from refocusing attention and resources on the growing threat from white nationalists, armed militias, and other groups energized by the anti-immigrant, anti-Muslim strains of the War on Terror.

FDR was right: the thing to fear is fear itself — fear leads to panic, and panic leads to bad and ill-considered decisions.

Update: But see also David Corn’s article  in Mother Jones: “It’s Not Too Late to Learn the Lessons We Didn’t Learn From 9/11.”

Written by Leisureguy

10 September 2021 at 3:57 pm

Reading John Gray in war

leave a comment »

Andy Owen, author of All Soldiers Run Away: Alano’s War: The Story of a British Deserter (2017) and a former soldier who writes on the ethics and philosophy of war, has an interesting essay in Aeon:

‘All of humanity’s problems stem from man’s inability to sit quietly in a room alone.’
Blaise Pascal (1623-62)

Ifirst read the English philosopher John Gray while sitting in the silence of the still, mid-afternoon heat of Helmand Province in Afghanistan. In Black Mass: Apocalyptic Religion and the Death of Utopia (2007), Gray showed how the United States’ president George W Bush and the United Kingdom’s prime minister Tony Blair framed the ‘war on terror’ (which I was part of) as an apocalyptic struggle that would forge the new American century of liberal democracy, where personal freedom and free markets were the end goals of human progress. Speaking at the Sydney Writers’ Festival in 2008, Gray highlighted an important caveat to the phrase ‘You can’t have an omelette without breaking eggs,’ which is sometimes used, callously, to justify extreme means to high-value ends. Gray’s caveat was: ‘You can break millions of eggs and still not have a single omelette.’ In my two previous tours of Iraq, I had seen first-hand – as sectarian hatred, insurgency, war fighting, targeted killings and the euphemistically named collateral damage tore apart buildings, bodies, communities and the shallow fabric of the state – just how many eggs had been broken and yet still how far away from the omelette we were.

There was no doubt that Iraq’s underexploited oil reserves were part of the US strategic decision-making, and that the initial mission in Afghanistan was in response to the terrorist attacks of 11 September 2001 on the US, but both invasions had ideological motivations too. I had started the process to join the British military before 9/11. The military I thought I was joining was the one that had successfully completed humanitarian interventions in the Balkans and Sierra Leone. I believed we could use force for good, and indeed had a duty to do so. After the failure to prevent genocides in Rwanda and Srebrenica, the concept of the ‘responsibility to protect’ was developing, which included the idea that when a state was ‘unable or unwilling’ to protect its people, responsibility shifted to the international community and, as a last resort, military intervention would be permissible. It would be endorsed by all member states of the United Nations (UN) in 2005 but, under the framework, the authority to employ the last resort rested with the UN Security Council, who hadn’t endorsed the invasion of Iraq.

Despite the lack of a UN resolution, many of us who deployed to Iraq naively thought we were doing the right thing. When Lieutenant Colonel Tim Collins delivered his eve-of-battle speech to the Royal Irish Battle Group in March 2003, he opened by stating: ‘We go to liberate, not to conquer.’ We had convinced ourselves that, as well as making the region safer by seizing the Iraqi president Saddam Hussein’s weapons of mass destruction (WMD), we were there to save the people of Iraq from their own government and replace it with the single best way of organising all societies: liberal democracy. This feeling was so persuasive that it led to many troops feeling that the Iraqis were somehow ungrateful when they started to shoot at us for invading their country.

By my second tour of Iraq in 2005, it was clear that no WMD would be found and the society that was evolving was far from the one envisaged. Morale was at a low ebb as the gap between the mission and what we were achieving widened. We were stuck in a Catch-22. We would hand over to local security forces when the security situation improved enough for us to do so. However, the security situation couldn’t improve while we were still there. It would improve only if we left. The conditions that would allow us to leave were us already having left. Most troops were stuck inside the wire, their only purpose seemingly to be mortared or rocketed for being there. I was asked why we were there, especially when soldiers witnessed their friends being injured or killed, or saw the destruction of the city we’d come to liberate. They needed meaning, it couldn’t all be pointless. Meaning was found in protecting each other. My team of 30 or so men and women found purpose in trying to collect intelligence on those planting deadly improvised explosive devices along the main routes in and out of the city. Members of both the team before and the team after us were blown up trying to do so.

Much of the criticism levelled at the post-invasion failure focused on the mistake of disbanding the Iraqi state, the lack of post-conflict planning and the lack of resources. There was less focus on the utopian aims of the whole project. But it was only through Gray that I saw the similarities between the doctrines of Stalinism, Nazi fascism, Al-Qaeda’s paradoxical medieval, technophile fundamentalism, and Bush’s ‘war on terror’. Gray showed that they are all various forms (however incompatible) of utopian thinking that have at their heart the teleological notion of progress from unenlightened times to a future utopia, and a belief that violence is justified to achieve it (indeed, from the Jacobins onwards, violence has had a pedagogical function in this process). At first, I baulked at the suggested equivalence with the foot soldiers of the other ideologies. There were clearly profound differences! But through Gray’s examples, I went on to reflect on how much violence had been inflicted throughout history by those thinking that they were doing the right thing and doing it for the greater good.

A message repeated throughout Gray’s work is that, despite the irrefutable material gains, this notion is misguided: scientific knowledge and the technologies at our disposal increase over time, but there’s no reason to think that morality or culture will also progress, nor – if it does progress for a period – that this progress is irreversible. To think otherwise is to misunderstand the flawed nature of our equally creative and destructive species and the cyclical nature of history. Those I spoke to in Basra needed no convincing that the advance of rational enlightened thought was reversible, as the Shia militias roamed the streets enforcing their interpretation of medieval law, harassing women, attacking students and assassinating political opponents. By the time bodies of journalists who spoke out against the death squads started turning up at the side of the road, Basra’s secular society was consigned to history. Gray points to the re-introduction of torture by the world’s premier liberal democracy during the war on terror as an example of the reversibility of progress. The irreversibility idea emerged directly from a utopian style of thinking that’s based on the notion that the end justifies the means. Such thinking is often accompanied by one of the defining characteristics of the Iraq and Afghanistan campaigns: hubris.

The myth of progress was a key theme of Gray’s . . .

Continue reading.

Written by Leisureguy

31 July 2021 at 8:46 pm

Facing Years in Prison for Drone Leak, Daniel Hale Makes His Case Against U.s. Assassination Program

leave a comment »

This article by Ryan Devereaux in The Intercept is a must-read:

THE MISSILES THAT killed Salim bin Ahmed Ali Jaber and Walid bin Ali Jaber came in the night. Salim was a respected imam in the village of Khashamir, in southeastern Yemen, who had made a name for himself denouncing the rising power of Al Qaeda’s franchise in the Arabian Peninsula. His cousin Walid was a local police officer. It was August 21, 2012, and the pair were standing in a palm grove, confronting a trio of suspected militants, when the Hellfires made impact.

The deaths of the two men sparked protests in the days that followed, symbolizing for many Yemenis the human cost of U.S. counterterrorism operations in their country. Thousands of miles away, at the U.S. military’s base in Bagram, Afghanistan, Daniel Hale, a young intelligence specialist in the U.S. Air Force, watched the missiles land. One year later, Hale found himself sitting on a Washington, D.C., panel, listening as Salim’s brother, Faisal bin Ali Jaber, recalled the day Salim was killed.

As Fazil recounted what happened next, I felt myself transported back in time to where I had been on that day, 2012. Unbeknownst to Fazil and those of his village at the time was that they had not been the only ones watching Salem approach the jihadist in the car. From Afghanistan, I and everyone on duty paused their work to witness the carnage that was about to unfold. At the press of a button, from thousands of miles away, two Hellfire missiles screeched out of the sky, followed by two more. Showing no signs of remorse, I, and those around me, clapped and cheered triumphantly. In front of a speechless auditorium, Fazil wept.

Hale recalled the emotional moment and others stemming from his work on the U.S. government’s top-secret drone program in an 11-page, handwritten letter filed in the U.S. District Court for the Eastern District of Virginia this week.

Secret Evidence

Hale was indicted by a grand jury and arrested in 2019 on a series of counts related to the unauthorized disclosure of national defense and intelligence information and the theft of government property. In March, the 33-year-old pleaded guilty to leaking a trove of unclassified, secret, and top-secret documents to a news organization, which government filings strongly implied was The Intercept. His sentencing is scheduled for next week.

The Intercept “does not comment on matters relating to the identity of anonymous sources,” Intercept Editor-in-Chief Betsy Reed said at the time of Hale’s indictment. “These documents detailed a secret, unaccountable process for targeting and killing people around the world, including U.S. citizens, through drone strikes,” Reed noted. “They are of vital public importance, and activity related to their disclosure is protected by the First Amendment.”

Federal prosecutors are urging Judge Liam O’Grady to issue a maximum sentence, up to 11 years in prison, arguing that Hale has shown insufficient remorse for his actions, that his disclosures were motivated by vanity and not in the public interest, and that they aided the United States’ enemies abroad — namely the Islamic State.

“These documents contained specific details that adversaries could use to hamper and defeat actions of the U.S. military and the U.S. intelligence community,” the government claimed. “Indeed, they were of sufficient interest to ISIS for that terrorist organization to further distribute two of those documents in a guidebook for its followers.”

Prosecutors have acknowledged, however, that Hale’s sentencing was “in an unusual posture” because the probation officer in the case, who makes recommendations to the court, “has not seen some of the key facts of the case,” namely those that the government says support its claim that Hale’s disclosures had the potential to cause “serious” or “exceptionally grave” harm to U.S. national security. The Intercept has not reviewed the documents in question, which remain under seal, shielded from public scrutiny.

Harry P. Cooper, a former senior official in the CIA and noted agency expert on classified materials who did review the documents, provided a declaration in Hale’s case on the potential national security threat posed by the release of the documents.

Cooper, who maintains a top-secret clearance and has trained top-level officials at the agency, including the director of the CIA, said that while some of the documents did constitute so-called national defense information, “the disclosure of these documents, at the time they were disclosed and made public, did not present any substantial risk of harm to the United States or to national security.”

Commenting on the government’s claim that Hale’s disclosures were circulated by ISIS, Cooper said, “such publication further supports my conclusions, because it suggests that the adversaries treated the documents as trophies rather than as something that would give a tactical advantage, given that publication would reduce to zero any tactical advantage that the documents might otherwise have given.”

“In short,” Cooper said, “an adversary who has gained a tactical advantage by receiving secret information would never publicize their possession of it.”

Hale was charged under the Espionage Act, a highly controversial 1917 law that has become a favored tool of federal prosecutors pursuing cases of national security leaks. The law bars the accused from using motivations such as informing the public as a defense against incarceration, and yet, Hale’s alleged personal motivations and character came up repeatedly in a sentencing memo filed this week, with prosecutors arguing that he was “enamored of journalists” and that as a result, “the most vicious terrorists in the world” obtained top-secret U.S. documents.

In their own motion filed this week, Hale’s lawyers argued that the former intelligence analyst’s motivations were self-evident — even if the government refused to recognize them. “The facts regarding Mr. Hale’s motive are clear,” they wrote. “He committed the offense to bring attention to what he believed to be immoral government conduct committed under the cloak of secrecy and contrary to public statements of then-President Obama regarding the alleged precision of the United States military’s drone program.”

Hidden Assassinations

Legal experts focused on the drone program strongly dispute the prosecution’s claim that Hale’s disclosures did not provide a significant public service. Indeed, for many experts, shedding light on a lethal program that the government had tried to keep from public scrutiny for years is vital.

“The disclosures provided important information to the American public about a killing program that has virtually no transparency or accountability, and has taken a devastating toll on civilian lives abroad in the name of national security,” said Priyanka Motaparthy, director of the Counterterrorism, Armed Conflict and Human Rights Project at Columbia Law School. “They helped reveal how some of the most harmful impacts of this program, in particular the civilian toll, were obscured and hidden.”

Thanks in large part to the government’s efforts to keep the drone program under tight secrecy, the task of calculating the human impact of the program has been left to investigative journalists and independent monitoring groups. The numbers that these groups have compiled over the years show a staggering human cost of these operations. The U.K.-based Bureau of Investigative Journalism, or TBIJ, estimates the total number of deaths from drones and other covert killing operations in Pakistan, Afghanistan, Yemen, and Somalia to run between 8,858 and 16,901 since strikes began to be carried out in 2004.

Of those killed, as many as 2,200 are believed to have been civilians, including several hundred children and multiple U.S. citizens, including a 16-year-old boy. The tallies of civilian casualties are undoubtedly an undercount of the true cost of the drone war — as Hale’s letter to the court this week and the documents he allegedly made public show, the people who are killed in American drone strikes are routinely classified as “enemies killed in action” unless proven otherwise.

Following years of pressure — and in the wake of the publication of the materials Hale is accused of leaking — the Obama administration introduced new requirements for reporting civilian casualties from covert counterterrorism operations to the public in 2016, disclosing that year that between 64 and 116 civilians were believed to have been killed in drone strikes and other lethal operations. However, the Trump administration revoked that meager disclosure requirement, leaving the public once again in the dark about who exactly is being killed and why. . .

Continue reading. There’s more and it’s important because it shows an aspect of the US that one normally associates with the baddies. Some of what the US has done — a drone strike on a wedding party, for example — are functionally equivalent to terrorism.

Written by Leisureguy

25 July 2021 at 4:56 pm

Rachel Maddow speaks on Frederick Douglass

leave a comment »

Rachel Maddow:

In 1845, Frederick Douglass, the great American abolitionist, published the first of what would become three autobiographical accounts of his life. The first one was called Narrative of the Life of Frederick Douglass: An American Slave.

Frederick Douglass is, of course, one of the greatest Americans of all time. His autobiographies about life as a slave and his struggle to become free, in addition to everything else he did in his life, those written works are some of the most influential written American accounts of anything on any subject.

In Narrative of the Life, which is the most widely read of the three of his three autobiographical accounts but also in the subsequent autobiographies he wrote as well, including the next one, My Bondage and My Freedom, one of the most harrowing things that Frederick Douglass describes about his own life is a yearlong period when the man who owned him as a slave decided that young Frederick Douglass was incorrigible.

Douglass’ owner decided that Frederick Douglass needed in effect to be tamed, to be broken. And so he shipped Frederick Douglass off to a man that is literally known as a slave breaker. The slave breaker was named Edward Covey. C-O-V-E-Y.

This is part of how Frederick Douglass describes him in My Bondage and My Freedom. He says, quote, “I have now lived with him [meaning his slave owner] nearly nine months, and he had given me a number of severe whippings, without any visible improvement in my character or my conduct. Now he was resolved to put me out as he said, quote, to be broken.”

There was, in the Bay Side, a man named Edward Covey, who enjoyed the execrated reputation of being a first rate hand at breaking young Negroes. Breaking.

Frederick Douglass then goes on in chapter after chapter after chapter in this autobiography. Look at this. The experiences at Covey’s, unusual brutality at Covey, driven back to Covey’s. You know, Covey’s manner of proceeding to whip, right? Chapter after chapter after chapter, he describes this experience, the way that Edward Covey tortured him and beat him nearly to death and worked him nearly to death all the try to destroy Frederick Douglass’ spirit, to try to destroy his mind, to turn him into a docile slave who would work out question whereupon he would then be returned to his owner.

And because Douglass is so capable and so brilliant, his own recounting of what happened to him in that period of his life, what happened to him when his slave owner sent him to Edward Covey, what happened to him at Edward Covey’s hands, what happened to him when he stayed for a year at Edward Covey’s farm and Covey was tasked there with breaking him, because Frederick Douglass is such a luminous, important, brilliant, inspiring, incredible figure, unparalleled figure in American history, because of what we know he is capable of, because of what we know what his mind was capable of and what he did for his country in his life, when he recounts what happened to him at the hands of Edward Covey, it is the most dispiriting and desolate and just miserable thing that Douglass writes about.

He wrote:

I shall never be able to narrate the mental experience through which it was my lot to pass during my stay at Edward Covey’s. I was completely wrecked, changed and bewildered, goaded almost to madness at one time and at another reconciling myself to my wretched condition. I suffered bodily as well as mentally.

“The overwork and brutal chastisements of which I was the victim, combined with that ever gnawing and soul devouring thought, I am a slave, a slave for life, a slave with no rational ground to hope for freedom, it rendered me a living embodiment of mental and physical wretchedness.

That was Frederick Douglass’ account of his own life in that lowest period in his own life. And that written account did more than any other to galvanize the American abolitionist movement to bring an end to slavery. Of course, it was not fiction. It really happened and it happened as Frederick Douglass said it did and Edward Covey was a real person who operated a slave breaking operation at his farm to which Frederick Douglass was sent.

Now, if you go back to that initial description, Douglass describes Covey’s farm as being on the Bay Side. What he meant by that is that the farm was on the far side of Chesapeake Bay, the far side of Chesapeake Bay from the mainland of Maryland, which is where Douglass was being sent there from.

Edward Covey’s farm, his slave breaking operation which he tortured Frederick Douglass and countless others was this house and its surrounding farmland on the eastern shore of Maryland, in a town that’s now called St. Michael’s.

The farm and the house at the farm itself had a name, a fitting name. It was called Mount Misery.
About 15 years ago now, a literature professor wrote a very thoughtful piece in the Baltimore Sun newspaper suggesting a new future for Mount Misery, suggesting that the United States of America should consider buying Mount Misery to make it a commemorative site. He argued, would not the most fitting outcome for Mount Misery be as a monument or museum wherein a key moment from the country’s past can find a rightful place in the public memory. The old Edward Covey house deserves our understanding and preservation, the fight between slave and slave breaker that took place there is emblematic of two of the elemental themes of American history, the horrors of legally sanctioned racial violence and also the nobility of the struggle against it.

And then her;`s actually the kicker from that piece. The professor writes, “Preserving Mount Misery as a public site of contemplation where the meanings of democracy and despotism are given a human face also would help keep St. Michael’s from being merely a resort for the wealthy.”

A resort for the wealthy? Check this out. The occasion for that call that well-argued piece in the Baltimore Sun that Mount Misery should be purchased and preserved by this country as a monument to the epic violence committed there against slaves in great numbers but specifically against one of the greatest Americans of all time, the key role that the torture in that house played in turning on our American conscious to eventually overthrow slavery, the occasion for that call to preserve Mount Misery as a monument to the hell that happened there, the reason the Baltimore Sun published that just less than 15 years ago now was this revelation that was published in the New York Times exactly 15 years ago today.

On June 30th, 2006, it’s titled “Weekends with the President’s Men.” It is kind of a kicky sidebar piece in the New York Times that was published in the summer of 2006. And that piece revealed that that site on the eastern shore of Maryland, Mount Misery, that house, that farm had actually been recently purchased and was now being lived in as a private home.

Can you imagine, right? First of all, the house is still called Mount Misery today. That`s still the name by which it is known. Who would want to live in a place called Mount Misery?

But then you get to the reason that it`s called Mount Misery, right? It was the home, the same building standing there since 1804. Frederick Douglass was tortured there in 1833 and 1834. It`s the same actual physical place in which the great Frederick Douglass was tortured and beaten and worked nearly to death every day for a year.

Whether or not you think that place should be purchased by this country and made into a memorial for the worst most violent evils of slavery and their role on turning on American’s conscious to end slavery, again, that’s a substantive and interesting proposal. Whether or not you are into that idea, would you want to live there yourself? Would you like to wake up there in the morning and plan breakfast, have that be your home? Who would do that?

That article published in the New York Times“15 years ago today was actually controversial at the time that it was published because in writing that piece it did reveal the exact home address of a senior government official who in fact had made Mount Misery his private home. His name is Donald Rumsfeld, and he was at the time, the summer of 2006, struggling to the end of his disastrous tenure as Secretary of Defense in the George W. Bush administration.

He lived at the time at Mount Misery. He bought the place in 2003 as he was leading the nation into the invasion of Iraq. That was where he went to get away from Washington while running two disastrous wars. He would like to have the Chinook helicopter drop him off at the slave breaker’s home where Douglass was tortured to death. He could relax there.

Written by Leisureguy

1 July 2021 at 10:31 pm

How Rumsfeld Deserves to Be Remembered

leave a comment »

George Packer vocally and enthusiastically backed the idea of the US invading Iraq by the US during the George W. Bush administration, when Donald Rumsfeld was Secretary of Defense and Dick Cheney was Vice President. Cheney and Rumsfeld seemed to work closely together. Packer now writes in the Atlantic:

In 2006, soon after I returned from my fifth reporting trip to Iraq for The New Yorker, a pair of top aides in the George W. Bush White House invited me to lunch to discuss the war. This was a first; until then, no one close to the president would talk to me, probably because my writing had not been friendly and the administration listened only to what it wanted to hear. But by 2006, even the Bush White House was beginning to grasp that Iraq was closer to all-out civil war than to anything that could be called “freedom.”

The two aides wanted to know what had gone wrong. They were particularly interested in my view of the secretary of defense, Donald Rumsfeld, and his role in the debacle. As I gave an assessment, their faces actually seemed to sag toward their salads, and I wondered whether the White House was so isolated from Iraqi reality that top aides never heard such things directly. Lunch ended with no explanation for why they’d invited me. But a few months later, when the Bush administration announced Rumsfeld’s retirement, I suspected that the aides had been gathering a case against him. They had been trying to push him out before it was too late.

Rumsfeld was the worst secretary of defense in American history. Being newly dead shouldn’t spare him this distinction. He was worse than the closest contender, Robert McNamara, and that is not a competition to judge lightly. McNamara’s folly was that of a whole generation of Cold Warriors who believed that Indochina was a vital front in the struggle against communism. His growing realization that the Vietnam War was an unwinnable waste made him more insightful than some of his peers; his decision to keep this realization from the American public made him an unforgivable coward. But Rumsfeld was the chief advocate of every disaster in the years after September 11. Wherever the United States government contemplated a wrong turn, Rumsfeld was there first with his hard smile—squinting, mocking the cautious, shoving his country deeper into a hole. His fatal judgment was equaled only by his absolute self-assurance. He lacked the courage to doubt himself. He lacked the wisdom to change his mind.

Rumsfeld was working in his office on the morning that a hijacked jet flew into the Pentagon. During the first minutes of terror, he displayed bravery and leadership. But within a few hours, he was already entertaining catastrophic ideas, according to notes taken by an aide: “best info fast. Judge whether good enough [to] hit S.H. [Saddam Hussein] @ same time. Not only UBL [Osama bin Laden].” And later: “Go massive. Sweep it all up. Things related and not.” These fragments convey the whole of Rumsfeld: his decisiveness, his aggression, his faith in hard power, his contempt for procedure. In the end, it didn’t matter what the intelligence said. September 11 was a test of American will and a chance to show it.

Rumsfeld started being wrong within hours of the attacks and never stopped. He argued that the attacks proved the need for the missile-defense shield that he’d long advocated. He thought that the American war in Afghanistan meant the end of the Taliban. He thought that the new Afghan government didn’t need the U.S. to stick around for security and support. He thought that the United States should stiff the United Nations, brush off allies, and go it alone. He insisted that al-Qaeda couldn’t operate without a strongman like Saddam. He thought that all the intelligence on Iraqi weapons of mass destruction was wrong, except the dire reports that he’d ordered up himself. He reserved his greatest confidence for intelligence obtained through torture. He thought that the State Department and the CIA were full of timorous, ignorant bureaucrats. He thought that America could win wars with computerized weaponry and awesome displays of force.

He believed in regime change but not in nation building, and he thought that a few tens of thousands of troops would be enough to win in Iraq. He thought that the quick overthrow of Saddam’s regime meant mission accomplished. He responded to the looting of Baghdad by saying “Freedom’s untidy,” as if the chaos was just a giddy display of democracy—as if it would not devastate Iraq and become America’s problem, too. He believed that Iraq should be led by a corrupt London banker with a history of deceiving the U.S. government. He faxed pages from a biography of Che Guevara to a U.S. Army officer in the region to prove that the growing Iraqi resistance did not meet the definition of an insurgency. He dismissed the insurgents as “dead-enders” and humiliated a top general who dared to call them by their true name. He insisted on keeping the number of U.S. troops in Iraq so low that much of the country soon fell to the insurgency. He focused his best effort on winning bureaucratic wars in Washington.

By the time Rumsfeld was fired, in November 2006, the U.S., instead of securing peace in one country, was losing wars in two, largely because of actions and decisions taken by Rumsfeld himself. As soon as he was gone, the disaster in Iraq began to turn around, at least briefly, with a surge of 30,000 troops, a policy change that Rumsfeld had adamantly opposed. But it was too late. Perhaps it was too late by the early afternoon of September 11.

Rumsfeld had intelligence, wit, dash, and endless faith in himself. Unlike McNamara, he never expressed a quiver of regret. He must have died in the secure knowledge that he had been right all along.

Written by Leisureguy

1 July 2021 at 6:31 pm

Donald Rumsfeld, despicable person, dies at age 88

leave a comment »

Update: See also “The Hell Donald Rumsfeld Built.”

Spencer Ackerman has a good obituary of Rumsfeld in the Daily Beast:

The only thing tragic about the death of Donald Rumsfeld is that it didn’t occur in an Iraqi prison. Yet that was foreordained, considering how throughout his life inside the precincts of American national security, Rumsfeld escaped the consequences of decisions he made that ensured a violent, frightening end for hundreds of thousands of people.

An actuarial table of the deaths for which Donald Rumsfeld is responsible is difficult to assemble. In part, that’s a consequence of his policy, as defense secretary from 2001 to 2006, not to compile or release body counts, a PR strategy learned after disclosing the tolls eroded support for the Vietnam War. As a final obliteration, we cannot know, let alone name, all the dead.

But in 2018, Brown University’s Costs of War Project put together something that serves as the basis for an estimate. According to Neda C. Crawford, Brown’s political-science department chair, the Afghanistan war to that point claimed about 147,000 lives, to include 38,480 civilians; 58,596 Afghan soldiers and police (about as many American troops as died in Vietnam); and 2,401 U.S. servicemembers.

Rumsfeld was hardly the only person in the Bush administration responsible for the Afghanistan war. But in December 2001, under attack in Kandahar, where it had retreated from the advance of U.S. and Northern Alliance forces, the Taliban sought to broker a surrender—one acceptable to the U.S.-installed Afghan leader Hamid Karzai. At the Pentagon, Donald Rumsfeld refused. “I do not think there will be a negotiated end to the situation, that’s unacceptable to the United States,” he said. That statement reaped a 20-year war, making it fair to say that the subsequent deaths are on his head, even while acknowledging that Rumsfeld was hardly the only architect of the conflict.

Crawford in 2018 also tallied between 267,792 and 295,170 deaths to that point in Iraq. That is almost certainly a severe undercount, and it includes between a very conservatively estimated 182,000 to 204,000 civilians; over 41,000 Iraqi soldiers and police; and 4,550 U.S. servicemembers. As one of the driving forces behind the invasion and the driving force behind the occupation, Rumsfeld is in an elite category of responsibility for these deaths, alongside his protege Dick Cheney and the president they served, George W. Bush.

Rumsfeld’s depredations short of the wars of choice he oversaw—and yes, responding to 9/11 with war in Afghanistan was no less a choice than the unprovoked war of aggression in Iraq – were no less severe. His indifference to the suffering of others was hardly unique among American policymakers after 9/11, but his blitheness about it underscored the cruel essence of the enterprise. When passed a sheet of paper that, in bureaucratic language, pitched a torture technique of forcing men held captive at Guantanamo Bay for hours on end, Rumsfeld scribbled a shrug on it: “I stand for 8-10 hours a day.” Months earlier, when Rumsfeld was banking on using the U.S. military to invade Iraq, a reporter asked about using U.S. forces to provide security for rebuilding Afghanistan at a moment before Taliban resistance coalesced. “Ah, peacekeeping,” he sneered in return, explaining that such tasks were beneath U.S. forces.

But to those forces, for whom he was responsible, he was no less indifferent. In Kuwait in December 2004, National Guardsmen preparing for deployment confronted Rumsfeld in the hope of enlisting his help with a dire circumstance. They were scrounging through scrap heaps for metal to weld onto their insufficiently armored vehicles so the RPGs they were sure to encounter wouldn’t kill them. Rumsfeld let it be known that the war mattered, not the warfighter. “You go to war with the Army you’ve got, not the Army you might want or wish to have at a later time,” he replied.

If Rumsfeld was indignant at the question, it reflected the unreality he inhabited and the lies he told as easily as he breathed. He wrapped himself in a superficial understanding of epistemology (“there are known knowns; there are things we know we know. We also know there are known unknowns…”) that a compliant press treated as sagacity. He wore a mask of assuredness, a con man’s trick, as he said things that bore no resemblance to the truth, such as his September 2002 insistence that he possessed “bulletproof” evidence of a nonexistent alliance between Saddam Hussein and al-Qaeda. As resistance in Iraq coalesced in summer 2003, Rumsfeld said it couldn’t be “anything like a guerrilla war or an organized resistance,” even as a reporter quoted U.S. military doctrine explaining why it was. He insisted, “I don’t do quagmires” when quagmires were all he did.

He had reason to suspect he would get away with it. Manipulating the media was, to Rumsfeld, a known known, since reporters loved Rumsfeld before they hated him. U.S. News & World Report put a grinning Rumsfeld on the cover above the headline “Rum Punch.” (“A Secretary of War Unlike Any Other… You Got A Problem With That?”) Vanity Fair dispatched Annie Liebovitz to photograph him amongst Bush’s war cabinet. People magazine called him the “sexiest cabinet member” in 2002. A typical thumbsucker piece, this one in the Los Angeles Times of August 17, 2003, began with the falsity that “Donald H. Rumsfeld has won two wars and won them his way…” The conservative press reflected the subtext. “The Stud” was what National Review called the septuagenarian Rumsfeld as it depicted him in a come-hither pose. . .

Continue reading. There’s much more. He wasn’t any good at his job, and he never recognized that nor expressed any regret or remorse. He was a man who couldn’t be bothered.

Written by Leisureguy

1 July 2021 at 10:28 am

“I Fought in Afghanistan. I Still Wonder, Was It Worth It?”

leave a comment »

Timothy Kahn, formerly a USMC captain, served in Iraq and Afghanistan and writes in the NY Times:

When President Biden announced on Wednesday that the United States would withdraw all its troops from Afghanistan by Sept. 11, 2021, he appeared to be finally bringing this “forever war” to an end. Although I have waited for this moment for a decade, it is impossible to feel relief. The Sept. 11 attacks took place during my senior year of college, and the wars in Iraq and Afghanistan that followed consumed the entirety of my adult life. Although history books may mark this as the end of the Afghanistan war, it will never be over for many of my generation who fought.

Sometimes there are moments, no more than the span of a breath, when the smell of it returns and once again I’m stepping off the helicopter ramp into the valley. Covered in the ashen dust of the rotor wash, I take in for the first time the blend of wood fires burning from inside lattice-shaped mud compounds, flooded fields of poppies and corn, the sweat of the unwashed and the wet naps that failed to mask it, chicken and sheep and the occasional cow, the burn pit where trash and plastic smoldered through the day, curries slick with oil eaten by hand on carpeted dirt floors, and fresh bodies buried shallow, like I.E.D.s, in the bitter earth.

It’s sweet and earthy, familiar to the farm boys in the platoon who knew that blend of animal and human musk but alien to those of us used only to the city or the lush Southern woods we patrolled during training. Later, at the big bases far from the action, surrounded by gyms and chow halls and the expeditionary office park where the flag and field grade officers did their work, it was replaced by a cologne of machinery and order. Of common parts installed by low-bid contractors and the ocher windblown sand of the vast deserts where those behemoth bases were always located. Relatively safe after the long months at the frontier but dull and lifeless.

Then it’s replaced by the sweet, artificial scents of home after the long plane ride back. Suddenly I’m on a cold American street littered with leaves. A couple passes by holding hands, a bottle of wine in a tote bag, dressed for a party, unaware of the veneer that preserves their carelessness.

I remain distant from them, trapped between past and present, in the same space you sometimes see in the eyes of the old-timers marching in Veterans Day parades with their folded caps covered in retired unit patches, wearing surplus uniforms they can’t seem to take off. It’s the space between their staring eyes and the cheering crowd where those of us who return from war abide.

My war ended in 2011, when I came home from Afghanistan eager to resume my life. I was in peak physical shape, had a college degree, had a half-year of saved paychecks and would receive an honorable discharge from the Marine Corps in a few months. I was free to do whatever I wanted, but I couldn’t bring myself to do anything.

Initially I attributed it to jet lag, then to a need for well-deserved rest, but eventually there was no excuse. I returned to my friends and family, hoping I would feel differently. I did not.

“Relax. You earned it,” they said. “There’s plenty of time to figure out what’s next.” But figuring out the future felt like abandoning the past. It had been just a month since my last combat patrol, but I know now that years don’t make a difference.

At first, everyone wanted to ask about the war. They knew they were supposed to but approached the topic tentatively, the way you hold out a hand to an injured animal. And as I went into detail, their expressions changed, first to curiosity, then sympathy and finally to horror.

I knew their repulsion was only self-preservation. After all, the war cost nothing to the civilians who stayed home. They just wanted to live the free and peaceful lives they’d grown accustomed to — and wasn’t their peace of mind what we fought for in the first place?

After my discharge, I moved to . . .

Continue reading. There’s more.

The Endgame of the Reagan Revolution

leave a comment »

Heather Cox Richardson writes a good summary of modern American political history:

And so, we are at the end of a year that has brought a presidential impeachment trial, a deadly pandemic that has killed more than 338,000 of us, a huge social movement for racial justice, a presidential election, and a president who has refused to accept the results of that election and is now trying to split his own political party.

It’s been quite a year.

But I had a chance to talk with history podcaster Bob Crawford of the Avett Brothers yesterday, and he asked a more interesting question. He pointed out that we are now twenty years into this century, and asked what I thought were the key changes of those twenty years. I chewed on this question for awhile and also asked readers what they thought. Pulling everything together, here is where I’ve come out.

In America, the twenty years since 2000 have seen the end game of the Reagan Revolution, begun in 1980.

In that era, political leaders on the right turned against the principles that had guided the country since the 1930s, when Democratic President Franklin Delano Roosevelt guided the nation out of the Great Depression by using the government to stabilize the economy. During the Depression and World War Two, Americans of all parties had come to believe the government had a role to play in regulating the economy, providing a basic social safety net and promoting infrastructure.

But reactionary businessmen hated regulations and the taxes that leveled the playing field between employers and workers. They called for a return to the pro-business government of the 1920s, but got no traction until the 1954 Brown v. Board of Education decision, when the Supreme Court, under the former Republican governor of California, Earl Warren, unanimously declared racial segregation unconstitutional. That decision, and others that promoted civil rights, enabled opponents of the New Deal government to attract supporters by insisting that the country’s postwar government was simply redistributing tax dollars from hardworking white men to people of color.

That argument echoed the political language of the Reconstruction years, when white southerners insisted that federal efforts to enable formerly enslaved men to participate in the economy on terms equal to white men were simply a redistribution of wealth, because the agents and policies required to achieve equality would cost tax dollars and, after the Civil War, most people with property were white. This, they insisted, was “socialism.”

To oppose the socialism they insisted was taking over the East, opponents of black rights looked to the American West. They called themselves Movement Conservatives, and they celebrated the cowboy who, in their inaccurate vision, was a hardworking white man who wanted nothing of the government but to be left alone to work out his own future. In this myth, the cowboys lived in a male-dominated world, where women were either wives and mothers or sexual playthings, and people of color were savage or subordinate.

With his cowboy hat and western ranch, Reagan deliberately tapped into this mythology, as well as the racism and sexism in it, when he promised to slash taxes and regulations to free individuals from a grasping government. He promised that cutting taxes and regulations would expand the economy. As wealthy people—the “supply side” of the economy– regained control of their capital, they would invest in their businesses and provide more jobs. Everyone would make more money.

From the start, though, his economic system didn’t work. Money moved upward, dramatically, and voters began to think the cutting was going too far. To keep control of the government, Movement Conservatives at the end of the twentieth century ramped up their celebration of the individualist white American man, insisting that America was sliding into socialism even as they cut more and more domestic programs, insisting that the people of color and women who wanted the government to address inequities in the country simply wanted “free stuff.” They courted social conservatives and evangelicals, promising to stop the “secularization” they saw as a partner to communism.

After the end of the Fairness Doctrine in 1987, talk radio spread the message that Black and Brown Americans and “feminazis” were trying to usher in socialism. In 1996, that narrative got a television channel that personified the idea of the strong man with subordinate women. The Fox News Channel told a story that reinforced the Movement Conservative narrative daily until it took over the Republican Party entirely.

The idea that people of color and women were trying to undermine society was enough of a rationale to justify keeping them from the vote, especially after Democrats passed the Motor Voter law in 1993, making it easier for poor people to register to vote. In 1997, Florida began the process of purging voter rolls of Black voters.

And so, 2000 came.

In that year, the presidential election came down to the electoral votes in Florida. Democratic candidate Al Gore won the popular vote by more than 540,000 votes over Republican candidate George W. Bush, but Florida would decide the election. During the required recount, Republican political operatives led by Roger Stone descended on the election canvassers in Miami-Dade County to stop the process. It worked, and the Supreme Court upheld the end of the recount. Bush won Florida by 537 votes and, thanks to its electoral votes, became president. Voter suppression was a success, and Republicans would use it, and after 2010, gerrymandering, to keep control of the government even as they lost popular support.

Bush had promised to unite the country, but his installation in the White House gave new power to the ideology of the Movement Conservative leaders of the Reagan Revolution. He inherited a budget surplus from his predecessor Democrat Bill Clinton, but immediately set out to get rid of it by cutting taxes. A balanced budget meant money for regulation and social programs, so it had to go. From his term onward, Republicans would continue to cut taxes even as budgets operated in the red, the debt climbed, and money moved upward.

The themes of Republican dominance and tax cuts were the backdrop of the terrorist attack of September 11, 2001. That attack gave the country’s leaders a sense of mission after the end of the Cold War and, after launching a war in Afghanistan to stop al-Qaeda, they set out to export democracy to Iraq. This had been a goal for Republican leaders since the Clinton administration, in the belief that the United States needed to spread capitalism and democracy in its role as a world leader. The wars in Afghanistan and Iraq strengthened the president and the federal government, creating the powerful Department of Homeland Security, for example, and leading Bush to assert the power of the presidency to interpret laws through signing statements.

The association of the Republican Party with patriotism enabled Republicans in this era to call for increased spending for the military and continued tax cuts, while attacking Democratic calls for domestic programs as wasteful. Increasingly, Republican media personalities derided those who called for such programs as dangerous, or anti-American.

But while Republicans increasingly looked inward to their party as the only real Americans and asserted power internationally, changes in technology were making the world larger. The Internet put the world at our fingertips and enabled researchers to decode the human genome, revolutionizing medical science. Smartphones both made communication easy. Online gaming created communities and empathy. And as many Americans were increasingly embracing rap music and tattoos and LGBTQ rights, as well as recognizing increasing inequality, books were pointing to the dangers of the power concentrating at the top of societies. In 1997, J.K. Rowling began her exploration of the rise of authoritarianism in her wildly popular Harry Potter books, but her series was only the most famous of a number of books in which young people conquered a dystopia created by adults.

In Bush’s second term, his ideology created a perfect storm. His . . .

Continue reading. There’s much more.

Enhanced interrogation in practice: This Soldier’s Witness to the Iraq War Lie

leave a comment »

Torturing people who are merely suspects is something done by totalitarian governments — and also the US under George W. Bush. We know, even though the CIA deliberately destroyed the video recordings of the torture.

Frederic Wehrey writes in the NY Review of Books:

A few weeks before I deployed to Iraq as a young US military officer, in the spring of 2003, my French-born father implored me to watch The Battle of Algiers, Gillo Pontecorvo’s dramatic reenactment of the 1950s Algerian insurgency against French colonial rule. There are many political and aesthetic reasons to see this masterpiece of cinéma vérité, not least of which is its portrayal of the Algerian capital’s evocative old city, or Casbah. One winter morning in 2014, more than a decade after I first saw the film, I took a stroll down the Casbah’s rain-washed alleys and into the newer French-built city. Scenes from the black-and-white movie—like the landmark Milk Bar café where a female Algerian guerrilla sets off a bomb that kills French civilians—jumped to life. The ensuing French military response, memorably depicted in the film, included arbitrary arrests, torture, and “false flag” bombings that only inflamed the Algerian insurrection.

It was these moral perils of counterinsurgency that my father hinted at. “Keep your eyes open,” he told me. This was a prescient warning, one that served as the backdrop for my deployment, even if the Algerian analogy was imperfect and would become overused. As American soldiers soon faced a guerrilla and civil war in Iraq for which they were woefully ill-equipped, intellectually and militarily, The Battle of Algiers would be screened and discussed at the Pentagon. To this day, it is taught to West Point cadets as a cautionary tale.

Still, the full weight of the film’s lessons was not apparent to me in Iraq until one morning in the summer of 2003, when I received an urgent phone call about a captured Iraqi intelligence officer. My commander wanted me to go interview him at the Baghdad hospital where he was being treated for unspecified wounds.

I donned my Kevlar vest and grabbed my carbine for the trip to the so-called Green Zone in the city center, which was becoming increasingly dangerous because of bomb attacks and ambushes by a growing insurgency.

My own experience with this militancy was mostly of a distant nature—though my encounters were anything but impersonal. As an intelligence officer, I debriefed Iraqi sources and informants on insurgent groups and foreign fighters, which sometimes yielded detailed information that US soldiers would use to conduct raids, looking for weapons, explosives, insurgents, or wanted ex-regime figures. Since I read the after-action reports of these operations, I learned the names and ages of those who were captured. Sometimes, I even saw photographs of their faces. This established a sort of intimacy, a chain of causality between my actions and their fates.

In collecting the intelligence that drove these raids, I tried to vet and verify what I heard. Ninety percent of the information I discarded after rounds of questions. Much of it was outright fabrication by Iraqis seeking financial reward or favors from the US military. Others were trying lure American soldiers into helping them settle personal scores or eliminating their political, commercial, or sectarian rivals. The remainder of the information sometimes proved valid. And the resulting seizure of militants, weapons, or bomb-making materials did save lives.

On occasion, though, we did not sufficiently corroborate the information before an assault, or we got the location wrong. In the aftermath of such misdirected predawn raids on innocent Iraqi civilians, I remembered Pontecorvo’s film and would ask myself: “How many new insurgents did we just create?”

All of this was a departure from the original focus of my deployment, which was to interview former Iraqi officials about Iraq’s suspected weapons of mass destruction (WMD). But once the insurgency started attacking American soldiers, Iraqis, and international organizations, US military commanders demanded that more intelligence resources be devoted to penetrating the insurgents’ networks—especially since the hunt for Saddam’s nuclear, chemical, and biological weapons was going nowhere.

Even so, I continued to chase down any leads I got on WMD. And that was what I assumed this call about the detained Iraqi spy was about. Instead, when I got to the hospital room in the Green Zone, I found myself seated across from a man who had been at the center of one of the biggest lies behind the US decision to invade Iraq.

When Ahmed Khalil Ibrahim Samir al-Ani was posted to the Iraqi embassy in Prague in the late 1990s under diplomatic cover, he quickly came under surveillance by the Czech security service. One morning in early April of 2001, an Arab informant working for the Czechs reported seeing al-Ani meeting with an Arab student at the Iraqi embassy. This student was identified, according to the report, as an Egyptian named Mohamed Atta—the man who, not long after, became the ringleader of the hijackers who carried out al-Qaeda’s terrorist attacks on the World Trade Center and the Pentagon on September 11, 2001.

The CIA and FBI later punched holes in this story; the Czech president himself subsequently repudiated it. To begin with, the informant had identified Atta as the man from the April 2001 meeting only upon seeing his photo published in the news after September 11. The FBI’s records of Atta put him in Virginia and Florida immediately before and after the supposed Prague meeting, and the agency uncovered no evidence of international travel. But none of this stopped the Iraq war hawks in the Bush administration from seizing on the so-called Prague Connection as proof of Saddam Hussein’s supposed complicity in terrorist attacks on American soil—and using it as a casus belli for the 2003 invasion.

There at the Baghdad hospital, I joined an FBI agent in questioning the bedridden al-Ani about his time in the Czech Republic. A diminutive man with a grizzled face creased by bouts of pain, he epitomized the type of drab regime functionary I’d come to know in Iraq all too well. He answered our questions straightforwardly. In the end, the hours-long session provided no evidence about the Prague meeting to contradict the debunking that had already appeared in the press. Al-Ani had never met Mohamed Atta or even heard of him until he saw news reports after September 11. Nor was he himself even in Prague on the day of the alleged encounter; he was out of town, seventy miles away.

Even more disturbing than this non-revelation, though, was his account of his capture that summer by US special operations forces and the reason for his hospitalization. Snatching him from his Baghdad home at night, US soldiers had bound his wrists, covered his head, and forced him to lie on the floor of a Humvee for the long trip to a detention facility. Within fifteen minutes of his confinement in the vehicle, he felt an unbearable burning sensation. A Humvee’s engine is located in the front and conducts heat to the rear bed, where al-Ani was lying facedown on the bare metal. He twisted and writhed from the pain, but his American guards thought he was resisting. One of the soldiers stepped harder on his back with his boot. “Jesus, Jesus, please,” he’d cried, he told me, hoping that this invocation in English would get them to relent.

In front of us in the hospital, he lifted his gown to show us the results: severe burns, in dark-hued patches, covered his stomach, thighs, feet, and palms. As a consequence, al-Ani would endure three months of hospitalization, which involved multiple skin grafts, as well as the amputation of his thumb and the loss of movement of a finger.

After the meeting, I relayed his account of these injuries to my commanding general, who later reported the matter to a Senate inquiry into detainee abuses. The US Department of Justice also included the FBI’s account of this same interview in the inspector general’s 2008 report on detainee interrogations. And, over several years, the US Army investigated the incident, concluding that al-Ani’s injuries were consistent with his story and that “the offences of Assault and Cruelty and Maltreatment was [sic] substantiated.” Despite that finding, the Army dropped the case.

To my knowledge, nobody was ever disciplined or punished for al-Ani’s mistreatment.

*

It is a cruel irony that this Iraqi man was first used as a prop for an American invasion and then subjected to disfiguring violence by soldiers who had carried out that invasion. But his story weighs on me in other ways. The abuses we’ve seen in US policing have deep, homegrown roots, but I am convinced that they are also partly a result of the militarization of law enforcement born of the Iraq War and America’s other overseas interventions. The Iraq disaster has rippled across virtually every facet of American life, deepening the inequalities that divide us, stirring a popular contempt for “expertise” that has opened the door to demagoguery, and contributing to the hollowing-out of our infrastructure and institutions in ways that have left the country dangerously exposed to future shocks.

The Iraq debacle is what the journalist Robert Draper, in his engrossing recent book on the decision to oust Saddam, To Start a War: How the Bush Administration Took America into Iraq, correctly calls . . .

Continue reading.

Written by Leisureguy

16 September 2020 at 4:59 pm

Why the International Criminal Court will investigate possible U.S. war crimes — even if the Trump administration says it can’t

leave a comment »

Kelebogile Zvobgo writes in the Washington Post:

Judges in the Appeals Chamber of the International Criminal Court on Thursday authorized Chief Prosecutor Fatou Bensouda to open an investigation into alleged U.S. war crimes in Afghanistan. This is a big milestone in international criminal justice — for the first time in history, U.S. leaders, armed forces and intelligence personnel may face a trial in an international court for crimes perpetrated in the context of the nation’s wars abroad.

In April, the Pre-Trial Chamber rejected Bensouda’s first request for an investigation. On Thursday, Secretary of State Mike Pompeo condemned the Appeals Chamber’s overturning of the decision, calling the ICC “an unaccountable political institution masquerading as a legal body.”

What are the alleged abuses? How does the ICC have jurisdiction over the United States? What will ordinary U.S. citizens make of an ICC investigation? My research explains how U.S. citizens are more supportive of the ICC than the Trump administration’s rhetoric suggests.<

The ICC prosecutor examined evidence of U.S. torture and abuse

In 2006, the ICC’s Office of the Prosecutor (OTP) opened a preliminary examination into allegations of war crimes and crimes against humanity in the Afghan conflict since 2003 — the year Afghanistan became a member of the ICC.

The OTP examined allegations of abuses by both anti-government and pro-government forces, including the Taliban, the Afghan National Security Forces, the United States, armed forces and the CIA. The OTP says the information it gathered indicates, among other allegations, that U.S. interrogation techniques used in Afghanistan — involving “torture, cruel treatment, outrages upon personal dignity, and rape” — amount to war crimes.

Some ICC judges are worried about going after the U.S.

The United States is not a member of the ICC. However, the treaty that created the court, the Rome Statute, allows it to investigate citizens of nonmember states if the alleged crimes occurred on the territory of a member state. Once Afghanistan ratified the Rome Statute and joined the ICC in 2003, U.S. military and intelligence personnel in Afghanistan came under the court’s jurisdiction.

[The International Criminal Court was established 20 years ago. Here’s how.]

In November 2017 — after more than a decade of gathering evidence — the prosecutor requested authorization to open a full investigation, arguing there was “a reasonable basis to believe” U.S. military and intelligence personnel committed war crimes.

A year and a half later, in April 2019, the Pre-Trial Chamber unanimously rejected the request. The judges agreed the request was in the ICC’s jurisdiction and admissible before the ICC. However, they claimed the investigation would probably not be successful and, therefore, it would not serve the interests of justice to proceed.

The 2019 decision sparked controversy in the human rights community. Amnesty International and Human Rights Watch issued statements criticizing the court’s judges for capitulating to the Trump administration’s threats and, in the process, abandoning the victims of the alleged crimes.

The ICC will move ahead, despite the political risks

Bensouda swiftly appealed the decision. Her office coordinated a multifaceted response, drawing on submissions from victims’ legal representatives and amicus curiae briefs from human rights organizations.

[The U.S. revoked the visa for the ICC prosecutor. That bodes poorly for international criminal justice.]

On Thursday, the Appeals Chamber unanimously reversed the Pre-Trial Chamber’s decision, saying it had gone beyond its power by rejecting the prosecutor’s request. The Rome Statute requires only that the Pre-Trial Chamber determine whether “there is a reasonable basis to proceed with an investigation” and whether “the case appears to fall within the jurisdiction of the Court.”

Since these facts were not in dispute, there was no basis to reject the prosecutor’s request. Last week’s decision authorizes . . .

Continue reading.

Written by Leisureguy

11 March 2020 at 4:14 pm

Is Facebook Mark Zuckerberg’s Revenge for the Iraq War?

leave a comment »

Peter Canellos offers an interesting perspective in Politico, and I can agree that when the George W. Bush administration was pushing the US to invade Iraq — a totally discretionary move, since Iraq posed no danger whatsoever to the US (unlike, say, Saudi Arabia, the home of 19 of the 9/11 terrorists) — the mainstream media at that time seemed to go right along, downplaying any reports that undermined the push to war. (Not all of the mainstream media: the Atlantic published several lengthy articles that made a cogent argument against the attack and invasion, including a piece by James Fallows titled, as I recall, “Iraq: The 51st State.”)

Canellos writes:

Mark Zuckerberg’s recent media blitz included a lot of scripted lines that belie his intentions—such as his assertion during a cozy chat with News Corp CEO Robert Thomson that journalism is crucial for democracy—and one that rings strikingly, resoundingly true: His claim at an October 17 speech at Georgetown University that his views on free expression were shaped by his collegiate frustrations over the failure of the mainstream media to expose the weaknesses of the Bush administration’s case for war in Iraq.

The comment passed with relatively little notice, except among skeptics who saw it as a self-serving, ex-post-facto justification for Facebook’s reluctance to impose constraints on its users’ political assertions. But it was a rare personal admission from one of the least-known and most privacy-obsessed of moguls, and offered an organic, true-to-his-experiences explanation for his decisions at Facebook, many of which have proved to be ruinous for the mainstream media. It turns out it wasn’t just the profit motive that drove Facebook to become the prime source of information around the world; Zuckerberg wished to supplant the mainstream media out of something closer to real animus.

“When I was in college, our country had just gone to war in Iraq,” he explained. “The mood on campus was disbelief. It felt like we were acting without hearing a lot of important perspectives. The toll on our soldiers, families and our national psyche was severe, and most of us felt powerless to stop it. I remember feeling that if more people had a voice to share their experiences, maybe things would have gone differently. Those early years shaped my belief that giving everyone a voice empowers the powerless and pushes society to be better over time.”

This is the closest Zuckerberg has ever come to acknowledging a formative event, an aha moment, that shapes his perceptions of the relative merits of the mainstream media and social media. And it feels authentic to the moment; by late 2003, when the 19-year-old computer whiz was pondering the world from a Cambridge dorm room, it had started to dawn on the country that many of the justifications for the Iraq war were faulty—especially the reports of weapons of mass destruction. Young people rightly extended their anger from the Bush administration to the mainstream media that had failed to alert the country to the flimsiness of the government’s case.

If there was any doubt that those resentments linger, Zuckerberg laced his speech with encomiums to the fresh, clean air of direct democracy and backhanded swipes at the mildewed professional media. “People having the power to express themselves at scale is a new kind of force in the world—a Fifth Estate alongside the other power structures of society,” he declared. “People no longer have to rely on traditional gatekeepers in politics or media to make their voices heard, and that has important consequences.”

He defended political ads on Facebook as a voice for the voiceless, saying he considered banning them but reversed himself because “political ads are an important part of the voice—especially for local candidates, up-and-coming challengers, and advocacy groups that may not get much media attention otherwise. Banning political ads favors incumbents and whoever the media covers.”

The specter of a 35-year-old mogul making off-the-cuff decisions about how much speech (or “voice”) is healthy for society engenders a queasy feeling. It suggests that Elizabeth Warren and others may be right that too much monopolistic power exists on one platform— especially one that coyly presents itself as an innocent conduit for information while blithely acknowledging its governing power over constitutional liberties. But pending future action, such power is indeed vested in the character and values of Mark Zuckerberg.

Zuckerberg’s criticism of mainstream media might be honestly earned. Like Vietnam before it, the debate over the Iraq war dominates the political attitudes of a big slice of the generation that grew up around it. But it also represents only one window on the much larger, and more complicated, question of how best to provide a check and balance to the power of government, and to properly inform the populace. Zuckerberg may have come to his views sincerely, through his own impressions. Like other youthful conversions, they may be very hard to shake. But they aren’t remotely the last word on the question.

For while Zuckerberg may be open about his intentions, he can seem almost willfully blind to their consequences. In his speech, he tries to capture the long arc of American history, veering from the civil rights movement to the repression of socialists during World War I to the era of #MeToo and #BlackLivesMatter. He quotes Frederick Douglass and Martin Luther King Jr. But he never mentions the words “conspiracy theory” or “Donald Trump.”

That left a ghost in the lecture hall at Georgetown, shadowing all of Zuckerberg’s pronouncements and justifications: the abject failure of his chosen mode of communication in the 2016 election, a lapse that threatens to recur if not corrected and that carries more enduring consequences for America than the sins of the mainstream media in the early 2000s.

***

Back when a handful of major news outlets held outsized influence over the national political dialogue, it was common to rail against these unelected gatekeepers. By habitually returning to the mean, insisting on reporting whose candidacy seemed most viable and whose views comported with Main Street assumptions, those media arbiters perpetuated a bland centrism, or so the theory went. They chopped the ends off of the political spectrum, left and right. People who challenged the system had to struggle to be taken seriously.

This critique found a persuasive advocate in the late Ross Perot, who happened to be both a fan of conspiracy theories (particularly regarding POWs) and the CEO of a data firm. Almost three decades ago, when the only web on anyone’s mind was Charlotte’s, Perot envisioned a running national plebiscite, in which average citizens voted like senators. They would simply plug their choices into their home computers, thereby diminishing the importance of Congress and the media’s control of the national debate surrounding its actions.

Perot’s vision of a daily Brexit has yet to come to pass, but his desire to . . .

Continue reading.

Written by Leisureguy

8 November 2019 at 6:15 pm

Trump Is Doing the Same Thing on Iran That George W. Bush Did on Iraq

leave a comment »

Understandable in that Trump is devoid of originality and thus naturally must copy. Jonathan Chait writes in New York:

Last week, intelligence officials testified publicly that Iran has not resumed its efforts to acquire a nuclear weapon. The next day, President Trump called these officials “extremely passive and naive when it comes to the dangers of Iran,” and advised, “Perhaps Intelligence should go back to school!”

The first-blush response to this presidential outburst was to dump it in the same category as Trump’s other public eruption against members of his government who undercut his preferred narratives with inconvenient facts. That response is probably correct: this Trump tantrum is probably like all the other Trump tantrums. But there is another possible meaning to this episode: Trump’s rejection of intelligence assessments of Iran’s weapons of mass destruction capabilities eerily echoes the Bush administration’s rejection of Iraq’s WMD capabilities a decade and a half earlier.

Shortly after their testimony, the intelligence officials were summoned to the Oval office for a photographed session in which they publicly smoothed over their breach with the president, and (according to Trump) assured him that their remarks had been misconstrued, despite having been delivered in public and broadcast in their entirety. Yet Trump’s interview broadcast Sunday with Margaret Brennan on CBS made clear how little little headway they made in regaining his trust.

Trump told Brennan he plans to maintain troops in Iraq because, “I want to be able to watch Iran … We’re going to keep watching and we’re going to keep seeing and if there’s trouble, if somebody is looking to do nuclear weapons or other things, we’re going to know it before they do.” But would he accept the assessments that he received? No, Trump replied, he wouldn’t.

His reason for rejecting this intelligence was consistent. Trump is unable to separate the question, Do I like Iran’s government and its foreign policy? from the question Is Iran building a nuclear weapon? Tell Trump that Iran is abiding by its nuclear commitment, and what he hears you saying is, “Iran is a lovely state run by wonderful people.”

If that account of Trump’s thinking sounds too simplistic, just look at his answers:

I’m not going to stop [intelligence officials] from testifying. They said they were mischaracterized — maybe they were maybe they weren’t, I don’t really know — but I can tell you this, I want them to have their own opinion and I want them to give me their opinion. But, when I look at Iran, I look at Iran as a nation that has caused tremendous problems …

My intelligence people, if they said in fact that Iran is a wonderful kindergarten, I disagree with them 100 percent. It is a vicious country that kills many people …

So when my intelligence people tell me how wonderful Iran is — if you don’t mind, I’m going to just go by my own counsel.

In fact, intelligence officials did not deny Iran has caused problems. They simply asserted facts about its nuclear weapons. Trump cannot hear those facts without translating it into Iran being comprehensively “wonderful.”

Even more remarkably, Trump explained that intelligence assessments could not be trusted because they had failed in the run-up to the Iraq war:

MARGARET BRENNAN: I want to move on here but I should say your intel chiefs do say Iran’s abiding by that nuclear deal. I know you think it’s a bad deal, but—

PRESIDENT DONALD TRUMP: I disagree with them. I’m — I’m — by the way—

MARGARET BRENNAN: You disagree with that assessment?

PRESIDENT DONALD TRUMP: —I have intel people, but that doesn’t mean I have to agree. President Bush had intel people that said Saddam Hussein—

MARGARET BRENNAN: Sure.

PRESIDENT DONALD TRUMP: —in Iraq had nuclear weapons — had all sorts of weapons of mass destruction. Guess what? Those intel people didn’t know what the hell they were doing, and they got us tied up in a war that we should have never been in.

Trump’s understanding of this history is almost perfectly backwards. U.S. intelligence officials never said Iraq “had nuclear weapons,” or even anything close to that. They did overstate Iraqi weapons capabilities. But — crucially — the Bush administration also pressured intelligence agencies to inflate their findings, as John Judis and Spencer Ackerman reported in 2003, and administration officials overstated the intelligence that was produced, as the Senate Intelligence Committee found in 2008.

The backdrop to this episode does have some important differences with the current moment. The Bush administration had been plunged into an adrenal panic by the 9/11 attacks. Its rush toward war was largely choreographed by Dick Cheney, a skilled bureaucratic operator, and enjoyed broad public legitimacy created by the national unity bestowed upon Bush by the surprise attack. None of these conditions apply to the easily distracted, childlike, and deeply unpopular sitting president.

And while Cheney has departed the scene, National Security Council director John Bolton has assumed a somewhat parallel role. An ultrahawk with a long record of punishing subordinates who undermine the factual basis for his preferred policies, Bolton has emerged as Trump’s most influential foreign policy adviser. Bolton in 2015 insisted that Iran was racing toward a nuclear weapon. (“Even absent palpable proof, like a nuclear test, Iran’s steady progress toward nuclear weapons has long been evident.”) He likewise concluded that diplomacy could never work (“The inescapable conclusion is that Iran will not negotiate away its nuclear program”) and that “only military action” could stop it.

As Trump has grown alienated from his national security apparatus, Bolton appears to be the one remaining official who has retained a measure of his trust. And while he may not have a Cheney-like ability to manipulate the president, Bolton does benefit from a near vacuum in rival power sources. . .

Continue reading.

Written by Leisureguy

4 February 2019 at 4:46 pm

%d bloggers like this: