Archive for February 2013
The first thing Petty Officer 2nd Class Rebecca Blumer realized upon waking was that she was freezing cold and naked. The second thing was that her body ached all over. Blumer groggily scanned the unfamiliar room for clues. She saw a concrete floor splotched with vomit, a metal door and a window onto a hallway, where a woman in an orange jumpsuit was sweeping.
“Where am I?” Blumer called hoarsely.
“Richmond County jail,” the inmate told her.
Blumer shivered. “I need to see a doctor,” she whispered.
The woman nodded. “You’ve been screaming that all night.”
Blumer sat back in shock. She was a normally cheerful 23-year-old Navy intelligence analyst stationed at Fort Gordon, a vast Army base of 15,000 military employees in Augusta, Georgia. Blumer, whose job was to sift through top-secret data, was part of a thousand-member naval unit. The night before, February 12th, 2010, she and some friends had gone to a bar not far from base for a couple of beers. Three Army guys – one with light hair, the other two dark-haired – had sent Blumer a shot of Jägermeister, a drink she didn’t care much for but had downed anyway. The light-haired man had rounded the bar to talk to her. The last thing Blumer remembered was being overwhelmed by a dizzy, sluggish feeling, her limbs and head too heavy to lift, the noises in the bar rising up and caving in on her. Only later would Blumer find out the rest: that at 1:40 a.m., police had noticed her driving with her headlights off. That she’d barely been able to stand upright during her field sobriety test, but when placed under arrest she’d gone berserk, trying to break free of the police car and screaming incoherently. In jail, she’d yelled for a doctor and fought with the cops so wildly that she’d been hosed down in an effort to quiet her. Now, crouching in her cell with a swollen jaw; bruises smudging her wrists, ankles and neck; her abdomen sore inside; and her lower back and buttocks afire with what felt like rug burn, it dawned on Blumer. She’d been roofied and raped.
She was desperate to get back to the safety of Fort Gordon. “I need to go to the hospital,” a panicked Blumer told the master-at-arms when he arrived to return her to base. Sitting in the car wearing the previous night’s turtleneck and jeans, Blumer reminded herself that she was in good hands. She came from a long military tradition; while in the cavalry, her great-grandfather had once been stationed at this very base. So Blumer was confused when, arriving at Fort Gordon, the master-at-arms drove her not to the military hospital but to the Judge Advocate General’s offices, where a half-dozen members of her chain of command were solemnly waiting in their black dress uniforms to discipline her for driving under the influence.
Blumer, a standout sailor with an unblemished record, was sure she could clear things up. She wrote a statement in the crowded office that described her suspicions about what had actually occurred, and her urgent need for medical attention. Then she obediently left the room so her superiors could discuss the matter. When she was allowed in a few minutes later, Blumer was told that she would be taken to the hospital – but with orders only for a toxicology report, to see if there really were date-rape drugs in her system. “Whether you get a rape kit is up to you,” the female JAG prosecutor cautiously told Blumer, who struggled to make sense of what was happening: The military she’d trusted to care for her wasn’t interested in caring for her at all. She was even more shaken by the JAG’s jarring question later on: “Did you inflict your injuries yourself?”
The implication floored Blumer. “How could anyone even think that I would do that to myself?” she says now. It was Blumer’s first glimpse of a hidden side of military culture, in which rapes, and the sweeping aside of rapes, happen with disturbing regularity. And it was her first sense of what lay in store after coming forward as a military rape victim: that she would be treated with suspicion by those charged with helping her, penalized by command and ostracized by her unit. “Once my assault happened,” Blumer says, “my whole future disappeared.”
The scandal of rape in the U.S. Armed Forces, across all of its uniformed services, has become inescapable. Last year saw the military’s biggest sex-abuse scandal in a decade, when an investigation at Lackland Air Force Base in San Antonio revealed that 32 basic-training instructors preyed on at least 59 recruits. In Fort Bragg, North Carolina, Army Brig. Gen. Jeffrey Sinclair is currently facing court-martial for sex-crimes charges, including forcible sodomy, for alleged misconduct against five women. In October, an Air Force technical sergeant filed an administrative complaint describing a work environment of comprehensive harassment – in which all women are “bitches”; and claimed that during a routine meeting in a commander’s office, she was instructed to take off her blouse and “relax” – edged with menace and punctuated by violent assaults. In December, a Department of Defense report revealed that rape is rampant at the nation’s military academies, where 12 percent of female cadets experienced “unwanted sexual contact.” And an explosive series of federal lawsuits filed against top DOD brass on behalf of 59 service members (including Rebecca Blumer) allege that the leadership has done nothing to stop the cycle of rape and impunity – and that by failing to condemn sexual assault, the military has created a predators’ playground. . .
It takes a long time to write and publish a book, so Garry Wills certainly could not have predicted that his newest, “Why Priests?: A Failed Tradition,” would arrive at precisely the moment in history in which many thoughtful Catholics must be asking the same question.
If you’re expecting a polemic, you might get a quiet one, but you won’t get much in the way of bombast or grandstanding. Wills is a scholar, and his opposition is rooted in a position firmly inside the church. The book is dedicated to the memory of a priest, Henri de Lubac, S.J., and it begins with a long appreciation of the priests Wills has known and loved in a professional lifetime of reading and writing about religion which itself began in a Jesuit seminary, where Wills studied for five years in hopes of becoming a priest.
This brief memoiristic opening quickly gives way to a historical account of the rise to prominence and power of the priestly class in the Roman Catholic tradition, which begins with the first generation of a priestless movement that hadn’t yet begun to call itself Christianity, and it is here that the reviewer of the audiobook edition begins to experience a special pleasure. So often the better audiobooks get their traction and build their momentum through their narrative qualities — the urgency of scene-making, the building tension of information that the listener is gaining alongside the speaker, the carefully modulated rising and falling of carefully shaped juxtapositions of events.
But when listening to “Why Priests?” in the pleasantly near-professorial cadences Michael Prichard expertly delivers, the pleasure rises from that calm authority, which so well matches Wills’s method of offering information in a steady, measured way, and then giving words to the ideas about the information that the listener has already begun to formulate for himself because the case has been prepared so intelligently. . .
Indeed, Scalia himself is not being kind to Scalia: he has turned him into a racist clown, mocking the principles by which he is supposed to judge. He already looks close to mania and delusion—as does Bob Woodward, come to think of it. These guys can really get out of touch. Scalia first, in this Salon piece by Joan Walsh:
Four slow-moving ambulances brought up the rear as student leader John Lewis led 600 peaceful protesters dressed for church on the voting rights march that would become known as Selma’s Bloody Sunday, on March 7, 1965. They stayed peaceful; law enforcement officials didn’t. Trampled by police horses, choked by tear gas, beaten with billy clubs – Lewis had his skull fractured – the marchers would need more medical help than the four cars could provide. The ugly melee made national news that night: ABC broke into its presentation of “Judgment at Nuremberg” with footage of the violence, and viewers couldn’t be entirely sure where Nazi atrocities ended and their own country’s began.
Now, not far from Selma, Shelby County, Ala., is trying to take the teeth out of the Voting Rights Act that Lyndon B. Johnson hustled through Congress after Bloody Sunday. Even though the act was reauthorized by a Republican-dominated Congress in 2006 on a 98-0 vote in the Senate (it was 390-33 in the House), and signed by President Bush, and even though its constitutionality has been upheld by the Supreme Court four times, there is evidence that the current right-wing court majority would like to overturn at least part of it. Court conservatives once represented a reaction against the court’s supposed overreach into realms best left to Congress, and its willingness to ignore earlier court decisions. Now they seem set to say Congress has no business here, and that their Supreme Court predecessors who upheld the act were either mistaken or the blinkered creatures of their idiosyncratic eras.
Unbelievably, Antonin Scalia derided the act as a “racial entitlement,” prompting gasps from the crowd gathered to hear the arguments Wednesday. (As Rachel Maddow noted, Scalia seems to live for those gasps.) And he blamed Congress for pandering for votes by keeping that “racial entitlement” alive. The cynical Scalia sounded like Mitt Romney blaming his loss on President Obama delivering “gifts” to his coalition:
I don’t think there is anything to be gained by any Senator to vote against continuation of this act. And I am fairly confident it will be reenacted in perpetuity unless — unless a court can say it does not comport with the Constitution …They are going to lose votes if they do not reenact the Voting Rights Act. Even the name of it is wonderful: The Voting Rights Act. Who is going to vote against that in the future?
Indeed, the name of it is wonderful. With that remark, Scalia made clear (if he hadn’t already) that he’s more suited for the talk radio dial alongside Rush Limbaugh, Bill O’Reilly and Sean Hannity than he is for the Supreme Court bench.
The right-wing justice’s rant goes to the heart of long-held conservative ambivalence about democracy: that corrupt politicians will be able to buy off the rabble, with “spoils” or patronage or jobs; even outright gifts of cash. Only men of wealth, property and education could be trusted to rise above such rank bribery, which is why many states had property requirements and other limits on voting in the early days of our country; universal suffrage didn’t even reach all white men until 1830.
Still, Romney only railed against Obama providing “gifts” like healthcare to Latinos and contraceptives to women. Limbaugh called him “Santa Claus,” one of his nicer names for the president, for those popular new programs. A majority of Americans, O’Reilly opined during his election night self-pity party, “want stuff. And who is going to give them things? President Obama. He knows it, and he ran on it.”
But not even O’Reilly implied that the “stuff” Obama gave his voters included their constitutional right to vote.
As is his trademark, . . .
And Bob Woodward, eviscerated by Alex Pareene:
Bob Woodward rocked Washington this weekend with an editorial that hammered President Obama for inventing “the sequester” and then being rude enough to ask that Congress not make us have the sequester. Woodward went on “Morning Joe” this morning, and he continued his brutal assault:
“Can you imagine Ronald Reagan sitting there and saying ‘Oh, by the way, I can’t do this because of some budget document?’” Woodward said Wednesday on MSNBC’s “Morning Joe.”
“Or George W. Bush saying, ‘You know, I’m not going to invade Iraq because I can’t get the aircraft carriers I need’ or even Bill Clinton saying, ‘You know, I’m not going to attack Saddam Hussein’s intelligence headquarters,’ as he did when Clinton was president because of some budget document?” Woodward added. “Under the Constitution, the president is commander-in-chief and employs the force. And so we now have the president going out because of this piece of paper and this agreement, I can’t do what I need to do to protect the country. That’s a kind of madness that I haven’t seen in a long time.”
Speaking of kinds of madness, Woodward’s actual position here is insane. As Dave Weigel points out, “some budget document” is a law, passed by Congress and signed by the president. Woodward is saying, why won’t the president just ignore the law, because he is the commander in chief, and laws should not apply to him. That is a really interesting perspective, from a man who is famous for his reporting on the extralegal activities of a guy who is considered a very bad president!
Also, that George W. Bush analogy is amazing. It would have been a good thing for him to invade and occupy Iraq without congressional approval? Say what you will about George W. Bush, at least he was really, really devoted to invading Iraq. (And yes the Reagan line, lol.) . . .
That finding surprises me, though I can see how it’s possible—well, given that it’s true, it’s obviously possible. I mean I can just about see how it might work: very slight differences resulting in different expressions, like being threat-tolerant or not sets in motion a chain of outcomes and influences that result in totally different political outlooks. Gina Kolata writes in the NY Times:
The psychiatric illnesses seem very different — schizophrenia, bipolar disorder, autism, major depression and attention deficit hyperactivity disorder. Yet they share several genetic glitches that can nudge the brain along a path to mental illness, researchers report. Which disease, if any, develops is thought to depend on other genetic or environmental factors.
Their study, published online Wednesday in the Lancet, was based on an examination of genetic data from more than 60,000 people world-wide. Its authors say it is the largest genetic study yet of psychiatric disorders. The findings strengthen an emerging view of mental illness that aims to make diagnoses based on the genetic aberrations underlying diseases instead of on the disease symptoms.
Two of the aberrations discovered in the new study were in genes used in a major signaling system in the brain, giving clues to processes that might go awry and suggestions of how to treat the diseases.
“What we identified here is probably just the tip of an iceberg,” said Dr. Jordan Smoller, lead author of the paper and a professor of psychiatry at Harvard Medical School and Massachusetts General Hospital. “As these studies grow we expect to find additional genes that might overlap.”
The new study does not mean that the genetics of psychiatric disorders are simple. Researchers say there seem to be hundreds of genes involved and the gene variations discovered in the new study only confer a small risk of psychiatric disease. . .
I was thinking about a title I saw — “Will technology help humans conquer the universe or kill us all?“—and went down the train of thought of the oncoming catastrophe (climate change) that is totally due to our use of technology, and how Monsanto is using technology and the powers of a modern corporation to gain monopoly control of our food supply, and this photo, and so on. And the next step is how the problem is not technology so much as how the technology is used—the knife that kills can also cure (surgery), and all that: it’s not the technology, it’s human nature that’s the problem.
And boy! did that ring a bell: It’s exactly the same patter you hear from gun zealots: “Guns don’t kill people, people kill people” and you have to point out yet again that without the guns (as we know from multiple examples in other countries), people do not kill people nearly as much. While correlation is not (always) from causation, one might consider whether guns contribute to the problem rather than the solution, and perhaps access should be limited….
And I saw that the exact same thing holds for technology—I didn’t recognize it, because I’m a technology zealot, so I fell into the same thought patterns as the gun zealot: “You can’t blame X for flaws in human nature”—no, you can’t. But given that those flaws do indeed exist, you’d damn well better control or fix X so that it doesn’t fail due “human nature”. That’s like the NATO self-aiming, self-firing tank cannon that aimed itself at a reviewing stand of generals, who proved surprisingly agile, considering their age, at leaping to their feet and clearing the platform in record time. The cannon did not, as it happens, fire, and the primary contractor explained that the dolts had washed the tank, so the electronics got wet. And of course the obvious question: Doesn’t it rain in Eastern Europe?
So just a the cannon must accommodate rain—for that it will encounter—human use of X (product, technology, whatever) had better turn out to work well with human flaws, or we might be in for a bad time.
Technology apparently doesn’t work well with human flaws: the states of the world—looking at our environment, our economies, our political structures, our communities, our physical health, our educational achievement, our infrastucture, our quality of life—pretty clearly demonstrate the fact. And to say that technology is not responsible for this, that people are, is to repeat a tired argument in a new context.
Our primary failure has been that we did not control technology—and I am loath to say it, because I am indeed a technology zealot, and just as for the gun zealot, my mind instantly fills with all the things that could go wrong from limiting technology. Since it’s a common theme in science-fiction, plenty of examples came to mind: the folly of trying to suppress scientific knowledge and how, even if it succeeds, it horribly distorts society into those who have the technology as distinct from (and superior to) those from whom the technology is kept.
Technology does seem to have ruined much and is close on destroying us. I suppose the problem is that we have no other group: no examples of successful limitation of technology… well, no: the samurai class in Japan pretty much kept firearms at bay for a while. But then the superiority of those who had the technology to those who lacked it pushed a familiar button in human nature and those who were behind struggled to catch up.
Technology control would be difficult. But it certainly would not be impossible. One immediately thinks jiggering the transportation system so that everyone ends up doing a lot of walking—cf. New York City. Shape the use of technologies so that people make good life-choices as a by-product of simply doing things they want to do: e.g., going from A to B (which someone wants) will necessarily involve walking and going uphill and down (not necessarily wanted in itself, but satisfying an essential human need (for exercise) and so the city layout and transportation options are designed so that such walks are always or commonly required. Similar designs could be used to insure the ingestion of healthful foods, the avoidance of toxins (in foods, in environment, etc.). We thus arrive at a technocratic society with a lot of restrictions, and I don’t know how viable it would be given the inevitability of power struggles.
Maybe technology itself is the poison pill: agriculture, most likely, is the thing that put us on the fatal track, and the invention of writing simply nailed shut the lid.
So the military-industrial complex will lose obscene profits. Gary Brecher wrote this for Not Safe For Work Corporation, and Alternet picked it up:
All the talk about drones focusses on their “morality.” But there’s a funny thing about morality talk: most of it seems to come down to money. This time’s no different.
The worst thing about drones is that they’re cheap. That’s interfering with the vacation-home budgets of a lot of very sleazy DoD contractors and their pet Texas congressmen, and that’s why you’re hearing a consensus around how “immoral” drones are.
Remember this: Drones are a threat to the sleaziest acquisition program in the history of defense contracting: the F-35 Joint Strike Fighter. There have been some pretty disgusting lemons in the sorry history of the DoD — you just have to think back to SDI, also known as “Star Wars,” to find a weapons system that not only didn’t work but was never meant to work — but I’d have to say that the F-35 is an even bigger con job than Star Wars.
Don’t just take it from me — serious hawks who actually know what they’re talking about when it comes to military aviation are saying this. John McCain, who crashed a few fighter jets in his time, joined Robert Gates when he was still SecDef to go public with what every Pentagon insider already knew: The F-35 is a godawful piece of boondoggle junk, and nobody wants it.
I can’t sum up the F-35 better than McCain did :
“It has been an incredible waste of the taxpayers’ dollar and it hurts the credibility of our acquisition process, our defense industry…[and]…reinforces the view of some of us that the military-industrial- congressional complex that President Eisenhower warned us about is alive and well.”
So there’s the lineup: In the blue corner, everybody with any decency or sense. In the red corner, a bunch of Texas Congressmen who own stock in the companies involved. My money’s on the Texans, I’m sorry to say, because there’s just too much money to be made on the F-35 for these pigs to pass up.
I’m talking about more money than you can possibly imagine. Guess how much each F-35 is supposed to cost. (That’s not what it’s actually going to cost, which is always way more, just what they say it’ll cost.) You may think you know that fighters are expensive toys, but let’s play The Price Is Wrong — write down your guess.
The correct answer is “$200 million for the base model.” Two. Hundred. Million. For just one billion dollars, folks, you can get five of these dogs, which will do almost as well as an F-16 that cost about 8% of what we’re gonna pay for the F-35! That, folks, is what the F-35 backers consider a deal too wasteful to resist.
Now let’s move on to advanced math, with lots of extra zeroes, to figure out how much the whole program will cost. We’ll make it a story problem: “If the American people are stupid enough to pay $200 million for each barks-like-a-dog F-35, and they go through with the planned purchase of 2,443 of these flying cash dispensers, how many billions in treasury bonds will we have to sell to the Chinese just to line the pockets of some sleazy Texas congressmen and their contractor pals?” . . .
Sugar is indeed toxic. It may not be the only problem with the Standard American Diet, but it’s fast becoming clear that it’s the major one.
A study published in the Feb. 27 issue of the journal PLoS One links increased consumption of sugar with increased rates of diabetes by examining the data on sugar availability and the rate of diabetes in 175 countries over the past decade. And after accounting for many other factors, the researchers found that increased sugar in a population’s food supply was linked to higher diabetes rates independent of rates of obesity.
In other words, according to this study, obesity doesn’t cause diabetes: sugar does.
The study demonstrates this with the same level of confidence that linked cigarettes and lung cancer in the 1960s. As Rob Lustig, one of the study’s authors and a pediatric endocrinologist at the University of California, San Francisco, said to me, “You could not enact a real-world study that would be more conclusive than this one.”
The study controlled for poverty, urbanization, aging, obesity and physical activity. It controlled for other foods and total calories. In short, it controlled for everything controllable, and it satisfied the longstanding “Bradford Hill” criteria for what’s called medical inference of causation by linking dose (the more sugar that’s available, the more occurrences of diabetes); duration (if sugar is available longer, the prevalence of diabetes increases); directionality (not only does diabetes increase with more sugar, it decreases with less sugar); and precedence (diabetics don’t start consuming more sugar; people who consume more sugar are more likely to become diabetics).
The key point in the article is this: “Each 150 kilocalories/person/day increase in total calorie availability related to a 0.1 percent rise in diabetes prevalence (not significant), whereas a 150 kilocalories/person/day rise in sugar availability (one 12-ounce can of soft drink) was associated with a 1.1 percent rise in diabetes prevalence.” Thus: for every 12 ounces of sugar-sweetened beverage introduced per person per day into a country’s food system, the rate of diabetes goes up 1 percent. (The study found no significant difference in results between those countries that rely more heavily on high-fructose corn syrup and those that rely primarily on cane sugar.) . . .
In fact, when I was a young boy, the name for the disease (so far as I knew) was “sugar diabetes” and it was commonly held that eating lots of sugar (very sugary iced-tea, pastries and cakes, candy bars, and so on) would “give you diabetes; that’s why they call it sugar diabetes.” I think I probably still held to that at some level.