Archive for April 2014
A destabilized China would be very bad
But it looks as though it’s coming. Read the whole article. This is one of those things that are the result of decisions made long ago without really considering the consequences, and now the only thing to do is ride it out as best they can. Like climate change, in that way (and climate change is going to exacerbate their problems—and that’s an awful lot of people).
Chili garlic paste
I found I had a large bag of dried red New Mexican peppers—a pretty mild pepper, like a dried ancho, which I might have used had I not discovered the New Mexican. I recalled this recipe, and I pretty much followed it:
10-12 ancho chiles or red New Mexican dried chiles
6-8 dried chipotles
2 Tbsp olive oil
1 tsp salt
2-3 garlic cloves
soak water as needed (I used about 3 Tbsp, adding 1 Tbsp at a time)
Put in processor (my little 3.5 c KitchenAid was perfect) and process until smooth and then some more.
Extremely tasty—warmth, but no bite and no need for a drink of water.
Obviously you can dress this up—e.g., adding Parmesan cheese and/or pine nuts, using lemon juice (and the zest from the lemon) instead of the soaking water, adding ground black pepper, adding 1/2 tsp sugar, adding fresh figs, and so on.
This Cop Says It’s Time to Legalize All Drugs
Diane Goldstein in Pacific Standard:
I was a police officer for 20 years, enforcing drug laws in California and thinking I was doing my part for society. But what made me think properly about drug use for the first time was my experience with my older brother, Billy. I had watched him struggle with a lifelong problem with drugs. But I still did not understand what it meant to be Billy until my husband convinced me to open up my heart and our home to save him in 2002.
It was in this intimacy of watching Billy try, during the year he lived with us, to live up to the expectations of society and those he loved that I realized that our society’s portrayal of people with chronic drug problems was both damaging and morally flawed.
By society’s standard, my brother was a criminal. His struggles with addiction taught me many things. He had many years of sobriety, interspersed with the setbacks that addiction specialists know so often come with the condition. But because of an emphasis by the court system on abstinence-only drug programs, and an emphasis on punishment over progress, these normal and accepted setbacks in recovery were exacerbated by harsh penalties. Because of Billy’s felony convictions for drugs, he was unemployable. He lacked health care until we stepped in. Without us, my brother would have been on the streets. Yet despite our help, my brother passed away from an accidental overdose of psychotropic medications and alcohol.
After having my eyes opened to the realities of drug use, I realized we could not arrest our way out of this problem. I joined Law Enforcement Against Prohibition (LEAP), a group of law enforcement officials opposed to the war on drugs. Some people are surprised to find police, prosecutors, judges, and others arguing for legalizing drugs, but in many ways we are the best positioned to see the injustices and ineffectiveness of the criminal justice system up close.
We’ve seen how federal grants and civil asset forfeiture laws (whereby police can take your property and use or sell it for their own benefit, even if you’re never charged with a crime) encourage police to go after drug offenders while real criminals roam free. We’ve seen people die of overdose. We’ve seen people go to prison who had no business being there. And we’ve seen that none of this has reduced drug use or addiction. In spite of more than 40 years of the war on drugs—and the trillion dollars we’ve spent—Americans now have access to drugs that are cheaper, more potent, and just as readily available as when the drug war started. Who exactly is prohibition supposed to be helping?
But that doesn’t mean that everything we’ve tried has failed. As we work toward a world in which
Don Quixote notes
First, after seeing some passages in both the Burton Raffel translation and the Edith Grossman translation, Raffel has superseded Grossman for me. His translations are smoother and not so clunky. The title he uses is Don Quijote.
And I want to quote a passage from the very interesting introduction by Diana de Armas Wilson. Don Quijote was first published in 1605 and proved to be very popular, with hundreds of copies crossing the Atlantic in the very year of publication.
In 1607—just as Jamestown, Virginia, was struggling to become the first permanent British settlement in the New World—a small mining town in the highlands of Peru awarded its first prize, during a festive ring joust, to impersonations of Don Quijote and Sancho.
Amazing: that anyone in the town had read the book, much less that it was well enough known for people to dress as characters in the novel—and win!
The other Christie scandal
Paul Krugman blogs in the NY Times:
What is it that makes self-proclaimed centrists such easy marks for right-wing con men? Actually, it’s not that much of a mystery: the centrist creed is that the two parties are symmetrically extremist, and this means that there must, as a matter of principle, be Serious, Honest Republicans out there — so such people must be invented if they don’t actually exist. Hence the elevation of Paul Ryan despite clear evidence of his con-artist nature.
And hence, also, the love affair with Chris Christie.
That affair ended up in a breakup over Bridgegate, but the evidence of Christie’s true nature was obvious all along. I wrote two years ago about his fiscal fakery, and in particular the way he tried to silence independent critics of his budget projections via crude, vicious personal attacks.
Now Vox tells us that the critics were in fact completely right, and that Christie’s budget projections were absolutely as unrealistic as they said.
Can we say that someone who tries to browbeat anyone daring to question rosy scenarios is someone who should never, ever be allowed near higher office? And can we also say that there’s something very wrong with pundits who failed to see the obvious about this guy?
The Rise of Corporate Impunity
Jesse Eisinger has a good article at ProPublica about the only Wall St. executive prosecuted as a result of the financial crisis. It begins:
On the evening of Jan. 27, Kareem Serageldin walked out of his Times Square apartment with his brother and an old Yale roommate and took off on the four-hour drive to Philipsburg, a small town smack in the middle of Pennsylvania. Despite once earning nearly $7 million a year as an executive at Credit Suisse, Serageldin, who is 41, had always lived fairly modestly. A previous apartment, overlooking Victoria Station in London, struck his friends as a grown-up dorm room; Serageldin lived with bachelor-pad furniture and little of it — his central piece was a night stand overflowing with economics books, prospectuses and earnings reports. In the years since, his apartments served as places where he would log five or six hours of sleep before going back to work, creating and trading complex financial instruments. One friend called him an “investment-banking monk.”
Serageldin’s life was about to become more ascetic. Two months earlier, he sat in a Lower Manhattan courtroom adjusting and readjusting his tie as he waited for a judge to deliver his prison sentence. During the worst of the financial crisis, according to prosecutors, Serageldin had approved the concealment of hundreds of millions in losses in Credit Suisse’s mortgage-backed securities portfolio. But on that November morning, the judge seemed almost torn. Serageldin lied about the value of his bank’s securities — that was a crime, of course — but other bankers behaved far worse. Serageldin’s former employer, for one, had revised its past financial statements to account for $2.7 billion that should have been reported. Lehman Brothers, AIG, Citigroup, Countrywide and many others had also admitted that they were in much worse shape than they initially allowed. Merrill Lynch, in particular, announced a loss of nearly $8 billion three weeks after claiming it was $4.5 billion. Serageldin’s conduct was, in the judge’s words, “a small piece of an overall evil climate within the bank and with many other banks.” Nevertheless, after a brief pause, he eased down his gavel and sentenced Serageldin, an Egyptian-born trader who grew up in the barren pinelands of Michigan’s Upper Peninsula, to 30 months in jail. Serageldin would begin serving his time at Moshannon Valley Correctional Center, in Philipsburg, where he would earn the distinction of being the only Wall Street executive sent to jail for his part in the financial crisis.
American financial history has generally unfolded as a series of booms followed by busts followed by crackdowns. After the crash of 1929, the Pecora Hearings seized upon public outrage, and the head of the New York Stock Exchange landed in prison. After the savings-and-loan scandals of the 1980s, 1,100 people were prosecuted, including top executives at many of the largest failed banks. In the ’90s and early aughts, when the bursting of the Nasdaq bubble revealed widespread corporate accounting scandals, top executives from WorldCom, Enron, Qwest and Tyco, among others, went to prison.
The credit crisis of 2008 dwarfed those busts, and it was only to be expected that a similar round of crackdowns would ensue. In 2009, the Obama administration appointed Lanny Breuer to lead the Justice Department’s criminal division. Breuer quickly focused on professionalizing the operation, introducing the rigor of a prestigious firm like Covington & Burling, where he had spent much of his career. He recruited elite lawyers from corporate firms and the Breu Crew, as they would later be known, were repeatedly urged by Breuer to “take it to the next level.”
But the crackdown never happened. Over the past year, I’ve interviewed Wall Street traders, bank executives, defense lawyers and dozens of current and former prosecutors to understand why the largest man-made economic catastrophe since the Depression resulted in the jailing of a single investment banker — one who happened to be several rungs from the corporate suite at a second-tier financial institution. Many assume that the federal authorities simply lacked the guts to go after powerful Wall Street bankers, but that obscures a far more complicated dynamic. During the past decade, the Justice Department suffered a series of corporate prosecutorial fiascos, which led to critical changes in how it approached white-collar crime. The department began to focus on reaching settlements rather than seeking prison sentences, which over time unintentionally deprived its ranks of the experience needed to win trials against the most formidable law firms. By the time Serageldin committed his crime, Justice Department leadership, as well as prosecutors in integral United States attorney’s offices, were de-emphasizing complicated financial cases — even neglecting clues that suggested that Lehman executives knew more than they were letting on about their bank’s liquidity problem. In the mid-’90s, white-collar prosecutions represented an average of 17.6 percent of all federal cases. In the three years ending in 2012, the share was 9.4 percent. (Read the Department of Justice’s response to ProPublica’s inquiries.)
After the evening drive to Philipsburg, Serageldin checked into a motel. He didn’t need to report to Moshannon Valley until 2 p.m. the next day, but he was advised to show up early to get a head start on his processing. Moshannon is a low-security facility, with controlled prisoner movements, a bit tougher than the one portrayed on “Orange Is the New Black.” Friends of Serageldin’s worried about the violence; he was counseled to keep his head down and never change the channel on the TV no matter who seemed to be watching. Serageldin, who is tall and thin with a regal bearing, was largely preoccupied with how, after a decade of 18-hour trading days, he would pass the time. He was planning on doing math-problem sets and studying economics. He had delayed marrying his longtime girlfriend, a private-equity executive in London, but the plan was for her to visit him frequently.
Other bankers have spoken out about feeling unfairly maligned by the financial crisis, pegged as “banksters” by politicians and commentators. But Serageldin was contrite. “I don’t feel angry,” he told me in early winter. “I made a mistake. I take responsibility. I’m ready to pay my debt to society.” Still, the fact that the only top banker to go to jail for his role in the crisis was neither a mortgage executive (who created toxic products) nor the C.E.O. of a bank (who peddled them) is something of a paradox, but it’s one that reflects the many paradoxes that got us here in the first place.
Part of the Justice Department’s futility can be traced to . . .
The same report also appears in the NY Times Magazine.
RazorRock Stealth v. iKon Slant
First, a good prep. I really like Special 218 shaving soap from QEDusa.com. It’s also available as a shave stick. The Vie-Long horsehair brush worked up a very fine lather, and I set to work with the two razors, both carrying Personna Lab Blue blades.
The two razors are very close in comfort and efficiency. They do have a somewhat different cutting angle—the Stealth’s handle is carried closer to the face than the iKon’s—but today I noticed that I already “know” the angle best for the Stealth. The quotation marks are to indicate that this is not a conscious, cognitive knowledge, but rather simply by using the razor one’s adaptive unconscious will quickly figure out what to do. (For more on this, I highly recommend Strangers to Ourselves: Discovering the Adaptive Unconscious, by Timothy Wilson.)
Really, the Stealth seems ready for market. While one might play around with handle designs, I see no real reason why this razor would not succeed well “as is.”
A BBS result with no irritation, a good splash of Very V aftershave from Saint Charles Shave, and the day begins.
The Continuing Evolution of Genes
Pretty cool how new genes can arise: a start sequence is inserted into the middle of some junk DNA, and presto! a new gene—which may be toxic, or neutral, or even helpful. Carl Zimmer has a useful explanatory article in the NY Times.
4% of those executed in the US were innocent, based on new study
Ed Pilkington has the story in The Guardian:
At least 4.1% of all defendants sentenced to death in the US in the modern era are innocent, according to the first major study to attempt to calculate how often states get it wrong in their wielding of the ultimate punishment.
A team of legal experts and statisticians from Michigan and Pennsylvania used the latest statistical techniques to produce a peer-reviewed estimate of the “dark figure” that lies behind the death penalty – how many of the more than 8,000 men and women who have been put on death row since the 1970s were falsely convicted.
The team arrived at a deliberately conservative figure that lays bare the extent of possible miscarriages of justice, suggesting that the innocence of more than 200 prisoners still in the system may never be recognised.
The study concludes that were all innocent people who were given death sentences to be cleared of their offences, the exoneration rate would rise from the actual rate of those released – 1.6% – to at least 4.1%. That is equivalent in the time frame of the study, 1973 to 2004, of about 340 prisoners – a much larger group than the 138 who were exonerated in the same period.
“This is a disturbing finding,” said Samuel Gross, a law professor at the University of Michigan law school who is the lead author of the research. “There are a large number of people who are sentenced to death, and despite our best efforts some of them have undoubtedly been executed.”
The research team deployed statistical devices to put a figure on the proportion of cases of hidden innocence. In particular, they deployed a technique known as . . .
BofA makes $4 billion accounting error
Doesn’t give you much confidence in the bank, does it? Here’s the story.
Interesting perspective on the Donald Sterling decision
Well worth reading. By Jelani Cobb in the New Yorker:
A few months ago, on the Upper West Side of Manhattan, I was rushing along the street one morning when an older white man offered me a handshake and a cordial “Good morning.” His warm greeting was followed by what he meant as a sincere compliment: “You did a wonderful job painting that stoop yesterday.” I blinked in confusion, and he recognized his mistake. “Do I have the wrong one?” he asked. I assured him that I had definitely not painted his stairs the previous day, and continued across town, somewhat bemused by the encounter.
The episode stuck with me because it so perfectly embodied a kind of vintage racism, one in which all members of a particular racial group are entirely interchangeable; to respond with anger would be entirely beside the point. For many years, I was puzzled by African-Americans who collected Sambo and Mammy figurines, plantation grotesques whose distorted features expressed the prevailing arguments for slavery and segregation. But my sidewalk encounter suggested a possible rationale for these purchases: they recalled an era when people were at least candid about their racial views.
This afternoon, Adam Silver, the N.B.A. commissioner, announced that the Los Angeles Clippers owner Donald Sterling, who was recorded telling a girlfriend that he didn’t want her to bring black people to his games, will be banned for life and fined $2.5 million, and that the league will attempt to force Sterling to sell the team. Faced with grumblings about a player strike or diminished attendance during the playoffs, the league had no choice but to act quickly.
In matters of race, Malcolm X argued, American politics were divided not between liberals and conservatives but between “wolves” and “foxes”—between . . .
How Many Have We Killed?
Many people choose ignorance rather than knowledge when the information is painful or conflicts with their treasured beliefs—or, indeed, might conflict with their treasured beliefs: the GOP and NRA, for example, are implacably opposed to any scientific studies of gun violence: they fear the knowledge might not agree with their stated positions, I assume. Fracking companies vigorously oppose publishing what they are pumping into our soil (and groundwater): if people don’t know what it is, they will not object. Some states (e.g., Oklahoma) refuse to state what chemicals they are using in their lethal injections for the death penalty—it could be simply chlorine and lye, for all we know.
And now this, reported in the NY Review of Books by David Cole:
On Monday, The New York Times reported that “the Senate has quietly stripped a provision from an intelligence bill that would have required President Obama to make public each year the number of people killed or injured in targeted killing operations in Pakistan and other countries where the United States uses lethal force.” National security officials in the Obama administration objected strongly to having to notify the public of the results and scope of their dirty work, and the Senate acceded. So much for what President Obama has called “the most transparent administration in history.”
The Senate’s decision is particularly troubling in view of how reticent the administration itself continues to be about the drone program. To date, Obama has publicly admitted to the deaths of only four people in targeted killing operations. That came in May 2013, when, in conjunction with a speech at the National Defense University, and, in his words, “to facilitate transparency and debate on the issue,” President Obama acknowledged for the first time that the United States had killed four Americans in drone strikes. But according to credible accounts, Obama has overseen the killing of several thousand people in drone strikes since taking office. Why only admit to the four Americans’ deaths? Is the issue of targeted killings only appropriate for debate when we kill our own citizens? Don’t all human beings have a right to life?
In the NDU speech, President Obama also announced new limits on the use of drones “beyond the Afghan theater.” He proclaimed that drone strikes would be authorized away from the battlefield only when necessary to respond to “continuing and imminent threats” posed by people who cannot be captured or otherwise countermanded. Most important, he said, “before any strike is taken, there must be near-certainty that no civilians will be killed or injured—the highest standard we can set.” Yet in December, a US drone strike in Yemen reportedly struck a wedding party. The New York Times reported that while some of the victims may have been linked to al-Qaeda, the strike killed “at least a half dozen innocent people, according to a number of tribal leaders and witnesses.”
The decision to drop the requirement to report on the number of people we kill in drone strikes fittingly if depressingly came on the ten-year anniversary of CBS’s airing of the photos of torture and prisoner abuse at Abu Ghraib prison in Iraq. To this day, the United States has not held accountable any senior official for torture inflicted during the “war on terror”—not at Abu Ghraib, not at Guantanamo, not at Bagram Air Force Base, and not in the CIA’s secret prisons, or “black sites.” President Obama has stuck to his commitment to look forward, not backward, and his administration has opposed all efforts to hold the perpetrators of these abuses to account. Indeed, the administration has classified even the memories of the survivors of torture in CIA black sites, now housed at Guantanamo, maintaining that they and their lawyers cannot under any circumstance even talk publically about their mistreatment.
To be fair, Obama deserves some credit for both banning torture and achieving some transparency on the subject.. .
I’ll say it again: trying to resolve political problems by killing people doesn’t work. I would think that every strike with people who are clearly innocent are killed or maimed must convert the victims’ entire extended families to an anti-American view. Some proportion will doubtless become at least terrorists of opportunity. Non-violent methods of resolution should be exhausted before war, which truly is the last resort—not the first resort.
“A hungry man is an angry man”
That quotation a friend frequently stated as she served dinner to her family. And, apparently, it’s quite true, as reported in Pacific Standard by John Upton:
Furious that your spouse forgot to put the sandwich fixings back in the fridge—for, like, the umpteenth time?
Don’t get mad. Get eating.
Diabetics understand the dangers of low blood sugar. If recent insulin doses were too high, glucose in a diabetic’s blood falls too far below the 100 milligrams-per-deciliter goal, and irritability can set in. Innocuous comments can stab like harsh insults; life’s trivialities seem deadly serious. Diabetics use this irrational frustration as a warning sign that soda, fruit juice, or a candy bar is needed—stat.
But diabetics aren’t the only ones affected; Anybody can be vulnerable to low blood sugar, especially when hungry. And new research has pinned a share of the blame for domestic quarrels, which can escalate to violent abuse, on hard-to-notice blood sugar imbalances.
Self-control is not some supernatural state of mind. It takes energy to maintain, and that energy comes from burning glucose. If this energy supply is in short supply, then self-control becomes limited, and it’s harder to regulate emotions and unwelcome impulses.
To test the relationship between blood sugar levels and domestic quarreling, a team of scientists turned to a combination of voodoo and modern medicine. They handed out voodoo dolls and glucometers, which are used by diabetics, to 107 couples. The couples had been married for an average of 12 years. Every evening for three weeks, each partner pricked a doll symbolizing their spouse with as many as 51 pins, with more pricks meaning they felt more marital frustration. They also measured their blood sugar levels.
Sure enough, the lower a study participant’s blood sugar, the more pins they were likely to stick in the play-sized representation of their lover.
Couples who were generally happy with their relationships used fewer pins. (And wives pricked their husbands more often than vice versa, though the relationship wasn’t statistically significant.) After playing with the data to account for such differences, the scientists found a statistically significant negative correlation between blood sugar and spouse frustration levels. The results were posted online Monday by the Proceedings of the National Academy of Scientists ahead of print publication.
The research suggests that couples could cut back on their quarreling by recognizing irrational frustrations and signs of low blood sugar—and snacking it out instead of duking it out.
Glucose meters are available cheap from . . .
The history of what might have been, Afghanistan division
Anand Gopal writes at TomDispatch.com:
It was a typical Kabul morning. Malik Ashgar Square was already bumper-to-bumper with Corolla taxis, green police jeeps, honking minivans, and angry motorcyclists. There were boys selling phone cards and men waving wads of cash for exchange, all weaving their way around the vehicles amid exhaust fumes. At the gate of the Lycée Esteqial, one of the country’s most prestigious schools, students were kicking around a soccer ball. At the Ministry of Education, a weathered old Soviet-style building opposite the school, a line of employees spilled out onto the street. I was crossing the square, heading for the ministry, when I saw the suicide attacker.
He had Scandinavian features. Dressed in blue jeans and a white t-shirt, and carrying a large backpack, he began firing indiscriminately at the ministry. From my vantage point, about 50 meters away, I couldn’t quite see his expression, but he did not seem hurried or panicked. I took cover behind a parked taxi. It wasn’t long before the traffic police had fled and the square had emptied of vehicles.
Twenty-eight people, mostly civilians, died in attacks at the Ministry of Education, the Ministry of Justice, and elsewhere across the city that day in 2009. Afterward, U.S. authorities implicated the Haqqani Network, a shadowy outfit operating from Pakistan that had pioneered the use of multiple suicide bombers in headline-grabbing urban assaults. Unlike other Taliban groups, the Haqqanis’ approach to mayhem was worldly and sophisticated: they recruited Arabs, Pakistanis, even Europeans, and they were influenced by the latest in radical Islamist thought. Their leader, the septuagenarian warlord Jalaluddin Haqqani, was something like Osama bin Laden and Al Capone rolled into one, as fiercely ideological as he was ruthlessly pragmatic.
And so many years later, his followers are still fighting. Even with the U.S. withdrawing the bulk of its troops this year, up to 10,000 Special Operations forces, CIA paramilitaries, and their proxies will likely stay behind to battle the Haqqanis, the Taliban, and similar outfits in a war that seemingly has no end. With such entrenched enemies, the conflict today has an air of inevitability — but it could all have gone so differently.
Though it’s now difficult to imagine, by mid-2002 there was no insurgency in Afghanistan: al-Qaeda had fled the country and the Taliban had ceased to exist as a military movement. Jalaluddin Haqqani and other top Taliban figures were reaching out to the other side in an attempt to cut a deal and lay down their arms. Tens of thousands of U.S. forces, however, had arrived on Afghan soil, post-9/11, with one objective: to wage a war on terror.
As I report in my new book, No Good Men Among the Living: America, the Taliban, and the War Through Afghan Eyes, the U.S. would prosecute that war even though there was no enemy to fight. To understand how America’s battle in Afghanistan went so wrong for so long, a (hidden) history lesson is in order. In those early years after 2001, driven by the idée fixe that the world was rigidly divided into terrorist and non-terrorist camps, Washington allied with Afghan warlords and strongmen. Their enemies became ours, and through faulty intelligence, their feuds became repackaged as “counterterrorism.” The story of Jalaluddin Haqqani, who turned from America’s potential ally into its greatest foe, is the paradigmatic case of how the war on terror created the very enemies it sought to eradicate.
The Campaign to Take Out Haqqani: 2001
Jalaluddin Haqqani stands at about average height, with bushy eyebrows, an aquiline nose, a wide smile, and an expansive beard, which in its full glory swallows half his face. In his native land, the three southeastern Afghan provinces known collectively as Loya Paktia, he is something of a war hero, an anti-Soviet mujahedeen of storied bravery and near mythical endurance. (Once, after being shot, he refused painkillers because he was fasting.) During the waning years of the Cold War, he was beloved by the Americans — Texas Congressman Charlie Wilson called him “goodness personified” — and by Osama bin Laden, too. In the 1980s, the U.S. supplied him with funds and weapons in the battle against a Soviet-backed regime in Kabul and the Red Army, while radical Arab groups provided a steady stream of recruits to bolster his formidable Afghan force.
American officials had this history in mind when the second Afghan War began in October 2001. Hoping to convince Haqqani (who had backed the Taliban and al-Qaeda in the post-Soviet years) to defect, they spared his territory in Loya Paktia the intense bombing campaign that they had loosed on much of the rest of the country. The Taliban, for their part, placed him in charge of their entire military force, both sides sensing that his could be the swing vote in the war. Haqqani met with top Taliban figures and Osama bin Laden, only to decamp for Pakistan, where he took part in a flurry of meetings with Pakistanis and U.S.-backed Afghans.
His representatives also began meeting American officials in Islamabad, the Pakistani capital, and the United Arab Emirates, and the Americans eventually offered him a deal: surrender to detention, cooperate with the new Afghan military authorities, and after a suitable period, he would be free to go. For Haqqani, one of Loya Paktia’s most respected and popular figures, the prospect of sitting behind bars was unfathomable. Arsala Rahmani, an associate of his, who would go on to serve as a senator in the Afghan government, told me, “He wanted to have an important position in Loya Paktia, but they offered to arrest him. He couldn’t believe it. Can you imagine such an insult?”
Haqqani declined the American offer, but left the door open to future talks. The prevailing ethos in the U.S., though, was that you were either with us or against us. “I personally always believed that Haqqani was someone we could have worked with,” a former U.S. intelligence official told journalist Joby Warrick. “But at the time, no one was looking over the horizon, to where we might be in five years. For the policy folks, it was just ‘screw these little brown people.’”
In early November, the U.S. began bombing Loya Paktia. Two nights later, warplanes attacked Haqqani’s home in the town of Gardez, near the Pakistani border. He was not present, but his brother-in-law and a family servant died in the blast. The next evening, U.S. planes struck a religious school in the village of Mata China, one of many Haqqani had built in Afghanistan and Pakistan, which provided room, board, and education to poor children. Malem Jan, a Haqqani family friend, showed up the next morning. “I had never seen anything like it,” he said. “There were so many bodies. The roof was flattened to the ground. I saw one child who was alive under there, but no one could get him out in time.” Thirty-four people, almost all children, lost their lives. . .
The Ugly American has returned, but now he’s armed with drones firing Hellfire missiles.
The government’s crucial role in innovation
While many on the right condemn the government to the extent of denying that it exists, our government is very much a part of our lives and plays a strong role in innovating new technology. Jeff Madrick in the NY Review of Books reviews two books on the topic:
The Entrepreneurial State: Debunking Public vs. Private Sector Myths
by Mariana Mazzucato
Anthem, 237 pp., $19.95 (paper)Doing Capitalism in the Innovation Economy: Markets, Speculation and the State
by William H. Janeway
Cambridge University Press, 329 pp., $35.99“The great advances of civilization,” wrote Milton Friedman in Capitalism and Freedom, his influential best seller published in 1962, “whether in architecture or painting, in science or literature, in industry or agriculture, have never come from centralized government.” He did not say what he made of the state-sponsored art of Athens’s Periclean Age or the Medici family, who, as Europe’s dominant bankers but then as Florentine rulers, commissioned and financed so much Renaissance art. Or the Spanish court that gave us Velázquez. Or the many public universities that produced great scientists in our times. Or, even just before Friedman was writing, what could he have made of the Manhattan Project of the US government, which produced the atomic bomb? Or the National Institutes of Health, whose government-supported grants led to many of the most important pharmaceutical breakthroughs?
We could perhaps forgive Friedman’s ill-informed remarks as a burst of ideological enthusiasm if so many economists and business executives didn’t accept this myth as largely true. We hear time and again from those who should know better that government is a hindrance to the innovation that produces economic growth. Above all, the government should not try to pick “winners” by investing in what may be the next great companies. Many orthodox economists insist that the government should just get out of the way.
Lawrence Summers said something of the sort in a 2001 interview, shortly after the end of his tenure as Bill Clinton’s treasury secretary:
There is something about this epoch in history that really puts a premium on incentives, on decentralization, on allowing small economic energy to bubble up rather than a more top-down, more directed approach.
More recently, the respected Northwestern economist Robert Gordon reiterated the conventional view in a talk at the New School, saying that he was “extremely skeptical of government” as a source of innovation. “This is the role of individual entrepreneurs. Government had nothing to do with Bill Gates, Steve Jobs, Zuckerberg.”
Fortunately, a new book, The Entrepreneurial State, by the Sussex University economist Mariana Mazzucato, forcefully documents just how wrong these assertions are. It is one of the most incisive economic books in years. Mazzucato’s research goes well beyond the oft-told story about how the Internet was originally developed at the US Department of Defense. For example, she shows in detail that, while Steve Jobs brilliantly imagined and designed attractive new commercial products, almost all the scientific research on which the iPod, iPhone, and iPad were based was done by government-backed scientists and engineers in Europe and America. The touch-screen technology, specifically, now so common to Apple products, was based on research done at government-funded labs in Europe and the US in the 1960s and 1970s.
Similarly, Gordon called the National Institutes of Health a useful government “backstop” to the apparently far more important work done by pharmaceutical companies. But Mazzucato cites research to show that the NIH was responsible for some 75 percent of the major original breakthroughs known as new molecular entities between 1993 and 2004.
Further, Marcia Angell, former editor of The New England Journal of Medicine, found that new molecular entities that were given priority as possibly leading to significant advances in medical treatment were often if not mostly created by government. As Angell notes in her book (2004), only three of the seven high-priority drugs in 2002 came from pharmaceutical companies: the drug Zelnorm was developed by Novartis to treat irritable bowel syndrome, Gilead Sciences created Hepsera to treat hepatitis B, and Eloxatin was created by Sanofi-Synthélabo to treat colon cancer. No one can doubt the benefits of these drugs, or the expense incurred to develop them, but this is a far cry from the common claim, such as Gordon’s, that it is the private sector that does almost all the important innovation.
The rise of Silicon Valley, the high-technology center of the US based in and around Palo Alto, California, is supposedly the quintessential example of how entrepreneurial ideas succeeded without government direction. As Summers put it, new economic ideas were “born of the lessons of the experience of the success of decentralization in a place like Silicon Valley.” In fact, military contracts for research gave initial rise to the Silicon Valley firms, and national defense policy strongly influenced their development. Two researchers cited by Mazzucato found that in 2006, the last year sampled, only twenty-seven of the hundred top inventions annually listed by R&D Magazine in the 2000s were created by a single firm as opposed to government alone or a collaboration with government-funded entities. Among those recently developed by government labs were a computer program to speed up data-mining significantly and Babel, a program that translates one computer-programming language into another.
For all the acclaim now given to venture capital, Mazzucato says, private firms often invest after innovations have already come a long way under government’s much more daring basic research and patient investment of capital. The obvious case is the development of the technology for the Internet, but the process is much the same in the pharmaceutical industry. . .
Land speed record: 322 body-lengths per second!
If a human could run that fast, s/he would be traveling 1300 mph: Mach 2, more or less.
The cheetah we think is fast, but it travels only at 16 body-lengths per second.
Who is this speed demon? Take a look.
Bakelite slant and the Stealth
The Omega R&B brush seems to be well along in its break-in: it made a terrific lather and easily sustained the lather throughout the shave, with ample lather for another 3 passes. The brush continues to be extraordinarily soft and pleasant on the face, presumably from the use of untrimmed bristles.
The lather from Synergy’s Gondolier shaving soap was superb, and I used both the Stealth and (on first pass only) the bakelite slant. They do have a similar feel, and on the whole I prefer the angle of the bakelite slant. However, when I switch to the Stealth only for the last two passes, I seemed to find its angle easily: a BBS shave was the result. One nick, which I didn’t even feel, on the upper lip, but My Nik Is Sealed put paid to that.
A good splash of Paul Sebastian aftershave for the vanilla hit, and the day begins.
Meet the Doctor Who Gave $1 Million of His Own Money to Keep His Gun Research Going
Interesting ProPublica article reprinted in Pacific Standard by Lois Beckett:
Federal funding for research on gun violence has been restricted for nearly two decades. President Obama urged Congress to allocate $10 million for new research after the Newtown school shooting. But House Republicans say they won’t approve it. The Centers for Disease Control and Prevention’s budget still lists zero dollars for research on gun violence prevention.
One of the researchers who lost funding in the political battle over studying firearms was Dr. Garen Wintemute, a professor of emergency medicine who runs the Violence Prevention Research Program at the University of California-Davis. Wintemute is, by his own count, one of only a dozen researchers across the country who have continued to focus full-time on firearms violence.
To keep his research going, Wintemute has donated his own money, as the science journal Nature noted in a profile of him last year. As of the end of 2013, he has donated about $1.1 million, according to Kathryn Keyes, a fundraiser at Davis’ development office. His work has also continued to get funding from some foundations and the state of California.
We contacted Wintemute to talk about his research, the politics of studying firearms, and how much we really know about whether gun control laws work.
At the end of one of our conversations, Wintemute volunteered that he is also a donor to ProPublica, something the editorial staff had not known. (He and his family’s foundation have donated less than $1,500 over four years.)
Here is the condensed version of our conversations, edited for length and clarity.
What research were you doing when the CDC ended your funding? . . .
High-plains moochers
Paul Krugman has a very good column today:
It is, in a way, too bad that Cliven Bundy — the rancher who became a right-wing hero after refusing to pay fees for grazing his animals on federal land, and bringing in armed men to support his defiance — has turned out to be a crude racist. Why? Because his ranting has given conservatives an easy out, a way to dissociate themselves from his actions without facing up to the terrible wrong turn their movement has taken.
For at the heart of the standoff was a perversion of the concept of freedom, which for too much of the right has come to mean the freedom of the wealthy to do whatever they want, without regard to the consequences for others.
Start with the narrow issue of land use. For historical reasons, the federal government owns a lot of land in the West; some of that land is open to ranching, mining and so on. Like any landowner, the Bureau of Land Management charges fees for the use of its property. The only difference from private ownership is that by all accounts the government charges too little — that is, it doesn’t collect as much money as it could, and in many cases doesn’t even charge enough to cover the costs that these private activities impose. In effect, the government is using its ownership of land to subsidize ranchers and mining companies at taxpayers’ expense.
It’s true that some of the people profiting from implicit taxpayer subsidies manage, all the same, to convince themselves and others that they are rugged individualists. But they’re actually welfare queens of the purple sage.
And this in turn means that treating Mr. Bundy as some kind of libertarian hero is, not to put too fine a point on it, crazy. Suppose he had been grazing his cattle on land belonging to one of his neighbors, and had refused to pay for the privilege. That would clearly have been theft — and brandishing guns when someone tried to stop the theft would have turned it into armed robbery. The fact that in this case the public owns the land shouldn’t make any difference.
So what were people like Sean Hannity of Fox News, who went all in on Mr. Bundy’s behalf, thinking? Partly, no doubt, it was . . .
Double-entry bookkeeping as part of the liberal arts curriculum
Jacob Soll has an interesting piece in the NY Times:
SOMETIMES it seems as if our lives are dominated by financial crises and failed reforms. But how much do Americans even understand about finance? Few of us can do basic accounting and fewer still know what a balance sheet is. If we are going to get to the point where we can have a serious debate about financial accountability, we first need to learn some essentials.
The German economic thinker Max Weber believed that for capitalism to work, average people needed to know how to do double-entry bookkeeping. This is not simply because this type of accounting makes it possible to calculate profit and capital by balancing debits and credits in parallel columns; it is also because good books are “balanced” in a moral sense. They are the very source of accountability, a word that in fact derives its origin from the word “accounting.”
In Renaissance Italy, merchants and property owners used accounting not only for their businesses but to make a moral reckoning with God, their cities, their countries and their families. The famous Italian merchant Francesco Datini wrote “In the Name of God and Profit” in his ledger books. Merchants like Datini (and later Benjamin Franklin) kept moral account books, too, tallying their sins and good acts the way they tallied income and expenditure.
One of the less sexy and thus forgotten facts about the Italian Renaissance is that it depended highly on a population fluent in accounting. At any given time in the 1400s, 4,000 to 5,000 of Florence’s 120,000 inhabitants attended accounting schools, and there is ample archival evidence of even lowly workers keeping accounts.
This was the world in which Cosimo de’ Medici and other Italians came to dominate European banking. It was understood that all landowners and professionals would know and practice basic accounting. Cosimo de’ Medici himself did yearly audits of the books of all his bank branches; he also personally kept the accounts for his household. This was typical in a world where everyone from farmers and apothecaries to merchants — even Niccolò Machiavelli — knew double-entry accounting. It was also useful in political office in republican Florence, where government required a certain amount of transparency.
If we want to know how to make our own country and companies more accountable, we would do well to study the Dutch. In 1602, . . .