Archive for September 2012
At TomDispatch Mattea Kramer has an interesting article:
Five big things will decide what this country looks like next year and in the 20 years to follow, but here’s a guarantee for you: you’re not going to hear about them in the upcoming presidential debates. Yes, there will be questions and answers focused on deficits, taxes, Medicare, the Pentagon, and education, to which you already more or less know the responses each candidate will offer. What you won’t get from either Mitt Romney or Barack Obama is a little genuine tough talk about the actual state of reality in these United States of ours. And yet, on those five subjects, a little reality would go a long way, while too little reality (as in the debates to come) is a surefire recipe for American decline.
So here’s a brief guide to what you won’t hear this Wednesday or in the other presidential and vice-presidential debates later in the month. Think of these as five hard truths that will determine the future of this country.
1. Immediate deficit reduction will wipe out any hope of economic recovery: These days, it’s fashionable for any candidate to talk about how quickly he’ll reduce the federal budget deficit, which will total around $1.2 trillion in fiscal 2012. And you’re going to hear talk about the Simpson-Bowles deficit reduction plan and more like it on Wednesday. But the hard truth of the matter is that deep deficit reduction anytime soon will be a genuine disaster. Think of it this way: If you woke up tomorrow and learned that Washington had solved the deficit crisis and you’d lost your job, would you celebrate? Of course not. And yet, any move to immediately reduce the deficit does increase the likelihood that you will lose your job.
When the government cuts spending, it lays off workers and cancels orders for all sorts of goods and services that would generate income for companies in the private sector. Those companies, in turn, lay off workers, and the negative effects ripple through the economy. This isn’t atomic science. It’s pretty basic stuff, even if it’s evidently not suitable material for a presidential debate. The nonpartisan Congressional Research Service predicted in a September report, for example, that any significant spending cuts in the near-term would contribute to an economic contraction. In other words, slashing deficits right now will send us ever deeper into the Great Recession from which, at best, we’ve scarcely emerged.
Champions of immediate deficit reduction are likely to point out that unsustainable deficits aren’t good for the economy. And that’s true — in the long run. Washington must indeed plan for smaller deficits in the future. That will, however, be a lot easier to accomplish when the economy is healthier, since government spending declines when fewer people qualify for assistance, and tax revenues expand when the jobless go back to work. So it makes sense to fix the economy first. The necessity for near-term recovery spending paired with long-term deficit reduction gets drowned out when candidates pack punchy slogans into flashes of primetime TV.
2. . .
Interesting idea: term-limited (but renewable) marriage contracts. As pointed out in the articles, contracts are already common in marriages (pre-nuptial agreements), so the idea is around. Matt Richtel recounts his findings as he explored the idea with various marriage experts:
IT makes little sense to explore a new era of family values based around Hollywood couplings. Or, worse yet, around mere rumors of the way movie stars conduct their marital affairs.
But might there be seeds of something worth considering in one such rumor, that Tom Cruise and Katie Holmes signed a five-year marriage contract?
It’s a dim data point but not an isolated one, suggesting people are rethinking marriage, at least around the edges. Prenuptial agreements, a different sort of contract, are on the rise, as is vowless cohabitation. The ages at which people marry have hit record highs, 28.7 years for men and 26.5 for women. And gay marriage has provoked widespread conversation about the institution’s meaning and place.
Last year, several lawmakers in Mexico City proposed the creation of short-term, renewable marriage contracts with terms as brief as two years. The idea was to own up to the reality that marriages fail about half the time.
Is marriage headed for an overhaul? A fundamental rethinking? Is it due for one?
When the Mexican legislators proposed their idea, which was not passed, the archdiocese there called it “absurd” and said it was anathema to the nature of marriage. I decided to put the questions to a different group: the people who study marriage and divorce. I was motivated not just by trend lines but, as a child of divorce, by ghosts.
I asked whether society should consider something like a 20-year marriage contract, my own modest proposal that, as in the one from Mexico, acknowledges the harsh truth that nearly half of marriages in the United States end in divorce and many others are miserable. The rough idea: two people, two decades, enough time to have and raise children if that’s your thing; a new status quo, a ceremony with a shelf life, till awhile do us part.
But despite having proposed it, whimsically, as a journalistic expedition, I found myself surprised and even unnerved by the extent to which some experts I spoke with say there is a need to rethink an institution that so often fails.
“We’re remarkably not innovative about marriage even though almost all the environmental conditions, writ large, have changed,” said Pepper Schwartz, a sociology professor at the University of Washington and author of books on love, sex and marriage. “We haven’t scrutinized it. We’ve been picking at it like a scab, and it’s not going to heal that way.”
The kinds of things that are changing: we’re living longer; we live apart from families and are less inclined to religion, both marriage support systems; technology makes it easier than ever flirt or cheat and fuels instant gratification (“I will absolutely invest in this marriage after I watch this cat video”).
Over all, divorce rates . . .
Continue reading. This should be of particular interest to the Red States in the Bible Belt, which have the highest divorce rates in the country, divorce apparently being include in their righteous defense of “traditional” marriage (which, I have to point out, is totally unlike many Old Testament marriages: polygamy seemed relatively common).
And, man!, can she play! Via 30sJazz.com. The YouTube comment:
From Sensations Of 1945.
Dorothy Donegan (April 6, 1922 — May 19, 1998) was an American classically trained jazz pianist primarily known for performing in the stride piano and boogie-woogie style. She also played bop, swing jazz, and classical music. Obituaries for her argued that her flamboyant personality, tendency to mix unrelated genres in the same concert, and willingness to do lounge music may have caused her to be undervalued in jazz circles.
Donegan was born and grew up in Chicago, Illinois and began studying classical piano at age six. In her early years, she studied at the Chicago Musical College and by age eight her potential was recognized. In the 1940s she became Art Tatum’s protégée and in 1942 she made her recording debut. She appeared in Sensations of 1945 with Cab Calloway, Gene Rodgers and W. C. Fields and was known for her work in Chicago nightclubs. She began a trio in 1945, but then returned to solo work. She expressed some interest in returning to classical music after this.
Her first six albums would prove to be obscure when compared to her success at live performance. It was not until the 1980s that her work gained notice in the recorded jazz world, including an appearance at the 1987 Montreux Jazz Festival, and her live albums from 1991 perhaps gained her the most acclaim. Even at that point, she remained best known for concerts and live performances. At these she would draw crowds with her eclectic mixture of styles and her personality. She died of cancer in 1998 in Los Angeles, California.
Very interesting piece by Elisabeth Rosenthal in the NY Times:
ONE spectacular Sunday in Paris last month, I decided to skip museums and shopping to partake of something even more captivating for an environment reporter: Vélib, arguably the most successful bike-sharing program in the world. In their short lives, Europe’s bike-sharing systems have delivered myriad benefits, notably reducing traffic and its carbon emissions. A number of American cities — including New York, where a bike-sharing program is to open next year — want to replicate that success.
So I bought a day pass online for about $2, entered my login information at one of the hundreds of docking stations that are scattered every few blocks around the city and selected one of Vélib’s nearly 20,000 stodgy gray bikes, with their basic gears, upright handlebars and practical baskets.
Then I did something extraordinary, something I’ve not done in a quarter-century of regular bike riding in the United States: I rode off without a helmet.
I rode all day at a modest clip, on both sides of the Seine, in the Latin Quarter, past the Louvre and along the Champs-Élysées, feeling exhilarated, not fearful. And I had tons of bareheaded bicycling company amid the Parisian traffic. One common denominator of successful bike programs around the world — from Paris to Barcelona to Guangzhou — is that almost no one wears a helmet, and there is no pressure to do so.
In the United States the notion that bike helmets promote health and safety by preventing head injuries is taken as pretty near God’s truth. Un-helmeted cyclists are regarded as irresponsible, like people who smoke. Cities are aggressive in helmet promotion.
But many European health experts have taken a very different view: Yes, there are studies that show that if you fall off a bicycle at a certain speed and hit your head, a helmet can reduce your risk of serious head injury. But such falls off bikes are rare — exceedingly so in mature urban cycling systems.
On the other hand, many researchers say, if you force or pressure people to wear helmets, you discourage them from riding bicycles. That means more obesity, heart disease and diabetes. And — Catch-22 — a result is fewer ordinary cyclists on the road, which makes it harder to develop a safe bicycling network. The safest biking cities are places like Amsterdam and Copenhagen, where middle-aged commuters are mainstay riders and the fraction of adults in helmets is minuscule. . .
Very interesting op-ed by David Treuer in the NY Times:
JUST over a week ago, a handful of Senator Scott P. Brown’s supporters gathered in Boston to protest his opponent, Elizabeth Warren. The crowd — making Indian war whoops and tomahawk chops — was ridiculing what Mr. Brown, Republican of Massachusetts, called the “offense” of Ms. Warren’s claim that she has Cherokee and Delaware ancestry.
To mock real Indians by chanting like Hollywood Indians in order to protest someone you claim is not Indian at all gets very confusing. Even more so because early Americans spent centuries killing Indians, and then decades trying to drive any distinctive Indianness out of the ones who survived. Perhaps we’ve come a long way if Americans are now going around accusing people who don’t look or act Indian enough of appropriating that identity for personal gain. But in fact, the appropriation of Indian virtues is one of the country’s oldest traditions.
Indians — who we are and what we mean — have always been part of how America defined itself. Indians on the East Coast were largely (but never completely) deracinated, and tribes like the Delaware were either killed or relocated farther west. At the same time, their Indianness was extracted as a set of virtues: honor, stoicism, dignity, freedom. Once, in college, an African-American student shook his head when I told him that I was Indian and he said he was jealous. Why? I asked. Because you lived life on your own terms and would rather have died than become a slave. That sentiment — totally at odds with the reality in which many tribes were indeed enslaved and a few owned slaves themselves — seemed a very wistful expression of what being an Indian meant.
In any case, the mythic Indian virtues of dignity and freedom adhere less to real Indians than they do to the very nation that deposed them. Just think of how much the ultimate American, the cowboy, has in common with the Indian: a life lived beyond the law but in accordance with a higher set of laws like self-sufficiency, honor, toughness, a painful past, a fondness for whiskey and always that long, lingering look over his shoulder at a way of life quickly disappearing. Contrary to the view held by a lot of Indian people, America hasn’t forgotten us. It has always been obsessed with us and has appropriated, without recourse to reality or our own input, the qualities with which we are associated.
BEGINNING in the late 19th century, assimilation of the remaining American Indian population was official federal policy. This was around the time that the American frontier was considered closed: the West Coast had been reached and there were no more lands or peoples to conquer. And yet Indians still held on to much of our land and our identity. So at the behest of the federal government, thousands of Indian children were removed from their homes and sent to boarding schools. Indian languages and native religions were suppressed. . .
This is clearly explained in a NY Times column on the Affordable Care Act written by J.D. Kleinke of the American Enterprise Institute (a conservative think tank):
IF Mitt Romney’s pivots on President’s Obama’s health care reform act have accelerated to a blur — from repealing on Day 1, to preserving this or that piece, to punting the decision to the states — it is for an odd reason buried beneath two and a half years of Republican political condemnations: the architecture of the Affordable Care Act is based on conservative, not liberal, ideas about individual responsibility and the power of market forces.
This fundamental ideological paradox, drowned out by partisan shouting since before the plan’s passage in 2010, explains why Obamacare has only lukewarm support from many liberals, who wanted a real, not imagined, “government takeover of health care.” It explains why Republicans have been unable since its passage to come up with anything better. And it explains why the law is nearly identical in design to the legislation Mr. Romney passed in Massachusetts while governor.
The core drivers of the health care act are market principles formulated by conservative economists, designed to correct structural flaws in our health insurance system — principles originally embraced by Republicans as a market alternative to the Clinton plan in the early 1990s. The president’s program extends the current health care system — mostly employer-based coverage, administered by commercial health insurers, with care delivered by fee-for-service doctors and hospitals — by removing the biggest obstacles to that system’s functioning like a competitive marketplace.
Chief among these obstacles are market limitations imposed by the problematic nature of health insurance, which requires that younger, healthier people subsidize older, sicker ones. Because such participation is often expensive and always voluntary, millions have simply opted out, a risky bet emboldened by the 24/7 presence of the heavily subsidized emergency room down the street. The health care law forcibly repatriates these gamblers, along with those who cannot afford to participate in a market that ultimately cross-subsidizes their medical misfortunes anyway, when they get sick and show up in that E.R. And it outlaws discrimination against those who want to participate but cannot because of their medical histories. Put aside the considerable legislative detritus of the act, and its aim is clear: to rationalize a dysfunctional health insurance marketplace.
This explains why the health insurance industry has been quietly supporting the plan all along. It levels the playing field and expands the potential market by tens of millions of new customers.
The rationalization and extension of the current market is financed by the other linchpin of the law: the mandate that we all carry health insurance, an ideaforged not by liberal social engineers at the Brookings Institution but by conservative economists at the Heritage Foundation. The individual mandate recognizes that millions of Americans who could buy health insurance choose not to, because it requires trading away today’s wants for tomorrow’s needs. The mandate is about personal responsibility — a hallmark of conservative thought.
IN the partisan war sparked by the 2008 election, Republicans conveniently forgot that this was something many of them had supported for years. The only thing wrong with the mandate? Mr. Obama also thought it was a good idea.
The same goes for health insurance exchanges, another idea formulated by conservatives and supported by Republican governors and legislators across the country for years. An exchange is as pro-market a mechanism as they come: free up buyers and sellers, standardize the products, add pricing transparency, and watch what happens. Market Economics 101.
In the shouting match over the health care law, most have somehow missed another of its obvious virtues: it enshrines accountability — yes, another conservative idea. Under today’s system, . . .
Continue reading. It’s an interesting piece, for sure. I don’t know that I agree with all of it: for example, I don’t see “accountability” as particularly Republican (or Democratic, for that matter), but rather a logical, moral, and ethical idea vigorously fought by those who would be impacted by accountability: people eager to escape responsibility for the consequences of their decisions and actions. It’s not party-specific. Still, the column is good food for thought, should Republicans read it. Too bad no comments are appended to the piece.
Very interesting essay in the NY Times by Stephanie Coontz:
SCROLL through the titles and subtitles of recent books, and you will read that women have become “The Richer Sex,” that “The Rise of Women Has Turned Men Into Boys,” and that we may even be seeing “The End of Men.” Several of the authors of these books posit that we are on the verge of a “new majority of female breadwinners,” where middle-class wives lord over their husbands while demoralized single men take refuge in perpetual adolescence.
How is it, then, that men still control the most important industries, especially technology, occupy most of the positions on the lists of the richest Americans, and continue to make more money than women who have similar skills and education? And why do women make up only 17 percent of Congress?
These books and the cultural anxiety they represent reflect, but exaggerate, a transformation in the distribution of power over the past half-century. Fifty years ago, every male American was entitled to what the sociologist R. W. Connell called a “patriarchal dividend” — a lifelong affirmative-action program for men.
The size of that dividend varied according to race and class, but all men could count on women’s being excluded from the most desirable jobs and promotions in their line of work, so the average male high school graduate earned more than the average female college graduate working the same hours. At home, the patriarchal dividend gave husbands the right to decide where the family would live and to make unilateral financial decisions. Male privilege even trumped female consent to sex, so marital rape was not a crime.
The curtailment of such male entitlements and the expansion of women’s legal and economic rights have transformed American life, but they have hardly produced a matriarchy. Indeed, in many arenas the progress of women has actually stalled over the past 15 years.
Let’s begin by determining which is “the richer sex.”
Women’s real wages have been rising for decades, while the real wages of most men have stagnated or fallen. But women’s wages started from a much lower base, artificially held down by discrimination. Despite their relative improvement, women’s average earnings are still lower than men’s and women remain more likely to be poor.
Today women make up almost 40 percent of full-time workers in management. But the median wages of female managers are just 73 percent of what male managers earn. And although women have significantly increased their representation among high earners in America over the past half-century, only 4 percent of the C.E.O.’s in Fortune’s top 1,000 companies are female.
What we are seeing is a convergence in economic fortunes, not female ascendance. Between 2010 and 2011, . .
Continue reading. Later in the essay:
ONE thing standing in the way of further progress for many men is the same obstacle that held women back for so long: overinvestment in their gender identity instead of their individual personhood. Men are now experiencing a set of limits — externally enforced as well as self-imposed — strikingly similar to the ones Betty Friedan set out to combat in 1963, when she identified a “feminine mystique” that constrained women’s self-image and options.
Although men don’t face the same discriminatory laws as women did 50 years ago, they do face an equally restrictive gender mystique.
Just as the feminine mystique discouraged women in the 1950s and 1960s from improving their education or job prospects, on the assumption that a man would always provide for them, the masculine mystique encourages men to neglect their own self-improvement on the assumption that sooner or later their “manliness” will be rewarded.
According to a 2011 poll by the Pew Research Center, 77 percent of Americans now believe that a college education is necessary for a woman to get ahead in life today, but only 68 percent think that is true for men. And just as the feminine mystique exposed girls to ridicule and harassment if they excelled at “unladylike” activities like math or sports, the masculine mystique leads to bullying and ostracism of boys who engage in “girlie” activities like studying hard and behaving well in school. One result is that men account for only 2 percent of kindergarten and preschool teachers, 3 percent of dental assistants and 9 percent of registered nurses.
The masculine mystique is institutionalized in . . .
I’ve read that among inner-city African-Americans, a devotion to study, reading, and education is derided as “acting white”: an example of counter-productive racial identity. So perhaps over-identification with any group can work against the fulfillment of one’s human potential because one tries to fit the square peg of oneself into the round hole of the group stereotype, which always is a simplification (and thus distortion) of reality.
David Ropeik explains why a belief contrary to evidence has its own sort of rational explanation:
WE make all sorts of ostensibly conscious and seemingly rational choices when we are aware of a potential risk. We eat organic food, max out on multivitamins and quickly forswear some products (even whole technologies) at the slightest hint of danger. We carry guns and vote for the candidate we think will keep us safe. Yet these choices are far from carefully considered — and, surprisingly often, they contravene reason. What’s more, while our choices about risk invariably feel right when we make them, many of these decisions end up putting us in greater peril.
Researchers in neuroscience, psychology, economics and other disciplines have made a range of discoveries about why human beings sometimes fear more than the evidence warrants, and sometimes less than the evidence warns. That science is worth reviewing at length. But one current issue offers a crash course in the most significant of these findings: the fear of vaccines, particularly vaccines for children.
In a 2011 Thomson Reuters/NPR poll, nearly one parent in three with a child under 18 was worried about vaccines, and roughly one American in four was concerned about the value and safety of vaccines in general. In the same poll, roughly one out of every five college-educated respondents worried that childhood vaccination was connected with autism; 7 percent said they feared a link with Type 1 diabetes.
Based on the evidence, these and most other concerns about vaccines are unfounded. A comprehensive report last year from the Institute of Medicine is just one of many studies to report that vaccines do not cause autism, diabetes, asthma or other major afflictions listed by the anti-vaccination movement.
Yet these fears, fierce and visceral, persist. To frustrated doctors and health officials, vaccine-phobia seems an irrational denial of the facts that puts both the unvaccinated child and the community at greater risk (as herd immunity goes down, disease spread rises). But the more we learn about how risk perception works, the more understandable — if still quite dangerous — the fear of vaccines becomes.
Along with many others, the cognitive psychologists Paul Slovic of the University of Oregon and Baruch Fischhoff of Carnegie Mellon University have identified several reasons something might feel more or less scary than mere reason might suppose. . .
Sort of blunt title, but it does describe the effect. I first read about this in Science News years ago, and indeed some toilets are now made that require more of a squatting posture, but most still seem to use chair height, which causes problems, as explained in the NPR article. This company offers a simple solution. Eliza Barclay at NPR discusses the issue:
We at Shots don’t shy away from talking about poop, as Michaeleen Doucleff demonstrated last month with her post on the Bill & Melinda Gates Foundation’s investment in fake feces.
Poop talk may strike some as juvenile, but many people in the world don’t have a safe way to do their business. And by age 50, about half of American adults have experienced hemorrhoid symptoms, according to the Mayo Clinic.
Which brings us to a discussion that’s been simmering since at least the 1960s. Is the modern toilet at least partly to blame for problems like hemorrhoids and constipation?
Some architects and doctors have posited that squatting may be the more natural position for us to do our business. That’s spawned a sort of squatting counterculture and a budding industry to go with it.
As neuroscientist Daniel Lametti wrote in Slate in 2010, squatting allows the, er, anorectal angle to straighten, so that less effort is required for evacuation. And today there are lots of squatting evangelists on the Internet who marshal scientific evidence, limited as it may be, and ample cultural evidence of the practice enduring in many parts of the world to make their case that squatting is superior.
But not everyone who might want to experiment with squatting can actually squat safely or comfortably, especially elderly folks with bowel movement issues. Enter Squatty Potty, a product launched by Robert Edwards, a 37-year-old contractor and designer in St. George, Utah. The story really starts with his mother, who was suffering hemorrhoids and constipation and had resorted to lifting her feet with on a step stool while on the john, for some relief.
The bulky step stool took up too much space, so Edwards offered to make her one that would fit snugly around the base of the toilet when not in use.
The family experimented with different prototypes and soon realized there was a market for such a product. In the last year, Roberts says without any advertising he has sold 10,000 Squatty Pottys (they start at $34.95) — a testament to the revival of squatting in the U.S. And he says he has become more convinced that the modern toilet is the cause of many people’s bowel issues. . .
Continue reading: not everyone agrees.
UPDATE: Since they offer a 30-day guarantee, I thought I’d try it. I was about to send it back—too damn awkward to use—when I happened to read the instructions. Oh.
It works fine. I’m keeping it.
Another very fine shave. Ingredients: Simpson Case, Prairie Creations Barbershop tallow shave stick, 1940’s Super Speed, Astra Keramik blade, Alt Innsbruck aftershave.
Even getting a little variety is refreshing.
I heard the snooze alarm going off periodically upstairs. Snooze buttons should be pressure-sensitive: the harder you hit the button, the longer the time before the next alarm. Just a thought.
When we read the lawsuits involving Wall Street – firms colluding with each other, document shredding, lawyers’ hiding evidence, decades of deceiving the American people, strong arm tactics, deceptive trade associations – it all has a familiar ring. It should. Wall Street is following in the footsteps of Big Tobacco.
And there’s one more common bond that should deeply trouble every American. The law firm that fronted for Big Tobacco for four decades, Covington & Burling, has its former lawyers ensconced in three of the top slots at the U.S. Justice Department. Now Covington & Burling has become Wall Street’s go-to guys for legal counsel in a growing roster of alleged crimes.
The public, and Congress, have a pressing need to question how a law firm that was cited by a U.S. District Court, an Appellate Court and the U.S. Supreme Court as playing a central role in coordinating the illegal activity of Big Tobacco – activity that callously harmed the health and welfare of both children and adults, ended up sending three of its lawyers to the top slots at the Nation’s highest law enforcement office.
Both Eric Holder, the U.S. Attorney General, and Lanny Breuer, the Assistant Attorney General for the Criminal Division were Covington & Burling partners before they joined the Justice Department. Dan Suleiman, who also worked at Covington and Burling, became the new deputy chief of staff and counselor to Lanny Breuer on July 16 of this year. Since 2008, employees of Covington & Burling have contributed $347,951 to President Obama’s campaigns.
In 1999, the United States took on the depraved tobacco industry, suing the largest firms under the Racketeer Influenced and Corrupt Organizations Act (RICO). The government charged that the tobacco companies engaged in a four-decade conspiracy to mislead the public about the dangers of smoking, distort the dangers of secondhand smoke, lie about the addictiveness of nicotine, deceitfully market cigarettes as light or low tar while fully aware that these products were as hazardous as regular cigarettes, and unconscionably target the youth market as “replacement smokers.”
Following a nine-month bench trial, 14,000 exhibits, live testimony from 84 witnesses and written testimony from 162 witnesses, on August 17, 2006, Judge Gladys Kessler of the U.S. District Court for the District of Columbia issued a 1,683 page opinion. The Court found that “Cigarette smoking causes disease, suffering, and death. Despite internal recognition of this fact, Defendants have publicly denied, distorted, and minimized the hazards of smoking for decades.” The Court also found the following:
“Finally, a word must be said about the role of lawyers in this fifty-year history of deceiving smokers, potential smokers, and the American public about the hazards of smoking and second hand smoke, and the addictiveness of nicotine. At every stage, lawyers played an absolutely central role in the creation and perpetuation of the Enterprise and the implementation of its fraudulent schemes. They devised and coordinated both national and international strategy; they directed scientists as to what research they should and should not undertake; they vetted scientific research papers and reports as well as public relations materials to ensure that the interests of the Enterprise would be protected; they identified ‘friendly’ scientific witnesses, subsidized them with grants from the Center for Tobacco Research and the Center for Indoor Air Research, paid them enormous fees, and often hid the relationship between those witnesses and the industry; and they devised and carried out document destruction policies and took shelter behind baseless assertions of the attorney client privilege.”
The Court stated in footnote that “Despite the apparent conflict of interest, a few law firms, particularly Covington & Burling and Shook, Hardy & Bacon, represented the shared interests of all the Defendants and coordinated a significant part of the Enterprise’s activities.”
The Court further noted in a footnote that wrongdoing on the part of lawyers for the tobacco industry appeared to be continuing into the present. The Court made the following findings specific to Covington & Burling, the law firm that has three top posts in today’s U.S. Justice Department. . .
It seems highly likely that Tom Davis has been buying wild horses simply to slaughter them. Dave Phillips reports for ProPublica:
The Bureau of Land Management faced a crisis this spring.
The agency protects and manages herds of wild horses that still roam the American West, rounding up thousands of them each year to keep populations stable.
But by March, government pens and pastures were nearly full. Efforts to find new storage space had fallen flat. So had most attempts to persuade members of the public to adopt horses. Without a way to relieve the pressure, the agency faced a gridlock that would invite lawsuits and potentially cause long-term damage to the range.
So the BLM did something it has done increasingly over the last few years. It turned to a little-known Colorado livestock hauler named Tom Davis who was willing to buy hundreds of horses at a time, sight unseen, for $10 a head.
The BLM has sold Davis at least 1,700 wild horses and burros since 2009, agency records show — 70 percent of the animals purchased through its sale program.
Like all buyers, Davis signs contracts promising that animals bought from the program will not be slaughtered and insists he finds them good homes.
But Davis is a longtime advocate of horse slaughter. By his own account, he has ducked Colorado law to move animals across state lines and will not say where they end up. He continues to buy wild horses for slaughter from Indian reservations, which are not protected by the same laws. And since 2010, he has been seeking investors for a slaughterhouse of his own.
“Hell, some of the finest meat you will ever eat is a fat yearling colt,” he said. “What is wrong with taking all those BLM horses they got all fat and shiny and setting up a kill plant?”
Animal welfare advocates fear that horses bought by Davis are being sent to the killing floor.
“The BLM says it protects wild horses,” said Laura Leigh, founder of the Nevada-based advocacy group Wild Horse Education, “but when they are selling to a guy like this you have to wonder.”
BLM officials say they carefully screen buyers and are adamant that no wild horses ever go to slaughter.
“We don’t feel compelled to sell to anybody we don’t feel good about,” agency spokesman Tom Gorey said. “We want the horses to be protected.”
Sally Spencer, who runs the wild horse sales program, said the agency has had no indication of problems with Davis and it would be unfair for the BLM to look more closely at him based on the volume of his purchases.
“It is no good to just stir up rumors,” she said. “We have never heard of him not being able to find homes. So people are innocent until proven guilty in the United States.”
Some BLM employees say privately that wild horse program officials may not want to look too closely at Davis. The agency has more wild horses than it knows what to do with, they say, and Davis has become a relief valve for a federal program plagued by conflict and cost over-runs.
“They are under a lot of pressure in Washington to make numbers,” said a BLM corral manager who did not want his name used because he feared retribution from the agency’s national office. “Maybe that is what this is about. They probably don’t want to look too careful at this guy.” . . .
This post by James Fallows is worth reading: on the rigors of campaigning and the extra demands on women in politics, above and beyond the demands placed on me.
A very nice shave: Spellbound Woods, the Frank Shaving brush, and the ARC Weber. A good breakfast. And I’m just finishing the second in the Hangman’s Daughter series, Dark Monk. Good mysteries set in medieval Germany.
Pam Martens has a good post:
U.S. Treasury Secretary Tim Geithner is now three for three in the book world: the quintessential poster boy for regulatory capture who ended up as Citigroup’s bitch. In Ron Suskind’s Confidence Men, Geithner ignores a directive from the President of the United States to wind down Citigroup. In Neil Barofsky’s Bailout, Geithner is the evil genius using the Home Affordable Modification Program (HAMP) to “foam the runways” for the banks, slowing down the foreclosure stream so the banks could stay afloat, with no genuine goal to help struggling families stay in their homes.
Now Sheila Bair, the ultimate insider as former head of the FDIC during the crisis, has completed the microscopic job on Geithner inBull by the Horns. The image that emerges is a two-headed monster: a regulator functioning as a Citigroup messenger boy and an insanely mismanaged bank that was somehow able to shield from public scrutiny that it had a measly $125 billion in U.S. insured deposits while turning government on its head and raking in over $2.5 trillion in taxpayer capital, guarantees and loans.
When I came to the part about the $125 billion in insured deposits, I thought my Kindle had malfunctioned. What! It was well publicized that Citigroup had over $2 trillion in assets; how could it have only $125 billion in U.S. insured deposits?
Sheila Bair may not have realized it, but she was filling in the missing piece of a puzzle that has captivated much of Wall Street since 2008: why was every regulator jumping through hoops to save Citigroup, a serial predator that constantly promised to change but never did.
As it turns out, the bulk of Citigroup’s deposits were foreign and much of those deposits were not insured or had low insurance amounts. Had this foreign money decided to run for the exits on fear of a Citigroup collapse, FDIC might have been looking at just a $125 billion problem but the rest of the financial system was looking at $2 trillion on the books, $1 trillion off the books and God knows what kind of counterparty agreements in the closets.
Bair indicates her belief that Citigroup’s two main regulators, John Dugan (a former bank lobbyist) at the Office of the Comptroller of the Currency (OCC) and Tim Geithner, then President of the Federal Reserve Bank of New York, were not being forthright on Citigroup’s real condition. Bair explains Citigroup’s situation in 2008 as follows: . . .
In other bank news, Bank of America is paying $2.43 billion to settle litigation on how it deceived investors in its acquisition of Merrill Lynch. As is usual with businesses, the business simply writes a check and none of the decision-makers—those actually guilty—goes to prison—a fine, that affects only a calendar quarter, and the malefactors walk away scot-free. There’s something very wrong with that picture.
Mickey Edwards, one-time US Representative from Oklahoma, has a good column in the NY Times today:
Frustrated over the inability of political leaders to find common ground on even the most pressing national issues, Americans have developed a long list of people or political practices to blame for the fact that government doesn’t seem to work anymore. But the real problem is something that’s not high on most such lists, something that’s far more crucial.
We’re electing the wrong people, they complain. There are no leaders any more. There’s too much money in politics. Too many corporations, labor unions, special interests and billionaires. Too many right-wingers, or left-wingers, in Congress, on television, on the Internet — and they’re all zealous and nasty. Too many Americans only talk to people who already agree with them. And so forth. Every observer has his or her own pet reason for the failure of the federal government to function.
If any attempt is made to assess the problem as a whole, each side complains about “false equivalence” and doubles down on blaming the opposition. It’s not that the villains they’ve identified don’t share in the blame, because they’ve all played a part in the unraveling of government. The problem, however, is much deeper than any of these individual elements: it’s the political system itself that is at fault. The problems with governance will never be solved until we turn that system upside down and start over.
While the United States is actually a Republic, with the attendant constitutional constraints on the powers of the majority, its political system is also based on a fundamental underlying democratic principle: that the people themselves will choose their leaders and thus indirectly determine the policies of their government. Because the federal government’s most important powers – to declare war, to establish tax policies, to create programs, to decide how much to spend on them, to approve treaties, to make the final decisions about who will head federal agencies or sit on the Supreme Court — are all Congressional powers, it is only by being able to select members of the Senate and the House of Representatives that the people are able to manage the levers of government.
Yet despite the repeated and urgent warnings of the Republic’s founders, we have created a system that seriously undermines that democratic principle and gives us instead a government that is unable to deal with even the most urgent problems because the people have been shoved aside in the pursuit of partisan advantage. In some ways our system has come to resemble those multi-party parliamentary systems in which the tail (relatively small groups of hard-liners) is able to wag the dog. What Washington, Adams, Jefferson and Madison all agreed on was the danger of creating political parties like the ones we have today, permanent factions that are engaged in a constant battle for advantage even if that means skewing election results, keeping candidates off the ballot, denying voters the right to true representation and “fixing” the outcome of legislative deliberations.
Let’s begin with the election process itself. . .