Later On

A blog written for those whose interests more or less match mine.

Archive for January 2019

The Voter Suppression State

leave a comment »

Mimi Swartz, editor of the Texas Monthly, writes in the NY Times:

For those of you keeping track of the “As Texas goes, so goes the nation” notion, I have either very good or very bad news.

The state that gave you two recent mediocre-to-crummy Republican presidents (who are starting to look downright Lincolnesque compared to you-know-who), gerrymandering in the guise of redistricting (thanks a lot, Tom DeLay) and a profound if misguided antipathy to government in general is now surging ahead in a new field: voter suppression. As someone who loves Texas with a triple shot of ambivalence, I take no pleasure in spreading this news. But if it is your goal to keep people of color from the polls — some Republican leaders come to mind — it’s time once again to look to Texas for guidance.

Our state officials in their infinite wisdom last week announced that they hoped to excise 95,000 people from voter rolls because they didn’t seem to be citizens. Our secretary of state, David Whitley, insisted that, with the help of the Department of Public Safety, he had been able to compile a list of those supposedly illegally registered. It was even suggested that 58,000 of those folks had actually already voted, a felony in these parts. This finding was heralded in a tweet by ourattorney general, Ken Paxton, as an all-caps “Voter Fraud Alert.” Paxton, you may or may not know, is himself under indictment for securities fraud.

The state, which as yet cannot take anyone off the voter rolls, turned to county officials, who can. They are supposed to hunt those miscreants down by sending notices demanding they appear at voter registrars’ offices with proof of citizenship (birth certificate, passport, etc.) within 30 days. Otherwise, they would be stricken from the rolls and, presumably, ICE would be pounding on their doors soon after.

Among many who seized on this appalling narrative was President Trump, who tweeted: “These numbers are just the tip of the iceberg. All over the country, especially in California, voter fraud is rampant. Must be stopped!”

Well, yes, someone had to be stopped here in Texas, and the narrative was appalling, but not for the stated reasons. Within 24 hours, various groups devoted to voting rights had put on their thinking caps — they don’t give them out at the Statehouse — and were noting a few problems with the list.

Like, some of this “research” was 25 years old, during which time a lot of people holding driver’s licenses could have become naturalized citizens who, at least so far, are allowed to vote in Texas. In other words, state leaders were not experts in data compilation, a finding that should surprise no one. As our former governor and the current secretary of energy Rick Perry would say, “Whoops.”

Within a few days, Harris County (which includes Houston) had found that 60 percent of the 30,000 people on the DPS’s list should never have been there in the first place, because they had become citizens in the last quarter-century or so. The League of United Latin American Citizens also filed suit against Mr. Whitley and Mr. Paxton, claiming a violation of the voting rights act, and declared the whole mess a “witch hunt” intended to scare Latinos away from the ballot box.

Ignorance or venality? Hard to say. Stupidity is always a good bet, but Texans are already trying to exercise their civic duty with one of the nation’s strictest voter identification laws in effect — regular people already need to show a government-issued ID to vote here. Then, too, the convoluted rules for running third-party voter registration drives here would send Rube Goldberg to bed with a blinding migraine.

There is one simple fact fomenting all this hysteria, of course: According to census estimates, the state’s Hispanic population grew to 11.2 million in 2017, from 9.7 million in 2010. The population of white Texans grew by only about half a million people, to 11.9 million, during the same period. By 2022, the state is guesstimated to be majority Latino. (By 2050 our booming population — with all our Latinos — is supposed to surpass California’s.) This may or may not mean that Texas will turn blue around the same time, though the anti-immigrant/build the wall bias of state and national leaders who know better might be helping that process along. On the other hand, maybe our leadership plans to just deport them all.

Those numbers could certainly explain the weirdness of the last few days. A weak, diminished Republican leadership, not to mention its far-right backers, is more terrifying to our leaders in Austin and their far-right backers than the return of Barack Obama.

But I’m not sure Texas Democrats are exploiting this opportunity to the fullest . . .

Continue reading.

Written by Leisureguy

31 January 2019 at 6:41 pm

What It’s Like To See 100 Million Colors

leave a comment »

From New York:

Concetta Antico is an artist with a superhuman power: she can see up to 100 million colors, a hundred times more than the average human. She is a tetrachromat: a person with four (instead of three) photoceptors in their retinas, thus possessing hyper-charged color perception.

Written by Leisureguy

31 January 2019 at 4:23 pm

Posted in Daily life

Is Sugar Toxic?

leave a comment »

This article by Gary Taubes appeared in the NY Times on April 13, 2011—but it’s still of interest:

On May 26, 2009, Robert Lustig gave a lecture called “Sugar: The Bitter Truth,” which was posted on YouTube the following July. Since then, it has been viewed well over 800,000 times, gaining new viewers at a rate of about 50,000 per month, fairly remarkable numbers for a 90-minute discussion of the nuances of fructose biochemistry and human physiology.

Lustig is a specialist on pediatric hormone disorders and the leading expert in childhood obesity at the University of California, San Francisco, School of Medicine, which is one of the best medical schools in the country. He published his first paper on childhood obesity a dozen years ago, and he has been treating patients and doing research on the disorder ever since.

The viral success of his lecture, though, has little to do with Lustig’s impressive credentials and far more with the persuasive case he makes that sugar is a “toxin” or a “poison,” terms he uses together 13 times through the course of the lecture, in addition to the five references to sugar as merely “evil.” And by “sugar,” Lustig means not only the white granulated stuff that we put in coffee and sprinkle on cereal — technically known as sucrose — but also high-fructose corn syrup, which has already become without Lustig’s help what he calls “the most demonized additive known to man.”

It doesn’t hurt Lustig’s cause that he is a compelling public speaker. His critics argue that what makes him compelling is his practice of taking suggestive evidence and insisting that it’s incontrovertible. Lustig certainly doesn’t dabble in shades of gray. Sugar is not just an empty calorie, he says; its effect on us is much more insidious. “It’s not about the calories,” he says. “It has nothing to do with the calories. It’s a poison by itself.”

If Lustig is right, then our excessive consumption of sugar is the primary reason that the numbers of obese and diabetic Americans have skyrocketed in the past 30 years. But his argument implies more than that. If Lustig is right, it would mean that sugar is also the likely dietary cause of several other chronic ailments widely considered to be diseases of Western lifestyles — heart disease, hypertension and many common cancers among them.

The number of viewers Lustig has attracted suggests that people are paying attention to his argument. When I set out to interview public health authorities and researchers for this article, they would often initiate the interview with some variation of the comment “surely you’ve spoken to Robert Lustig,” not because Lustig has done any of the key research on sugar himself, which he hasn’t, but because he’s willing to insist publicly and unambiguously, when most researchers are not, that sugar is a toxic substance that people abuse. In Lustig’s view, sugar should be thought of, like cigarettes and alcohol, as something that’s killing us.

This brings us to the salient question: Can sugar possibly be as bad as Lustig says it is?

It’s one thing to suggest, as most nutritionists will, that a healthful diet includes more fruits and vegetables, and maybe less fat, red meat and salt, or less of everything. It’s entirely different to claim that one particularly cherished aspect of our diet might not just be an unhealthful indulgence but actually be toxic, that when you bake your children a birthday cake or give them lemonade on a hot summer day, you may be doing them more harm than good, despite all the love that goes with it. Suggesting that sugar might kill us is what zealots do. But Lustig, who has genuine expertise, has accumulated and synthesized a mass of evidence, which he finds compelling enough to convict sugar. His critics consider that evidence insufficient, but there’s no way to know who might be right, or what must be done to find out, without discussing it.

If I didn’t buy this argument myself, I wouldn’t be writing about it here. And I also have a disclaimer to acknowledge. I’ve spent much of the last decade doing journalistic research on diet and chronic disease — some of the more contrarian findings, on dietary fat, appeared in this magazine —– and I have come to conclusions similar to Lustig’s.

The history of the debate over the health effects of sugar has gone on far longer than you might imagine. It is littered with erroneous statements and conclusions because even the supposed authorities had no true understanding of what they were talking about. They didn’t know, quite literally, what they meant by the word “sugar” and therefore what the implications were.

So let’s start by clarifying a few issues, beginning with Lustig’s use of the word “sugar” to mean both sucrose — beet and cane sugar, whether white or brown — and high-fructose corn syrup. This is a critical point, particularly because high-fructose corn syrup has indeed become “the flashpoint for everybody’s distrust of processed foods,” says Marion Nestle, a New York University nutritionist and the author of “Food Politics.”

This development is recent and borders on humorous. In the early 1980s, high-fructose corn syrup replaced sugar in sodas and other products in part because refined sugar then had the reputation as a generally noxious nutrient. (“Villain in Disguise?” asked a headline in this paper in 1977, before answering in the affirmative.) High-fructose corn syrup was portrayed by the food industry as a healthful alternative, and that’s how the public perceived it. It was also cheaper than sugar, which didn’t hurt its commercial prospects. Now the tide is rolling the other way, and refined sugar is making a commercial comeback as the supposedly healthful alternative to this noxious corn-syrup stuff. “Industry after industry is replacing their product with sucrose and advertising it as such — ‘No High-Fructose Corn Syrup,’ ” Nestle notes.

But marketing aside, the two sweeteners are effectively identical in their biological effects. “High-fructose corn syrup, sugar — no difference,” is how Lustig put it in a lecture that I attended in San Francisco last December. “The point is they’re each bad — equally bad, equally poisonous.”

Refined sugar (that is, sucrose) is made up of a molecule of the carbohydrate glucose, bonded to a molecule of the carbohydrate fructose — a 50-50 mixture of the two. The fructose, which is almost twice as sweet as glucose, is what distinguishes sugar from other carbohydrate-rich foods like bread or potatoes that break down upon digestion to glucose alone. The more fructose in a substance, the sweeter it will be. High-fructose corn syrup, as it is most commonly consumed, is 55 percent fructose, and the remaining 45 percent is nearly all glucose. It was first marketed in the late 1970s and was created to be indistinguishable from refined sugar when used in soft drinks. Because each of these sugars ends up as glucose and fructose in our guts, our bodies react the same way to both, and the physiological effects are identical. In a 2010 review of the relevant science, Luc Tappy, a researcher at the University of Lausanne in Switzerland who is considered by biochemists who study fructose to be the world’s foremost authority on the subject, said there was “not the single hint” that H.F.C.S. was more deleterious than other sources of sugar.

The question, then, isn’t whether high-fructose corn syrup is worse than sugar; it’s what do they do to us, and how do they do it? The conventional wisdom has long been that the worst that can be said about sugars of any kind is that they cause tooth decay and represent “empty calories” that we eat in excess because they taste so good.

By this logic, sugar-sweetened beverages (or H.F.C.S.-sweetened beverages, as the Sugar Association prefers they are called) are bad for us not because there’s anything particularly toxic about the sugar they contain but just because people consume too many of them.

Those organizations that now advise us to cut down on our sugar consumption — the Department of Agriculture, for instance, in its recent Dietary Guidelines for Americans, or the American Heart Association in guidelines released in September 2009 (of which Lustig was a co-author) — do so for this reason. Refined sugar and H.F.C.S. don’t come with any protein, vitamins, minerals, antioxidants or fiber, and so they either displace other more nutritious elements of our diet or are eaten over and above what we need to sustain our weight, and this is why we get fatter.

Whether the empty-calories argument is true, it’s certainly convenient. It allows everyone to assign blame for obesity and, by extension, diabetes — two conditions so intimately linked that some authorities have taken to calling them “diabesity” — to overeating of all foods, or underexercising, because a calorie is a calorie. “This isn’t about demonizing any industry,” as Michelle Obama said about her Let’s Move program to combat the epidemic of childhood obesity. Instead it’s about getting us — or our children — to move more and eat less, reduce our portion sizes, cut back on snacks.

Lustig’s argument, however, is not about the consumption of empty calories — and biochemists have made the same case previously, though not so publicly. It is that sugar has unique characteristics, specifically in the way the human body metabolizes the fructose in it, that may make it singularly harmful, at least if consumed in sufficient quantities.

The phrase Lustig uses when he describes this concept is “isocaloric but not isometabolic.” This means we can eat 100 calories of glucose (from a potato or bread or other starch) or 100 calories of sugar (half glucose and half fructose), and they will be metabolized differently and have a different effect on the body. The calories are the same, but the metabolic consequences are quite different.

The fructose component of sugar and H.F.C.S. is metabolized primarily by the liver, while the glucose from sugar and starches is metabolized by every cell in the body. Consuming sugar (fructose and glucose) means more work for the liver than if you consumed the same number of calories of starch (glucose). And if you take that sugar in liquid form — soda or fruit juices — the fructose and glucose will hit the liver more quickly than if you consume them, say, in an apple (or several apples, to get what researchers would call the equivalent dose of sugar). The speed with which the liver has to do its work will also affect how it metabolizes the fructose and glucose.

In animals, or at least in laboratory rats and mice, it’s clear that if the fructose hits the liver in sufficient quantity and with sufficient speed, the liver will convert much of it to fat. This apparently induces a condition known as insulin resistance, which is now considered the fundamental problem in obesity, and the underlying defect in heart disease and in the type of diabetes, type 2, that is common to obese and overweight individuals. It might also be the underlying defect in many cancers.

If what happens in laboratory rodents also happens in humans, and if we are eating enough sugar to make it happen, then we are in trouble.

The last time an agency of the federal government looked into the question of sugar and health in any detail was in 2005, in a report by the Institute of Medicine, a branch of the National Academies. The authors of the report acknowledged that plenty of evidence suggested that sugar could increase the risk of heart disease and diabetes — even raising LDL cholesterol, known as the “bad cholesterol”—– but did not consider the research to be definitive. There was enough ambiguity, they concluded, that they couldn’t even set an upper limit on how much sugar constitutes too much. Referring back to the 2005 report, an Institute of Medicine report released last fall reiterated, “There is a lack of scientific agreement about the amount of sugars that can be consumed in a healthy diet.” This was the same conclusion that the Food and Drug Administration came to when it last assessed the sugar question, back in 1986. The F.D.A. report was perceived as an exoneration of sugar, and that perception influenced the treatment of sugar in the landmark reports on diet and health that came after.

The Sugar Association and the Corn Refiners Association have also portrayed the 1986 F.D.A. report as clearing sugar of nutritional crimes, but what it concluded was actually something else entirely. To be precise, the F.D.A. reviewers said that other than its contribution to calories, “no conclusive evidence on sugars demonstrates a hazard to the general public when sugars are consumed at the levels that are now current.” This is another way of saying that the evidence by no means refuted the kinds of claims that Lustig is making now and other researchers were making then, just that it wasn’t definitive or unambiguous.

What we have to keep in mind, says Walter Glinsmann, the F.D.A. administrator who was the primary author on the 1986 report and who now is an adviser to the Corn Refiners Association, is that sugar and high-fructose corn syrup might be toxic, as Lustig argues, but so might any substance if it’s consumed in ways or in quantities that are unnatural for humans. The question is always at what dose does a substance go from being harmless to harmful? How much do we have to consume before this happens?

When Glinsmann and his F.D.A. co-authors decided no conclusive evidence demonstrated harm at the levels of sugar then being consumed, they estimated those levels at 40 pounds per person per year beyond what we might get naturally in fruits and vegetables — 40 pounds per person per year of “added sugars” as nutritionists now call them. This is 200 calories per day of sugar, which is less than the amount in a can and a half of Coca-Cola or two cups of apple juice. If that’s indeed all we consume, most nutritionists today would be delighted, including Lustig.

But 40 pounds per year happened to be 35 pounds less than what Department of Agriculture analysts said we were consuming at the time — 75 pounds per person per year — and the U.S.D.A. estimates are typically considered to be the most reliable. By the early 2000s, according to the U.S.D.A., we had increased our consumption to more than 90 pounds per person per year.

That this increase happened to coincide with the current epidemics of obesity and diabetes is one reason that it’s tempting to blame sugars — sucrose and high-fructose corn syrup — for the problem. In 1980, roughly one in seven Americans was obese, and almost six million were diabetic, and the obesity rates, at least, hadn’t changed significantly in the 20 years previously. By the early 2000s, when sugar consumption peaked, one in every three Americans was obese, and 14 million were diabetic.

This correlation between sugar consumption and diabetes is what defense attorneys call circumstantial evidence. It’s more compelling than it otherwise might be, though, because the last time sugar consumption jumped markedly in this country, it was also associated with a diabetes epidemic.

In the early 20th century, many of the leading authorities on diabetes in North America and Europe (including Frederick Banting, who shared the 1923 Nobel Prize for the discovery of insulin) suspected that sugar causes diabetes based on the observation that the disease was rare in populations that didn’t consume refined sugar and widespread in those that did. In 1924, Haven Emerson, director of the institute of public health at Columbia University, reported that diabetes deaths in New York City had increased as much as 15-fold since the Civil War years, and that deaths increased as much as fourfold in some U.S. cities between 1900 and 1920 alone. This coincided, he noted, with an equally significant increase in sugar consumption — almost doubling from 1890 to the early 1920s — with the birth and subsequent growth of the candy and soft-drink industries.

Emerson’s argument was countered by Elliott Joslin, a leading authority on diabetes, and Joslin won out. But his argument was fundamentally flawed. Simply put, it went like this: The Japanese eat lots of rice, and Japanese diabetics are few and far between; rice is mostly carbohydrate, which suggests that sugar, also a carbohydrate, does not cause diabetes. But sugar and rice are not identical merely because they’re both carbohydrates. Joslin could not know at the time that the fructose content of sugar affects how we metabolize it.

Joslin was also unaware that the Japanese ate little sugar.  . .

Continue reading. There’s much more.

Later in the article:

. . . Until Lustig came along, the last time an academic forcefully put forward the sugar-as-toxin thesis was in the 1970s, when John Yudkin, a leading authority on nutrition in the United Kingdom, published a polemic on sugar called “Sweet and Dangerous.” Through the 1960s Yudkin did a series of experiments feeding sugar and starch to rodents, chickens, rabbits, pigs and college students. He found that the sugar invariably raised blood levels of triglycerides (a technical term for fat), which was then, as now, considered a risk factor for heart disease. Sugar also raised insulin levels in Yudkin’s experiments, which linked sugar directly to type 2 diabetes. Few in the medical community took Yudkin’s ideas seriously, largely because he was also arguing that dietary fat and saturated fat were harmless. This set Yudkin’s sugar hypothesis directly against the growing acceptance of the idea, prominent to this day, that dietary fat was the cause of heart disease, a notion championed by the University of Minnesota nutritionist Ancel Keys.

A common assumption at the time was that if one hypothesis was right, then the other was most likely wrong. Either fat caused heart disease by raising cholesterol, or sugar did by raising triglycerides. “The theory that diets high in sugar are an important cause of atherosclerosis and heart disease does not have wide support among experts in the field, who say that fats and cholesterol are the more likely culprits,” as Jane E. Brody wrote in The Times in 1977.

At the time, many of the key observations cited to argue that dietary fat caused heart disease actually support the sugar theory as well. During the Korean War, pathologists doing autopsies on American soldiers killed in battle noticed that many had significant plaques in their arteries, even those who were still teenagers, while the Koreans killed in battle did not. The atherosclerotic plaques in the Americans were attributed to the fact that they ate high-fat diets and the Koreans ate low-fat. But the Americans were also eating high-sugar diets, while the Koreans, like the Japanese, were not. . .

Written by Leisureguy

31 January 2019 at 4:07 pm

The Trump-Russia Investigation and the Mafia State

leave a comment »

Masha Gessen writes in the New Yorker:

It’s been a strange two and a half years. From the first allegations, in July, 2016, of Russian meddling in the U.S. election campaign to the arrest of President Donald Trump’s former adviser Roger Stone last week, many of us who write about Russia professionally, or who are Russian, have struggled to square what we know with the emerging narrative. In this story, Russia waged a sophisticated and audacious operation to subvert American elections and install a President of its choice—it pulled off a coup. Tell that to your average American liberal, and you’ll get a nod of recognition. Tell it to your average Russian liberal (admittedly a much smaller category), and you’ll get uproarious laughter. Russians know that their state lacks the competence to mount a sophisticated sabotage effort, that the Kremlin was even more surprised by Trump’s election than was the candidate himself, and that Russian-American relations are at their most dysfunctional since the height of the Cold War. And yet the indictments keep coming.

Reader, I think I’ve finally figured it out. I don’t mean that I’ve figured out whether Russians influenced the outcome of the American election—I doubt even the Robert Mueller investigation will be able to answer that question. I mean that I’ve figured out how to think about what we know and not go crazy. The answer lies in the concept of the Mafia state. (And, no, I’m not invoking the Mob because Stone encouraged an associate to behave like a character from “The Godfather Part II,” as detailed in his indictment.)

As journalists who usually cover American politics have connected the dots of the story of Russian interference, those of us who normally write about Russia have cringed. Early on, it was common to point out that Paul Manafort, Trump’s former campaign manager, who is now under arrest, worked for Viktor Yanukovych, who is often characterized as the “pro-Russian President of Ukraine.” In fact, there was no love lost between Putin and Yanukovych. After he was run out of town, during the 2014 Ukrainian revolution, Yanukovych did seek refuge in Russia, but during his tenure as President he was an unreliable partner for Putin at best. Perhaps more to the point, he’s a crook and a brute. He served time for robbery and assault before he became a politician, and he is wanted in Ukraine for treason, mass murder, and embezzlement. A visitor to Ukraine can take a tour of Yanukovych’s palace, famous for its marble, crystal, immense scale, and a life-size solid-gold sculpture of a loaf of bread. Manafort made a career of working for the corrupt and the crooked. That in itself tells us little about Russia or its role in the 2016 campaign.

We cringed at headlines that claimed to have established a connection between the Kremlin and Natalia Veselnitskaya, the lawyer who was at the Trump Tower meeting with Donald Trump, Jr., Jared Kushner, and Manafort. Veselnitskaya represents a Russian company, Prevezon Holdings Ltd., which was investigated in New York for money laundering, and Veselnitskaya has been charged with lying to prosecutors about her working relationship with the Russian prosecutor general’s office. Federal prosecutors in the Southern District of New York claim that Veselnitskaya collaborated with a lawyer in the Russian prosecutor’s office to draft exculpatory documents for Prevezon. In media coverage, her e-mailing with a lawyer in the Russian prosecutor’s office was portrayed as evidence of a direct line to Putin, suggesting that she met with Trump’s campaign officials as his emissary. To me, it read as a lot of bluster on the part of a minor operator. From all the available evidence, and contrary to her sales pitch, Veselnitskaya did not have any dirt to offer on Hillary Clinton. To the extent that Veselnitskaya had established connections to high-level Russian officials, they were the kind that are necessary for a lawyer to be at all effective in a corrupt system.

We cringed at the characterization of the Russian online influence campaign as “sophisticated” and “vast”: Russian reporting on the matter—the best available—convincingly portrayed the troll operation as small-time and ridiculous. It was, it seems, fraudulent in every way imaginable: it perpetrated fraud on American social networks, creating fake accounts and events and spreading falsehoods, but it was also fraudulent in its relationship to whoever was funding it, because surely crudely designed pictures depicting Hillary Clinton as Satan could not deliver anyone’s money’s worth.

What we are observing is not most accurately described as the subversion of American democracy by a hostile power. Instead, it is an attempt at state capture by an international crime syndicate. What unites Yanukovych, Veselnitskaya, Manafort, Stone, WikiLeaks’s Julian Assange, the Russian troll factory, the Trump campaign staffer George Papadopoulos and his partners in crime, the “Professor” (whose academic credentials are in doubt), and the “Female Russian National” (who appears to have fraudulently presented herself as Putin’s niece) is that they are all crooks and frauds. This is not a moral assessment, or an attempt to downplay their importance. It is an attempt to stop talking in terms of states and geopolitics and begin looking at Mafias and profits.

The Hungarian sociologist Bálint Magyar, who created the concept of the “post-Communist mafia state,” has just finished editing a new collection of articles called “Stubborn Structures: Reconceptualizing Post-Communist Regimes” (to be published by C.E.U. Press early this year). In one of his own pieces in the collection, using Russia as an example, Magyar describes the Mafia state as one run by a “patron” and his “court”—put another way, the boss and his clan—who appropriate public resources and the institutions of the state for their private use and profit. When I talked to Magyar on the phone on Monday, he told me that Trump is “like a Mafia boss without a Mafia. Trump cannot transform the United States into a Mafia state, of course, but he still acts like a Mafia boss.” Putin, on the other hand, “is a Mafia boss with a real Mafia, which has turned the whole state into a criminal state.” Still, he said, “the behavior at the top is the same.”

The Mafia state is efficient in its own way. It does not take over all state institutions, but absorbs only the ones necessary for extracting profit. Some structures therefore continue to work as though they were part of a normal state. This may explain why . . .

Continue reading.

Written by Leisureguy

31 January 2019 at 12:56 pm

The Wild Experiment That Showed Evolution in Real Time

leave a comment »

Ed Yong writes in the Atlantic:

In the fall of 2010, Rowan Barrett was stuck. He needed a piece of land, one with plenty of mice, and after days of futile searching, he found himself at a motel bar in Valentine, Nebraska, doing what people do at bars: telling a total stranger about his problems.

A young evolutionary biologist, Barrett had come to Nebraska’s Sand Hills with a grand plan. He would build large outdoor enclosures in areas with light or dark soil, and fill them with captured mice. Over time, he would see how these rodents adapted to the different landscapes—a deliberate, real-world test of natural selection, on a scale that biologists rarely attempt.

But first, he had to find the right spots: flat terrain with the right color soil, an abundance of mice, and a willing owner. The last of these was proving especially elusive, Barrett bemoaned. Local farmers weren’t keen on giving up valuable agricultural land to some random out-of-towner. After knocking on door after door, he had come up empty. Hence: the bar.

Barrett’s drinking companion—Bill Ward, or Wild Bill to his friends—thought the idea was bizarre, but also fun. “He told me, ‘I’ve got this alfalfa field. You’re welcome to come by tomorrow. I’m okay with you building this thing,’” Barrett said to me. “I just about fell out of my chair.”

When researchers study evolution through natural selection, they typically focus on just one part of it. The essence of the process is this: Some genes confer beneficial traits. Those traits make their owners more likely to survive and reproduce in a given environment. Over time, those genes and traits become more common. So researchers might, for example, find genes behind certain traits (such as striped coats). Or they might link certain traits to success in a given environment (such as longer-legged lizards in hurricane-hit islands). Beyond some experiments with lab-grown microbes, they have rarely connected all the dots together.

That’s what Barrett accomplished. With hundreds of mice and years of research, he and his colleagues were able to show and measure, in the real world, “the full process of evolution by natural selection,” says Hopi Hoekstraof Harvard University, who led the study. “It’s all in one.”

It was also a pain in the ass. “Utter ignorance was a good thing,” said Barrett, who had, until this point, only ever worked with small fish. “Anyone who had worked with mice would have never attempted this.”

Once the team had Bill Ward on board, they ended up buying 30,000 pounds of stainless steel plates from a local hardware store, and carting them over to the farm using flatbeds and forklifts. There, they erected the plates in trenches two feet deep, creating square enclosures that were 164 feet across on each side. They built three such pens on light sand, and three on dark soil.

At first, the steel pens seemed to work. Mice could neither dig beneath the plates nor climb over them. They were, however, exceptionally good at sneaking through gaps where adjacent plates didn’t quite meet, so the team had to dig everything back up and pour concrete around the joints.

Nature itself seemed eager to select against the team. On one trip, high winds almost flipped the truck carrying the steel plates. Once, a team member fainted and cut himself on a piece of steel. During winter, ramps of snow would accumulate along the walls, so the team had to add an extra layer of mesh along the plates. They also had to catch all the rattlesnakes in the enclosures and throw them over the walls; Bill helped. “Everything goes wrong in the field,” Hoekstra says. “And we’re used to dealing with pipettes, not backhoes.”

When everything was finally set, the team evicted every mouse already inside the enclosures, and caught around 500 more from the surrounding hills. They photographed each rodent, took a DNA sample, implanted a tiny radio chip between its shoulders, and released it into one of the enclosures.

As time passed, many of the mice fell prey to owls, but after three months, the team returned and recaptured the ones that were left. Sure enough, . . .

Continue reading.

Written by Leisureguy

31 January 2019 at 12:03 pm

America’s Epidemic of Vaccine Exemptions

leave a comment »

Emily Atkin writes in the New Republic:

New York and Washington allow parents to refuse vaccinations for non-medical reasons. Both states are experiencing major measles outbreaks. This is not a coincidence.

To be a parent in the 1950s was to know that your child would at some point contract measles, a highly contagious virus characterized by fever and rash. When it happened, most parents needed only to plan for a few days of care. But about 500 every year planned funerals.

The first measles vaccine in the U.S. was introduced in 1963, and the disease was officially eliminated in 2000. Since 2008, however, it has been creeping back. Nearly 350 measles cases were diagnosed in the country last year, the second-highest number since its eradication. Just one month into 2019, it seems certain that this year will be even worse.

At least 35 people, mostly children, have been diagnosed with measles in Washington state since January 1, prompting the state’s governor, Jay Inslee, to declare a state of emergency. Around 40 more have been diagnosed in New York this month, part of an outbreak there that’s seen at least 186 cases since October. Public health officials expect the outbreaks to spread further, and attribute both of them to the same problem: An increasing number of parents are refusing vaccinations for their kids.

Across the United States, children are required to be immunized from life-threatening diseases before they’re allowed to enter school or daycare. This not only protects the child from disease, but ensures that schools are safe places for immune-compromised kids and adults, as well as kids and adults who are medically unable to get vaccines. Vulnerable groups such as these rely on herd immunity, which is achieved when around 90 to 95 percent of the population is vaccinated.

The majority of parents who reject these requirements today, however, aren’t from vulnerable groups. They’re opting out for their own religious or personal beliefs [or because getting the vaccination seems to them like too much trouble – LG]. Parents aren’t legally allowed to do that in every state, but can in the two states experiencing major measles outbreaks. Religious exemptions are permitted in New York, where the outbreak is primarily affecting the ultra-Orthodox Jewish community. Both personal and religious exemptions are allowed in Washington, which according to one infectious disease researcher has become “a major anti-vaccine hot spot due to non-medical vaccine exemptions that have nothing to do with religion.”

Routine childhood vaccination programs have been shown to prevent approximately 42,000 early deaths and 20 million cases of disease per year, saving $13.5 billion in direct costs. That’s why non-medical exemption laws are opposed by the American Medical Association, the American Academy of Pediatrics, the Infectious Diseases Society of America—basically every reputable medical organization out there. But nearly every state has them in some form. There is also “tremendous variability in the rigor with which such beliefs must be proved or documented,” according to the Pediatric Infectious Diseases Society (PIDS). In some states, “parents simply need to state that ‘their religion’ is against vaccination to be granted an exemption, even though no major religions specifically discourage vaccination.”

These problems are being compounded by the growth of the anti-vaccine movement, which argues that vaccines are more dangerous than the government and medical community claim, and thus no vaccines should be mandatory. Neither their facts nor their logic holds up. “Parents cannot be exempted from placing infants in car seats simply because they do not ‘believe’ in them,” argues PIDS. States also don’t allow belief exemptions for laws intended to protect other people, like driving a car without a license. “Whether or not children should be vaccinated before childcare or school entry ought not be a matter of ‘belief,’” the group argues. “Rather, it should be a matter of public policy based on the best available scientific evidence, and in this case the science is definitive: vaccines are safe and they save lives.”

So why doesn’t Congress just pass a vaccination law outlawing non-medical exemptions? “We would love it if they could do something at the federal level,” said Rich Greenaway, the director of operations for the advocacy group Vaccinate Your Family. “We’d be 100 percent behind it.” But it’s not clear that Congress has that legal authority. According to the Congressional Research Service, “the preservation of the public health has been the primary responsibility of state and local governments, and the authority to enact laws relevant to the protection of the public health derives from the state’s general police powers.” Creating a federal vaccination law would turn that historical precedent on its head. . .

Continue reading.

And in the Vancouver [B.C.] Sun, an article by Harrison Mooney:

The B.C. Centre for Disease Control is warning residents about a measles outbreak in the state of Washington.

Washington Gov. Jay Inslee declared a state of emergency Friday after dozens came down with the highly infectious disease in Washington’s southernmost county.

“Clark County is now reporting 30 cases of measles,” Inslee said Friday. “Because measles is contagious before people realize they are sick, those who are not vaccinated may spread the disease without knowing.”

The outbreak appears to have already spread to neighbouring Oregon, with one related case reported.

This year, B.C. has seen just one case of measles, in a traveller returning from the Philippines. No cases related to the Clark County outbreak have been reported in the province, but B.C.’s vaccination rate currently sits below the 95 per cent necessary to achieve herd immunity.

Anti-vaccination propaganda has already led to province-wide measles outbreaks in 2014 and 2010, and the BCCDC is understandably concerned about another as the disease spreads through the neighbouring states.

As a precaution, the BCCDC is advising residents, and border crossers especially, to review and, if necessary, update their vaccines, the best protection against measles. While it is expected most will not be affected by the outbreak, those who have not been vaccinated against the disease, including infants less than a year old, are at risk.

In B.C., children receive two doses of measles-containing vaccine, with the first after 12 months and the second at four years of age. . .

Continue reading.

The US is becoming a disease-ridden country with crumbling infrastructure, a poor educational system (because schools are understaffed and teachers are underpaid), that cannot keep its government in operation. Does anyone notice any hints of decline in that description?

Written by Leisureguy

31 January 2019 at 11:17 am

Watch a single cell become a complete organism in six pulsing minutes of timelapse

leave a comment »

Fascinating video on Aeon with this text:

Native to central and southern Europe, the amphibious alpine newt breeds in shallow water, where its larvae are born, hatch and feed on plankton, before sprouting legs and moving to land. This timelapse video from the Dutch director Jan van IJken tracks the development of a single-celled zygote into the hatched larva of an alpine newt. Captured in stunning detail at microscopic scales, Becoming is a remarkable look at the process of cell division and differentiation, whence all animals – from newts to humans – come. For more awe-inspiring biology from van IJken, watch The Art of Flying.

Do take a look at the video. Mesmerizing.

Written by Leisureguy

31 January 2019 at 11:05 am

Posted in Evolution, Science, Video

Mama Bear Hydrogen Fragrance and another fine lather

with 5 comments

Another Mama Bear shave stick, another great lather. The character of the lather from glycerin-based soaps really does seem to be different. Pretty soon I’ll do some alternating among glycerin-based soaps and tallow-based and other-fat-based soaps and see if I can put my finger on exactly the difference they make in the shaving experience, but right now I’m still just enjoying the experience that glycerin-based soaps offer.

Hydrogen Fragrance is just a name (hydrogen itself being odorless) for “Top fruity notes of apple, grapefruit, peach and leafy greens; with middle notes of lily, lavender, rose and violet; and finally base notes with amber, sandalwood, and raspberry musk.

I again loved the lather, and I did add just a little water to the Plisson brush as I worked up the lather. Three passes with the wonderful iKon open-comb produced a perfect result (totally smooth, no damage), and a splash of Lavanda finished the job.

Great way to start the day.

Written by Leisureguy

31 January 2019 at 7:56 am

Posted in Daily life, Shaving

There’s a Bigger Difference Between a 6-Point Scale and a 10-Point Scale Than You Think

leave a comment »

Very strange. From a Kevin Drum post at Mother Jones.

Written by Leisureguy

30 January 2019 at 12:51 pm

Glycerin-based shaving soaps make wonderful lather

leave a comment »

Moving on from unavailable glycerin-based shaving soaps, this morning I used one of Mama Bear’s shave sticks and got the same wonderful and amazing lather that the two (now unavailable) QED glycerin-based shave sticks produced. I highly recommend you try a glycerin-based soap.

One benefit of using the soap in stick form (other than the pleasure of seeing lather rise up on your face as you brush) is that you automatically do not use too much water: if the brush is very wet, water will spill down your face as you work up the lather there, so one naturally uses a brush that is damp, not wet. (You can easily add water, a very little at a time, as you work up the lather.)

Spellbound Woods is a favorite of mine. I don’t know what it is in a fragrance that makes it appealing, but whatever it is, this soap has quite a bit of it. The specifics are “Amber, Sandalwood, Vanilla, Cedarwood and the barest hint of light floral on the dry down.” I would bet it’s the Vanilla that captures my attention.

The RazoRock Keyhole did a fine job—I like this little brush a lot—and my Baili BR171 is one of the high-comfort high-efficiency razors I dote on. Three passes, perfect result. A splash of Saint Charles Shave Woods aftershave finished the job. (Scroll to the bottom of the list on this page.)

Written by Leisureguy

30 January 2019 at 8:21 am

Posted in Shaving

Meth, Murder, and Pirates: The Coder Who Became a Crime Boss

leave a comment »

Wired has an excerpt from the book The Mastermind: Drugs. Empire. Murder. Betrayal, by Evan Ratliff:

How a brilliant self-made software programmer from South Africa single-handedly built an online startup that became one of the largest individual contributors to America’s burgeoning painkiller epidemic. In his world, everything was for sale. Pure methamphetamine manufactured in North Korea. Yachts built to outrun coast guards. Police protection and judges’ favor. Crates of military-grade weapons. Private jets full of gold. Missile-guidance systems. Unbreakable encryption. African militias. Explosives. Kidnapping. Torture. Murder. It’s a world that lurks just outside of our everyday perception, in the dark corners of the internet we never visit, the quiet ports where ships slip in by night, the back room of the clinic down the street.

MONROVIA, LIBERIA. SEPTEMBER 26, 2012

On a gray afternoon, three men enter a drab hotel room for a business meeting, months in the making. Two are white: a portly South African and his muscled European deputy. The other, with dark hair and a paunch of his own, is Latino—Colombian, or so he says. The hotel is in the Liberian capital, abutting the Atlantic Ocean on the coast of West Africa, but it could be any number of places in the world. The men’s business is drugs and weapons, and drugs and weapons are everywhere. They shake hands, nod heads, and begin speaking in the elliptical but familiar way of people who share the vernacular of a trade. They are cautious, but not cautious enough. A video exists to prove it.

“I can see why you picked this place,” says the South African, settling his substantial bulk into a maroon leather couch pressed against the wall. “Because it’s chaotic. It should be easy to move in and out, from what I’ve seen.” His name is Paul, and to a trained ear his cadence carries a tinge of not just South Africa but his childhood home, Zimbabwe, where he lived until his teens. His large white head is shaved close, and what hair remains has gone gray as he approaches forty. He has the look of a beach vacationer cleaned up for a dinner out, in an oversize blue polo shirt and a pair of khaki cargo shorts. His clothes seem out of keeping with both the scope of his international influence and the deal he is about to complete, with a man he believes to be the head of a South American drug cartel.

“Very easy,” replies the Colombian, whom Paul refers to only as Pepe. In the video recording of the meeting, Pepe sits down just offscreen, on a matching couch. His disembodied voice speaks in flawless, if heavily accented, English.

“Very few people, not too many eyes. It looks like the right place.”

“Trust me—what’s your name again?” “Paul.”

“Paul, trust me, it’s the right place. I’ve been here already for quite a bit of time. And always, me and my organization, we pick places like this. First of all, for corruption. You can buy anything you want here. Anything. You just tell me what you need.”

“Yeah, it’s safe here,” Paul says. “If there’s a problem here, you can fix it. I understand this type of place.”

“Everything is easy here. Just hand to hand, boom boom boom, you can see,” Pepe says, laughing. “Well, thanks to your guy here, now we are meeting.” He gestures at the third man in the room, the European employee of Paul’s who goes by the name Jack. It was Jack who made the initial connection between Paul and Pepe.

The deal Jack brokered was complex enough that, when I meet him years later, I need him to walk me through it several times. The Colombians, who deal primarily in the cocaine produced in their own country, are looking to expand into methamphetamine, which they want to manufacture in Liberia and distribute to the United States and Europe.

Paul, a computer programmer who heads his own kind of cartel based in the Philippines, will provide the materials to build the Colombians’ meth labs: precursor chemicals, formulas for cooking them into meth, and a “clean room” in which to synthesize it all. While the labs are being built, Paul has agreed to also sell Pepe his own stash of meth, in exchange for an equivalent amount of cocaine at market rates.

After months of back-and-forth, Jack has urged Paul to travel to Liberia and meet his new associate “boss to boss” to finalize the deal.

“So where do you want to start?” Pepe says. “First of all is the clean room.”

Paul tells him that the parts needed to build it are already en route by boat. “If you have any problem, I’ll send guys here to assemble it like that.” He snaps his fingers.

“We shouldn’t have any. I got my guys here, my chemist.”

“To compensate you for the delays, we will just, when we do business, we will give you back the money.”

“Paul, you don’t have to compensate me for nothing.”

Paul flicks his hand in the air. “We feel bad it took so long.” “This is just business,” Pepe says. “We don’t have to compensate, just doing business. This is about money.”

Pepe turns to the second part of the deal: the trade of his Colombian cocaine for Paul’s methamphetamine, a sample of which Paul has shipped to him from his base in the Philippines. “Let me ask you a question,” Pepe says.

“Sure.”

“You are not Filipino, why the Philippines?”

“Same reason you are in Liberia. Basically, as far as Asia goes, it’s the best shithole we can find, which gives us the ability to ship anywhere. It’s the best position in Asia. And it’s also a poor place. Not as bad as here, but we can still solve problems.”

“You are cooking your shit in the Philippines?” Pepe says. “Actually, right now we manufacture in the Philippines and we also buy from the Chinese. We’re getting it from North Korea. So the quality you saw was very high.”

“That’s not just very high. That is awesome.” “Yeah.”

“I was going to tell you that later on, but now that you talk about it: That stuff is fucking incredible.”

“That is manufactured by the North Koreans,” Paul says. “We get it from the Chinese, who get it from the North Koreans.”

“So my product is going to be the same, the amount that I’m going to buy from you?”

“The same. Exactly the same.” Paul nods. “I know you want the high quality for your market.”

“Yeah, because the product—you know that one of the best customers, and you probably know that, is the Americans.”

“Number one.”

“It’s the number one. They are fucking—they want everything over there. I don’t know what the word is from Spanish. Consumistas? Consumists?”

“Consumers,” Jack interjects, off-camera.

“Yeah, they buy everything and they never stop,” Paul says. “So everything that I ship is to America,” Pepe says. “Trust me, when I brought this, fucking everyone was asking me for it. Everyone.”

Paul and Pepe consider different payment possibilities. First they will trade the cocaine for meth. After that, Paul says that he is happy to be paid in gold or diamonds. If they need to conduct bank transfers, he works primarily through China and Hong Kong, although he sounds a note of caution. “We just had, in Hong Kong, twenty million dollars frozen, by bullshit,” he says. “You need to be cautious. It becomes worse, because the American, he likes to control everything. And they are there, making a lot of trouble.”

“I say fuck Americans,” Pepe says. “Americans, like you say, they think that they can control everything, but they cannot. It’s not impossible, but they cannot. We have to be very careful.”

They discuss shipment methods, and how many kilos of each drug the other could move in a month. Paul owns ships already picking up loads in South America and traveling to Asia, but he much prefers to work in Africa, territory he knows well. His customers are in Australia, Thailand, China. “We are not touching the US right now,” he says.

“Why not?” . . .

Continue reading.

Written by Leisureguy

29 January 2019 at 2:50 pm

TAKEN: How police departments make millions by seizing property

leave a comment »

Civil asset forfeiture is robbery by the government. Anna Lee, Nathaniel Cary, and Mike Ellis report in the Greenville (SC) News:

hen a man barged into Isiah Kinloch’s apartment and broke a bottle over his head, the North Charleston resident called 911. After cops arrived on that day in 2015, they searched the injured man’s home and found an ounce of marijuana.

So they took $1,800 in cash from his apartment and kept it.

______

When Eamon Cools-Lartigue was driving on Interstate 85 in Spartanburg County, deputies stopped him for speeding. The Atlanta businessman wasn’t criminally charged in the April 2016 incident. Deputies discovered $29,000 in his car, though, and decided to take it.

______

When Brandy Cooke dropped her friend off at a Myrtle Beach sports bar as a favor, drug enforcement agents swarmed her in the parking lot and found $4,670 in the car.

Her friend was wanted in a drug distribution case, but Cooke wasn’t involved. She had no drugs and was never charged in the 2014 bust. Agents seized her money anyway.

She worked as a waitress and carried cash because she didn’t have a checking account. She spent more than a year trying to get her money back.

______

The Greenville News and Anderson Independent Mail examined these cases and every other court case involving civil asset forfeiture in South Carolina from 2014-2016.

Our examination was aimed at understanding this little-discussed, potentially life-changing power that state law holds over citizens — the ability of officers to seize property from people, even if they aren’t charged with a crime.

The resulting investigation became TAKEN, a statewide journalism project with an exclusive database and in-depth reporting. It’s the first time a comprehensive forfeiture investigation like this has been done for an entire U.S. state, according to experts.

The TAKEN team scoured more than 3,200 forfeiture cases and spoke to dozens of targeted citizens plus more than 50 experts and officials. Additionally, the team contacted every law enforcement agency in the state.

This yielded a clear picture of what is happening: Police are systematically seizing cash and property — many times from people who aren’t guilty of a crime — netting millions of dollars each year. South Carolina law enforcement profits from this policing tactic: the bulk of the money ends up in its possession.

The intent is to give law enforcement a tool to use against nefarious organizations by grabbing the fruits of their illegal deeds and using the proceeds to fight more crime.

Officers gather in places like Spartanburg County for contests with trophies to see who can make the largest or most seizures during highway blitzes. They earn hats, mementos and free dinners, and agencies that participate take home a cut of the forfeiture proceeds.

That money adds up. Over three years, law enforcement agencies seized more than $17 million, our investigation shows.

“We’ve heard so many awful stories,” said Hilary Shelton, director of the NAACP’s Washington bureau. “Having cash makes you vulnerable to an illicit practice by a police organization.

“It’s a dirty little secret. It’s so consistent with the issue of how law enforcement functions. They say, ‘Oh yeah, we want to make sure that resources used for the trafficking of drugs are stopped’ … but many of the people they are taking money from are not drug traffickers or even users.”

These seizures leave thousands of citizens without their cash and belongings or reliable means to get them back. They target black men most, our investigation found — with crushing consequences when life savings or a small business payroll is taken.

Many people never get their money back. Or they have to fight to have their property returned and incur high attorney fees.

Police officials respond by saying forfeiture allows them to hamstring crime rings and take money from drug dealers, a move they say hurts trafficking more than taking their drugs.

In 2016, when a Myrtle Beach police unit broke up a sophisticated drug ring called the 24/7 Boyz that offered a dispatch system to order drugs and have them delivered on demand, the police used seizure powers. They took cars, firearms, a four-bedroom house and $80,000. They also arrested 12 people.

Fifteenth Circuit Solicitor Jimmy Richardson initially prosecuted the case before turning it over to the federal government. In January, 10 of the 12 defendants pleaded guilty to drug conspiracy charges.

Richardson said taking a drug ring’s operating cash strikes a critical blow against traffickers in a way that criminal charges don’t. “A drug enterprise is an onion, it’s a multitude of layers,” he said. “Some tools hurt the traffickers, some hurt the enterprise itself. I feel this hurts the enterprise.”

Agencies also said funding for their work would be imperiled without the profit from this tool. Clemson Police Chief Jimmy Dixon said losing those profits could shut down his agency’s K-9 unit entirely. Undercover narcotics operations overall would suffer, Dixon said, citing limits on the department’s operating budget.

 

The TAKEN investigation key findings:

•  Black men pay the price for this program. They represent 13 percent of the state’s population. Yet 65 percent of all citizens targeted for civil forfeiture in the state are black males.

“These types of civil asset forfeiture practices are going to put a heavier burden on lower-income people,” said Ngozi Ndulue, recently a national NAACP senior director, now working at the Death Penalty Information Center. “And when you add in racial disparities around policing and traffic stops and arrest and prosecution, we know this is going to have a disproportionate effect on black communities.”

•  If you are white, you are twice as likely to get your money back than if you are black.

•  Nearly one-fifth of people who had their assets seized weren’t charged with a related crime. Out of more than 4,000 people hit with civil forfeiture over three years, 19 percent were never arrested. They may have left a police encounter without so much as a traffic ticket. But they also left without their cash.

Roughly the same number — nearly 800 people — were charged with a crime but not convicted.

Greenville attorney Jake Erwin said the overarching idea is that the money being seized is earnings from past drug sales, so it’s fair game. “In theory, that makes a little bit of sense,” he said. “The problem is that they don’t really have to prove that.”

In some states, the suspicions behind a civil forfeiture must be proven beyond a reasonable doubt in court, but there is no requirement of proof in South Carolina. When a forfeiture is contested, prosecutors only have to show a preponderance of evidence to keep seized goods.

Police don’t just seize cash.

Practically anything can be confiscated and sold at auction: jewelry, electronics, firearms, boats, RVs. In South Carolina, 95 percent of forfeiture revenue goes back to law enforcement. The rest is deposited into the state’s general fund.

•  Most of the money isn’t coming from kingpins or money laundering operations. It’s coming from hundreds of encounters where police take smaller amounts of cash, often when they find regular people with drugs for personal use. Customers, not dealers. More than 55 percent of the time when police seized cash, they took less than $1,000.

•  Your cash or property can disappear in minutes but take years to get back. The average time between when property is seized and when a prosecutor files for forfeiture is 304 days, with the items in custody the whole time. Often, it’s far longer, well beyond the two-year period state law allows for a civil case to be filed. But only rarely are prosecutors called out for missing the filing window and forced to return property to owners.

•  The entire burden of recovering property is on the citizens, who must prove the goods belong to them and were obtained legally. Since it’s not a criminal case, an attorney isn’t provided. Citizens are left to figure out a complex court process on their own. Once cases are filed, they have 30 days to respond. Most of the time, they give up.

More: Hospital called police, who seized man’s money

More: She gave her friend a ride and lost her wages

•  The bulk of forfeited money finances law enforcement, but there’s little oversight of what is seized or how it’s spent. Police use it to pay for new guns and gear, for training and meals and for food for their police dogs. In one case, the Spartanburg County sheriff kept a top-of-the-line pick-up truck as his official vehicle and sold countless other items at auctions.

In many other places, changes are being made: 29 states have taken steps to reform their forfeiture process. Fifteen states now require a criminal conviction before property can be forfeited, according to the Institute for Justice, a non-profit libertarian law firm.

South Carolina lawmakers have crafted reform bills in recent General Assembly sessions, but none of the efforts made it out of committee.

o critics, South Carolina is the poster child for the injustice inherent in the for-profit civil forfeiture system, said Louis Rulli, a law expert at the University of Pennsylvania.

Forfeiture doesn’t square with the rest of the justice system, Rulli said. “How could it be possible that my property could be taken when I am not even charged with any criminal offense? It seems un-American,” he said.

Those who pay the biggest price are black men. Men like Kinloch. While he was hospitalized for a head injury from a home intruder, North Charleston police removed money from the tattoo artist’s apartment.

That department earns 12 percent of their annual operating budget from cash and property seized under civil law, our investigation shows.

“The robber didn’t get anything, but the police got everything,” said the 28-year-old Kinloch.

Police charged him with possession with intent to distribute after finding the marijuana in his apartment, but the charge was dismissed.

Kinloch never got his cash back.

Rent was due.

Without his $1,800, he couldn’t pay the landlord and was forced out of his home.

More: He fought off a robber, but police seized his $1,800

Kinloch isn’t alone as a black man facing forfeiture over small, or nonexistent, criminal charges in South Carolina. Our investigation found that black men make up the largest share by far of people targeted for civil forfeiture, much higher than even the drug arrest or incarceration rate for black men. Read about our exclusive findings here:  . . .

Continue reading. There’s much more, and it sounds exactly like what happens in a police state: authoritarians making up the rules to suit themselves.

Written by Leisureguy

29 January 2019 at 2:09 pm

Germs in Your Gut Are Talking to Your Brain. Scientists Want to Know What They’re Saying.

leave a comment »

Carl Zimmer reports in the NY Times:

In 2014 John Cryan, a professor at University College Cork in Ireland, attended a meeting in California about Alzheimer’s disease. He wasn’t an expert on dementia. Instead, he studied the microbiome, the trillions of microbes inside the healthy human body.

Dr. Cryan and other scientists were beginning to find hints that these microbes could influence the brain and behavior. Perhaps, he told the scientific gathering, the microbiome has a role in the development of Alzheimer’s disease.

The idea was not well received. “I’ve never given a talk to so many people who didn’t believe what I was saying,” Dr. Cryan recalled.

A lot has changed since then: Research continues to turn up remarkable links between the microbiome and the brain. Scientists are finding evidence that microbiome may play a role not just in Alzheimer’s disease, but Parkinson’s disease, depression, schizophrenia, autism and other conditions.

For some neuroscientists, new studies have changed the way they think about the brain.

One of the skeptics at that Alzheimer’s meeting was Sangram Sisodia, a neurobiologist at the University of Chicago. He wasn’t swayed by Dr. Cryan’s talk, but later he decided to put the idea to a simple test.

“It was just on a lark,” said Dr. Sisodia. “We had no idea how it would turn out.”

He and his colleagues gave antibiotics to mice prone to develop a version of Alzheimer’s disease, in order to kill off much of the gut bacteria in the mice. Later, when the scientists inspected the animals’ brains, they found far fewer of the protein clumps linked to dementia.

Just a little disruption of the microbiome was enough to produce this effect. Young mice given antibiotics for a week had fewer clumps in their brains when they grew old, too.

“I never imagined it would be such a striking result,” Dr. Sisodia said. “For someone with a background in molecular biology and neuroscience, this is like going into outer space.”

Following a string of similar experiments, he now suspects that just a few species in the gut — perhaps even one — influence the course of Alzheimer’s disease, perhaps by releasing chemical that alters how immune cells work in the brain.

He hasn’t found those microbes, let alone that chemical. But “there’s something’s in there,” he said. “And we have to figure out what it is.”

Scientists have long known that microbes live inside us. In 1683, the Dutch scientist Antonie van Leeuwenhoek put plaque from his teeth under a microscope and discovered tiny creatures swimming about.

But the microbiome has stubbornly resisted scientific discovery. For generations, microbiologists only studied the species that they could grow in the lab. Most of our interior occupants can’t survive in petri dishes.

In the early 2000s, however, the science of the microbiome took a sudden leap forward when researchers figured out how to sequence DNA from these microbes. Researchers initially used this new technology to examine how the microbiome influences parts of our bodies rife with bacteria, such as the gut and the skin.

Few of them gave much thought to the brain — there didn’t seem to be much point. The brain is shielded from microbial invasion by the so-called blood-brain barrier. Normally, only small molecules pass through.

“As recently as 2011, it was considered crazy to look for associations between the microbiome and behavior,” said Rob Knight, a microbiologist at the University of California, San Diego.

He and his colleagues discovered some of the earliest hints of these links. Investigators took stool from mice with a genetic mutation that caused them to eat a lot and put on weight. They transferred the stool to mice that had been raised germ-free — that is, entirely without gut microbiomes — since birth.

After receiving this so-called fecal transplant, the germ-free mice got hungry, too, and put on weight.

Altering appetite isn’t the only thing that the microbiome can do to the brain, it turns out. Dr. Cryan and his colleagues, for example, have found that mice without microbiomes become loners, preferring to stay away from fellow rodents.

The scientists eventually discovered changes in the brains of these antisocial mice. One region, called the amygdala, is important for processing social emotions. In germ-free mice, the neurons in the amygdala make unusual sets of proteins, changing the connections they make with other cells.

Studies of humans revealed some surprising patterns, too. Children with autism have unusual patterns of microbial species in their stool. Differences in the gut bacteria of people with a host of other brain-based conditions also have been reported.

But none of these associations proves cause and effect. Finding an unusual microbiome in people with Alzheimer’s doesn’t mean that the bacteria drive the disease. It could be the reverse: People with Alzheimer’s disease often change their eating habits, for example, and that switch might favor different species of gut microbes.

Fecal transplants can help pin down these links. In his research on Alzheimer’s, Dr. Sisodia and his colleagues transferred stool from ordinary mice into the mice they had treated with antibiotics. Once their microbiomes were restored, the antibiotic-treated mice started developing protein clumps again.

“We’re extremely confident that it’s the bacteria that’s driving this,” he said. Other researchers have taken these experiments a step further by using human fecal transplants.

If you hold a mouse by its tail, it normally wriggles in an effort to escape. If you give it a fecal transplant from humans with major depression, you get a completely different result: The mice give up sooner, simply hanging motionless.

As intriguing as this sort of research can be, it has a major limitation. Because researchers are transferring hundreds of bacterial species at once, the experiments can’t reveal which in particular are responsible for changing the brain.

Now researchers are pinpointing individual strains that seem to have an effect.

To study autism, Dr. Mauro Costa-Mattioli and his colleagues at the Baylor College of Medicine in Houston investigated different kinds of mice, each of which display some symptoms of autism. A mutation in a gene called SHANK3 can cause mice to groom themselves repetitively and avoid contact with other mice, for example.

In another mouse strain, Dr. Costa-Mattioli found that feeding mothers a high-fat diet makes it more likely their pups will behave this way.

When the researchers investigated the microbiomes of these mice, they found the animals lacked a common species called Lactobacillus reuteri. When they added a strain of that bacteria to the diet, the animals became social again.

Dr. Costa-Mattioli found evidence that L. reuteri releases compounds that send a signal to nerve endings in the intestines. The vagus nerve sends these signals from the gut to the brain, where they alter production of a hormone called oxytocin that promotes social bonds.

Other microbial species also send signals along the vagus nerve, it turns out. Still others communicate with the brain via the bloodstream.

It’s likely that this influence begins before birth, as a pregnant mother’s microbiome releases molecules that make their way into the fetal brain.

Mothers seed their babies with microbes during childbirth and breast feeding. During the first few years of life, both the brain and the microbiome rapidly mature.

To understand the microbiome’s influence on the developing brain, Rebecca Knickmeyer, a neuroscientist at Michigan State University, is studying fMRI scans of infants.

In her first study, published in January, she focused on the amygdala, the emotion-processing region of the brain that Dr. Cryan and others have found to be altered in germ-free mice.

Dr. Knickmeyer and her colleagues measured the strength of the connections between the amygdala and other regions of the brain. Babies with a lower diversity of species in their guts have stronger connections, the researchers found.

Does that mean a low-diversity microbiome makes babies more fearful of others? It’s not possible to say yet — but Dr. Knickmeyer hopes to find out by running more studies on babies.

As researchers better understand how the microbiome influences the brain, they hope doctors will be able to use it to treat psychiatric and neurological conditions.

It’s possible they’ve been doing it for a long time — without knowing.

In the early 1900s, neurologists found that putting people with epilepsy on a diet low in carbohydrates and high in protein and fat sometimes reduced their seizures.

Epileptic mice experience the same protection from a so-called ketogenic diet. But no one could say why. Elaine Hsiao, a microbiologist at the University of California, Los Angeles, suspected that the microbiome was the reason.

To test the microbiome’s importance, Dr. Hsiao and her colleagues raised mice free of microbes. When they put the germ-free epileptic mice on a ketogenic diet, they found that the animals got no protection from seizures.

But if they gave the germ-free animals stool from mice on a ketogenic diet, seizures were reduced.

Dr. Hsiao found that two types of gut bacteria in particular thrive in mice on a ketogenic diet. They may provide their hosts with building blocks for neurotransmitters that put a brake on electrical activity in the brain.

It’s conceivable that people with epilepsy wouldn’t need to go on a ketogenic diet to get its benefits — one day, they may just take a pill containing the bacteria that do well on the diet.

Sarkis Mazmanian, a microbiologist at Caltech, and his colleagues have identified a single strain of bacteria that triggers symptoms of Parkinson’s disease in mice. He has started a company that is testing a compound that may block signals that the microbe sends to the vagus nerve.  . .

Continue reading.

Written by Leisureguy

29 January 2019 at 1:19 pm

We Need to Talk about Intestinal Worms

leave a comment »

Ellen Agler with Mojie Crigler write in Scientific American:

n 1909, John D. Rockefeller, Sr., deeded 72,000 shares of the Standard Oil Company to establish a foundation dedicated to the promotion of health and the reduction of disease. Hookworm, then rampant throughout the U.S. South, was the Rockefeller Foundation’s first undertaking. The parasite latches onto the wall of its host’s small intestine, causing iron and protein deficiencies, stunted physical and cognitive growth and profound lethargy. Now, 110 years later, where does hookworm stand? How much progress has been made, and what work remains?

In many ways, the Rockefeller Sanitary Commission for the Eradication of Hookworm Disease (RSC) set the standard for public health programs to map, treat, educate, scale up, work with other sectors (in hookworm’s case, water, sanitation and hygiene) and collaborate with local leaders in government, media, churches and schools. And while technology has advanced, the intent is unchanged.

The RSC’s educational silent films and country fair exhibits have given way to TED talks and text-message alerts, but all are used to raise awareness and encourage preventive behavior. Today, smartphones and satellites are used to map hookworm’s location, replacing the surveys and voluntarily mailed-in stool samples that the RSC relied on. One can only imagine what devices will be available for mapping diseases 110 years from now.

Some aspects of the anti-hookworm effort have undergone paradigm shifts. Regarding treatment, the RSC aimed to make the host environment inhospitable for the parasite, through ingestion of potential poisons (extract of male fern or thymol with Epsom salts). By contrast, modern treatments target the worms instead of the human. The drugs albendazole and mebendazole can disrupt adult hookworms’ metabolism and reproduction, most often with little to no side effects in the host.

In the past, infected people repeated treatments until their stools were worm-free. Moreover, the RSC treated infected individuals one at a time; now entire districts, regions, even nations receive deworming medicine via regularly scheduled mass drug administrations. Then and now, treatment is combined with strong efforts to improve sanitation and hygiene.

By the 1930s (much later than the RSC had anticipated), hookworm had declined in the South in large part because fewer people were exposed to the parasite, thanks to more indoor plumbing, paved roads, farming machinery (which took people out of the fields) and an overall rise of the standard of living. In 2019, people infected with hookworm are likely to be among the 1.5 billion most impoverished people in the world. Countries in Africa and Southeast Asia have the greatest prevalence of hookworm, which is often co-endemic with whipworm and roundworm (the three are collectively called “intestinal worms”). A rising standard of living can’t be counted on to eliminate intestinal worms from endemic regions. Rather, the elimination of intestinal worms as a public health problem likely will contribute to a rise in the standard of living.

In the late 1990s, the American economists Michael Kremer and Edward Miguel conducted research in Kenya demonstrating that, compared to programs that added more textbooks or teachers, deworming school children cost the least and had the greatest effect on education. (“Deworming” refers to intestinal worms and schistosomiasis, a disease in which the parasitic worm lives in the host’s intestine, bladder, or reproductive organs; as these four diseases are often co-endemic, treatment is often integrated.)

Kremer and Miguel found that students who were dewormed attended school 25 percent more often than those who were not dewormed. Their ability to learn improved. Siblings and neighbors, exposed to a smaller pool of parasites, also benefited. Years later, Kremer and Miguel followed up with participants from their original study and found that those who had been dewormed were earning more money. In the bigger picture, according to a study from Erasmus University, African economies could gain $52 billion by 2030 in increased productivity if NTDs were ended by 2020.

The RSC offered robust economic arguments for its hookworm program–from the still-relevant point that employers incur lost-labor costs when workers are infected, to an insistence that its work was not charity but an investment. Whereas the RSC dispatched mobile dispensaries, a key goal of modern programs is sustainability: locally funded, locally run. Fortunately, albendazole and mebendazole (for hookworm) and praziquantel (for schistosomiasis) are donated by pharmaceutical companies.

Deworming costs 20–50 cents per person per year, on average; delivery is a program’s biggest expense. Many deworming programs are based at schools, because students are a captive audience and the schools, which frequently double as community centers, provide a ready-made infrastructure for record-keeping and disseminating information on diseases and good sanitation and hygiene practices.

In 2017, 598 million children around the world were treated for intestinal worms. Compare that to . . .

Continue reading.

Written by Leisureguy

29 January 2019 at 12:42 pm

Corporations behaving badly: Ex-IBM Executive Says She Was Told Not to Disclose Names of Employees Over Age 50 Who’d Been Laid Off

leave a comment »

Peter Gosselin reports in ProPublica:

In sworn testimony filed recently as part of a class-action lawsuit against IBM, a former executive says she was ordered not to comply with a federal agency’s request that the company disclose the names of employees over 50 who’d been laid off from her business unit.
Catherine A. Rodgers, a vice president who was then IBM’s senior executive in Nevada, cited the order among several practices she said prompted her to warn IBM superiors the company was leaving itself open to allegations of age discrimination. She claims she was fired in 2017 because of her warnings.
Company spokesman Edward Barbini labeled Rodgers’ claims related to potential age discrimination “false,” adding that the reasons for her firing were “wholly unrelated to her allegations.”
Rodgers’ affidavit was filed Jan. 17 as part of a lawsuit in federal district court in New York. The suit cites a March 2018 ProPublica story that IBM engaged in a strategy designed to, in the words of one internal company document, “correct seniority mix” by flouting or outflanking U.S. anti-age discrimination laws to force out tens of thousands of older workers in the five years through 2017 alone.
Rodgers said in an interview Sunday that IBM “appears to be engaged in a concerted and disproportionate targeting of older workers.” She said that if the company releases the ages of those laid off, something required by federal law and that IBM did until 2014, “the facts will speak for themselves.”
“IBM is a data company. Release the data,” she said.
Rodgers is not a plaintiff in the New York case but intends to become one, said Shannon Liss-Riordan, the attorney for the employees.
IBM has not yet responded to Rodgers’ affidavit in the class-action suit. But in a filing in a separate age-bias lawsuit in federal district court in Austin, Texas, where a laid-off IBM sales executive introduced the document to bolster his case, lawyers for the company termed the order for Rodgers not to disclose the layoffs of older workers from her business unit “unremarkable.”
They said that the U.S. Department of Labor sought the names of the workers so it could determine whether they qualified for federal Trade Adjustment Assistance, or TAA, which provides jobless benefits and re-training to those who lose their jobs because of foreign competition. They said that company executives concluded that only one of about 10 workers whose names Rodgers had sought to provide qualified.
In its reporting, ProPublica found that IBM has gone to considerable lengths to avoid reporting its layoff numbers by, among other things, limiting its involvement in government programs that might require disclosure. Although the company has laid off tens of thousands of U.S. workers in recent years and shipped many jobs overseas, it sought and won TAA aid for just three during the past decade, government records show.
Company lawyers in the Texas case said that Rodgers, 62 at the time of her firing and a 39-year veteran of IBM, was let go in July 2017 because of “gross misconduct.”
Rodgers said that she received “excellent” job performance reviews for decades before questioning IBM’s practices toward older workers. She rejected the misconduct charge as unfounded.
Legal action against IBM over its treatment of older workers appears to be growing. In addition to the suits in New York and Texas, cases are also underway in California, New Jersey and North Carolina.
Liss-Riordan, who has represented workers against a series of tech giants including Amazon, Google and Uber, has added 41 plaintiffs to the original three in the New York case and is asking the judge to require that IBM notify all U.S. workers whom it has laid off since July 2017 of the suit and of their option to challenge the company.
One complicating factor is that IBM requires departing employees who want to receive severance pay to sign a document waiving their right to take the company to court and limiting them to private, individual arbitration. Studies show this process rarely results in decisions that favor workers. To date, neither plaintiffs’ lawyers nor the government has challenged the legality of IBM’s waiver document.
Many ex-employees also don’t act within the 300-day federal statute of limitations for bringing a case. Of about 500 ex-employees who Liss-Riordan said contacted her since she filed the New York case last September, only 100 had timely claims and, of these, only about 40 had not signed the waivers and so were eligible to join the lawsuit. She said she’s filed arbitration cases for the other 60.
At key points, Rodgers’ account of IBM’s practices is similar to those reported by ProPublica. Among the parallels:

  • Rodgers said that all layoffs in her business unit were of older workers and that younger workers were unaffected. (ProPublica estimated that about 60 percent of the company’s U.S. layoffs from 2014 through 2017 were workers age 40 and above.)
  • She said that she and other managers were told to encourage workers flagged for layoff to use IBM’s internal hiring system to find other jobs in the company even as upper management erected insurmountable barriers to their being hired for these jobs.
  • Rodgers said the company reversed a decades long practice of encouraging employees to work from home and ordered many to begin reporting to a few “hub” offices around the country, a change she said appeared designed to prompt people to quit. She said that in one case an employee agreed to relocate to Connecticut only to be told to relocate again to North Carolina. . . .

Continue reading.

Written by Leisureguy

29 January 2019 at 12:16 pm

How the GOP Prompted the Decay of Political Norms

leave a comment »

E.J. Dionne, Jr., Norm Ornstein, and Thomas Mann last September published the book One Nation After Trump: A Guide for the Perplexed, the Disillusioned, the Desperate, and the Not-Yet Deported. The Atlantic has this extract from the book:

President Trump’s approach to governance is unlike that of his recent predecessors, but it is also not without antecedents. The groundwork for some of this dysfunction was laid in the decades before Trump’s emergence as a political figure. Nowhere is that more true than in the disappearance of the norms of American politics.

Norms are defined as “a standard or pattern, especially of social behavior, that is typical or expected of a group.” They are how a person is supposed to behave in a given social setting. We don’t fully appreciate the power of norms until they are violated on a regular
 basis. And the breaching of norms often produces a cascading effect: As one person breaks with tradition
 and expectation, behavior previously considered inappropriate is normalized and taken up by others. Donald Trump is the Normless President, and his ascendancy threatens to inspire a new wave of norm-breaking.

This would be bad enough if he were entirely a one-off, an amoral figure who suddenly burst onto the scene and took advantage of widespread discontent and an electoral system that tilts outcomes in the direction of his politics. But Trumpism has long been in gestation. His own party, sometimes consciously, sometimes not, has been undercutting the norms of American politics for decades. As the traditionalist conservative Rod Dreher has written, “Trump didn’t come from nowhere. George W. Bush, the Republican Party, and movement conservatism bulldozed the field for Trump without even knowing what they were doing.”

The United States has to hope that in the long run, more Republicans will join the ranks of the conservatives who already understand the damage Trump’s indifference to informal ethical benchmarks is inflicting on our political system. But to do so effectively, they will, as Dreher suggests, have to reexamine their own past and the deterioration in the standards of political behavior that took root in their party. And this will only happen if Republican officials come to see altering the course of the modern conservative movement as a political imperative.

Parties, from the beginning of the Republic, have been a central force in American politics, clarifying the policy choices available to American voters. They provide the basis for organizing elections and political power in the institutions of government even as they compete constantly for loyalty and fealty with the institutions themselves. Members of Congress loyal to the president’s party sometimes reflexively follow his lead, denying or papering over his failings and failures. At other moments—driven by personal beliefs or constituency interests, by electoral imperatives, and sometimes, at least, by faithfulness to the public interest and the fundamentals of the Constitution—they keep their distance from him. And members of the party opposed to the president often challenge his positions.

But during some periods of divided government, when one party controls the White House and the other has a majority in the House, the Senate, or both chambers, cross-party coalitions where parties share responsibility for governance have thrived. As political scientist David Mayhew showed, divided government during the decades following World War II produced significant legislative achievements—and arguably did so as or more often than when a single party held all the reins of power.

Strong Democratic majorities in Congress in the 1930s voted for sweeping New Deal legislation—but many Democrats joined with Republicans to block Franklin Roosevelt’s attempt to enlarge the Supreme Court. Republicans in the majority in 1947-48 vigorously opposed most of Harry Truman’s agenda—leading to his famous campaign in 1948 against the “Do-Nothing Republican Congress.” But the same Congress joined with Truman to enact the Marshall Plan, as well as a historic and (to this point, at least) enduring reorganization of the national-security apparatus that created the National Security Council and made it easier to coordinate defense and foreign policies. Most Democrats in the Reagan era opposed his initial plans to slash government and cut taxes, but conservative Democrats provided enough votes for Reagan to enact an early package. Then, in subsequent years, Democrats bargained with him to increase taxes to combat the burgeoning deficit his program produced and to stave off further spending cuts.

So what happened? Parties have certainly become more polarized, shaped by the great ideological and geographical sorting that began in the 1960s. The South, realigned by Lyndon Johnson’s commitment to civil rights, lost its status as nearly uniformly Democratic and gradually became the GOP’s most important power center. New England and the West Coast had once been strongholds of an often-moderate brand of Republicanism. They became bastions of Democratic strength. A repolarized partisanship solidified by the 1990s and became even more pronounced after 2008.

Polarized parties encouraged polarized policymaking, but room still existed for occasional cross-party cooperation. The State Children’s Health Insurance Program (S-CHIP), which covered almost 9 million children in 2016, would never have been enacted without the odd-couple partnership between the loyally liberal Senator Ted Kennedy of Massachusetts and the faithfully conservative Senator Orrin Hatch from Utah. Over four decades in Congress, from 1975 to 2015, Representative Henry Waxman, a staunch liberal, found common ground for compromise with conservative Republicans—including Reagan. The results: groundbreaking policies in health care and the environment. Waxman also conducted bipartisan investigations during the Bush administration in cooperation with Republican Representative Tom Davis of Virginia.

The norms inculcated over many decades led to an elaborate language of respect (“my distinguished colleague”) toward fellow legislators that often seemed out of place during particularly emotional and intense debates. They could also lead to amusing understatement. In the 1960s, House Speaker John McCormack of Massachusetts would express his distress over the behavior of a Republican on the floor by saying: “I hold the distinguished gentleman in minimum high regard.”

Tribalism, which cast members of the opposing party not as worthy adversaries but as dangerous enemies, swept that respect away. The change began with Newt Gingrich, who came to Congress in 1979 determined to nationalize congressional elections and convince voters that Washington was so dreadful and corrupt that anything would be an improvement over the status quo. When he recruited candidates, he offered them a language of partisan militancy. “You’re fighting a war,” Gingrich characteristically told a group of college Republicans in 1978. “It is a war for power. … Don’t try to educate. That is not your job. What is the primary purpose of a political leader? To build a majority.” And he did, winning an extraordinary victory in 1994 that gave Republicans control of the House for the first time in 40 years. That heralded a period of intense competition for control of the House and Senate, which itself fueled the hyper-partisanship that came to characterize national politics more generally.

Gingrich transformed the Republican Party in Congress. His recruits came in believing what Gingrich had taught them. Although he had a deep interest in science, Gingrich launched an attack on the use of science and facts in public policy that would be picked up by other Republican politicians in the years to come. One of the more enduring norms of Congress was that evidence vetted by acknowledged experts would frame debate and deliberation. Lawmakers could differ sharply on policy solutions, but all would share facts curated by the experts. As speaker, Gingrich abolished the Office of Technology Assessment, a blue-ribbon congressional agency that had been established for scientists to offer objective analysis on issues ranging from defense and space to climate and energy. The new majority defended shuttering the office’s doors as a cost-saving measure, and it was part of Gingrich’s broader (and largely successful) effort to centralize power in the speaker’s office. But the move also sent a message that ideological commitments would trump evidence.

Although Gingrich’s tenure as speaker ended in 1998, the atmosphere he helped to create persisted and was amplified by his less colorful successor, Dennis Hastert. Goaded by his lieutenant Tom DeLay (who was, in many ways, the real leader of the House), Hastert and House Republicans culminated their sustained assault on the Clinton presidency by pushing for the impeachment of Bill Clinton, a move that most Americans saw as aggressively partisan.

And when George W. Bush succeeded Clinton, the Hastert-led House transformed itself into an arm of the executive, creating a custom that became known as the “Hastert Rule.” Closing off the option of broad bipartisan coalitions in support of legislation, the “Rule” declared that the House would now rely only on Republican votes to pass bills, and they would reach the floor only if they secured a “majority of the majority.” To promote the Bush agenda, Hastert also bent both the existing rules and customs of the House. There would be few amendments permitted; bills would be written not in committee but by party leaders; and open processes (such as conference committees to reconcile differences between bills passed in the House and the Senate) would be discouraged. In a particularly flagrant episode, Hastert and DeLay held open a roll-call vote to pass George W. Bush’s prescription-drug benefit under Medicare for three hours, rather than the customary 15 minutes, in order to avoid defeat. They secured the final vote for the bill, but DeLay was later admonished by the House Ethics Committee for offering to a retiring GOP House member an endorsement for his son, who was seeking to succeed the father, in an effort to secure his vote and get the bill passed.

The Republican Party’s disregard for political norms intensified further with President Obama’s election. Immediately after Obama’s inauguration in 2009, Senate Republican leader Mitch McConnell and his colleagues embarked on a deliberate strategy of obstruction across a broad range of policies. McConnell made his objective clear in a comment that came to epitomize his approach: “The single most important thing we want to achieve is for President Obama to be a one-term president.” Republicans tried to cast their response as a reaction to purported aloofness and high-handedness on the part of Obama and his congressional allies. In fact, as Republican Senator George Voinovich explained, McConnell from the start had advised his colleagues of Obama, “If he was for it, we had to be against it.”

McConnell bent the norms of the Senate to a degree the body had never seen before in his use—and misuse—of the filibuster. Cloture to end a filibuster (an imperfect but helpful measure of how often the filibuster was used to block Senate business) were filed rarely in the 1970s—in some years, they averaged less than one per month. During the Obama era, Democratic Majority Leader Harry Reid took to filing for cloture more than once a week. And in 2013, when a frustrated Reid decided to eliminate the filibuster for all presidential nominations except for the Supreme Court, a Congressional Research Service study showed how dramatic the abuse had become. In all of American history, it found 168 cloture motions had been filed on presidential nominations—and nearly half of them, 82, happened during Obama’s presidency.

Reid did not take this action lightly. It came only after another threat, when McConnell made clear that no matter whom President Obama nominated to fill the three vacancies on the D.C. Circuit Court of Appeals, Republicans would filibuster the nominees through the entire Obama term to preserve the Court’s conservative majority.

But there was no better example of extreme partisanship than McConnell’s refusal to consider any nominee Obama put forward to replace Supreme Court Justice Antonin Scalia after Scalia’s sudden death in February 2016. McConnell argued that the “American people should have a voice in the selection of their next Supreme Court Justice” and that “this vacancy should not be filled until we have a new president.” This was a radical departure. Supreme Court nominees had been rejected before, but except for those who withdrew, none in recent memory had been denied both a hearing and a vote. That it was justified with a risible claim to being democratic, as if the American people hadn’t reelected Obama for a full four-year second term, showed just how far McConnell was willing to go. And nearly all of McConnell’s colleagues overwhelmingly supported this strategy, one by one announcing that they, too, would seek to delay hearings and a vote on a nominee until Obama had left the White House. This even included Senator Orrin Hatch of Utah, who had once praised Garland as a “consensus nominee.”

Upon taking office, Trump quickly nominated Tenth Circuit Court of Appeals judge Neil Gorsuch to fill Scalia’s seat. Democrats moved to filibuster Gorsuch’s nomination, citing their opposition to Republican treatment of Garland and Gorsuch’s staunch conservative record. Immediately, McConnell invoked the “nuclear option,” as Reid had earlier, this time allowing Supreme Court nominations and not simply lower-court, cabinet, and subcabinet confirmations to be pushed through on a simple-majority vote. The Republicans succeeded, and Gorsuch was confirmed. But a line had been crossed. It became increasingly difficult to avoid seeing the Court itself as merely another partisan institution.

McConnell’s disregard for Senate norms was not limited to his use and abuse of filibusters. When the effort to repeal Obamacare came to the Senate in May 2017, McConnell created a process that had virtually no precedent when it came to considering a major policy change. He named a group of 13 Republican senators as a health-policy task force, bypassing the committees that have jurisdiction over the issues at stake. They met in complete secrecy. McConnell made clear that he would not bring Democrats into the process at all. He would not hold a single hearing. His plan was to rush the bill to passage with as little debate as possible, on the accurate assumption that the more the public knew about the details of the plan, the less likely it would be to pass. The gambit failed in its initial objective. A Congressional Budget Office estimate that the bill would throw 22 million Americans off the insurance rolls led to resistance among key Republican senators from states that had benefited from Obamacare. McConnell had to put off a quick vote and rewrite the proposal during the summer.

The contrast with the process by which Obamacare was considered and enacted—a process Republicans had assailed, in McConnell’s words, as “a disservice to the American people”—could not have been more stark. The New York Times’s Robert Pear noted that “in June and July 2009, with Democrats in charge, the Senate Health Committee spent nearly 60 hours over 13 days marking up the bill that became the Affordable Care Act.” The Senate Finance Committee, he wrote, had worked on the legislation for eight days, “its longest markup in two decades.” Before passing the Affordable Care Act on December 24, 2009, the Senate debated it for 25 days, “considered more than 130 amendments and held 79 roll-call votes.” That the Senate in the Trump years would set out to upend the American health-care system largely in secret was a dramatic and genuinely shocking example of how the decay of norms is not an abstract problem. It threatens the most basic commitments of our democracy.

House Republicans started the Obama years in the minority and without the weapons available to their counterparts in the Senate. But they took an equally deliberate approach to blocking, stalling, and discrediting the new president’s program. Led by Eric Cantor, Kevin McCarthy, and Paul Ryan—they called themselves the “Young Guns”—they sought and usually achieved perfect party discipline against every major Democratic initiative. . .

Continue reading.

Written by Leisureguy

29 January 2019 at 12:10 pm

How Toxic Masculinity Threatens Peace in Afghanistan

leave a comment »

Elizabeth Weingarten writes in the New Republic:

What does it mean to be a man? In the United States, that’s a debate recently stoked by a Gillette ad about harmful masculine norms, as well as the American Psychological Association’s new guidelines to help therapists work with men and boys in a culture that tells them to hide their emotions and pain. But though it’s a question some dismiss as philosophical rather than practical, or a badge of “political correctness” culture, research in the past several years has suggested it’s also a question with profound implications for international relations: Put simply, how men define their roles—and whether they’re able to live up to them—can have real consequences for national security. And in some of the theaters in which the United States has tested its military prowess in the past two decades, goals may be foiled not by the mechanics of fourth-generation warfare, but what may seem a much more pedestrian issue: gender.

On January 29, the gender equality NGO Promundo released a new report showing that younger men in Afghanistan are less likely than their fathers to support gender equality, and that both women and men still define men’s roles in traditional terms—as the breadwinners and protectors of their families. The report came a day after the announcement Tuesday that U.S. and Taliban representatives had tentatively agreed to a peace framework.

Two-thirds of the men Promundo surveyed agreed or strongly agreed with the statement, “women in Afghanistan have too many rights.” Younger men “associate the dilution of their culture with the spread of women’s rights and gender equality ideals,” said Sayed Idrees Hashimi, a Promundo report co-author and project manager at the Opinion Research Center of Afghanistan. And these findings, in turn, have troubling implications for security.

In Afghanistan, “real men” can be narrowly defined by their ability to provide for and protect their families. For many men, living up to that socially sanctioned definition amidst inexorable physical and economic insecurity is impossible: They don’t have the money to pay a bride dowry, can’t find a job, or they cannot protect their family from extremist violence or insurgencies. “If you’re a 17, 18, or 20-year-old man in Afghanistan right now, it’s a crippling identity moment for you,” explained Brian Heilman, one of the study authors and a senior research officer at Promundo.  “You feel entitled to certain elements of ‘manhood’ that you can’t actually achieve in your social environment.” Often insecure and humiliated, these men can seek power from another source—the subordination of women, and often, from extremist organizations.  “Gender bias and violent extremism are two sides of the same coin,” one Afghan man who worked as a U.S. government advisor for its Promote project, designed to empower Afghan women through training and by connecting them with educational and economic opportunities, told me.

The Promundo research, which included a nationally representative household survey of 1,000 male and 1,000 female participants, focus group discussions with both men and women, as well as other interviews with men, complements other findings that Afghani gender norms, which many thought the fall of the Taliban would improve, have resisted change: A 2016 Afghanistan Research and Evaluation Unit ( AREU) study showed Afghan men across generations believed men to be superior to women when it came to leadership qualities and levels of education and thought that men held the primary responsibility for the security of their families. More than half of young and more mature men thought wife-beating was acceptable. “Our talks and discussions about women’s rights are all as slogans but nothing in action,” one AREU focus group participant told researchers. “Here, if a stranger bothers my wife or sister as he stares at them on their way home, I cannot tolerate that; I would have to kill him, or else I am not called a man in my community… .”

In recent years, political science research has increasingly suggested a correlation between gender equality and a number of indicators of stability and prosperity: GDP per capita, growth rates, and low corruption. Political scientist Mary Caprioli, to cite just one example, has found that increased political, economic, and social gender equality makes states less likely to resort to military options in international conflicts and crises, and less likely to experience civil conflict. There’s also more specific evidence that regressive gender norms and expectations around masculinity play into terrorist recruitment: Nearly all of the former jihadi fighters interviewed in a 2015 Mercy Corps study cited a common justification for their decision to travel to Jordan and Syria to fight—protecting Sunni women and children. “Those men who went to fight, those are real men,” one young man in Ma’an told researchers.

Some researchers have found that young men have more open and flexible attitudes about gender equality and masculinity until they reach puberty. In Afghanistan around that age, young men “begin to understand that they are never going to be accepted unless they marry and become head of a household,” Texas A&M University Professor Valerie Hudson told me. “That means they will have to come up with a bride price, which may be the equivalent of several years’ income, in addition to the cost of the wedding itself, which may involve up to 1000 guests.” Hudson’s research suggests that bride price “is a catalyst for conflict and instability”; rising prices make it harder for men who are un- or underemployed to come up with the money to pay for a bride, and more likely that they’ll turn to an extremist group that promises them either money or brides in exchange for service. Unraveling “the web of incentives and disincentives that men are given in Afghan culture,” she said, is key to understanding the patterns behind instability and extremist recruitment in the region.

Despite the relevance of gender inequality for U.S. security policy and strategy in Afghanistan, prioritizing gender norms in the military’s strategy to stabilize the area isn’t as simple as it might seem. Masculinity, anywhere, is a difficult subject.  “We’ve floated talking about masculinity in the military,” one female naval commander told me. “It doesn’t go over very well. People get defensive pretty much immediately, and make it personal and visceral. It’s part of their identity.” That makes it difficult, she said, to address strategic blindspots and approach problems like violent extremism or conflict reconstruction holistically: “If we aren’t having those conversations, especially when you’re talking about dealing with male-dominated organizations, like militaries, police sectors and government, we open ourselves up to missing things,” she said. “In the countering violent extremism fight, what it means to be a man is a lot of times directly related to women. When terrorists use women and rape as a weapon of war, there is a reverberation and impact on men in society—the men who weren’t able to protect those women, and who have to resort to violence to feel like real men. That needs to be explored to really understand the problem and begin to address solutions to the instability.”

Some women, too, hesitate to integrate discussions of masculinity into U.S. foreign policy and programming, fearing it could overshadow or detract from the conversation about the needs and experiences of women and girls. “There’s a philosophical tension there,” said Jamille Bigio, a senior fellow at the Council on Foreign Relations who previously worked on the White House National Security Council staff. Even in countries that are progressive when it comes to feminist foreign policy, like Canada and Sweden, the idea has been to talk more about women and girls’ needs, rather than “feminist principles, which are different. Integrating feminist principles would start a different conversation about gender norms and gender roles,” one that would systematically include men, she told me.

And in the end, challenging gender norms, and getting the buy-in necessary to shift them a bit, is not easy. Gender equality and security at the national level starts in the household—with egalitarian partnerships. But men benefit from household inequality—at least in the short-term. Spending less time on household labor frees them up to access more economic, social, and political opportunities, begetting more power and privilege outside of the home. (At the same time, they lose out in the long term on the benefits of sharing equal parenting responsibilities, for instance, and in living in a society that’s more stable, secure and productive.)  And women participate in gender-policing, too. Belquis Ahmadi, a pioneer of masculinity research in Afghanistan who works at the United States Institute of Peace, told me that some Afghan women viciously ridicule men in their household who attempt to help with domestic work or who act more sensitively towards their wives. “In some parts of Afghanistan, a man who helps with the chores is called Zancho—which means a man with female characteristics,” Ahmadi said. “That’s considered the worst thing you can call a man.”


So how to fix the problem? . . .

. . .

Continue reading.

Written by Leisureguy

29 January 2019 at 11:15 am

Climate change apparently not a hoax after all: Continental US 2.8ºF warmer, seas 9″ higher, …

leave a comment »

The Washington Post has a striking multimedia article—print, video, sound—on what global warming is doing to the US. And, of course, climate change is a global problem, not just a US problem, but the effects are clearly seen in the US and those effects are going to get worse for 60 years to come even if we suddenly stopped using all fossil fuel today. And instead we are burning even more and increasing the amount of CO2 we put into the atmosphere.

I’m 79. It’s my grandchildren who will suffer the most—and your grandchildren as well.

Written by Leisureguy

29 January 2019 at 10:53 am

U.S. Intelligence Chiefs Contradict Trump on North Korea and Iran

leave a comment »

Julian Barnes and David Sanger report in the NY Times:

A new American intelligence assessment of global threats has concluded that North Korea is “unlikely to give up” all of its nuclear stockpiles, and that Iran is not “currently undertaking the key nuclear weapons-development activity” needed to make a bomb, directly contradicting two top tenets of President Trump’s foreign policy.

Daniel R. Coats, the director of national intelligence, also challenged Mr. Trump’s insistence that the Islamic State had been defeated, a key rationale for his decision to exit from Syria. The terror group, the annual “Worldwide Threat Assessment” report to Congress concluded, “still commands thousands of fighters in Iraq and Syria,” and maintain eight branches and a dozen networks around the world.

Mr. Trump is expected to meet next month with Kim Jong-un, the North Korean leader, in a second round of direct negotiations aimed at ridding Pyongyang of its nuclear weapons.

But Mr. Coats told the Senate Intelligence Committee on Tuesday that “we currently assess North Korea will seek to retain its W.M.D. capability and is unlikely to completely give up its nuclear weapons and production capability.”

“Its leaders ultimately view nuclear weapons as critical to regime survival,” Mr. Coats said.

Mr. Coats said Iran continued to sponsor terrorism in Europe and the Middle East, supporting Houthis in Yemen and Shiite militants in Iraq. He also said that he believed Iran hard-liners would continue to challenge centrist rivals.

“We do not believe Iran is currently undertaking the key activities we judge necessary to produce a nuclear device,” Mr. Coats said, but he added that Iranian officials have “publicly threatened to push the boundaries” of the nuclear deal it struck with world powers in 2015 if it did not see the benefits it expected.

Mr. Trump withdrew the United States from that agreement last year. He called it “defective at its core” and said if the deal remained in place, Iran “will be on the cusp of acquiring the world’s most dangerous weapons.” The agreement still stands, largely with support from European capitals.

Perhaps the strongest rebuke of Mr. Trump’s security priorities comes in what is missing from the report: Any rationale for building a wall along the southwest border, which Mr. Trump has advertised as among the most critical security threats facing the United States. The first mention of Mexico and drug cartels comes on page 18 of the 42-page assessment, well after a range of other, more pressing threats are reviewed.

Most pressing, as it has been for the past five years, are cybersecurity threats to the United States. For the first time, the report concluded that China is now positioned to conduct effective cyberattacks against American infrastructure, and specifically cited Beijing’s ability to cut off natural gas pipelines, at least briefly.

The assessment also argues that while Russia’s ability to conduct cyberespionage and influence campaigns is similar to the one it ran in the 2016 American presidential election, the bigger concern is that “Moscow is now staging cyberattack assets to allow it to disrupt or damage U.S. civilian and military infrastructure during a crisis.” . . .

Continue reading.

Written by Leisureguy

29 January 2019 at 9:33 am

Another luxuriant lather

with one comment

Yesterday’s lather from my QED Vetiver shave stick was so thick and creamy and wonderful that I had to reprise the experience, so today I used my QED Vanilla shave stick: different fragrance but same wonderful lather, this time created with my Kent Infinity synthetic, which, like The Grooming Co. synthetic I used yesterday, is fairly resilient. I imagine the resilience contributes but the heavy lifting is done by the very fine glycerin soap QED once offered.

Now, of course, I’m wondering whether it is QED soap in particular, or good glycerin-basd soaps in general. Tomorrow I’ll use a different glycerin soap and see.

Three passes with the RazoRock Old Type left a perfectly smooth face—this really is a first-rate razor at a surprisingly low price.

A splash of Anthony Gold’s Red Cedar aftershave, and I’m off to a late start…

Written by Leisureguy

29 January 2019 at 9:20 am

Posted in Shaving

%d bloggers like this: