Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Science’ Category

FBI admits flaws in hair analysis over decades

leave a comment »

Spencer Hsu reports in the Washington Post:

The Justice Department and FBI have formally acknowledged that nearly every examiner in an elite FBI forensic unit gave flawed testimony in almost all trials in which they offered evidence against criminal defendants over more than a two-decade period before 2000.

Of 28 examiners with the FBI Laboratory’s microscopic hair comparison unit, 26 overstated forensic matches in ways that favored prosecutors in more than 95 percent of the 268 trials reviewed so far, according to the National Association of Criminal Defense Lawyers (NACDL) and the Innocence Project, which are assisting the government with the country’s largest post-conviction review of questioned forensic evidence.

The cases include those of 32 defendants sentenced to death. Of those, 14 have been executed or died in prison, the groups said under an agreement with the government to release results after the review of the first 200 convictions.

The FBI errors alone do not mean there was not other evidence of a convict’s guilt. Defendants and federal and state prosecutors in 46 states and the District are being notified to determine whether there are grounds for appeals. Four defendants were previously exonerated.

The admissions mark a watershed in one of the country’s largest forensic scandals, highlighting the failure of the nation’s courts for decades to keep bogus scientific information from juries, legal analysts said. The question now, they said, is how state authorities and the courts will respond to findings that confirm long-suspected problems with subjective, pattern-based forensic techniques — like hair and bite-mark comparisons — that have contributed to wrongful convictions in more than one-quarter of 329 DNA-exoneration cases since 1989.

In a statement, the FBI and Justice Department vowed to continue to devote resources to address all cases and said they “are committed to ensuring that affected defendants are notified of past errors and that justice is done in every instance. The Department and the FBI are also committed to ensuring the accuracy of future hair analysis testimony, as well as the application of all disciplines of forensic science.”

Peter Neufeld, co-founder of the Innocence Project, commended the FBI and department for the collaboration but said, “The FBI’s three-decade use of microscopic hair analysis to incriminate defendants was a complete disaster.”

“We need an exhaustive investigation that looks at how the FBI, state governments that relied on examiners trained by the FBI and the courts allowed this to happen and why it wasn’t stopped much sooner,” Neufeld said.

Norman L. Reimer, the NACDL’s executive director, said, “Hopefully, this project establishes a precedent so that in future situations it will not take years to remediate the injustice.”

While unnamed federal officials previously acknowledged widespread problems, the FBI until now has withheld comment because findings might not be representative.

Sen. Richard Blumenthal (D-Conn.), a former prosecutor, called on the FBI and Justice Department to notify defendants in all 2,500 targeted cases involving an FBI hair match about the problem even if their case has not been completed, and to redouble efforts in the three-year-old review to retrieve information on each case.

“These findings are appalling and chilling in their indictment of our criminal justice system, not only for potentially innocent defendants who have been wrongly imprisoned and even executed, but for prosecutors who have relied on fabricated and false evidence despite their intentions to faithfully enforce the law,” Blumenthal said.

Senate Judiciary Committee Chairman Charles E. Grassley (R-Iowa) and the panel’s ranking Democrat, Patrick J. Leahy (Vt.), urged the bureau to conduct “a root-cause analysis” to prevent future breakdowns.

“It is critical that the Bureau identify and address the systemic factors that allowed this far-reaching problem to occur and continue for more than a decade,” the lawmakers wrote FBI Director James B. Comey on March 27, as findings were being finalized.

The FBI is waiting to complete all reviews to assess causes but has acknowledged that hair examiners until 2012 lacked written standards defining scientifically appropriate and erroneous ways to explain results in court. The bureau expects this year to complete similar standards for testimony and lab reports for 19 forensic disciplines.

Federal authorities launched the investigation in 2012 after The Washington Post reported that flawed forensic hair matchesmight have led to the convictions of hundreds of potentially innocent people since at least the 1970s, typically for murder, rape and other violent crimes nationwide.

The review confirmed that FBI experts systematically testified to the near-certainty of “matches” of crime-scene hairs to defendants, backing their claims by citing incomplete or misleading statistics drawn from their case work.

In reality, there is no accepted research on how often hair from different people may appear the same. Since 2000, the lab has used visual hair comparison to rule out someone as a possible source of hair or in combination with more accurate DNA testing.

Warnings about the problem have been mounting. In 2002, the FBI reported that its own DNA testing found that examiners reported false hair matches more than 11 percent of the time. In the District, the only jurisdiction where defenders and prosecutors have re-investigated all FBI hair convictions, threeof seven defendants whose trials included flawed FBI testimony have been exonerated through DNA testing since 2009, and courts have exonerated two more men. All five served 20 to 30 years in prison for rape or murder.

University of Virginia law professor Brandon L. Garrett said the results reveal a “mass disaster” inside the criminal justice system, one that it has been unable to self-correct because courts rely on outdated precedents admitting scientifically invalid testimony at trial and, under the legal doctrine of finality, make it difficult for convicts to challenge old evidence.

“The tools don’t exist to handle systematic errors in our criminal justice system,” Garrett said. “The FBI deserves every recognition for doing something really remarkable here. The problem is there may be few judges, prosecutors or defense lawyers who are able or willing to do anything about it.”

Federal authorities are offering new DNA testing in cases with errors, if sought by a judge or prosecutor, and agreeing to drop procedural objections to appeals in federal cases.

However, biological evidence in the cases often is lost or unavailable. Among states, only . . .

Continue reading. There are interesting state-by-state graphics at the link.

Emphasis added. Prosecutors will probably not agree. Prosecutors (e.g., Kamala Harris, now a US Senator) have fought tooth and nail to keep innocent people in prison. See this post.

Written by LeisureGuy

20 January 2019 at 4:05 pm

How oat milk can help farmland

leave a comment »

Tom Philpott writes in Mother Jones:

Move over, almond and soy milk: An oat milk boom, as I argued in a piece last year, could help the Midwest solve some of its most dire agricultural issues. And now there’s new research out this month to help support the case for covering the region with oats.

In states like Iowa, fertilizer runoff from corn and soybean farms pollutes drinking water and feeds algae blooms, fouling water from local lakes and rivers down to the Gulf of Mexico. These farms also lose soil to erosion at an alarming rate, compromising the region’s future as a crucial hub of the US food system.

Back in 2013, I reported on “one weird trick” that could go a long way toward solving these problems: biodiversity. When farmers add more crops to their dominant corn-soybean rotation, it disrupts weed and pest patterns and means they can use fewer pesticides. It also frees up space for planting legumes, which capture nitrogen from the air and reduce the need for synthetic fertilizer. One great contender for this third crop is oats.

Earlier this month, researchers from Iowa State University and the University of Minnesota came out with a paper that adds more weight to the case for diversification. The paper reports on results from trial plots established in 2002 by Iowa State at a farm outside Ames. In one swath, the ground was planted in a two-year rotation of corn and soybeans, the standard recipe in the Midwest. In another, a three-year rotation held sway: corn, soybeans, and oats inter-planted with red clover, a legume. In the final one, the rotation was extended to four years, adding a round of alfalfa, another legume, and a forage crop for cattle.

The paper found that the longer rotations—the ones with the added crops—bring the following benefits:

Water pollution drops dramatically

Nitrogen fertilizer is a key crop nutrient, and when it’s washed away into the Midwest’s rivers and streams, it also supercharges algae growth, especially in salt water. That’s bad news for the Gulf of Mexico, where these waterways ultimately drain. Since Midwestern agriculture intensified in the 1970s, annual dead zones have been appearing in the Gulf, sucking oxygen out of the water and turning huge swaths of it into fetid dead zones. The annual Gulf dead zone fluctuates in size based on weather patterns. Last year’s turned out to be below average in area covered—but it was still the size of Delaware. In 2017, the dead zone set an all-time record, clocking in at a size four times larger than the federal target for a healthy Gulf ecosystem.

In the Iowa State farm study, the plots managed with three- and four-year rotations lost 39 percent less nitrogen to runoff than the corn-soybean control plots, partially because the presence of more nitrogen-fixing legumes in the mix reduces the need to apply synthetic nitrogen fertilizer.

And on these plots, 30 percent less phosphorus leaked away as runoff.  Phosphorus is another key crop nutrient applied to farm fields, and it’s the main driver for blue-green algae blooms in freshwater bodies like lakes. These blooms produce toxins called microcystins, which, when ingested, cause nausea, vomiting, diarrhea, severe headaches, fever, and liver damage. Lakes downstream from farms throughout the Midwest have been increasingly saddled with these “harmful algae blooms” in recent years. Toledo struggles annually to keep microcystins out of its city water, which is drawn from algae-plagued Lake Erie. Freshwater blooms also generate massive amounts of methane, a greenhouse gas with 30 times the heat-trapping power of carbon dioxide.

Soil stays in place

According to Iowa State agronomist Richard Cruse, Iowa farms lose topsoil at an average rate of 5.7 tons per acre annually, versus the natural rate of regeneration of 0.5 acres per year. As soil washes away, farmland doesn’t sponge up or hold water as well, making it more vulnerable to droughts. Erosion is already reducing crop yields in Iowa, Cruse’s research has found—an effect that will accelerate if the trend continues. On the Iowa State plots planted with oats, clover, and alfalfa, erosion rates decreased by 60 percent.

Crop yields improve—and so could the bottom line

The diverse plots in the study delivered higher yields of corn and soybeans (in the years when those crops are grown), and also required drastically lower amounts of off-farm inputs like fertilizers and herbicides. (A 2012 paper on the same group of test plots found that the diverse fields require 88 percent less herbicides because the addition of another crop disrupts weed patterns.) As a result, the authors found that the more diverse plots were slightly more profitable than the control ones.

Natalie Hunt, a University of Minnesota researcher and a co-author on the study, told me that the economic analysis assumed that the oats and alfalfa generated by the biodiverse plots would find a profitable use by being fed to cattle and hogs “on-farm or on neighboring farms.” That setup works best for diversified operations that include crops as well as livestock. A farm that planted alfalfa during its fourth year of rotation, for example, could “harvest” it by simply turning cattle loose on it for munching; and the resulting beef provides an income stream.

But such farms are increasingly rare in states like Iowa, which are made up mainly of huge corn and soybean farms, and separately, an ever-growing number of massive confined hog farms, highly geared toward consuming that corn and soy.

Another obstacle, Hunt says, are the “heavily taxpayer-subsidized crop insurance programs that keep farmers locked into a corn- and soybean-producing system year after year, even when market prices are poor,” as they have been for the past several years.

She adds, though, that if consumers demanded food from the Midwest that didn’t pollute water and damage soil, the “market would respond pretty quickly”—that is, if farmers could get a premium price for crops, meat, and milk “grown with biodiversity” or some such label, farmers would have incentive to add them to their rotations. And that was precisely the thesis of my oat milk piece. I calculated that turning grain into a beverage doesn’t require nearly enough product to create a demand surge sufficient to bring oats to millions of acres of Midwestern farmland; however, it could be a lever to raise consumer awareness of the ecological damage endemic in the Midwest. . .

Continue reading.

Written by LeisureGuy

20 January 2019 at 2:06 pm

What Life Is Like When Corn Is off the Table

leave a comment »

Sarah Zhang writes in the Atlantic:

When Christine Robinson was first diagnosed with a corn allergy 17 years ago, she remembers thinking, “No more popcorn, no more tacos. I can do this.”

Then she tried to put salt on her tomatoes. (Table salt has dextrose, a sugar derived from corn.) She tried drinking bottled iced tea. (It contains citric acid, which often comes from mold grown in corn-derived sugar.) She tried bottled water. (Added minerals in some brands can be processed with a corn derivative.) She ultimately gave up on supermarket meat (sprayed with lactic acid from fermented corn sugars), bagged salads (citric acid, again), fish (dipped in cornstarch or syrup before freezing), grains (cross-contaminated in processing facilities), fruits like apples and citrus (waxed with corn-derived chemicals), tomatoes (ripened with ethylene gas from corn), milk (added vitamins processed with corn derivatives). And that’s not even getting to all the processed foods made with high-fructose corn syrup, modified food starch, xanthan gum, artificial flavorings, corn alcohol, maltodextrin—all of which are or contain derivatives of corn.

“It’s such a useful plant,” Robinson says of corn. “It can be made into so very, very many things that are, from my perspective, trying to kill me.”

Read: Drowning in corn ]

Corn allergies are relatively rare, and ones as severe as Robinson’s are rarer still. (Many people unable to eat whole corn can still tolerate more processed corn derivatives.) But to live with a corn allergy is to understand very intimately how corn is everywhere. Most of the 14.6 billion bushels of corn grown in the U.S. are not destined to be eaten on the cob. Rather, as @SwiftOnSecurity observed in a viral corn thread, the plant is a raw source of useful starches that are ubiquitous in the supply chain.

It’s not just food. Robinson told me she is currently hoarding her favorite olive-oil soap, which she had been using for 17 years but recently went out of stock everywhere. (A number of soap ingredients, such as glycerin, can come from corn.) She’s been reading up on DIY soapmaking. A year ago, the brand of dish soap she liked was reformulated to include citric acid, so she had to give that up, too. And navigating the hospital with a corn allergy can be particularly harrowing. Corn can lurk in the hand sanitizer (made from corn ethanol), pills (made with corn starch as filler), and IV solutions (made with dextrose). A couple years ago, she went to see a specialist for a migraine, and her doctor insisted she get an IV that contained dextrose.

“And while in the midst of a migraine I had to argue with a doctor about the fact [that] I really could not have a dextrose IV,” she said. In the moment, she realized how absurd it was for her to be telling a world-class specialist to change her treatment.

Read: The allergens in natural beauty products ]

Because corn allergies are rare, many doctors are not familiar with the potential scope. Robinson said she was the first case her original doctor had ever seen in 38 years of practice, and he didn’t know to advise her against corn derivatives. Even official sources of medical information can be confusing, telling corn-allergy patients that they do not need to avoid cornstarch and high-fructose corn syrup. Misinformation abounds in the other direction, too, because corn allergies can be easy to misdiagnose and easy to self-diagnose incorrectly. All this means that corn-allergy sufferers encounter a good deal of skepticism. But Robert Wood, president of the American Academy of Allergy, Asthma & Immunology and a pediatric allergist at Johns Hopkins, told me that derivatives such as corn syrup can indeed cause problems for certain people.

People with corn allergies have naturally been finding one another on the Internet. A Facebook group called Corn Allergy & Intolerance (Maize, Zea Mays) now has nearly 8,500 members. Becca, a tech worker in Washington State, writes a fairly prominent blog called Corn Allergy Girl. (She asked I not use her last name because she doesn’t want her health status to affect her professional life.) The blog collates years of Becca’s research into corn allergies, as well as resources inherited from other, now-defunct corn-allergy blogs.

Members of the Facebook group have also forged ties with individual farms. Once a year, Robinson said, a farmer in California sends members of the group a big box of avocados that have not been exposed to corn-derived ethylene gas or waxes. “It’s a great month when you’re trying to get through all of them,” she said. For the rest of the time, she gets most of her food from a CSA with a local farm in Pennsylvania.

Becca, who writes Corn Allergy Girl, also gets a lot of her produce from local farms. The rest she grows. She goes to a specific butcher and meat processor who will custom-process whole animals for her without using lactic acid or citric acid. . . .

Continue reading.

Written by LeisureGuy

19 January 2019 at 4:41 pm

It Used to Be Okay for Parents to Play Favorites

leave a comment »

Jennifer Traig writes in the Atlantic:

The fight might be over the last fruit strip or the TV or the best chair in front of the TV; it doesn’t really matter. My children’s conflict has many causes but only one true one: They are siblings, and that’s what siblings do. The war between brothers and sisters is eternal, each generation renewing the hostilities that have defined sibling relations since humanity began.

Although it seems as if my children never give it a rest, in fact they fight far less than the average. Statistically, they should be arguing more than three times an hour, a number researchers landed on not by interviewing children or parents but by installing microphones in the subjects’ homes. Younger children fight even more—six times each hour. This means they have a fight—a real fight, not just cross words—every 10 minutes.

It is very disturbing when the people you love most in the world turn savagely on one another, and from the parents’ perspective, it makes no sense. They’re fighting for the affection, attention, and material goods that their parents supply, all of which said parents are in no mood to hand over after a few hours of constant bickering.

From the combatants’ point of view, however, the conflict is unavoidable. Children fight because they’re wired to. Sibling rivalry is an evolutionary imperative, an innate impulse. We’re programmed to turn on the usurpers who compete with us for precious resources like food and parental attention, and we begin early. By six months, infants get upset when their mother pays attention to a baby doll. By 16 months, they know what bothers their siblings and will annoy them on purpose.

For many of us, our relationships with our siblings are the most profound relationships in our lives, more important and influential than the ones we have with our parents. They are in fact the only relationship many of us have for life, with someone who’s around from the beginning until the end. Humans generally maintain lifelong sibling relationships; we’re the only species that does. Which gives us a long, long time to hold a grudge.

Sibling conflict is not unique to humans, and humans are nowhere near as bad as some other animals are. Many animal siblings actually kill each other, often while the parents look on blithely. In certain bird species, sibling murder is so common it’s known as obligate siblicide. Black eagles are particularly vicious. In one of the few observed accounts, the slightly older chick attacked its slightly younger sibling 38 times over the younger’s three-day life span, delivering 1,569 blows with its sharply hooked beak. There was, by the way, more than enough food for both.

Sand tiger sharks commit sibling murder on a far greater scale, beginning before they’re even born. They play an in utero version of the Hunger Games, using their nascent teeth to chomp up all the sibling embryos they can. The shark that’s eventually born is just the last one standing. How did researchers figure this out? A biologist dissecting a pregnant shark was bitten by an embryo, still swimming around in the uterus, still looking for siblings to eat. Pigs are vicious, too, born with teeth that are angled to gash littermates while they nurse.

Sibling rivalry is common to all living things, even plants, which will chemically poison competing offspring to divert resources to themselves. Even bacteria fight with their bacterial siblings, resorting, like sharks, to cannibalism and fratricide.

Human siblings rarely resort to murder, and even more rarely to cannibalism, but they certainly scrap. For most of history, however, sibling conflict was subject to little examination and even less concern. Given how incredibly annoying it is (it is the part of parenting I hate most, which is saying something, given that parenting is a job that also requires cleaning diarrhea out of neck folds), it seems surprising that there’s so little complaining in the historical record. I can only assume that parents either didn’t see it as a problem, or didn’t see it as their problem.

While there’s little recorded evidence of parents trying to stop sibling conflict, there’s plenty of evidence that conflict occurred. Both myth and history are full of examples, with the Bible alone providing a good half-dozen case studies. Sibling conflict shows up in about 20 of the 50 chapters in Genesis. The very first homicide occurs between the very first brothers, Cain and Abel. Esau and Jacob, like sand tiger sharks, begin fighting while still in the womb. Later, Jacob favors his own son Joseph so blatantly that Joseph’s jealous siblings throw him into a well and sell him into slavery.

Sibling rivalry occurs in a lot of religious traditions and ancient mythologies. It informs both the Book of Mormon (the scripture) and The Book of Mormon (the musical). In the Hindu epic the Mahabharata, Arjuna kills his brother Karna, and in the Norse sagas, brothers are forever fighting and killing one another off. Romulus whacks Remus after they bicker over a wall. Zeus gets along with his siblings a bit better, marrying two of them (gross) and teaming up with the rest to fight his father in the War of the Titans. Once the war is over, however, the siblings go back to intra-familial turf wars and squabbling.

In the Bible, many of the fights are over parental affection, which is what psychology traditionally blamed for sibling rivalry, when it considered the topic at all. But recent studies indicate siblings are actually fighting over something simpler than that: toys. Eighty percent of sibling fights are over possessions. Parental affection comes in last as something worth fighting over, at a dismal 9 percent.

My children share little besides genetic material, and they don’t share very much of that. Siblings are not, in fact, that similar. My husband and I produced one cautious and thoughtful girl, and two years later rolled out our second model, a whirling tornado of a boy. I really don’t know what we were thinking putting them in the same bedroom. Although they agreed to the arrangement, it’s a little bit like co-housing Alan Dershowitz and Torquemada. . .

Continue reading. There’s much more.

Written by LeisureGuy

19 January 2019 at 8:58 am

Posted in Daily life, Science

Nitrate-Free Bacon: Myth or Reality?

leave a comment »

Jennifer Curtis has a nice write-up at Firsthand Foods:

When the Firsthand Foods’ team began to get serious about making bacon, we decided, due to popular demand, to offer a “nitrate free” product. We came up with a bacon brine that utilizes celery juice powder as a natural preservative instead of sodium nitrite. The bacon that resulted is delicious! And left us wanting to know more about exactly what’s going on with that celery powder. What are nitrites and is our bacon really nitrate or nitrite free? Here’s what we’ve figured out from our research:

What are nitrates and nitrites and what do they do?

Sodium nitrate and sodium nitrite are salts that are used in curing or preserving meat and fish. Sodium nitrate is a naturally occurring mineral that exists in lots of green vegetables, which we (optimistically!) consume all the time. Sodium nitrite is derived from sodium nitrate and is the compound that actually contains the antimicrobial properties that are desired in the production of bacon, hot dogs, salami, etc… In the case of salami, sodium nitrate is added during preparation and it then breaks down during the fermentation process into sodium nitrite, which helps prevent the growth of the deadly botulism bacteria. In the production of products like bacon, ham and hot dogs, which aren’t fermented, straight sodium nitrite is added. Besides preventing botulism, the presence of sodium nitrite provides the characteristic pink color and piquant “cured” flavor to these meat products.

Are nitrates and nitrites bad for you?

Yes and no. Turns out that nitrates exist in fairly large quantities in green vegetables. When we consume dietary nitrate our body converts it into nitrite, which increases nitric oxide in our blood stream and helps lower blood pressure. The stuff that’s actually bad for you is nitrosamine. This is created in the body when those green vegetables react with our acidic gastric juices. However, those vegetables also contain anti-oxidants, which keep nitrosamine production in check. Nitrosamines are also created when sodium nitrite is heated to a particularly high temperature. Nitrosamines have been linked in studies to cancer and so they are considered carcinogenic. Because of these studies, the USDA has imposed limits on the amount of sodium nitrite in cured meats and requires the addition of ascorbic acid (an antioxidant) in order to restrict the amount of nitrosamines that we consume. The other thing worth mentioning about these studies that link nitrosamines to cancer is the fact that a person eating enough bacon to ingest a dangerous amount of nitrite is eating, well, a lot of bacon. This equates to the consumption of a lot of meat, saturated fat and salt, in addition to sodium nitrite. If you are consuming traditional nitrite cured bacon, try to avoid burning it and be sure (surprise!) to maintain a balanced diet so that you’re eating antioxidant containing vegetables and/or fruits along with that bacon.

What’s up with celery powder in “nitrate free” bacon, ham, etc…?

As mentioned above, green vegetables contain nitrates. If you want to cure meat without the pure synthesized form of sodium nitrite, the naturally occurring nitrate in celery can be used. During the curing process, the nitrates in celery powder break down into nitrites and provide all the benefits of botulism prevention, bright pink color and that delicious cured flavor. For full disclosure, the USDA does not consider celery powder or any other “natural” form of nitrate to be a curing or preserving agent but rather a flavoring agent.

So are there nitrates or nitrites in there or what?

Our products can be legally and technically labeled “nitrate-free,” because the brine we use contains no synthesized sodium nitrite. It contains celery powder (and thus “naturally occurring sodium nitrite”), sea salt, cherry juice powder (ascorbic acid), maple sugar and some spices.

But to be completely transparent about it, due to the basic rules of chemistry, products that include celery powder do end up containing naturally-occurring nitrate and its derivative, sodium nitrite. We could choose to make our bacon without celery powder but it would be gray in color and, quite honestly, not as tasty. We’ve opted for striking a balance between flavor, appearance, and ingredients that speak to our customers’ interests in a more natural product.

When it comes to bacon, ham and cured meats, we believe in providing our customers with wholesome, high-quality products made from welfare-approved, pasture-raised animals sourced from local farmers. We encourage you to indulge in these specialty products in small quantities balanced with a good dose of seasonal fruits and vegetables.

Written by LeisureGuy

18 January 2019 at 12:46 pm

Posted in Daily life, Food, Health, Science

Whys of seeing: Experimental psychology and art

leave a comment »

Ellen Winner, professor of psychology at Boston College and senior research associate at Project Zero at the Harvard Graduate School of Education, has in Aeon an extract of her most recent book, How Art Works: A Psychological Exploration:

Many philosophical questions about the arts would benefit from some serious empirical research. One thinker who welcomed empirical findings was the art historian E H Gombrich (1909-2001), who was influenced by findings in experimental psychology showing that perception is a matter of inference rather than direct seeing. But all too often philosophers have relied on intuitions and hunches without seeking information about how people actually interact with works of art. If we want to understand the arts, it’s time to take experimental psychology seriously.

Today, experimental philosophers and philosophically inclined psychologists are designing experiments that can help to answer some of the big philosophical questions about the nature of art and how we experience it – questions that have puzzled people for centuries, such as: why do we prefer original works of art to forgeries? How do we decide what is good art? And does engaging with the arts make us better human beings?

Christ and the Disciples at Emmaus, believed to have been painted by Johannes Vermeer in the 17th century, hung in the Museum Boijmans Van Beuningen in Rotterdam for seven years; in 1937, it was admired by the Vermeer expert Abraham Bredius as ‘the masterpiece of Johannes Vermeer of Delft’. But in 1945, Han van Meegeren confessed that he had forged this painting (and many others), and should thus be deemed as great an artist as Vermeer. But this did not happen. The same work formerly revered was now derided.

There are two kinds of art forgeries: invented forgeries in the style of an established artist, and copy forgeries, which are reproductions of existing works. Most commonly, forgers such as van Meegeren produce invented forgeries. Copy forgeries are less common; these are more difficult to get away with since it is often known where the original resides. Moreover, because it is impossible to make a perfect copy by hand, one can always see (or hope to see) differences between the original and the copy, and use these differences to disparage the copy. The art critic Clive Bell in 1914 suggested that exact copies always lack life: the lines and forms in the original are caused by emotions in the artist’s mind that are not present in the copier’s mind. The philosopher Nelson Goodman in 1976 argued that, even if we can detect no physical difference between the original and the copy, just knowing that one painting is the original and the other is the copy tells us that there could be subtle differences that we cannot see now but that we might learn to see later. This knowledge shapes our aesthetic experience of what we believe to be a direct copy.

The puzzle posed by forgery is this: why does our perception and evaluation of an artwork change simply by learning it is a forgery? After all, the work itself has not changed. Philosophers have taken two broad positions on this question.

According to the formalist position, when the original and the forgery are visually indistinguishable, they are not aesthetically different. For example, Monroe Beardsley in 1959 argued that we should form our aesthetic judgments only by attending to the perceptual properties of the picture before us, and not by considering when or how the work was made or who it was made by. So why did people change their evaluation of the Vermeer painting once van Meegeren confessed to being the artist? According to Alfred Lessing, writing in 1965, this response can be chalked up to social pressures: ‘Considering a work of art aesthetically superior because it is genuine, or inferior because it is forged, has little or nothing to do with aesthetic judgment or criticism. It is rather a piece of snobbery.’ This view assumes that artworks have perceptual properties that are unaffected by our knowledge about the background of the work.

According to the historicist position, what we perceive in a work is influenced by what we know about it. Despite the original and the forgery being visually indistinguishable, they are aesthetically different precisely because of what the formalists deny is relevant – our beliefs about who made the work, when, and how. The German critic Walter Benjamin in the 1930s argued that our aesthetic response takes into account the object’s history, ‘its unique existence in a particular place’. He believed that a forgery has a different history and thus lacks the ‘aura’ of the original. The philosopher and critic Arthur Danto took a similar historicist position in 1964 when he asked us to consider why a Brillo box by Andy Warhol that is visually identical to a Brillo box in a supermarket is a work of art. To determine that the box in the museum is a work of art ‘requires something the eye cannot descry – an atmosphere of artistic theory, a knowledge of the history of art: an artworld’. Denis Dutton claimed in 2009 that we perceive a forgery to be aesthetically inferior to an original because we take into account the kind of achievement the work represents – the artist’s process – and a forgery represents a lesser kind of achievement than an original.

Psychologists have stepped into the fray to determine how much the label ‘forgery’ affects our response to a work of art – and, if so, why. The first question is easier to answer than the second. Studies show that just telling people that a work is a forgery (or even the less-charged term ‘copy’) causes them to rate that work lower on a host of aesthetic dimensions. Artworks labelled forgeries or copies are rated as less good, less beautiful, less awe-inspiring, less interesting, lower in monetary value, and even physically smaller than the same image shown to other respondents as an ‘original’. In addition, brain activation changes: while the visual areas of the brain didn’t change in response to whether Rembrandt paintings were labelled ‘authentic’ or ‘copy’, the label ‘authentic’ resulted in greater activation of the orbitofrontal cortex – an area associated with reward and monetary gain.

Clearly, people don’t behave how the formalists thought that they should. What is causing their appreciation to be diminished? One possibility is that our sense of forgery’s moral evil unconsciously influences our aesthetic response. Another is that our knowledge of forgery’s worthlessness on the art market has the same kind of unconscious effect. But if we could strip forgery of its connection with deception and lack of monetary value, would it still be devalued? And, if so, can we demonstrate that the historicist position is correct?

With my research team, I put this to the test by showing people two duplicate images of an unfamiliar art work side by side, telling them that the painting on the left was the first in a planned series of 10 identical works by a painter. Participants were then told one of three different stories about who made the work on the right: that it was by the artist, by the artist’s assistant, or by a forger. For those told it was the artist’s assistant, we specified that the assistant’s copy had the artist’s stamp on it, and that having a team of assistants was typical artistic practice (hence not fraudulent). The auction price of $53,000 was listed below all images (right and left) except for the forgery, which was listed at only $200.

We asked people to rate the copy relative to the original on six dimensions:

Which one is more creative?
Which one do you like more?
Which one is more original?
Which one is more beautiful?
Which one is the better work of art?
Which one is more likely to be influential?

Responses fell into two categories: broadly evaluative (what formalists called aesthetic) – with reference to beauty, goodness and liking; and historical-evaluative (what historicists called historical) – with reference to creativity, originality and influence. We reasoned that forgeries would always be the most devalued of the three kinds of copies because of their immorality and their lack of monetary worth. The artist’s copy, however, is like a forgery without these two marks against it. Thus, our key comparison was between responses to the artist’s versus the assistant’s copy, relative to the original.

We found that, on broadly evaluative dimensions, the artist’s and the assistant’s copies were rated identically – with no distinctions in beauty, liking or goodness. Thus, our participants behaved like formalists. Previous studies reporting lower beauty ratings for images labelled forgeries had presented works one at a time. But here, when the original and the forgery were presented simultaneously, people were forced to concede that there was no beauty difference.

On historical-evaluative ratings, however, the story was different. People rated the assistant’s copy as less creative, original and influential than the artist’s copy – even though both works were copies, both signed by the artist, and both worth the same monetarily. People now behaved as historicists, consistent with Danto’s position that visually identical Brillo boxes are not artistically identical.

These findings tell us that, when moral and monetary considerations are ruled out, there is still something wrong with a forgery. It’s not quite what Dutton thought, because while an original certainly represents a different kind of achievement from a forgery, there is really no difference in achievement between an artist’s copy and an assistant’s copy. Both are copies, after all. So what is it that’s wrong then?

I submit that it’s the aura that Benjamin talked about, which is dependent most critically on who made the work. Benjamin’s idea of ‘aura’ is consistent with what psychologists call essentialism – the view that certain special objects (eg, my wedding ring, or my childhood teddy bear) gain their identity from their history, and have an underlying nature that cannot be directly observed, a view developed extensively by the psychologist Susan Gelman. This is why we reject perfect replicas of such objects: we want the original. We appear to treat works of art this way too – as if they contain the essence of the artist, or the artist’s mind. We prefer the copy by the artist to the copy by the assistant because only the former contains that essence. This leads to the conclusion that just knowing that we are looking at a painting by Vermeer (even if it is a copy of a Vermeer by Vermeer) makes us feel like we are communing with Vermeer. Do we really want to find out that we were actually communing with van Meegeren?

These findings predict that we will not respond well to what the future is bringing us: three-dimensional prints of paintings virtually indistinguishable from the originals, and works of art generated by computers. These works will not allow us to infer the mind of the human artist.

The American art critic Peter Schjeldahl put this well when he wrote in 2008:

The spectre of forgery chills the receptiveness – the will to believe – without which the experience of art cannot occur. Faith in authorship matters. We read the qualities of a work as the forthright decisions of a particular mind, wanting to let it commandeer our own minds, and we are disappointed when it doesn’t.

If we read into a work of art the artist’s decisions, as Schjeldahl writes, then we are inferring a mind behind the work. Can we do this for abstract art? And, if so, can this help us distinguish art by great abstract expressionists from superficially similar art by children and animals?

Tension between those who revere and those who deride abstract art can be seen even among the most highly regarded art historians. In Art and Illusion(2000), Gombrich focused on representational art as a great human achievement, and disparaged abstract art as a display of the artist’s personality rather than skill. Contrast this to the attitude of the late American art historian Kirk Varnedoe, who was chief curator of painting and sculpture at the Museum of Modern Art from 1988 to 2001. In Pictures of Nothing(2006), Varnedoe responds explicitly to Gombrich’s challenge, writing that abstract art is a signal human achievement created in a new language, and filled with symbolic meaning. The ‘mind-boggling range of drips, stains, blobs, blocks, bricks, and blank canvases’ seen in museums of modern art are not random spills, he wrote. Rather, like all works of art, they are ‘vessels of human intention’ and they ‘generate meaning ahead of naming’. They represent a series of deliberate choices by the artist, and involve invention and evoke meanings – for example, energy, space, depth, repetition, serenity, discord.

Chimps, monkeys and elephants have all been given paints, brushes and paper on which to make marks. And their paintings, like those of preschoolers, bear a superficial resemblance to abstract expressionist paintings. Who hasn’t heard someone deride abstract art as requiring no skill at all, with statements such as ‘My kid could have done that!’

We wanted to find out whether people see more than they think they do in abstract art – whether they can see the mind behind the work. We createdpairs of images that looked eerily alike at first glance. Each pair consisted of a painting by a famous abstract expressionist whose works were found in at least one major art-history textbook (eg, Mark Rothko, Hans Hofmann, Sam Francis, Cy Twombly, Franz Kline and others) and a painting either by a child or a nonhuman animal (chimp, gorilla, monkey or elephant). The question we asked was whether people would prefer, and judge as better, works by artists over works by children and animals. And, if so, on what basis? . . .

Continue reading.

Written by LeisureGuy

17 January 2019 at 12:34 pm

Posted in Art, Books, Science

How Exercise May Help Keep Our Memory Sharp

leave a comment »

Good to know. Gretchen Reynolds writes in the NY Times:


A hormone that is released during exercise may improve brain health and lessen the damage and memory loss that occur during dementia, a new study finds. The study, which was published this month in Nature Medicine, involved mice, but its findings could help to explain how, at a molecular level, exercise protects our brains and possibly preserves memory and thinking skills, even in people whose pasts are fading.

Considerable scientific evidence already demonstrates that exercise remodels brains and affects thinking. Researchers have shown in rats and mice that running ramps up the creation of new brain cells in the hippocampus, a portion of the brain devoted to memory formation and storage. Exercise also can improve the health and function of the synapses between neurons there, allowing brain cells to better communicate.

In people, epidemiological research indicates that being physically active reduces the risk for Alzheimer’s disease and other dementias and may also slow disease progression.

But many questions remain about just how exercise alters the inner workings of the brain and whether the effects are a result of changes elsewhere in the body that also happen to be good for the brain or whether the changes actually occur within the brain itself.

Those issues attracted the attention of an international consortium of scientists — some of them neuroscientists, others cell biologists — all of whom were focused on preventing, treating and understanding Alzheimer’s disease.

Those concerns had brought a hormone called irisin into their sphere of interest. Irisin, first identified in 2012 and named for Iris, the gods’ messenger in Greek mythology, is produced by muscles during exercise. The hormone jump-starts multiple biochemical reactions throughout the body, most of them related to energy metabolism.

[Read more about irisin. | Sign up for the Well newsletter.]

Because Alzheimer’s disease is believed to involve, in part, changes in how brain cells use energy, the scientists reasoned that exercise might be helping to protect brains by increasing levels of irisin there.

But if so, they realized, irisin would have to exist in human brains. To see if it did, they gathered tissues from brain banks and, using sophisticated testing, found irisin there. Gene expression patterns in those tissues also suggested that much of this irisin had been created in the brain itself. Levels of the hormone were especially high in the brains of people who were free of dementia when they died, but were barely detectable in the brains of people who had died with Alzheimer’s.

Those tests, however, though interesting, could not tell scientists what role irisin might be playing in brains. So the researchers now turned to mice, some healthy and others bred to develop a rodent form of Alzheimer’s.

They infused the brains of the animals bred to have dementia with a concentrated dose of irisin. Those mice soon began to perform better on memory tests and show signs of improved synaptic health.

At the same time, they soaked the brains of the healthy animals with a substance that inhibits production of irisin and then pumped in a form of beta amyloid, a protein that clumps together to form plaques in the brains of those with Alzheimer’s. In effect, they gave the mice dementia. And, without any irisin in their brains, the once-healthy mice soon showed signs of worsening memory and poor function in the synapses between neurons in their hippocampus.

The scientists also looked inside individual neurons from healthy mice and found that, when they added irisin to the cells, gene expression changed in ways that would be expected to lessen damage from beta amyloid.

Finally and perhaps most important, the scientists had healthy mice work out, swimming for an hour almost every day for five weeks. Beforehand, some of the animals also were treated with the substance that blocks irisin production.

In the untreated animals, irisin levels in the brain blossomed during the exercise training and later, after the animals’ brains were exposed to beta amyloid, they seemed to fight off its effects, performing significantly better on memory tests than sedentary control mice that likewise had been exposed.

But the animals that had been unable to create irisin did not benefit much from exercise.  . .

Continue reading.

Written by LeisureGuy

17 January 2019 at 11:09 am

%d bloggers like this: