Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Science’ Category

The two sleeps of a natural night

leave a comment »

Interesting article by Clark Strand in the Washington Post:

What if you could meditate like a Tibetan lama with no instruction whatsoever — and without having to subscribe to any religious beliefs?

People hear a question like that and, unless they are particularly gullible, they assume they’re about to be scammed. But in this case there is nothing to buy — no tapes, no app, no religious agenda that gets sprung on you at the last moment when you’re feeling vulnerable and spiritually open. No hidden fees.

But that doesn’t mean there isn’t a catch. You have to be willing to revert to a Paleolithic pattern of sleep — and that means turning off your electric lights at dusk and leaving them off until dawn. Do that, and in about three week’s time, beginning around six hours after sunset each evening, you will find yourself experiencing a period of serene wakefulness that was once a nightly meditation retreat for all Homo sapiens on Earth. It’s a guarantee. It’s encoded in your genes.

During the mid-1990s, sleep researcher Thomas Wehr conducted a National Institutes of Health experiment that he later called an exercise in “archaeology, or human paleobiology.” Wehr wanted to find out if modern humans still carried within them the rhythms for a prehistoric mode of sleep. Did prehistoric humans sleep more? Did they sleep differently — or perhaps better?

Wehr’s logic was simple: Aided by the stimulating effects of all kinds of artificial lighting (everything from laptop screens to the bright lights of big cities), modern humans had compressed their sleep nights, like their work days, into convenient eight-hour blocks. And yet, given that light-assisted wakefulness was a relatively new invention, wasn’t it possible that human beings still carried in their DNA the remnants of a more primordial pattern of sleep?

The results were staggering. For one month, beginning at dusk and ending at dawn, Wehr’s subjects were removed from every possible form of artificial light. During the first three weeks, they slept as usual, only for about an hour longer. (After all, he reasoned, like most Americans, they were probably sleep deprived.) But at week four a dramatic change occurred. The participants slept the same number of hours as before, but now their sleep was divided in two. They began each night with about four hours of deep sleep, woke for two hours of quiet rest, then slept for another four.

During the gap between their “first” and “second” sleep, Wehr’s subjects were neither awake nor fully asleep. Rather, they experienced a condition they had never known before — a state of consciousness all its own. Later Wehr would compare it to what advanced practitioners experience in meditation — what you might call “mindfulness” today. But there weren’t any mindfulness practitioners in his study. They were simply ordinary people who, removed for one month from artificial lighting, found their nights broken in two.

While trying to account for the peace and serenity that his subjects reported feeling during their hours of “quiet rest,” Wehr discovered that . . .

Continue reading.

Clark Strand is the author of a book about these ideas: Waking Up to the Dark: Ancient Wisdom for a Sleepless Age

Written by LeisureGuy

24 May 2015 at 9:41 am

Training Young Doctors: The Current Crisis

leave a comment »

Sometimes it seems that America has lost the capacity to address large social problems—infrastructure being a prime example. But look also at the continuing breakdown of the medical/healthcare system. Lara Goitein reviews a recent book in the NY Review of Books:

Let Me Heal: The Opportunity to Preserve Excellence in American Medicine
by Kenneth M. Ludmerer
Oxford University Press, 431 pp., $34.95

In the 1890s, Sir William Osler, now regarded as something of a demigod in American medicine, created at the Johns Hopkins Hospital a novel system for training physicians after graduation from medical school. It required young physicians to reside in the hospital full-time without pay, sometimes for years, to learn how to care for patients under the close supervision of senior physicians.

This was the first residency program. Despite the monastic existence, the long hours, and the rigid hierarchy, Osler’s residents apparently loved it. They felt exalted to be able to learn the practice of medicine under the tutelage of great physicians who based their teachings on science, inquiry, and argument, not tradition. And far from bridling at being at the bottom of the pyramid, they virtually worshiped their teachers, who in turn generally lavished great attention and affection on their charges. Osler’s innovation spread rapidly, and the residency system is still the essential feature of teaching hospitals throughout the country.

Residents are young doctors who have completed medical school and are learning their chosen specialty by caring for patients under the supervision of senior physicians, called attendings. Residents in their first year are called interns. As in Osler’s time, residents work long hours, although they no longer live in the hospital and are now paid a modest salary. The time this training takes varies—three years, for example, to complete a program in internal medicine. Following that, many go on to a few more years of training in subspecialties (for example cardiology, a subspecialty of internal medicine), and at this point they are called fellows.

Together residents and fellows, who now number about 120,000 across the country, are called house officers, and their training is termed graduate medical education (GME). The teaching hospitals where most of this takes place are often affiliated with medical schools, which in turn are often part of universities, and together they make up sometimes gigantic conglomerates, called academic medical centers.

Despite the fact that Osler’s idea lives on, there have been enormous changes over the years, and this is the subject of Kenneth Ludmerer’s meticulous new book, Let Me Heal. Ludmerer, a senior faculty physician and professor of the history of medicine at Washington University in St. Louis, sounds a warning. The Oslerian ideal of faculty and residents forming close relationships and thinking together about each patient is in trouble. Instead, residents, with little supervision, are struggling to keep up with staggering workloads, and have little time or energy left for learning. Attending physicians, for their part, are often too occupied with their own research and clinical practices—often in labs and offices outside of the hospital—to pay much attention to the house officers.

The implications for the public are profound. Nearly anyone admitted to a teaching hospital—and these are the most prestigious hospitals in the country—can expect to be cared for by residents and fellows. Whether house officers are well trained and, most important, whether they have the time to provide good care are crucial. Yet until Ludmerer’s book, there has been very little critical attention to these questions. It’s simply assumed that when you are admitted to a teaching hospital, you will get the best care possible. It’s odd that something this important would be regarded in such a Panglossian way.

Ludmerer refers to graduate medical education in the period between the world wars, following Osler, as the “educational era,” by which he means that the highest priority of teaching hospitals was education. Heads of departments were omnipresent on the wards, and knew the house officers intimately. A network of intense, often lifelong mentorships formed. Ludmerer gives a fascinating account of the propagation of talent; for example, William Halsted, the first chief of surgery at Johns Hopkins, had seventeen chief residents, eleven of whom subsequently established their own surgical residency programs at other institutions. Of their 166 chief residents, eighty-five became prominent faculty members at university medical schools. The influence of the giants of the era of education still reaches us through three, four, or five generations of disciples, and house officers quote Osler even today.

There was a strong moral dimension to this system. Ludmerer writes that “house officers learned that medicine is a calling, that altruism is central to being a true medical professional, and that the ideal practitioner placed the welfare of his patients above all else.” Commercialism was antithetical to teaching hospitals in the era of education. “Teaching hospitals regularly acknowledged that they served the public,” writes Ludmerer, “and they competed with each other to be the best, not the biggest or most profitable.”

Indeed, teaching hospitals deliberately limited their growth to maintain the ideal setting for teaching and research. Ludmerer offers the example of the prestigious Peter Bent Brigham Hospital in Boston (now named the Brigham and Women’s Hospital), which in its 1925 annual report declared that it had “more patients than it can satisfactorily handle…. The last thing it desires is to augment this by patients who otherwise will secure adequate professional service.” They also kept prices as low as possible, and delivered large amounts of charity care. With few exceptions, members of the faculty did not patent medical discoveries or accept gifts from industry, and regularly waived fees for poor patients.

To be sure, this golden age was not pure gold. These physicians were, on the whole, paternalistic toward patients; by today’s standards, many were elitist, sexist, and racist. But they were utterly devoted to what they were doing, and to one another, and put that commitment ahead of everything, including their own self-interest.

World War II brought great changes. In the postwar prosperity, the United States began to invest heavily in science and medicine, with rapid expansion of the National Institutes of Health (NIH), which in turn poured money into research at academic medical centers. In addition, the growth of health insurance led to more hospital admissions. In 1965, the creation of Medicare and Medicaid accelerated this growth enormously. According to Ludmerer, between 1965 and 1990, the number of full-time faculty in medical schools increased more than fourfold, NIH funding increased elevenfold, and revenues of academic medical centers from clinical treatment increased nearly two hundred–fold.

Initially, in the couple of decades following the war, the influx of money and the rapid growth simply gave momentum to the trajectory begun in the era of education. Reinforced by leaders who had trained during that era, the established traditions endured, and teaching hospitals for the most part defended their commitment to educational excellence and public service. However, the close-knit, personal character of graduate medical education began to unravel. By the late 1970s, academic medical centers began to take on the character of large businesses, both in their size and complexity, and in their focus on growth and maximizing revenue. Even if technically nonprofit, the benefits of expansion accrued to everyone who worked there, most particularly the executives and administrators. In 1980, Arnold Relman wrote a landmark article in The New England Journal of Medicine, warning of the emergence of a “medical-industrial complex.”

The growing commercialization of teaching hospitals was exacerbated by a change in the method of payment for hospital care. Health care costs were rising rapidly and unsustainably, and in the 1980s health insurers responded with what has been termed “the revolt of the payers.” Previously, most insurers had paid hospitals according to “fee-for-service,” in which payment was made for each consultation, test, treatment, or other service provided. But now Medicare and other insurers, in an effort to control costs, began to reimburse hospitals less liberally and by “prospective payment” methods, in which the hospital received a fixed payment for each patient’s admission according to the diagnosis. Whatever part of that payment was not spent was the hospital’s gain; if the hospital spent more, it was a loss. Hospitals now had a strong incentive to get patients in and out as fast as possible.

Quite suddenly, the torrent of clinical revenue that had so swollen academic medical centers slowed. Many hospitals did not survive in the new environment (the total number of US hospitals decreased by nearly 20 percent between 1980 and 2000). Those that stayed afloat did so by promoting high-revenue subspecialty and procedural care, for example heart catheterization and orthopedic and heart surgery, which were still lucratively rewarded. They also developed more extensive relationships with pharmaceutical and biotech companies and manufacturers of medical devices, which paid them for exclusive marketing rights to drugs or technologies developed by faculty, as well as access to both patients and faculty for research and marketing purposes.1. . .

Continue reading.

Written by LeisureGuy

24 May 2015 at 6:37 am

Robert Solow in Conversation with Paul Krugman: “Inequality: What Can Be Done?”

leave a comment »

Written by LeisureGuy

23 May 2015 at 1:29 pm

Posted in Daily life, Science

Continuing fallout from the failure of the War on Drugs

leave a comment »

A bankrupt policy that costs $15 billion per year and results in increasing drug use. Jon Lee Anderson reports in the New Yorker:

1971, President Nixon announced the U.S. “war on drugs,” which every President since has carried forward as a battle standard. Until recently, most Latin American governments have coöperated, and in return have received intelligence, equipment, and, perhaps most importantly, financial assistance. The overall investment has been huge—the federal government now spends about fifteen billion dollars on it each year—with the net result that drug use has proliferated in the U.S. and worldwide. In the drug-producing countries, where drug consumption was negligible at the start of the American effort, the criminal narcoculture has attained ghoulishly surreal proportions.

Over the course of the past few years, a growing number of Latin American governments have begun to challenge U.S. policy and to call for a radical rethinking of the war on drugs, including widespread decriminalization. A handful of leftist governments, such as those of Venezuela, Ecuador, and Bolivia, have gone so far as to end their coöperation with the U.S. Drug Enforcement Administration, alleging that U.S. drug policy is a new form of Yankee imperialism. Uruguay, under the former President José Mujica, became the first country to legalize state-sponsored production, sale, and use of marijuana.

The latest opposition to the forty-five-year-old drug war came not from a government that is hostile to the U.S. but from its most steadfast ally in the Americas, Colombia. On May 14th, President Juan Manuel Santos announced that his government was halting its longstanding practice of spraying the country’s illicit coca crop with chemicals to kill the plants. The spraying began in the late nineties under the U.S.-sponsored Plan Colombia, which aimed to wipe out the country’s drug culture and its guerrillas, who largely depend onnarcotráfico for their survival. Santos made the announcement after U.N. scientists confirmed what critics of spraying had long alleged: that glyphosate, a key ingredient in the herbicide known as Roundup, is probably carcinogenic to humans.

Colombia was the last country in the world to use chemical spraying to combat illegal drug cultivation. Citing health hazards and damage to impoverished rural economies, both Bolivia and Peru, which also grow coca, have banned aerial spraying. Afghanistan, the world’s chief supplier of opium, overrode American protests to ban spraying in 2007. The Karzai government argued that the program drove poor Afghan farmers into the hands of the Taliban by destroying their livelihoods without offering realistic economic alternatives. Similar arguments have long been made in Colombia, where millions of farmers have been driven from their land to live in urban slums.

The U.S. State and Defense Departments, which jointly oversee Plan Colombia, have always lobbied heavily in favor of spraying, which is outsourced to the giant U.S. security contractor DynCorp. DynCorp has earned hundreds of millions from its Colombian contracts, just as it previously did in Afghanistan, where it also won the government contract to implement counter-narcotics strategy. Notably, after President Santos announced the halt to spraying, that U.S. Ambassador to Colombia, Kevin Whitaker, published an Op-Ed in the leading Colombian newspaper, El Tiempo, arguing in favor of continuing the spraying campaign while saying that the U.S would continue working closely with Colombia in spite of the recent decision. Whitaker ended his Op-Ed with the English phrase “We have your back.”

So who is to be believed about the war on drugs, and what is the right way forward? After almost twenty years, many deaths, and billions of dollars spent under Plan Colombia, has illicit coca production decreased in Colombia? Overall, yes, according to the plan’s proponents: in his piece, Whitaker asserted that the area under cultivation for illegal coca production was reduced by half between 2007 and 2013. But studies also show that that area increased by thirty-nine per cent last year—so the most recent trends aren’t good. And if one third of the initial cultivation area is still left, that means that a significant amount of cocaine is still coming out of Colombia, and will be for the foreseeable future. . .

Continue reading.

Maybe we’re going about drugs all wrong?

Written by LeisureGuy

23 May 2015 at 12:30 pm

Everything Sounds the Same When You’re Depressed

leave a comment »

Interesting finding: The cocktail-party effect—the ability to focus on one particular conversation in the babble of a cocktail party—is beyond the capability of those suffering from depression.

Written by LeisureGuy

22 May 2015 at 8:31 am

Posted in Mental Health, Science

Does being a jerk pay off, long term?

leave a comment »

In looking at the range of CEOs, legislators, judges, and so on, it would seem that in some cases being a jerk has paid off. But now the question is getting some serious study. However, the results can change by how “success” is measured.

Written by LeisureGuy

21 May 2015 at 2:53 pm

Healthcare advice from two health reporters

leave a comment »

Via Kevin Drum, this report in Vox by Julia Belluz and Sarah Kliff. The two offer 8 lessons learned from years of reporting on medicine and healthcare. To take just one example, consider this chart from the second lesson, “2) Ignore most news stories about new health studies”:

Medical_studies-05.0.0

By all means, read the entire article.

Written by LeisureGuy

20 May 2015 at 10:33 am

Follow

Get every new post delivered to your Inbox.

Join 1,893 other followers

%d bloggers like this: