Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Health’ Category

Training Young Doctors: The Current Crisis

leave a comment »

Sometimes it seems that America has lost the capacity to address large social problems—infrastructure being a prime example. But look also at the continuing breakdown of the medical/healthcare system. Lara Goitein reviews a recent book in the NY Review of Books:

Let Me Heal: The Opportunity to Preserve Excellence in American Medicine
by Kenneth M. Ludmerer
Oxford University Press, 431 pp., $34.95

In the 1890s, Sir William Osler, now regarded as something of a demigod in American medicine, created at the Johns Hopkins Hospital a novel system for training physicians after graduation from medical school. It required young physicians to reside in the hospital full-time without pay, sometimes for years, to learn how to care for patients under the close supervision of senior physicians.

This was the first residency program. Despite the monastic existence, the long hours, and the rigid hierarchy, Osler’s residents apparently loved it. They felt exalted to be able to learn the practice of medicine under the tutelage of great physicians who based their teachings on science, inquiry, and argument, not tradition. And far from bridling at being at the bottom of the pyramid, they virtually worshiped their teachers, who in turn generally lavished great attention and affection on their charges. Osler’s innovation spread rapidly, and the residency system is still the essential feature of teaching hospitals throughout the country.

Residents are young doctors who have completed medical school and are learning their chosen specialty by caring for patients under the supervision of senior physicians, called attendings. Residents in their first year are called interns. As in Osler’s time, residents work long hours, although they no longer live in the hospital and are now paid a modest salary. The time this training takes varies—three years, for example, to complete a program in internal medicine. Following that, many go on to a few more years of training in subspecialties (for example cardiology, a subspecialty of internal medicine), and at this point they are called fellows.

Together residents and fellows, who now number about 120,000 across the country, are called house officers, and their training is termed graduate medical education (GME). The teaching hospitals where most of this takes place are often affiliated with medical schools, which in turn are often part of universities, and together they make up sometimes gigantic conglomerates, called academic medical centers.

Despite the fact that Osler’s idea lives on, there have been enormous changes over the years, and this is the subject of Kenneth Ludmerer’s meticulous new book, Let Me Heal. Ludmerer, a senior faculty physician and professor of the history of medicine at Washington University in St. Louis, sounds a warning. The Oslerian ideal of faculty and residents forming close relationships and thinking together about each patient is in trouble. Instead, residents, with little supervision, are struggling to keep up with staggering workloads, and have little time or energy left for learning. Attending physicians, for their part, are often too occupied with their own research and clinical practices—often in labs and offices outside of the hospital—to pay much attention to the house officers.

The implications for the public are profound. Nearly anyone admitted to a teaching hospital—and these are the most prestigious hospitals in the country—can expect to be cared for by residents and fellows. Whether house officers are well trained and, most important, whether they have the time to provide good care are crucial. Yet until Ludmerer’s book, there has been very little critical attention to these questions. It’s simply assumed that when you are admitted to a teaching hospital, you will get the best care possible. It’s odd that something this important would be regarded in such a Panglossian way.

Ludmerer refers to graduate medical education in the period between the world wars, following Osler, as the “educational era,” by which he means that the highest priority of teaching hospitals was education. Heads of departments were omnipresent on the wards, and knew the house officers intimately. A network of intense, often lifelong mentorships formed. Ludmerer gives a fascinating account of the propagation of talent; for example, William Halsted, the first chief of surgery at Johns Hopkins, had seventeen chief residents, eleven of whom subsequently established their own surgical residency programs at other institutions. Of their 166 chief residents, eighty-five became prominent faculty members at university medical schools. The influence of the giants of the era of education still reaches us through three, four, or five generations of disciples, and house officers quote Osler even today.

There was a strong moral dimension to this system. Ludmerer writes that “house officers learned that medicine is a calling, that altruism is central to being a true medical professional, and that the ideal practitioner placed the welfare of his patients above all else.” Commercialism was antithetical to teaching hospitals in the era of education. “Teaching hospitals regularly acknowledged that they served the public,” writes Ludmerer, “and they competed with each other to be the best, not the biggest or most profitable.”

Indeed, teaching hospitals deliberately limited their growth to maintain the ideal setting for teaching and research. Ludmerer offers the example of the prestigious Peter Bent Brigham Hospital in Boston (now named the Brigham and Women’s Hospital), which in its 1925 annual report declared that it had “more patients than it can satisfactorily handle…. The last thing it desires is to augment this by patients who otherwise will secure adequate professional service.” They also kept prices as low as possible, and delivered large amounts of charity care. With few exceptions, members of the faculty did not patent medical discoveries or accept gifts from industry, and regularly waived fees for poor patients.

To be sure, this golden age was not pure gold. These physicians were, on the whole, paternalistic toward patients; by today’s standards, many were elitist, sexist, and racist. But they were utterly devoted to what they were doing, and to one another, and put that commitment ahead of everything, including their own self-interest.

World War II brought great changes. In the postwar prosperity, the United States began to invest heavily in science and medicine, with rapid expansion of the National Institutes of Health (NIH), which in turn poured money into research at academic medical centers. In addition, the growth of health insurance led to more hospital admissions. In 1965, the creation of Medicare and Medicaid accelerated this growth enormously. According to Ludmerer, between 1965 and 1990, the number of full-time faculty in medical schools increased more than fourfold, NIH funding increased elevenfold, and revenues of academic medical centers from clinical treatment increased nearly two hundred–fold.

Initially, in the couple of decades following the war, the influx of money and the rapid growth simply gave momentum to the trajectory begun in the era of education. Reinforced by leaders who had trained during that era, the established traditions endured, and teaching hospitals for the most part defended their commitment to educational excellence and public service. However, the close-knit, personal character of graduate medical education began to unravel. By the late 1970s, academic medical centers began to take on the character of large businesses, both in their size and complexity, and in their focus on growth and maximizing revenue. Even if technically nonprofit, the benefits of expansion accrued to everyone who worked there, most particularly the executives and administrators. In 1980, Arnold Relman wrote a landmark article in The New England Journal of Medicine, warning of the emergence of a “medical-industrial complex.”

The growing commercialization of teaching hospitals was exacerbated by a change in the method of payment for hospital care. Health care costs were rising rapidly and unsustainably, and in the 1980s health insurers responded with what has been termed “the revolt of the payers.” Previously, most insurers had paid hospitals according to “fee-for-service,” in which payment was made for each consultation, test, treatment, or other service provided. But now Medicare and other insurers, in an effort to control costs, began to reimburse hospitals less liberally and by “prospective payment” methods, in which the hospital received a fixed payment for each patient’s admission according to the diagnosis. Whatever part of that payment was not spent was the hospital’s gain; if the hospital spent more, it was a loss. Hospitals now had a strong incentive to get patients in and out as fast as possible.

Quite suddenly, the torrent of clinical revenue that had so swollen academic medical centers slowed. Many hospitals did not survive in the new environment (the total number of US hospitals decreased by nearly 20 percent between 1980 and 2000). Those that stayed afloat did so by promoting high-revenue subspecialty and procedural care, for example heart catheterization and orthopedic and heart surgery, which were still lucratively rewarded. They also developed more extensive relationships with pharmaceutical and biotech companies and manufacturers of medical devices, which paid them for exclusive marketing rights to drugs or technologies developed by faculty, as well as access to both patients and faculty for research and marketing purposes.1. . .

Continue reading.

Written by LeisureGuy

24 May 2015 at 6:37 am

Continuing fallout from the failure of the War on Drugs

leave a comment »

A bankrupt policy that costs $15 billion per year and results in increasing drug use. Jon Lee Anderson reports in the New Yorker:

1971, President Nixon announced the U.S. “war on drugs,” which every President since has carried forward as a battle standard. Until recently, most Latin American governments have coöperated, and in return have received intelligence, equipment, and, perhaps most importantly, financial assistance. The overall investment has been huge—the federal government now spends about fifteen billion dollars on it each year—with the net result that drug use has proliferated in the U.S. and worldwide. In the drug-producing countries, where drug consumption was negligible at the start of the American effort, the criminal narcoculture has attained ghoulishly surreal proportions.

Over the course of the past few years, a growing number of Latin American governments have begun to challenge U.S. policy and to call for a radical rethinking of the war on drugs, including widespread decriminalization. A handful of leftist governments, such as those of Venezuela, Ecuador, and Bolivia, have gone so far as to end their coöperation with the U.S. Drug Enforcement Administration, alleging that U.S. drug policy is a new form of Yankee imperialism. Uruguay, under the former President José Mujica, became the first country to legalize state-sponsored production, sale, and use of marijuana.

The latest opposition to the forty-five-year-old drug war came not from a government that is hostile to the U.S. but from its most steadfast ally in the Americas, Colombia. On May 14th, President Juan Manuel Santos announced that his government was halting its longstanding practice of spraying the country’s illicit coca crop with chemicals to kill the plants. The spraying began in the late nineties under the U.S.-sponsored Plan Colombia, which aimed to wipe out the country’s drug culture and its guerrillas, who largely depend onnarcotráfico for their survival. Santos made the announcement after U.N. scientists confirmed what critics of spraying had long alleged: that glyphosate, a key ingredient in the herbicide known as Roundup, is probably carcinogenic to humans.

Colombia was the last country in the world to use chemical spraying to combat illegal drug cultivation. Citing health hazards and damage to impoverished rural economies, both Bolivia and Peru, which also grow coca, have banned aerial spraying. Afghanistan, the world’s chief supplier of opium, overrode American protests to ban spraying in 2007. The Karzai government argued that the program drove poor Afghan farmers into the hands of the Taliban by destroying their livelihoods without offering realistic economic alternatives. Similar arguments have long been made in Colombia, where millions of farmers have been driven from their land to live in urban slums.

The U.S. State and Defense Departments, which jointly oversee Plan Colombia, have always lobbied heavily in favor of spraying, which is outsourced to the giant U.S. security contractor DynCorp. DynCorp has earned hundreds of millions from its Colombian contracts, just as it previously did in Afghanistan, where it also won the government contract to implement counter-narcotics strategy. Notably, after President Santos announced the halt to spraying, that U.S. Ambassador to Colombia, Kevin Whitaker, published an Op-Ed in the leading Colombian newspaper, El Tiempo, arguing in favor of continuing the spraying campaign while saying that the U.S would continue working closely with Colombia in spite of the recent decision. Whitaker ended his Op-Ed with the English phrase “We have your back.”

So who is to be believed about the war on drugs, and what is the right way forward? After almost twenty years, many deaths, and billions of dollars spent under Plan Colombia, has illicit coca production decreased in Colombia? Overall, yes, according to the plan’s proponents: in his piece, Whitaker asserted that the area under cultivation for illegal coca production was reduced by half between 2007 and 2013. But studies also show that that area increased by thirty-nine per cent last year—so the most recent trends aren’t good. And if one third of the initial cultivation area is still left, that means that a significant amount of cocaine is still coming out of Colombia, and will be for the foreseeable future. . .

Continue reading.

Maybe we’re going about drugs all wrong?

Written by LeisureGuy

23 May 2015 at 12:30 pm

Healthcare advice from two health reporters

leave a comment »

Via Kevin Drum, this report in Vox by Julia Belluz and Sarah Kliff. The two offer 8 lessons learned from years of reporting on medicine and healthcare. To take just one example, consider this chart from the second lesson, “2) Ignore most news stories about new health studies”:

Medical_studies-05.0.0

By all means, read the entire article.

Written by LeisureGuy

20 May 2015 at 10:33 am

The LCHF diet and the case for drinking whole milk

leave a comment »

Deena Shanker reports in Quartz:

Once upon a time, Americans drank a lot of whole milk. But when the anti-fat movement of the 1980s took hold, no-fat or lower-fat milks saw their popularity rise along with the cream that would soon be skimmed off the top. For most people, creamy, nutritious, delicious whole milk, like the milkmen who used to deliver it, became a relic of the past.

milk_sales_in_the_us_whole_2-percent_1-percent_skim_chartbuilder

For all the debate surrounding the latest recommendations from the committee that proposed federal dietary guidelines, the group’s endorsement of low-fat and fat-free milk over whole has garnered little attention. This suggests that while many of us scoff at the misguided anti-fat crusades of recent years (nuts to you, 1980s!) whole milk remains an unpopular outlier. And that’s just ridiculous.

Though it would seem to follow that consuming less fat would lead to being less fat, that’s not quite what the science says, at least when it comes to dairy—even if whole milk is more caloric than skim.

In 2006, a study published in the American Journal of Clinical Nutritionlooked at the role of dairy consumption in weight regulation for 19,352 Swedish women between the ages of 40 and 55. Data was initially collected between 1987 and 1990, and then again in 1997.  It found that for the women in the study, eating one or more servings a day of whole dairy products was “inversely associated with weight gain,” with the most significant findings for normal-weight women consuming whole milk or sour milk.

In 2013, the Scandinavian Journal of Primary Health Care published findings from a study that tracked the impact of dairy fat intake on 1,782 men. Twelve years after researchers took the initial measurements, they found that consumption of butter, high-fat milk, and cream several times a week were related to lower levels of central obesity, while “a low intake of dairy fat… was associated with a higher risk of developing central obesity.” (Central obesity means a waist-to-hip ratio equal to or greater than one—i.e. big in the middle.)

Still skeptical? Shortly after that study came out, the European Journal of Nutrition published a meta-analysis of 16 studies on the relationship between dairy fat, obesity, and cardiometabolic disease. (A meta-analysis combines findings from multiple, independent studies, and when done correctly, provides better coverage of a question than any single study usually can.) Its findings will be revelatory for anyone who drinks skim for weight control: . . .

Continue reading.

Written by LeisureGuy

18 May 2015 at 2:02 pm

Posted in Food, Health, Low carb, Science

How the American opiate epidemic was started by one pharmaceutical company

leave a comment »

Corporations will do absolutely anything for a profit, regardless of the harm they cause. Mike Mariani looks at an example in this Pacific Standard article:

The state of Kentucky may finally get its deliverance. After more than seven years of battling the evasive legal tactics of Purdue Pharma, 2015 may be the year that Kentucky and its attorney general, Jack Conway, are able to move forward with a civil lawsuit alleging that the drugmaker misled doctors and patients about their blockbuster pain pill OxyContin, leading to a vicious addiction epidemic across large swaths of the state.

A pernicious distinction of the first decade of the 21st century was the rise in painkiller abuse, which ultimately led to a catastrophic increase in addicts, fatal overdoses, and blighted communities. But the story of the painkiller epidemic can really be reduced to the story of one powerful, highly addictive drug and its small but ruthlessly enterprising manufacturer.

On December 12, 1995, the Food and Drug Administration approved the opioid analgesic OxyContin. It hit the market in 1996. In its first year, OxyContin accounted for $45 million in sales for its manufacturer, Stamford, Connecticut-based pharmaceutical company Purdue Pharma. By 2000 that number would balloon to $1.1 billion, an increase of well over 2,000 percent in a span of just four years. Ten years later, the profits would inflate still further, to $3.1 billion. By then the potent opioid accounted for about 30 percent of the painkiller market. What’s more, Purdue Pharma’s patent for the original OxyContin formula didn’t expire until 2013. This meant that a single private, family-owned pharmaceutical company with non-descript headquarters in the Northeast controlled nearly a third of the entire United States market for pain pills.

OxyContin’s ball-of-lightning emergence in the health care marketplace was close to unprecedented for a new painkiller in an age where synthetic opiates like Vicodin, Percocet, and Fentanyl had already been competing for decades in doctors’ offices and pharmacies for their piece of the market share of pain-relieving drugs. In retrospect, it almost didn’t make sense. Why was OxyContin so much more popular? Had it been approved for a wider range of ailments than its opioid cousins? Did doctors prefer prescribing it to their patients?

During its rise in popularity, there was a suspicious undercurrent to the drug’s spectrum of approved uses and Purdue Pharma’s relationship to the physicians that were suddenly privileging OxyContin over other meds to combat everything from back pain to arthritis to post-operative discomfort. It would take years to discover that there was much more to the story than the benign introduction of a new, highly effective painkiller.

In 1952, brothers Arthur, Raymond, and Mortimer Sackler purchased Purdue Pharma, then called Purdue Frederick Co. All three men were psychiatrists by trade, working at a mental facility in Queens in the 1940s.

The eldest brother, Arthur, was a brilliant polymath, contributing not only to psychiatric research but also thriving in the fledgling field of pharmaceutical advertising. It was here that he would leave his greatest mark. As a member of William Douglas McAdams, a small New York-based advertising firm, Sackler expanded the possibilities of medical advertising by promoting products in medical journals and experimenting with television and radio marketing. Perhaps his greatest achievement, detailed in his biography in the Medical Advertising Hall of Fame, was finding enough different uses for Valium to turn it into the first drug to hit $100 million in revenue.

The Medical Advertising Hall of Fame website’s euphemistic argot for this accomplishment states that Sackler’s experience in the fields of psychiatry and experimental medicine “enabled him to position different indications for Roche’s Librium and Valium.”

Sackler was also among the first medical advertisers to foster relationships with doctors in the hopes of earning extra points for his company’s drugs, according to a 2011 exposé in Fortune. Such backscratching in the hopes of reciprocity is now the model for the whole drug marketing industry. Arthur Sackler’s pioneering methods would be cultivated by his younger brothers Raymond and Mortimer in the decades to come, as they grew their small pharmaceutical firm.

Starting in 1996, . . .

Continue reading.

Written by LeisureGuy

16 May 2015 at 2:16 pm

Posted in Business, Drug laws, Health

Marijuana may be even safer than previously thought, researchers say

leave a comment »

Christopher Ingraham wrote in the Washington Post back in February:

Compared with other recreational drugs — including alcohol — marijuana may be even safer than previously thought. And researchers may be systematically underestimating risks associated with alcohol use.

Those are the top-line findings of recent research published in the journal Scientific Reports, a subsidiary of Nature. Researchers sought to quantify the risk of death associated with the use of a variety of commonly used substances. They found that at the level of individual use, alcohol was the deadliest substance, followed by heroin and cocaine.

Drug harm

And all the way at the bottom of the list? Weed — roughly 114 times less deadly than booze, according to the authors, who ran calculations that compared lethal doses of a given substance with the amount that a typical person uses. Marijuana is also the only drug studied that posed a low mortality risk to its users.

These findings reinforce drug-safety rankings developed 10 years ago under a slightly different methodology. So in that respect, the study is more of a reaffirmation of previous findings than anything else. But given the current national and international debates over the legal status of marijuana and the risks associated with its use, the study arrives at a good time.

Written by LeisureGuy

16 May 2015 at 1:27 pm

Posted in Drug laws, Health, Science

How the Army took 12 years to solve a puzzle that readers solved in minutes

leave a comment »

Perhaps the US military’s lack of interest in the welfare of its troops is part of our overall national decline. C.J. Chivers, the journalist who wrote the story I just now blogged, points out how military intelligence took 12 years to solve the mystery of what was in the barrels, a mystery solved in minutes by readers.

He blogs:

One old saw among combat-seasoned troops and other students of language is the joke that goes “Military intelligence is not.” I remember sitting in English classes, even in conservative upstate New York, and teachers offering that phrase – “military intelligence” – as an example of an oxymoron. (Another dead-ringer: “friendly fire.”) Veterans of war will nod with understanding.

But whatever you feel about all that, the photo and the document above point to an error by the U.S. Army’s 205th Military Intelligence Brigade that will help keep the one-liner alive.

At top is a photograph of barrels removed from an Iraqi Republican Guard warehouse in May 2003 by soldiers from the 811th Ordnance Company,an Army Reserve unit that subsequently would be neglected by senior officers and Army leadership for more than a decade. Some of these barrels leaked as the soldiers handled them, causing the soldiers to fall ill with several symptoms that partly mirrored those of nerve-agent exposure.

After the afflicted soldiers had been evacuated and admitted to a military hospital, the Army set out to find out what the barrels contained. (Why it did not do this before having the soldiers handle the barrels is another question.) Field-detection tests had indicated that the barrels may have held chemical-warfare agents. But such tests are often unreliable.  Once the troops fell ill, a more thorough check was in order.

This included making photographs of the Arabic and Cyrillic stenciling on the barrels, as seen in the pic.  The photographs were then shared with the 205th Military Intelligence Brigade, which offered a translation to English, memorialized in a memo written by a chemical defense non-commissioned officer, who had been assigned to the case.

Look closely at the photo, then at the document beneath it, focusing on the translation in Block 1.

The first line of the translation, from the Arabic, is accurate. The barrels bore the stenciling of the al Karama Company, an Iraqi firm closely associated with Baathist weapons programs, including rockets and missiles.  Then comes the mistake. The second line, which lists the intelligence brigade’s simple transliteration of an acronym in Cyrillic, reads PR-02.  That is not how the stenciling actually reads. It reads TG-02.  Interestingly, the Arabic on that line, “Fuel,” is rendered accurately.

Put those two items (TG-02 & fuel) into a Google search and see what you get.  In an instant you’ll be referred to links explaining that TG-02 is a toxic binary rocket and missile propellant associated with several Eastern bloc weapons systems, including systems possessed by Saddam Hussein’s military before it was routed in 2003. As you dig you’ll learn about xylidines, and find material safety data sheets that show acute exposure symptoms matching those suffered by members of the 811th.

For 12 years the veterans who were exposed to the contents of these barrels had wondered what it really was that had made them sick, and what long-term health consequences it might carry.  The Army, which at first suffered from some confusion about the incident but eventually figured it out, would not tell them. The records that ultimately described the contents accurately were classified, leaving the victims in the dark. FOIA requests were stalled. Queries to doctors and officers went nowhere. One important document, a site survey report by the Iraq Survey Group, was declassified only last week, after repeating prodding by The New York Times; it listed a chemical from a family of organic compounds used in TG-02 – essentially confirming what was visible on the barrel stenciling all along.

In other words, throughout all of these intervening years the answer had been right there on the photographs – TG-02.  And yet no one in the Army, as near as we can tell, spotted the intelligence unit’s mistake, which misdirected the inquiry into the incident while it was still in the open source, and served to keep the victims misinformed.

Last night, ahead of the publication of the story, we decided to run a quick and informal test about how hard this is to solve.  So we tweeted the photograph, posted it on Facebook and put it here.  And we asked readers what the barrels contained. How long, we wondered, would it take before someone solved it?  Readers’ answers quickly recalled that old joke about military intelligence.  The first correct reply arrived within 15 minutes. More thorough and accurate replies rolled in throughout the night.

PR-02 vs. TG-02.  Had the intelligence unit not bungled its two-character transliteration, . . .

Continue reading.

Written by LeisureGuy

14 May 2015 at 10:39 am

Posted in Health, Medical, Military

Follow

Get every new post delivered to your Inbox.

Join 1,897 other followers

%d bloggers like this: