Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Medical’ Category

Politicians who support funding only for issues that affect them personally

with one comment

Politicians sometimes seem to view the government as a private piggy bank, not only for their junkets to various pleasant countries and resorts, but also for doing research that benefits them in a personal connection. Kevin Drum notes:

Steve Benen mentions one of my pet peeves today: politicians who want to cut spending on everything except for research on one particular disease that happens to affect them personally. A couple of years ago, for example, Sen. Mark Kirk suddenly became interested in Medicaid’s approach to treating strokes after he himself suffered a stroke. The latest example is Jeb Bush, whose mother-in-law has Alzheimer’s. I suppose you can guess what’s coming next. Here’s Jeb in a letter he sent to Maria Shriver:

I have gotten lots of emails based on my comments regarding Alzheimer’s and dementia at a town hall meeting in New Hampshire. It is not the first time I have spoken about this disease. I have done so regularly.

Here is what I believe:

We need to increase funding to find a cure. We need to reform FDA [regulations] to accelerate the approval process for drug and device approval at a much lower cost. We need to find more community based solutions for care.

As Benen points out, Bush vetoed a bunch of bills that would have assisted Alzheimer’s patients when he was governor of Florida. I guess that’s changed now that he actually knows someone with the disease. However, it doesn’t seem to have affected his attitude toward any other kind of medical research spending.

I’m not even sure what to call this syndrome, but it’s mighty common. It’s also wildly inappropriate. . .

Continue reading.

Written by LeisureGuy

27 May 2015 at 10:39 am

The University of Minnesota’s Medical Research Mess

with one comment

I wonder whether this NY Times column by Carl Elliott, a professor at the Center for Bioethics at the University of Minnesota, doesn’t describe one strand of the overall decline of the US: how ethical conduct is crumbling due to the erosive effects of greed and ambition. Lying to protect an institution’s reputation (cf. Catholic pedophilia scandal or the many cover-ups in the military and in police departments, for example) seems to be part and parcel of that. The column begins:

IF you want to see just how long an academic institution can tolerate a string of slow, festering research scandals, let me invite you to the University of Minnesota, where I teach medical ethics.

Over the past 25 years, our department of psychiatry has been party to the following disgraces: a felony conviction and a Food and Drug Administration research disqualification for a psychiatrist guilty of fraud in a drug study; the F.D.A. disqualification of another psychiatrist, for enrolling illiterate Hmong refugees in a drug study without their consent; the suspended license of yet another psychiatrist, who was charged with “reckless, if not willful, disregard” for dozens of patients; and, in 2004, the discovery, in a halfway house bathroom, of the near-decapitated corpse of Dan Markingson, a seriously mentally ill young man under an involuntary commitment order who committed suicide after enrolling, over the objections of his mother, in an industry-funded antipsychotic study run by members of the department.

And those, unfortunately, are just the highlights.

The problem extends well beyond the department of psychiatry and into the university administration. Rather than dealing forthrightly with these ethical breaches, university officials have seemed more interested in covering up wrongdoing with a variety of underhanded tactics. Reporting in The Star Tribune discovered, for example, that in the felony case, university officials hid an internal investigation of the fraud from federal investigators for nearly four years.

I hope that the situation at the University of Minnesota is exceptional. But I know that at least one underlying cause of our problems is not limited to us: namely, the antiquated bureaucratic apparatus of institutional review boards, or I.R.B.s, which are supposed to protect subjects of medical experimentation. Indeed, whether other institutions have seen the kinds of abuses that have emerged at the University of Minnesota is difficult to know, precisely because the current research oversight system is inadequate to detect them.

The current I.R.B. system of research protection arose in the 1970s. At the time, many reformers believed the main threat to research subjects came from overambitious government and university researchers who might be tempted to overlook the welfare of research subjects.

As a result, the scheme put in place for protecting subjects was not a formal regulatory system but essentially an honor code. Under the I.R.B. system, medical research studies are evaluated — on paper — by a panel of academic volunteers. I.R.B.s do not usually monitor research as it is taking place. They rarely see a research subject or even a researcher face to face. Instead, they simply trust researchers to tell the truth, report mishaps honestly and conduct their studies in the way that they claim to be conducting them.

These days, of course, medical research is not just a scholarly affair. It is also a global, multibillion-dollar business enterprise, powered by the pharmaceutical and medical-device industries. The ethical problem today is not merely that these corporations have plenty of money to grease the wheels of university research. It’s also that researchers themselves are often given powerful financial incentives to do unethical things: pressure vulnerable subjects to enroll in studies, fudge diagnoses to recruit otherwise ineligible subjects and keep subjects in studies even when they are doing poorly.

In what other potentially dangerous industry do we rely on an honor code to keep people safe? . . .

Continue reading.

Voluntary guidelines and self-investigation: it doesn’t work for the police, it doesn’t work for the military, it doesn’t work for industry, and it’s not working here. And you probably know that the USDA now allows meat processors to police themselves. On the honor system.

Written by LeisureGuy

26 May 2015 at 4:56 pm

Posted in Education, Law, Medical

Victims of the modern American national witch hunt

leave a comment »

Radley Balko has a column on a phenomenon of the 1980s and 1990s that is almost unimaginable, except that it actually happened. And still today American courts refuse to recognize what happened.

Well worth reading. It begins:

From the Austin American-Statesman:

The state’s highest criminal court on Wednesday threw out the 1992 sexual assault convictions against Dan and Fran Keller but declined to find the former Austin day care owners innocent of crimes linked to a now-discredited belief that secret satanic cults were abusing day care children nationwide.

The Kellers spent more than 22 years in prison after three young children accused them of dismembering babies, torturing pets, desecrating corpses, videotaping orgies and serving blood-laced Kool-Aid in satanic rituals at their home-based day care.

No evidence of such activities was ever found.

Freed from prison in late 2013 as the case against them crumbled, the Kellers asked the Court of Criminal Appeals to declare them innocent, arguing that they were the victims of inept therapists, shoddy police work and “satanic panic” that swept the nation in the early 1990s.

A unanimous Court of Criminal Appeals instead overturned their convictions based on false testimony by an emergency room doctor whose hospital examination had provided the only physical evidence of sexual assault during the Kellers’ joint trial.

Dr. Michael Mouw later admitted that inexperience led him to misidentify normally occurring conditions as evidence of sexual abuse in a 3-year-old girl.

The nine judges did not provide an explanation for why they rejected the Kellers’ innocence claim except to say their decision was based on the findings of the trial judge “and this court’s independent review of the record.”

The panic actually began in the 1980s. It was instigated and perpetuated mostly by groups of fundamentalist Christians who saw Satan in every heavy metal album, “Smurfs” episode, and Dungeons & Dragons game, along with a quack cadre of psychotherapists who were convinced they could dig up buried memories through hypnosis. What they did instead was shed some light on just how potent the power of suggestion can be. Remarkably, children were convinced to testify about horrifying — and entirely fictional — violations perpetrated on them by care workers and, in some cases, by their own parents.

But it wasn’t just children. As the Kellers’ conviction shows, the panic was so overwhelming, it could convince trained medical professionals to see abuse where there was none. Some defendants were convicted of gruesome crimes such as the aforementioned dismembering of babies despite the fact that there were no corpses and no babies missing from the immediate area.

Ultimately, the panic and power of suggestion was pervasive enough to dupe our entire criminal justice system, as dozens of innocent people were sent to prison for crimes for which there was no evidence other than the coerced testimony of kids, and for which those same defendants would later be exonerated. Here’s an excerpt from the concurring opinion of Judge Cheryl Johnson, who would have declared the couple innocent: . . .

Continue reading.

Two videos at the link.

Written by LeisureGuy

26 May 2015 at 2:23 pm

The two sleeps of a natural night

leave a comment »

Interesting article by Clark Strand in the Washington Post:

What if you could meditate like a Tibetan lama with no instruction whatsoever — and without having to subscribe to any religious beliefs?

People hear a question like that and, unless they are particularly gullible, they assume they’re about to be scammed. But in this case there is nothing to buy — no tapes, no app, no religious agenda that gets sprung on you at the last moment when you’re feeling vulnerable and spiritually open. No hidden fees.

But that doesn’t mean there isn’t a catch. You have to be willing to revert to a Paleolithic pattern of sleep — and that means turning off your electric lights at dusk and leaving them off until dawn. Do that, and in about three week’s time, beginning around six hours after sunset each evening, you will find yourself experiencing a period of serene wakefulness that was once a nightly meditation retreat for all Homo sapiens on Earth. It’s a guarantee. It’s encoded in your genes.

During the mid-1990s, sleep researcher Thomas Wehr conducted a National Institutes of Health experiment that he later called an exercise in “archaeology, or human paleobiology.” Wehr wanted to find out if modern humans still carried within them the rhythms for a prehistoric mode of sleep. Did prehistoric humans sleep more? Did they sleep differently — or perhaps better?

Wehr’s logic was simple: Aided by the stimulating effects of all kinds of artificial lighting (everything from laptop screens to the bright lights of big cities), modern humans had compressed their sleep nights, like their work days, into convenient eight-hour blocks. And yet, given that light-assisted wakefulness was a relatively new invention, wasn’t it possible that human beings still carried in their DNA the remnants of a more primordial pattern of sleep?

The results were staggering. For one month, beginning at dusk and ending at dawn, Wehr’s subjects were removed from every possible form of artificial light. During the first three weeks, they slept as usual, only for about an hour longer. (After all, he reasoned, like most Americans, they were probably sleep deprived.) But at week four a dramatic change occurred. The participants slept the same number of hours as before, but now their sleep was divided in two. They began each night with about four hours of deep sleep, woke for two hours of quiet rest, then slept for another four.

During the gap between their “first” and “second” sleep, Wehr’s subjects were neither awake nor fully asleep. Rather, they experienced a condition they had never known before — a state of consciousness all its own. Later Wehr would compare it to what advanced practitioners experience in meditation — what you might call “mindfulness” today. But there weren’t any mindfulness practitioners in his study. They were simply ordinary people who, removed for one month from artificial lighting, found their nights broken in two.

While trying to account for the peace and serenity that his subjects reported feeling during their hours of “quiet rest,” Wehr discovered that . . .

Continue reading.

Clark Strand is the author of a book about these ideas: Waking Up to the Dark: Ancient Wisdom for a Sleepless Age

Written by LeisureGuy

24 May 2015 at 9:41 am

Training Young Doctors: The Current Crisis

leave a comment »

Sometimes it seems that America has lost the capacity to address large social problems—infrastructure being a prime example. But look also at the continuing breakdown of the medical/healthcare system. Lara Goitein reviews a recent book in the NY Review of Books:

Let Me Heal: The Opportunity to Preserve Excellence in American Medicine
by Kenneth M. Ludmerer
Oxford University Press, 431 pp., $34.95

In the 1890s, Sir William Osler, now regarded as something of a demigod in American medicine, created at the Johns Hopkins Hospital a novel system for training physicians after graduation from medical school. It required young physicians to reside in the hospital full-time without pay, sometimes for years, to learn how to care for patients under the close supervision of senior physicians.

This was the first residency program. Despite the monastic existence, the long hours, and the rigid hierarchy, Osler’s residents apparently loved it. They felt exalted to be able to learn the practice of medicine under the tutelage of great physicians who based their teachings on science, inquiry, and argument, not tradition. And far from bridling at being at the bottom of the pyramid, they virtually worshiped their teachers, who in turn generally lavished great attention and affection on their charges. Osler’s innovation spread rapidly, and the residency system is still the essential feature of teaching hospitals throughout the country.

Residents are young doctors who have completed medical school and are learning their chosen specialty by caring for patients under the supervision of senior physicians, called attendings. Residents in their first year are called interns. As in Osler’s time, residents work long hours, although they no longer live in the hospital and are now paid a modest salary. The time this training takes varies—three years, for example, to complete a program in internal medicine. Following that, many go on to a few more years of training in subspecialties (for example cardiology, a subspecialty of internal medicine), and at this point they are called fellows.

Together residents and fellows, who now number about 120,000 across the country, are called house officers, and their training is termed graduate medical education (GME). The teaching hospitals where most of this takes place are often affiliated with medical schools, which in turn are often part of universities, and together they make up sometimes gigantic conglomerates, called academic medical centers.

Despite the fact that Osler’s idea lives on, there have been enormous changes over the years, and this is the subject of Kenneth Ludmerer’s meticulous new book, Let Me Heal. Ludmerer, a senior faculty physician and professor of the history of medicine at Washington University in St. Louis, sounds a warning. The Oslerian ideal of faculty and residents forming close relationships and thinking together about each patient is in trouble. Instead, residents, with little supervision, are struggling to keep up with staggering workloads, and have little time or energy left for learning. Attending physicians, for their part, are often too occupied with their own research and clinical practices—often in labs and offices outside of the hospital—to pay much attention to the house officers.

The implications for the public are profound. Nearly anyone admitted to a teaching hospital—and these are the most prestigious hospitals in the country—can expect to be cared for by residents and fellows. Whether house officers are well trained and, most important, whether they have the time to provide good care are crucial. Yet until Ludmerer’s book, there has been very little critical attention to these questions. It’s simply assumed that when you are admitted to a teaching hospital, you will get the best care possible. It’s odd that something this important would be regarded in such a Panglossian way.

Ludmerer refers to graduate medical education in the period between the world wars, following Osler, as the “educational era,” by which he means that the highest priority of teaching hospitals was education. Heads of departments were omnipresent on the wards, and knew the house officers intimately. A network of intense, often lifelong mentorships formed. Ludmerer gives a fascinating account of the propagation of talent; for example, William Halsted, the first chief of surgery at Johns Hopkins, had seventeen chief residents, eleven of whom subsequently established their own surgical residency programs at other institutions. Of their 166 chief residents, eighty-five became prominent faculty members at university medical schools. The influence of the giants of the era of education still reaches us through three, four, or five generations of disciples, and house officers quote Osler even today.

There was a strong moral dimension to this system. Ludmerer writes that “house officers learned that medicine is a calling, that altruism is central to being a true medical professional, and that the ideal practitioner placed the welfare of his patients above all else.” Commercialism was antithetical to teaching hospitals in the era of education. “Teaching hospitals regularly acknowledged that they served the public,” writes Ludmerer, “and they competed with each other to be the best, not the biggest or most profitable.”

Indeed, teaching hospitals deliberately limited their growth to maintain the ideal setting for teaching and research. Ludmerer offers the example of the prestigious Peter Bent Brigham Hospital in Boston (now named the Brigham and Women’s Hospital), which in its 1925 annual report declared that it had “more patients than it can satisfactorily handle…. The last thing it desires is to augment this by patients who otherwise will secure adequate professional service.” They also kept prices as low as possible, and delivered large amounts of charity care. With few exceptions, members of the faculty did not patent medical discoveries or accept gifts from industry, and regularly waived fees for poor patients.

To be sure, this golden age was not pure gold. These physicians were, on the whole, paternalistic toward patients; by today’s standards, many were elitist, sexist, and racist. But they were utterly devoted to what they were doing, and to one another, and put that commitment ahead of everything, including their own self-interest.

World War II brought great changes. In the postwar prosperity, the United States began to invest heavily in science and medicine, with rapid expansion of the National Institutes of Health (NIH), which in turn poured money into research at academic medical centers. In addition, the growth of health insurance led to more hospital admissions. In 1965, the creation of Medicare and Medicaid accelerated this growth enormously. According to Ludmerer, between 1965 and 1990, the number of full-time faculty in medical schools increased more than fourfold, NIH funding increased elevenfold, and revenues of academic medical centers from clinical treatment increased nearly two hundred–fold.

Initially, in the couple of decades following the war, the influx of money and the rapid growth simply gave momentum to the trajectory begun in the era of education. Reinforced by leaders who had trained during that era, the established traditions endured, and teaching hospitals for the most part defended their commitment to educational excellence and public service. However, the close-knit, personal character of graduate medical education began to unravel. By the late 1970s, academic medical centers began to take on the character of large businesses, both in their size and complexity, and in their focus on growth and maximizing revenue. Even if technically nonprofit, the benefits of expansion accrued to everyone who worked there, most particularly the executives and administrators. In 1980, Arnold Relman wrote a landmark article in The New England Journal of Medicine, warning of the emergence of a “medical-industrial complex.”

The growing commercialization of teaching hospitals was exacerbated by a change in the method of payment for hospital care. Health care costs were rising rapidly and unsustainably, and in the 1980s health insurers responded with what has been termed “the revolt of the payers.” Previously, most insurers had paid hospitals according to “fee-for-service,” in which payment was made for each consultation, test, treatment, or other service provided. But now Medicare and other insurers, in an effort to control costs, began to reimburse hospitals less liberally and by “prospective payment” methods, in which the hospital received a fixed payment for each patient’s admission according to the diagnosis. Whatever part of that payment was not spent was the hospital’s gain; if the hospital spent more, it was a loss. Hospitals now had a strong incentive to get patients in and out as fast as possible.

Quite suddenly, the torrent of clinical revenue that had so swollen academic medical centers slowed. Many hospitals did not survive in the new environment (the total number of US hospitals decreased by nearly 20 percent between 1980 and 2000). Those that stayed afloat did so by promoting high-revenue subspecialty and procedural care, for example heart catheterization and orthopedic and heart surgery, which were still lucratively rewarded. They also developed more extensive relationships with pharmaceutical and biotech companies and manufacturers of medical devices, which paid them for exclusive marketing rights to drugs or technologies developed by faculty, as well as access to both patients and faculty for research and marketing purposes.1. . .

Continue reading.

Written by LeisureGuy

24 May 2015 at 6:37 am

Everything Sounds the Same When You’re Depressed

leave a comment »

Interesting finding: The cocktail-party effect—the ability to focus on one particular conversation in the babble of a cocktail party—is beyond the capability of those suffering from depression.

Written by LeisureGuy

22 May 2015 at 8:31 am

Posted in Mental Health, Science

Does being a jerk pay off, long term?

leave a comment »

In looking at the range of CEOs, legislators, judges, and so on, it would seem that in some cases being a jerk has paid off. But now the question is getting some serious study. However, the results can change by how “success” is measured.

Written by LeisureGuy

21 May 2015 at 2:53 pm

Follow

Get every new post delivered to your Inbox.

Join 1,902 other followers

%d bloggers like this: