Later On

A blog written for those whose interests more or less match mine.

Archive for December 2017

A must-read: “My Vagina Is Terrific. Your Opinion About It Is Not.”

leave a comment »

Jen Gunther, an obstetrician and gynecologist practicing in California, has a column in the NY Times that describes the power of memes:

There is a rash of men explaining vaginas to me.

That is what I have decided to name a collective of mansplainers. A murder of crows, a parliament of owls, a rash of mansplainers. In medicine a rash can be a mild annoyance that goes away and never returns. A rash can also portend a serious medical condition, even something malignant.

There have always been a few men here and there explaining vaginas to me. I have suffered fools eager to use pickup lines about being an amateur gynecologist, detailing their imagined superior knowledge of female anatomy and physiology. Men who think sitting beside them at a bar and smiling — because if you don’t smile, you get told to smile — is an invitation to tell you how they will make you scream and moan.

I know that many other women have had their vaginas explained to them, because for the past 25 years my career has been dedicated to treating vaginal and vulvar problems. I have listened to women with completely normal exams weep that they have been told that they do not smell or taste correctly. That they are too wet, or too loose, or too gross.

These women all shared something: They were told these things by men. While I admit this is anecdotal data, my years of listening to secret shame about healthy vaginas and vulvas seems to suggest it is largely, if not entirely, male partners who exploit vaginal and vulvar insecurities as a weapon of emotional abuse and control.

But it was the Vicks VapoRub that put me over the edge. . .

Continue reading.

Written by LeisureGuy

31 December 2017 at 11:46 am

Now Is the Time to Be a Deficit Hawk

leave a comment »

Very interesting post by Kevin Drum.

Written by LeisureGuy

31 December 2017 at 11:38 am

How We Know It Was Climate Change

leave a comment »

Noah Diffenbaugh, a professor of earth system science at Stanford, writes in the NY Times:

his was a year of devastating weather, including historic hurricanes and wildfires here in the United States. Did climate change play a role? Increasingly, scientists are able to answer that question — and increasingly, the answer is yes.

My lab recently published a new framework for examining connections between global warming and extreme events. Other scientists are doing similar research. How would we go about testing whether global warming has influenced the events that occurred this year?

Consider Hurricane Harvey, which caused enormous destruction along the Gulf Coast; it will cost an estimated $180 billion to recover from the hurricane’s storm surge, high winds and record-setting precipitation and flooding. Did global warming contribute to this disaster?

The word “contribute” is key. This doesn’t mean that without global warming, there wouldn’t have been a hurricane. Rather, the question is whether changes in the climate raised the odds of producing extreme conditions.

Hurricanes are complicated business. While there is evidence that global warming should increase the frequency of very intense storms, their rarity and complexity make it difficult to detect climate change’s fingerprint.

It is therefore critical to examine all of the contributing factors. In the case of Hurricane Harvey, these include the warm ocean that provided energy for the storm; the elevated sea level on top of which the storm surge occurred; the atmospheric pressure pattern that contributed to the storm’s stalling over the coast; and the atmospheric water vapor that provided moisture for the record-setting precipitation.

In examining these factors, scientists are deeply skeptical: We start with the assumption that each condition arose by chance, and then require a very heavy burden of proof to reject that assumption (analogous to the “beyond a reasonable doubt” standard in criminal cases).

The first step is to ask whether historical changes have been observed in any of the factors. For example, ocean temperatures have increased in recent decades. Applying the same statistical techniques used in engineering, medicine and finance, we can analyze whether those increases have changed the odds of achieving this year’s warm temperatures in the tropical Atlantic and Gulf of Mexico.

But identifying a trend doesn’t tell us the cause. For that, we run controlled experiments using computerized climate models that simulate conditions in previous decades, with and without the variable of human-generated greenhouse gases in the atmosphere. By comparing those experiments with historical weather data, we can quantify how likely an event is with and without human-generated warming. Based on previous warm years, we can expect to find that human-generated warming influenced this year’s ocean temperatures.

We also know that global warming is increasing the moisture in the atmosphere, meaning that a given storm can produce more precipitation. Analyses by Kerry Emanuel of M.I.T. and others since the storm show that global warming makes heavy rainfall during storms like Hurricane Harvey more likely.

Further, Hurricane Harvey’s stalling over the coast was critical for the record rainfall. The exact meteorological causes are complex, but the pattern of atmospheric pressure across North America played an important role. We have found that global warming increased the odds of the pressure pattern that contributed to the 2010 Russian heat wave that killed more than 50,000 people. We can likewise look back at pressure patterns during past hurricane seasons and examine whether global warming has altered the odds of patterns similar to Hurricane Harvey’s.

In addition to the heavy rainfall, storm surge contributed to coastal flooding. When hurricanes make landfall, low pressure and strong winds push water onto land. By increasing the mean sea level, global warming has “raised the floor” from which storm surge occurs. As a result, a storm is more likely to cause extensive flooding. Sea-level rise tripled the odds of Hurricane Sandy’s flood level in 2012. A similar analysis can be applied to the Hurricane Harvey storm surge.

So, what role did climate change play in Hurricane Harvey?  . . .

Continue reading.

Written by LeisureGuy

31 December 2017 at 8:55 am

Netflix Original “Bright” well worth watching

leave a comment »

Gritty fantasy and police movie with good comic touches. (Will Smith is a star, so naturally.) Not a serious movie, but an enjoyable movie, and exploring extended racism.

Written by LeisureGuy

30 December 2017 at 8:00 pm

Posted in Movies & TV

John Portman, Architect Who Made Skylines Soar, Dies at 93

leave a comment »

Robert McFadden writes in the NY Times:

John Portman, the architect and developer who revolutionized hotel designs with soaring futuristic atriums, built commercial towers that revitalized the downtowns of decaying postwar American cities and transformed Asian skylines from Shanghai to Mumbai, died on Friday in Atlanta. He was 93.

Mr. Portman’s family announced his death. No cause was given.

One of the world’s best-known and most influential architects, Mr. Portman, over a half-century, redefined urban landscapes in the United States. He built the Peachtree Center in Atlanta, the Embarcadero Center in San Francisco, the Renaissance Center in Detroit and scores of hotel, office and retail complexes in New York, Los Angeles, Chicago, Houston, Fort Worth, San Diego and other cities.

His buildings often evoked oohs and aahs from the public, but were not always a hit with critics, who called them concrete islands, self-contained cities within cities — serving their patrons yet insular, even forbidding to outsiders. But by combining architectural talents with the savvy of a real estate entrepreneur, Mr. Portman was hugely successful and a rarity among contemporaries: both an artist and a tough businessman.

In the 1960s and ’70s, his signature hotels — skyscrapers with escarpment atriums, cantilevered balconies overlooking interiors big enough to contain the Statue of Liberty, whooshing glass elevators, waterfalls, hanging gardens and revolving rooftop restaurants — offered thrilling antidotes to the standard lot of dreary hotel lobbies, claustrophobic box elevators and shotgun corridors lined with cells for the inmates. . .

Continue reading.

Written by LeisureGuy

30 December 2017 at 1:48 pm

Posted in Art, Business, Daily life, Memes

Good quotation in “The Amber Compass”

leave a comment »

Beginning of chapter 18:

O that it were possible we might

But hold some two days’ conference with the dead.

— John Webster

Written by LeisureGuy

30 December 2017 at 12:41 pm

Posted in Books

Toasted-sesame-oil mayo

leave a comment »

Just made a batch of mayo, using 2 tablespoons toasted sesame oil with enough olive oil to make 1 cup, following this method. I thought about including 1 teaspoon soy sauce or tamari sauce, but decided not to this time. I did use two anchovies, but I skipped the 1 tablespoon of Dijon mustard (though it might be interesting to use 1 tablespoon Chinese mustard).

Very tasty, very easy, very quick. It’s unclear to me why people buy mayonnaise in the store, especially if you read the ingredients.

Written by LeisureGuy

30 December 2017 at 11:59 am

Rod Neep brush and asses’ milk shaving soap, with Rockwell 6S R3

with 2 comments

A very pleasant shave this morning. The asses’ milk shaving soap is still excellent. (Steve of Quebec found that if you leave the lid off this soap, it loses it mojo. In contrast, Martin de Candre explicitly instructs that the lid for his soap is only for shipping and it should be removed/discarded once the soap is in use.)

The Rod Neep brush shown has a coin struck in my natal year embedded in the base, which is a nice touch.

With the excellent lather, the Rockwell did an easy, pleasant, and effective job, and the splash of Bathhouse aftershave at the end was pleasant. I love the ingredients list of that aftershave, and it is in fact a good aftershave so far as I’m concerned. (If you click the photo twice, it will enlarge enough so that you can read the label.)

Written by LeisureGuy

30 December 2017 at 10:31 am

Posted in Shaving

Where pain lives

leave a comment »

Cathryn Jakobson Ramin writes in Aeon:

For patient after patient seeking to cure chronic back pain, the experience is years of frustration. Whether they strive to treat their aching muscles, bones and ligaments through physical therapy, massage or rounds of surgery, relief is often elusive – if the pain has not been made even worse. Now a new working hypothesis explains why: persistent back pain with no obvious mechanical source does not always result from tissue damage. Instead, that pain is generated by the central nervous system (CNS) and lives within the brain itself.

I caught my first whiff of this news about eight years ago, when I was starting the research for a book about the back-pain industry. My interest was both personal and professional: I’d been dealing with a cranky lower back and hip for a couple of decades, and things were only getting worse. Over the years, I had tried most of what is called ‘conservative treatment’ such as physical therapy and injections. To date, it had been a deeply unsatisfying journey.

Like most people, I was convinced that the problem was structural: something had gone wrong with my skeleton, and a surgeon could make it right. When a neuroscientist I was interviewing riffed on the classic lyric from My Fair Lady, intoning: ‘The reign of pain is mostly in the brain,’ I was not amused. I assumed that he meant that my pain was, somehow, not real. It was real, I assured him, pointing to the precise location, which was a full yard south of my cranium.

Like practically everyone I knew with back pain, I wanted to have a spinal MRI, the imaging test that employs a 10-ft-wide donut-shaped magnet and radio waves to look at bones and soft tissues inside the body. When the radiologist’s note identified ‘degenerative disc disease’, a couple of herniated discs, and several bone spurs, I got the idea that my spine was on the verge of disintegrating, and needed the immediate attention of a spine surgeon, whom I hoped could shore up what was left of it.

Months would pass before I understood that multiple studies, dating back to the early 1990s, evaluating the usefulness of spinal imaging, had shown that people who did not have even a hint of lower-back pain exhibited the same nasty artefacts as those who were incapacitated. Imaging could help rule out certain conditions, including spinal tumours, infection, fractures and a condition called cauda equina syndrome, in which case the patient loses control of the bowel or bladder, but those diagnoses were very rare. In general, the correlation between symptoms and imaging was poor, and yet tens of thousands of spinal MRIs were ordered every year in the United States, the United Kingdom and Australia.

Very often, the next stop was surgery.  For certain conditions, such as a recently herniated disc that is pressing on a spinal nerve root, resulting in leg pain or numbness coupled with progressive weakness, or foot drop, a nerve decompression can relieve the pain. The problem is that all surgeries carry risks, and substantial time and effort is required for rehabilitation. After a year, studies show, the outcomes of patients who opt for surgery and those who don’t are approximately the same.

More invasive surgeries carry greater risks. Lumbar spinal fusion – surgery meant to permanently anchor two or more vertebrae together, eliminating any movement between them – is recognised as particularly hazardous. Even when the vertebral bones fuse properly, patients often do not get relief from the pain that sent them to the operating room. Beyond that, fusion surgery often results in ‘adjacent segment deterioration’, requiring a revision procedure.

In the US, about 80,000 spine procedures fail each year , and one in five patients returns for another operation. Typically, second, third and fourth attempts have an even lower chance of success, and patients continue to require painkillers over the long term. Even the procedures that surgeons deem successful, because the bones fuse and look perfect on a scan, are often unhelpful to patients. In one study, two years after spinal fusion, patients’ pain had barely been reduced by half, and most patients continued to use painkillers. Given such unimpressive outcomes, the cost of treating back pain is unacceptably high. Spine surgery costs a fortune, but otherapproaches, including epidural steroid injections, physical therapy and chiropractic treatment, are also expensive.

Including direct medical expenses and indirect expenses such as lost earnings, spine care costs the US about $100 billion a year. In the UK, that tab is about £10.6 billion (c$13.6 billion). In Australia, it’s A$1.2 billion (c$950 million). Many of these costs derive from the loss of productivity, as people take time off from work. Others result from the devastation wrought by addiction to prescription opioids. In Australia, between 1992 and 2012, prescription opioid dispensing increased 15-fold, and the cost to the Australian government increased more than 32-fold.

Pain falls into four basic categories. There’s nociceptive pain, the normally short-lived kind you feel when you accidentally slam your finger in the car door. There’s inflammatory pain, a response to damage or infection, resulting in a rush of small proteins called inflammatory cytokines to the site of the casualty. That pain has a habit of spreading, to affect everything in the vicinity. Beyond that, there’s neuropathic pain, known as ‘radiculopathy’. It results, usually, from an insult to a nerve, culminating in burning, tingling or shock-like sensations that travel the length of the affected nerve (sciatic pain is a good example).

‘As pain becomes more centralised, it becomes increasingly more difficult and less relevant to identify the initial source’

When any of those three types of pain sticks around long after the inciting injury has healed – or in the absence of any noxious stimulus – the patient can be said to be suffering from ‘central sensitisation’. Central sensitisation is a condition in which even mild injury can lead to a hyperactive and persistent response from the central nervous system.

The CNS includes the dorsal root ganglia, containing the cell bodies of sensory neurons that allow information to travel from the peripheral sites to the spinal cord and the brain. The peripheral nervous system (PNS) consists of the nerves beyond the brain and the spinal cord, serving all parts of the body that the CNS does not, comprising roughly 40 miles of nerve fibres, if they were laid out, end to end.

‘As pain becomes more centralised,’ wrote Clifford Woolf, a neurologist and neurobiologist at Harvard Medical School, ‘it becomes increasingly more difficult and less relevant to identify the initial source.’

More than three centuries ago, the French philosopher, mathematician and natural scientist René Descartes advanced the heretical idea that pain was not a punishment from God, nor a test or trial to be endured, for which prayer was the only intervention. Instead, he said, pain existed as a mechanical response to physical damage. His work Treatise of Man would not be published until after he died (some say because he feared persecution by Christian authorities, for whom the threat of pain was a useful recruitment tool). But when the volume finally emerged, Descartes posited the existence of ‘hollow tubules’ that allowed messages he described as ‘animal spirits’ to travel on a dedicated somatosensory pathway, from the afflicted site to the brain. The intensity of pain, Descartes believed, rose with the severity of tissue damage. In the absence of such damage – a shattered bone, a wound, a burn – pain ought not to exist.

But of course, it did.

In the mid-1960s, two scientists, the Canadian psychologist Ronald Melzack and the British neurobiologist Patrick Wall, both then working at the Massachusetts Institute of Technology, set out to answer the question of how pain could persist in the absence of an injury. It was mostly guesswork. It would be years before neuroimaging would allow them to view the structure of a living human brain.

In their landmark article ‘Pain Mechanisms: A New Theory’ (1965), published in the journal Science, they considered the pathophysiology of chronic pain, based on post-mortem studies, surgical notes, neurofeedback and patients’ reports of their experiences. Ultimately, the two scientists described the ‘gate control theory of pain’, hypothesising that nerve cells in the spinal cord acted as gates, flipping open to allow pain messages to pass through, or closing to prevent such messages from reaching the brain. At times, the scientists posited, the gates became stuck in the open position, allowing pain messages to flow unabated. It was that last little bit – the notion that messages would travel unceasingly, from the PNS to the CNS – that sparked Clifford Woolf’s interest in how pain was generated, and how it could be silenced.

In 1983, Woolf was a young anaesthesiologist with a PhD in neurobiology. As a post-doc, he had worked in Wall’s laboratory, which by that time had moved to University College London. There he observed post-mortem cellular and molecular changes in brain tissue in subjects who had suffered from chronic pain when they were alive.

Later, he had access to high-powered neuroimaging in the form of functional magnetic resonance imaging, or fMRI. This neuroimaging could measure changes in the brain’s blood flow, volume, oxygen or glucose mechanism, allowing Woolf to see how the brain responded to pain in a living subject. Woolf thus began to explore the many ways in which neurons in different brain regions communicate; how they form a greater number of synapses, linking regions that are not normally hot-wired to work in concert; and how those neural changes lead to the perception of pain. He saw that the regions of the brain that responded to acute, experimental pain were different from the regions that were involved in chronic pain. Over the next three decades, Woolf explored the relationship between specific gene phenotypes and chronic pain, looking for potential targets for drug therapy. It would be slow-going, in part because pharmaceutical companies were profitably selling opioid analgesics. When, in the mid-2000s, the efficacy and safety of opioids began to be questioned, Woolf’s work took on new vigour.

By then, the neuroscientist A Vania Apkarian, a professor of physiology, anaesthesiology and physical medicine at Northwestern University’s Feinberg School of Medicine in Chicago, was well into his own study of what happens to specific regions of the brain under the onslaught of chronic pain. For two decades, in his provocatively named Pain and Passions Lab, where his group works with both rodents and humans, Apkarian’s focus has been on pain’s cognitive consequences.

‘When we started this research in 1999,’ Apkarian said, ‘very few people believed that pain was more than nerves sending a signal into one part of the brain.’ With grants from the National Institutes of Neurological Disorders and Stroke – part of the National Institutes of Health (NIH), Apkarian demonstrated that instead of simply responding to externally generated discomfort, under siege the brain itself would begin to generate the pain. ‘The official definition of chronic pain,’ Apkarian wrote in the journal Pain Management, ‘is that it persists past the completion of injury-related healing processes.’

Brain activity in subjects with chronic pain was different from the nociception (perception of harm) evident in patients with experimentally induced pain, for instance, a hot poker placed on a sensitive part of the arm. While nociceptive-provoked pain activated primarily sensory regions – the ones that would cause you to yank your arm out of harm’s way – Apkarian’s group observed that chronic pain activated the prefrontal cortex and the limbic regions of the brain. The prefrontal cortex dictates higher-level thinking, including goal-setting and decision-making, while the limbic regions, including the hippocampus and the nucleus accumbens, govern memory, motivation and pleasure.

In a revelation that set the international media abuzz, Apkarian’s group found that the anatomy of the human brain in patients who suffered from chronic pain was abnormal. In those who had suffered for five years, both the hippocampus and the prefrontal cortex were structurally transformed, sacrificing 5 to 11 per cent of their grey matter density. That was important because the prefrontal cortex, in concert with the hippocampus, dictates how optimistic or depressed patients feel about their prospects, how well they can cope and make decisions about treatment. There’s still a great deal of work to do in this area but, wrote Apkarian, ‘the concept is that the continued, unrelenting pain impacts limbic structures in the brain that in turn entrain the cortex to reflect both the suffering and coping strategies that develop in chronic-pain patients.’ . . .

Continue reading. There’s quite a bit more, and an interesting payoff.

The piece seems to be an extract from a book: Crooked: Outwitting the Back Pain Industry and Getting on the Road to Recovery, by Cathryn Jakobson Ramin.

Written by LeisureGuy

29 December 2017 at 11:21 am

Anatomy of a Trump tweet

leave a comment »

In the NY Times Michael Grynbaum analyzes a Trump tweet:

President Trump, vacationing in Palm Beach, Fla., was relatively subdued on Twitter this week — at least until Thursday, when he launched a somewhat arcane broadside at the magazine Vanity Fair. The tweet, dense by Trumpian standards, was a bit confounding, given its highly specific references to rumors and tensions within the Manhattan magazine industry, and a minor Twitter kerfuffle involving a 63-second video about Hillary Clinton.

Phew. Let’s break this down.

1. Vanity Fair: Glossy monthly that has long been a thorn in Mr. Trump’s side; its recently retired editor, Graydon Carter, coined the epithet “short-fingered vulgarian” for the president. On at least 20 occasions, Mr. Trump has tweeted that the magazine is struggling, hence “on its last legs.”

2. apologizing for the minor hit: Last Friday, Vanity Fair’s site The Hive posted a video in which staff members suggested New Year’s resolutions for Hillary Clinton, like they did for a number of people — including the president.

The jokes were not kind; Mrs. Clinton was urged, for instance, to take up a new hobby — “volunteer work, knitting, improv comedy, literally anything that’ll keep you from running again.” This did not sit well among Mrs. Clinton’s fans, including the actress Patricia Arquette, who tweeted at Vanity Fair to “STOP TELLING WOMEN” what “THEY SHOULD DO OR CAN DO.” The hashtag #CancelVanityFair trended.

On Wednesday, Vanity Fair said the video “was an attempt at humor and we regret that it missed the mark.” Whether a 14-word statement amounts to “bending over backwards” is up to interpretation.

3. Anna Wintour:  . . .

Continue reading.

Written by LeisureGuy

29 December 2017 at 9:28 am

Encryption Lava Lamps

leave a comment »

Algorithms cannot produce random numbers. The best algorithms can do is pseudo-random numbers in which the pattern is a long cycle. For true randomness, one must look to the physical world: radioactive decay, for example, or… lava lamps.

Atlas Obscura reports:

What’s encrypting your web traffic as you surf the internet? An advanced algorithm created by a supercomputer? Actually, if the site you’re visiting is encrypted by the cybersecurity firm Cloudflare, your activity may be protected by nothing other than a wall of lava lamps. There couldn’t possibly be a groovier way to keep the internet secure.

Cloudflare covers about 10 percent of international web traffic, including the websites for Uber, OKCupid, or FitBit, for instance. And the colorful wall of lava lamps in the company’s San Francisco headquarters might be what’s generating the random code. The wall features over 100 lava lamps, spanning a variety of colors, and its random patterns deter hackers from accessing data.

As the lava lamps bubble and swivel, a video camera on the ceiling monitors their unpredictable changes and connects the footage to a computer, which converts the randomness into a virtually unhackable code.

Why use lava lamps for encryption instead of computer-generated code? Since computer codes are created by machines with relatively predictable patterns, it is entirely possible for hackers to guess their algorithms, posing a security risk. Lava lamps, on the other hand, add to the equation the sheer randomness of the physical world, making it nearly impossible for hackers to break through.

While you might think that such an important place would kept in secret and locked off from the public, it’s actually possible for visitors to witness these lava lamps in person. Simply enter the lobby of Cloudflare’s San Francisco headquarters and ask to see the lava lamp display. . .

Continue reading.

Written by LeisureGuy

29 December 2017 at 9:06 am

Posted in Technology

Fine brush, Fine slant, fine shave

leave a comment »

I tested the prototype of this razor, and I have to say the production version is noticeably nicer: it’s more highly polished, and the grooving on the handle is crisp and grippy. The baseplate comes off easily but still allows no play, and the threads are extremely nice when screwing the handle on and off: the feel of precision.

But first, as always, prep. I went with Phoenix Artisan’s Honeysuckle shaving soap because it’s a cold, dark, rainy morning and the fragrance of flowers is most welcome—plus I really like how it makes my skin feel.

The razor is excellent. I used a Derby Extra blade for the first shave, and I did get a couple of small nicks, but I put that down to still learning the razor. The overall feel is very comfortable, and it is indeed efficient. The design goal was to match the vintage Merkur white bakelite razor in feel and performance, and I would say the razor is a success. I’ll comment further as I gain more experience, but I’m happy with it.

A good splash of Honeysuckle aftershave and we advance inexorably toward the new year.

Written by LeisureGuy

29 December 2017 at 8:54 am

Posted in Shaving

Looking at a problem from a new angle can have enormous benefits: Example

leave a comment »

From Quora:

Written by LeisureGuy

28 December 2017 at 9:29 pm

Hmm. Trump: Even if there was collusion with Russia, ‘it’s not a crime’

leave a comment »

I find that “even if” pretty much a dead giveaway, as anyone who has kids knows. I’d say Trump thinks something is coming out, particularly with the White House explicitly poised to attack Flynn as liar, low moral character, whatever: the same Flynn who was once so highly praised, admired, and even had the president trying to protect him. So calling him a liar after having so obviously trusted him is another dead giveaway.

Written by LeisureGuy

28 December 2017 at 8:48 pm

Life expectancy in US down for second year in a row as opioid crisis deepens

leave a comment »

Update: See also Kevin Drum’s post “What’s Really Causing the Decline in US Life Expectancy? It’s Not Opioid Overdoses.”

Jessica Glenza reports in the Guardian:

Life expectancy in the US has declined for the second year in a row as the opioid crisis continues to ravage the nation.

It is the first time in half a century that there have been two consecutive years of declining life expectancy.

Drug overdoses killed 63,600 Americans in 2016, an increase of 21% over the previous year, researchers at the National Center for Health Statistics found.

Americans can now expect to live 78.6 years, a decrease of 0.1 years. The US last experienced two years’ decline in a row in 1963, during the height of the tobacco epidemic and amid a wave of flu.

“We do occasionally see a one-year dip, even that doesn’t happen that often, but two years in a row is quite striking,” said Robert Anderson, chief of the mortality statistics branch with the National Center for Health Statistics. “And the key driver of that is the increase in drug overdose mortality.”

Especially disconcerting, said Anderson, was preliminary data researchers received about overdoses in 2017: “It doesn’t look any better.” Together, the drug overdose epidemic and a plateau in improved mortality rates from cardiovascular disease are “affecting the entire national picture”.

“We haven’t seen more than two years in a row in declining life expectancy since the Spanish flu – 100 years ago,” said Anderson. “We would be entering that sort of territory, which is extremely concerning.”

Widely available prescription painkillers . . .

Continue reading.

Trump has tried to get some PR mileage from doing things that garner publicity but do nothing whatsoever to address the problem. So it’s getting worse. Big surprise.

Not a great time to try to take healthcare insurance away from millions.

Again, an obvious and glaring sign of the decline of the United States: It can’t even take care of its own citizens—or, more accurately, it won’t take care of its own citizens. I would guess that’s because of severe inequality: the ruling class of the very wealthiest Americans now feel totally separate from the rest of the country and really don’t care what happens to it so long as they can drain even more money from the public treasury. Avaricious greed knows no bounds (cf. Donald J. Trump).

Update: Another sign of decline: “FDA lacks “efficient and effective” food recall process, inspector general finds.”

Written by LeisureGuy

28 December 2017 at 1:25 pm

Yuval Harari, author of “Sapiens: A Brief History of Humankind,” has a new book

leave a comment »

Here are the opening paragraphs of Homo Deus: A Brief History of Tomorrow:

At the dawn of the third millennium, humanity wakes up, stretching its limbs and rubbing its eyes. Remnants of some awful nightmare are still drifting across its mind. ‘There was something with barbed wire, and huge mushroom clouds. Oh well, it was just a bad dream.’ Going to the bathroom, humanity washes its face, examines its wrinkles in the mirror, makes a cup of coffee and opens the diary. ‘Let’s see what’s on the agenda today.’

For thousands of years the answer to this question remained unchanged. The same three problems preoccupied the people of twentieth-century China, of medieval India and of ancient Egypt. Famine, plague and war were always at the top of the list. For generation after generation humans have prayed to every god, angel and saint, and have invented countless tools, institutions and social systems – but they continued to die in their millions from starvation, epidemics and violence. Many thinkers and prophets concluded that famine, plague and war must be an integral part of God’s cosmic plan or of our imperfect nature, and nothing short of the end of time would free us from them.

Yet at the dawn of the third millennium, humanity wakes up to an amazing realisation. Most people rarely think about it, but in the last few decades we have managed to rein in famine, plague and war. Of course, these problems have not been completely solved, but they have been transformed from incomprehensible and uncontrollable forces of nature into manageable challenges. We don’t need to pray to any god or saint to rescue us from them. We know quite well what needs to be done in order to prevent famine, plague and war – and we usually succeed in doing it.

True, there are still notable failures; but when faced with such failures we no longer shrug our shoulders and say, ‘Well, that’s the way things work in our imperfect world’ or ‘God’s will be done’. Rather, when famine, plague or war break out of our control, we feel that somebody must have screwed up, we set up a commission of inquiry, and promise ourselves that next time we’ll do better. And it actually works. Such calamities indeed happen less and less often. For the first time in history, more people die today from eating too much than from eating too little; more people die from old age than from infectious diseases; and more people commit suicide than are killed by soldiers, terrorists and criminals combined. In the early twenty-first century, the average human is far more likely to die from bingeing at McDonald’s than from drought, Ebola or an al-Qaeda attack.

Hence even though presidents, CEOs and generals still have their daily schedules full of economic crises and military conflicts, on the cosmic scale of history humankind can lift its eyes up and start looking towards new horizons. If we are indeed bringing famine, plague and war under control, what will replace them at the top of the human agenda? Like firefighters in a world without fire, so humankind in the twenty-first century needs to ask itself an unprecedented question: what are we going to do with ourselves? In a healthy, prosperous and harmonious world, what will demand our attention and ingenuity? This question becomes doubly urgent given the immense new powers that biotechnology and information technology are providing us with. What will we do with all that power?

Before answering this question, we need to say a few more words about famine, plague and war. The claim that we are bringing them under control may strike many as outrageous, extremely naïve, or perhaps callous. What about the billions of people scraping a living on less than $2 a day? What about the ongoing AIDS crisis in Africa, or the wars raging in Syria and Iraq? To address these concerns, let us take a closer look at the world of the early twenty-first century, before exploring the human agenda for the coming decades. . .

I bought it and am reading it now.

Written by LeisureGuy

28 December 2017 at 1:15 pm

Why I, like most cooks, use Diamond Crystal brand kosher salt, not Morton’s

leave a comment »

I stumbled onto the better quality of Diamond Crystal kosher salt by accident.  I had always used Morton’s kosher salt (thinking using kosher salt was merely to avoid iodide), but then picked up a box of Diamond Crystal and saw the light. Here’s why.

Written by LeisureGuy

28 December 2017 at 12:37 pm

Autonomous killing machines are already here: We call them “corporations”

leave a comment »

Ted Chiang has an article worth reading in Buzzfeed. Here’s who he is:

Ted Chiang is an award-winning writer of science fiction. Over the course of 25 years and 15 stories, he has won numerous awards including four Nebulas, four Hugos, four Locuses, and the John W. Campbell Award for Best New Writer. The title story from his collection, Stories of Your Life and Others, was adapted into the movie Arrival, starring Amy Adams and directed by Denis Villeneuve. He freelances as a technical writer and currently resides in Bellevue, Washington, and is a graduate of the Clarion Writers Workshop.

The article begins:

This summer, Elon Musk spoke to the National Governors Association and told them that “AI is a fundamental risk to the existence of human civilization.” Doomsayers have been issuing similar warnings for some time, but never before have they commanded so much visibility. Musk isn’t necessarily worried about the rise of a malicious computer like Skynet from The Terminator. Speaking to Maureen Dowd for a Vanity Fair article published in April, Musk gave an example of an artificial intelligence that’s given the task of picking strawberries. It seems harmless enough, but as the AI redesigns itself to be more effective, it might decide that the best way to maximize its output would be to destroy civilization and convert the entire surface of the Earth into strawberry fields. Thus, in its pursuit of a seemingly innocuous goal, an AI could bring about the extinction of humanity purely as an unintended side effect.

This scenario sounds absurd to most people, yet there are a surprising number of technologists who think it illustrates a real danger. Why? Perhaps it’s because they’re already accustomed to entities that operate this way: Silicon Valley tech companies.

Consider: Who pursues their goals with monomaniacal focus, oblivious to the possibility of negative consequences? Who adopts a scorched-earth approach to increasing market share? This hypothetical strawberry-picking AI does what every tech startup wishes it could do — grows at an exponential rate and destroys its competitors until it’s achieved an absolute monopoly. The idea of superintelligence is such a poorly defined notion that one could envision it taking almost any form with equal justification: a benevolent genie that solves all the world’s problems, or a mathematician that spends all its time proving theorems so abstract that humans can’t even understand them. But when Silicon Valley tries to imagine superintelligence, what it comes up with is no-holds-barred capitalism.


In psychology, the term “insight” is used to describe a recognition of one’s own condition, such as when a person with mental illness is aware of their illness. More broadly, it describes the ability to recognize patterns in one’s own behavior. It’s an example of metacognition, or thinking about one’s own thinking, and it’s something most humans are capable of but animals are not. And I believe the best test of whether an AI is really engaging in human-level cognition would be for it to demonstrate insight of this kind.

Insight is precisely what Musk’s strawberry-picking AI lacks, as do all the other AIs that destroy humanity in similar doomsday scenarios. I used to find it odd that these hypothetical AIs were supposed to be smart enough to solve problems that no human could, yet they were incapable of doing something most every adult has done: taking a step back and asking whether their current course of action is really a good idea. Then I realized that we are already surrounded by machines that demonstrate a complete lack of insight, we just call them corporations. Corporations don’t operate autonomously, of course, and the humans in charge of them are presumably capable of insight, but capitalism doesn’t reward them for using it. On the contrary, capitalism actively erodes this capacity in people by demanding that they replace their own judgment of what “good” means with “whatever the market decides.”

Because corporations lack insight, we expect the government to provide oversight in the form of regulation, but the internet is almost entirely unregulated. Back in 1996, John Perry Barlow published a manifesto saying that the government had no jurisdiction over cyberspace, and in the intervening two decades that notion has served as an axiom to people working in technology. Which leads to another similarity between these civilization-destroying AIs and Silicon Valley tech companies: the lack of external controls. If you suggest to an AI prognosticator that humans would never grant an AI so much autonomy, the response will be that you fundamentally misunderstand the situation, that the idea of an ‘off’ button doesn’t even apply. It’s assumed that the AI’s approach will be “the question isn’t who is going to let me, it’s who is going to stop me,” i.e., the mantra of Ayn Randian libertarianism that is so popular in Silicon Valley.

The ethos of startup culture could serve as a blueprint for civilization-destroying AIs. “Move fast and break things” was once Facebook’s motto; they later changed it to “Move fast with stable infrastructure,” but they were talking about preserving what they had built, not what anyone else had. This attitude of treating the rest of the world as eggs to be broken for one’s own omelet could be the prime directive for an AI bringing about the apocalypse. When Uber wanted more drivers with new cars, its solution was to persuade people with bad credit to take out car loans and then deduct payments directly from their earnings. They positioned this as disrupting the auto loan industry, but everyone else recognized it as predatory lending. The whole idea that disruption is something positive instead of negative is a conceit of tech entrepreneurs. If a superintelligent AI were making a funding pitch to an angel investor, converting the surface of the Earth into strawberry fields would be nothing more than a long overdue disruption of global land use policy.

There are industry observers talking about the need for AIs to have a sense of ethics, and some have proposed that we ensure that any superintelligent AIs we create be “friendly,” meaning that their goals are aligned with human goals. I find these suggestions ironic given that we as a society have failed to teach corporations a sense of ethics, that we did nothing to ensure that Facebook’s and Amazon’s goals were aligned with the public good. But I shouldn’t be surprised; the question of how to create friendly AI is simply more fun to think about than the problem of industry regulation, just as imagining what you’d do during the zombie apocalypse is more fun than thinking about how to mitigate global warming.

There have been some impressive advances in AI recently, like  . . .

Continue reading.

Written by LeisureGuy

28 December 2017 at 10:52 am

The near future: Autonomous killerbots from unknown sources

leave a comment »

This was all discussed in Daniel Suarez’s excellent tech-sci-fi novel Kill Decision: swarms of small (and inexpensive) autonomous killerbots. Here’s another view, via Jason Kottke (from a post worth reading):

Written by LeisureGuy

28 December 2017 at 10:01 am

A two-shave post: yesterday’s (102) and today’s (Old Type)

with one comment

Above is yesterday’s shave, and you’ll note a nice touch: the lid of the tub of soap is tilted to align with the tilt of the 102’s handle. It was an accident, but I like it.

I realized as I splashed on the aftershave I should have used Barrister & Mann’s Reserve Spice instead of their Reserve Classic, the better to match the spice theme of the soap.

The 102 is a truly great razor, IMO.

A tobacco theme, using Phoenix Artisan’s Cavendish aftershave splash to match the HTGAM Cavendish shaving soap from some years back. (Phoenix Artisan does offer a Cavendish soap now, in a regular tub, and I would be in an improved formula.)

Still, this was a fine shave. I didn’t shake out the synthetic Edwin Jagger quite enough, but still got a fine lather. The large (5″ diameter) work surface offered by the soap helped.

The RazoRock Old Type is another superb razor: extremely comfortable, reluctant to nick, and highly efficient.

Yesterday, what with one thing and another, I simply forgot to post the shave.

Written by LeisureGuy

28 December 2017 at 9:13 am

Posted in Shaving

%d bloggers like this: