Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Books’ Category

The Town That Went Feral

leave a comment »

In the New Republic Patrick Blanchfield reviews a brief history of an effort to put Libertarianism into practice in Grafton NH. (Like all previous attempts, it was an utter failure, and for the same reason: a reliance on mere logic, with no consideration given to experience — and as Oliver Wendell Holmes Jr. observed, “The life of the law has not been logic; it has been experience.”

The review begins:

In its public-education campaigns, the U.S. National Park Service stresses an important distinction: If you find yourself being attacked by a brown or grizzly bear, YES, DO PLAY DEAD. Spread your arms and legs and cling to the ground with all your might, facing downward; after a few attempts to flip you over (no one said this would be easy), the bear will, most likely, leave. By contrast, if you find yourself being attacked by a black bear, NO, DO NOT PLAY DEAD. You must either flee or, if that’s not an option, fight it off, curved claws and 700 psi-jaws and all.

But don’t worry—it almost never comes to this. As one park service PSA noted this summer, bears “usually just want to be left alone. Don’t we all?” In other words, if you encounter a black bear, try to look big, back slowly away, and trust in the creature’s inner libertarian. Unless, that is, the bear in question hails from certain wilds of western New Hampshire. Because, as Matthew Hongoltz-Hetling’s new book suggests, that unfortunate animal may have a far more aggressive disposition, and relate to libertarianism first and foremost as a flavor of human cuisine.

Hongoltz-Hetling is an accomplished journalist based in Vermont, a Pulitzer nominee and George Polk Award winner. A Libertarian Walks Into a Bear: The Utopian Plot to Liberate an American Town (and Some Bears) sees him traversing rural New England as he reconstructs a remarkable, and remarkably strange, episode in recent history. This is the so-called Free Town Project, a venture wherein a group of libertarian activists attempted to take over a tiny New Hampshire town, Grafton, and transform it into a haven for libertarian ideals—part social experiment, part beacon to the faithful, Galt’s Gulch meets the New Jerusalem. These people had found one another largely over the internet, posting manifestos and engaging in utopian daydreaming on online message boards. While their various platforms and bugbears were inevitably idiosyncratic, certain beliefs united them: that the radical freedom of markets and the marketplace of ideas was an unalloyed good; that “statism” in the form of government interference (above all, taxes) was irredeemably bad. Left alone, they believed, free individuals would thrive and self-regulate, thanks to the sheer force of “logic,” “reason,” and efficiency. For inspirations, they drew upon precedents from fiction (Ayn Rand loomed large) as well as from real life, most notably a series of micro-nation projects ventured in the Pacific and Caribbean during the 1970s and 1980s.

None of those micro-nations, it should be observed, panned out, and things in New Hampshire don’t bode well either—especially when the humans collide with a newly brazen population of bears, themselves just “working to create their own utopia,” property lines and market logic be damned. The resulting narrative is simultaneously hilarious, poignant, and deeply unsettling. Sigmund Freud once described the value of civilization, with all its “discontents,” as a compromise product, the best that can be expected from mitigating human vulnerability to “indifferent nature” on one hand and our vulnerability to one another on the other. Hongoltz-Hetling presents, in microcosm, a case study in how a politics that fetishizes the pursuit of “freedom,” both individual and economic, is in fact a recipe for impoverishment and supercharged vulnerability on both fronts at once. In a United States wracked by virus, mounting climate change, and ruthless corporate pillaging and governmental deregulation, the lessons from one tiny New Hampshire town are stark indeed.


.
“In a country known for fussy states with streaks of independence,” Hongoltz-Hetling observes, “New Hampshire is among the fussiest and the streakiest.” New Hampshire is, after all, the Live Free or Die state, imposing neither an income nor a sales tax, and boasting, among other things, the highest per capita rate of machine gun ownership. In the case of Grafton, the history of Living Free—so to speak—has deep roots. The town’s Colonial-era settlers started out by ignoring “centuries of traditional Abenaki law by purchasing land from founding father John Hancock and other speculators.” Next, they ran off Royalist law enforcement, come to collect lumber for the king, and soon discovered their most enduring pursuit: the avoidance of taxes. As early as 1777, Grafton’s citizens were asking their government to be spared taxes and, when they were not, just stopped paying them.

Nearly two and a half centuries later, Grafton has become something of a magnet for seekers and quirky types, from adherents of the Unification Church of the Reverend Sun Myung Moon to hippie burnouts and more. Particularly important for the story is one John Babiarz, a software designer with a Krusty the Klown laugh, who decamped from Big-Government-Friendly Connecticut in the 1990s to homestead in New Hampshire with his equally freedom-loving wife, Rosalie. Entering a sylvan world that was, Hongoltz-Hetling writes, “almost as if they had driven through a time warp and into New England’s revolutionary days, when freedom outweighed fealty and trees outnumbered taxes,” the two built a new life for themselves, with John eventually coming to head Grafton’s volunteer fire department (which he describes as a “mutual aid” venture) and running for governor on the libertarian ticket.

Although John’s bids for high office failed, his ambitions remained undimmed, and in 2004 he and Rosalie connected with . . .

Continue reading.

Written by Leisureguy

26 November 2022 at 6:15 pm

Digital Books wear out faster than Physical Books

leave a comment »

I have experienced for myself the greater longevity of physical books. Brewster Kahle writes at the Internet Archive Blogs:

Ever try to read a physical book passed down in your family from 100 years ago?  Probably worked well. Ever try reading an ebook you paid for 10 years ago?   Probably a different experience. From the leasing business model of mega publishers to physical device evolution to format obsolescence, digital books are fragile and threatened.

For those of us tending libraries of digitized and born-digital books, we know that they need constant maintenance—reprocessing, reformatting, re-invigorating or they will not be readable or read. Fortunately this is what libraries do (if they are not sued to stop it). Publishers try to introduce new ideas into the public sphere. Libraries acquire these and keep them alive for generations to come.

And, to serve users with print disabilities, we have to keep up with the ever-improving tools they use.

Mega-publishers are saying electronic books do not wear out, but this is not true at all. The Internet Archive processes and reprocesses the books it has digitized as new optical character recognition technologies come around, as new text understanding technologies open new analysis, as formats change from djvu to daisy to epub1 to epub2 to epub3 to pdf-a and on and on. This takes thousands of computer-months and programmer-years to do this work. This is what libraries have signed up for—our long-term custodial roles.

Also, the digital media they reside on changes, too—from Digital Linear Tape to PATA hard drives to SATA hard drives to SSDs. If we do not actively tend our digital books they become unreadable very quickly.

Then there are cataloging and metadata. If we do not keep up with the ever-changing expectations of digital learners, then our books will not be found. This is ongoing and expensive.

Our paper books have lasted hundreds of years on our shelves and are still readable. Without active maintenance, we will be lucky if our digital books last a decade.

Also, how we use books and periodicals . . .

Continue reading.

Written by Leisureguy

23 November 2022 at 5:34 pm

The library of Alexandria and its reputation

leave a comment »

Peter Gainsford writes in the Kiwi Hellenist:

Many people are aware that the library of Alexandria is hugely overblown. Sure, there’ll always be people insisting that it was a magical place that held the secrets of Göbekli Tepe, Doggerland, and blond blue-eyed Europeans building pyramids in Mexico and Bolivia: there’s no point engaging with people like that. The thing is, pretty much everyone has heard of it.

Last week the History subreddit paid some attention to a piece I wrote in 2015 dispelling some myths about the Alexandrian library. Which is nice. Some people misread it and thought I was claiming it was true that ‘the burning of the library of Alexandria was “the most destructive fire in the history of human culture”’. That’s a pity, but understandable. (One reader was angry at my claiming to be a Kiwi and a hellenist: that was entertaining.)

On a more serious note, several readers pointed out that there were other library losses in history that were far more destructive. And that’s absolutely correct. Any time books are destroyed that don’t exist in other copies in other libraries, that’s a catastrophic and irreversible loss.

You can argue about whether specific incidents belong in this category. The destruction of the House of Wisdom in Baghdad in 1258 didn’t exactly put an end to the Abbasid knowledge economy and book culture, any more than the Alexandrian fire did in hellenised Egypt.

But some tragedies really are catastrophically destructive. The fire at the . . .

Continue reading.

Written by Leisureguy

22 November 2022 at 6:53 pm

Posted in Books, Daily life, History

Why women aren’t from Venus, and men aren’t from Mars

leave a comment »

In Nature, Emily Cooke interviews Gina Rippon:

Early research into schizophrenia alerted neuroscientist Gina Rippon to what she now calls the myth of the gendered brain, a term she used in the title of her first book. By examining examples taken from brain–behaviour research during the late eighteenth and early nineteenth centuries, right up to contemporary studies, the book, published in 2019, investigates the desire to find biological explanations for gendered societal norms. Rippon argues that our brains are not fixed as male or female at birth, but are instead highly plastic, changing constantly throughout our lives and influenced by the gendered world in which we live.

The Gendered Brain: The New Neuroscience that Shatters the Myth of the Female Brain was written in part, she says, to address dubious research, or what is sometimes called neurotrash. Rippon first encountered it in the 2000s. At the time, she was working at the Aston Brain Centre, part of Aston University in Birmingham, UK. Shocked by the misuse of sex and gender reporting in neuroscience, she became set on changing the rhetoric. Rippon is now professor emeritus of cognitive neuroimaging at Aston University.

What is neurotrash?

It is what we’d generally call pseudoscience — bringing a kind of scientific legitimacy to an argument.

Early brain images were very seductive, with people thinking, ‘Brilliant, we can find the God spot,’ for instance. Images were hijacked by self-help gurus, relationship counsellors and even those espousing single-sex education. Just adding a picture of the brain in, say, book chapters on why boys and girls are different gave tremendous credibility. Also, the beginning of this century saw ‘neuro’ everything. Just put ‘neuro’ in front for that sexy science-y feel — for example, neuromarketing or neuroaesthetics.

The word neurotrash highlights misleading information: telling stories that might be partly true, sustaining stereotypes and feeding myth continuation, for example about the right brain and left brain. This is the idea that the brain is a ‘game of two halves’, when in fact the whole of your brain is working for you the whole of the time.

These stories were often well written and certainly more accessible than arcane journals. They also resonated with people’s experiences. We believed that men and women were different, and here were the scientists saying ‘you’re right, and this is why’.

How did your own research in the field take shape?

I began my career in the 1980s, and became interested in sex differences in the brain and how different regions could be better configured for various tasks — making me one of the people I subsequently criticized.

When setting up my own laboratory, I had a range of cognitive tests, such as verbal fluency tasks or visuospatial tasks, that would allegedly differentiate men from women reliably. However, over a period of 18 months I frustratingly didn’t find any differences, so became dispirited. The research made me realize that the whole right-brain, left-brain idea is based on very shaky evidence — possibly not something to hang my future research career on. So I stopped doing that sort of work and moved on, becoming involved in dyslexia research.

In 2006, shortly after I’d joined Aston, the engineer Julia King became the university’s first female vice-chancellor. She was interested in the under-representation of women in science, and wanted to know what researchers at Aston were doing that might be relevant to understanding this.

Aware that brain imaging was being used to talk publicly about neuroscience, I reviewed how the field pursued the belief in the male versus the female brain. Horrified by the discipline’s misuse, I wrote a review and started a public conversation.

At the 2010 British Science Festival, I gave a talk about the so-called differences between women’s and men’s brains, showing that, when you look at the data, they’re not that different after all. I was trying to dispel the stereotypical myths that men are ‘left-brained’ — logical, rational and good at spatial tasks — and women are ‘right-brained’ — emotional, nurturing and good at verbal tasks.

We’re not from Mars or Venus (to quote relationship counsellor John Gray’s 1992 book), we’re all from Earth! I assumed that people would thank me and just move on, but it caused an absolute furore and gave me early exposure to media backlash.

One favourite comment (now . . .

Continue reading.

Written by Leisureguy

22 November 2022 at 11:20 am

AI images for Shakespeare’s plays

leave a comment »

One image per play.

Written by Leisureguy

17 November 2022 at 5:55 pm

Posted in Art, Books, Software, Technology

How to Argue

leave a comment »

More useful videos in this post on Open Culture.

Written by Leisureguy

17 November 2022 at 5:04 pm

The complicated business legacy of GE’s Jack Welch

leave a comment »

My view is that Jack Welch not only ate away at the foundation of GE, leading to its subsequent collapse, but he also was responsible for the same sort of damage to capitalism, being a major contributor to the era of hypercapitalism.

In Fast Company, Kaushik Viswanath reviews (no paywall) William D. Cohan’s book Power Failure, a history of General Electric (and thus a close look at Jack Welch). Viswanath writes:

Plenty of ink has been spilled on General Electric, the storied 130-year-old conglomerate that had its origins with Thomas Edison. With his new book Power Failure, William D. Cohan, a journalist and author of several books on American business, adds nearly 800 pages to this corpus, with the aim of delivering the definitive and final history of the company. And there is an air of finality to it: Although GE survives today, it has fallen far from the status it held in the ‘90s as the world’s most valuable company, and awaits being broken up into three separate companies next year.

Cohan tells the story chronologically, beginning with the founding of the company in the 19th century. He shines a light on a figure whose talent for business gets overshadowed in most other accounts by Edison’s genius for invention: Charles Albert Coffin. In 1883, Coffin, the CEO of a Massachusetts shoe company, bankrolled a struggling maker of dynamos in town. With Coffin’s guidance and capital, Thomson-Houston Electric Company achieved success, becoming a formidable competitor to the Edison General Electric Company. Coffin then conspired with Edison’s financial backers, including J.P. Morgan, to merge the two companies against Edison’s wishes, leaving a furious Edison with a mere 10% stake—and Coffin the president of the new General Electric Company, formed in 1892.

The early history of GE is fascinating not only for . . .

Continue reading. (no paywall)

Written by Leisureguy

17 November 2022 at 9:52 am

The complexity of J. Edgar Hoover

leave a comment »

People are more complicated than we want them to be,  particularly when we are feeling judgmental. Kai Bird has a fascinating review (no paywall) of a new biography of J. Edgar Hoover in the Washington Post, which begins:

On Oct. 7, 1964, President Lyndon Johnson’s longtime aide Walter Jenkins walked into the YMCA near the White House after a party at the Newsweek magazine office and had sex in the bathroom with a homeless Army veteran. The vice squad arrested Jenkins, booked him and released him. A week later, the story made headlines on the eve of the presidential election that pitted Johnson against Republican Barry Goldwater. By then, a near-suicidal Jenkins had checked into George Washington University Hospital and the Republicans were “punching hard,” writes Beverly Gage in “G-Man,” her masterful account of the life and controversial career of FBI Director J. Edgar Hoover. The Goldwater campaign demanded to know if Jenkins’s conduct had compromised national security. Forced to act, Johnson ordered Hoover, his old friend and onetime neighbor, to investigate the scandal. Hoover was annoyed. This was politics, and for decades he had tried to insulate the FBI from partisan politics. But he did what he was told to do by his president.

It turned out that Jenkins, the father of six children, had been arrested in the same bathroom five years earlier. Johnson was astonished that Jenkins could have hidden his proclivities. Hoover was not. He thought such temptations were commonplace. Four days into the investigation he told Johnson that Jenkins had been under enormous stress and required medical attention. The FBI chief had already sent a bouquet of flowers to Jenkins’s hospital room. Attached was a sympathy card wishing him a speedy recovery. “With less than two weeks to go before the election,” Gage writes, “Hoover issued a report absolving Jenkins of any national security violations,” and on Election Day, Johnson rolled to victory in one of the nation’s biggest presidential landslides.

In Gage’s biography, Hoover emerges as a strangely tortured man who wielded power within the Justice Department for an astonishing 48 years. His response to Jenkins revealed a softer side and, Gage explains, raised an “innuendo that Hoover might have more in common with Jenkins than he wished to acknowledge.” In a memo, Hoover wrote that he liked Jenkins and felt sorry for him. “It is a pitiful case,” he observed, “and I think it is time for people to follow the admonition of the Bible about persons throwing the first stone and that none are without sin.”

Hoover’s story illustrates the unique power of biography to enter the life of another human being. The genre can provoke a rare response: It can persuade one to change one’s mind. This magical leap can happen when a good biographer is able to seduce the reader into understanding another soul. “G-Man” is Gage’s first biography, and she turns out to be a marvelous biographer.

After reading Gage, I have changed my mind about Hoover. He is not the caricature villain I thought I knew when I came of age in the turbulent 1960s. Hoover was a man of profound contradictions. While he had enough empathy to send flowers to Jenkins, he also orchestrated . . .

Continue reading. (no paywall)

Written by Leisureguy

13 November 2022 at 5:59 pm

The Gentleman’s Companion to the Toilet; or, A Treatise on Shaving

with one comment

I used Clive Thompson’s Weird Old Book Finder and searched on “shaving,” whereupon it delivered the above book from 1844.

I was particularly struck by this sentence in the preface:

To remove the beard pleasantly and expeditiously, (two very desirable things,) the razor requires to be held in a certain position, and moved in a certain direction, and if that is not attended to, the best razor that was ever made will not cut smoothly. 

Even in 1844 the distinction between comfort (“pleasantly”) and efficiency (“expeditiously”) was noted, along with the understanding that both are important for a good shave. 

Written by Leisureguy

12 November 2022 at 12:36 pm

Posted in Books, Daily life, Shaving

“How an 18th-Century Philosopher Helped Solve My Midlife Crisis”

leave a comment »

Alison Gopnik has a very interesting essay (no paywall) in the Atlantic:

In 2006, I was 50—and I was falling apart.

Until then, I had always known exactly who I was: an exceptionally fortunate and happy woman, full of irrational exuberance and everyday joy.

I knew who I was professionally. When I was 16, I’d discovered cognitive science and analytic philosophy, and knew at once that I wanted the tough-minded, rigorous, intellectual life they could offer me. I’d gotten my doctorate at 25 and had gone on to become a professor of psychology and philosophy at UC Berkeley.

I knew who I was personally, too. For one thing, I liked men. I was never pretty, but the heterosexual dance of attraction and flirtation had always been an important part of my life, a background thrum that brightened and sharpened all the rest. My closest friends and colleagues had all been men.

More than anything, though, I was a mother. I’d had a son at 23, and then two more in the years that followed. For me, raising children had been the most intellectually interesting and morally profound of experiences, and the happiest. I’d had a long marriage, with a good man who was as involved with our children as I was. Our youngest son was on his way to college.

I’d been able to combine these different roles, another piece of good fortune. My life’s work had been to demonstrate the scientific and philosophical importance of children, and I kept a playpen in my office long after my children had outgrown it. Children had been the center of my life and my work—the foundation of my identity.

And then, suddenly, I had no idea who I was at all.

My children had grown up, my marriage had unraveled, and I decided to leave. I moved out of the big, professorial home where I had raised my children, and rented a room in a crumbling old house. I was living alone for the first time, full of guilt and anxiety, hope and excitement.

I fell in love—with a woman, much to my surprise—and we talked about starting a new life together. And then my lover ended it.

Joy vanished. Grief took its place. I’d chosen my new room for its faded grandeur: black-oak beams and paneling, a sooty brick fireplace in lieu of central heating. But I hadn’t realized just how dark and cold the room would be during the rainy Northern California winter. I forced myself to eat the way I had once coaxed my children (“just three more bites”), but I still lost 20 pounds in two months. I measured each day by how many hours had gone by since the last crying jag (“There now, no meltdowns since 11 this morning”).

I couldn’t work. The dissolution of my own family made the very thought of children unbearable. I had won a multimillion-dollar grant to investigate computational models of children’s learning and had signed a contract to write a book on the philosophy of childhood, but I couldn’t pass a playground without tears, let alone design an experiment for 3-year-olds or write about the moral significance of parental love.

Everything that had defined me was gone. I was no longer a scientist or a philosopher or a wife or a mother or a lover.

My doctors prescribed Prozac, yoga, and meditation. I hated Prozac. I was terrible at yoga. But meditation seemed to help, and it was interesting, at least. In fact, researching meditation seemed to help as much as actually doing it. Where did it come from? Why did it work?

I had always been curious about Buddhism, although, as a committed atheist, I was suspicious of anything religious. And turning 50 and becoming bisexual and Buddhist did seem far too predictable—a sort of Berkeley bat mitzvah, a standard rite of passage for aging Jewish academic women in Northern California. But still, I began to read Buddhist philosophy.

In 1734, in Scotland, a 23-year-old was falling apart.

As a teenager, he’d thought . . .

Continue reading. (no paywall)

Written by Leisureguy

11 November 2022 at 9:23 pm

Was Kurt Vonnegut a nice man?

with 6 comments

Recently I have been thinking about regret — in particular, the statement made by some (Edit Piaf being a prime example), “I regret nothing.”

Nothing? Not one instance of being unkind? even inadvertently? I can think of more examples than I want of hurting someone by failing to be kind, and I regret every one.

But then it struck me that “I regret nothing” is exactly the sentiment of a sociopath, particularly a narcissistic sociopath.

The above came to mind when I read Dorian Lynskey’s profile of Kurt Vonnegut in UnHerd, which begins:

In 1999, the director Baz Luhrmann had a novelty hit with “Everybody’s Free (To Wear Sunscreen)”, a spoken-word litany of whimsical advice for young people: enjoy your youth, keep your old love letters, floss, and so on. The text derived from a column by a journalist called Mary Schmich but it was widely rumoured to be from a commencement address by a celebrated author who was born 100 years ago this week: Kurt Vonnegut. Despite having quit writing two years earlier, he was still delighting students with his witty speeches, of which this appeared to be one. Vonnegut set the record straight but graciously told Schmich: “I would have been proud had the words been mine.”

Nothing illustrates an author’s reputation as clearly as misattributed work. The Sunscreen confusion proved that one of his era’s most scathing satirists had been recast as the cuddly hipster grandpa of American letters. This certainly chimed with one strand of Vonnegut’s work, which is summed up by a famous line from his 1965 novel God Bless You, Mr Rosewater, or Pearls Before Swine (“God damn it babies, you’ve got to be kind”) but that was by no means the whole picture.

Like Dolly Parton, Alan Bennett, George Michael and Anthony Bourdain, Vonnegut has become simplified into an avatar of kindness, his wrinkles ironed flat by the heat of sainthood. This happened long before his death in 2007 and he was a willing conspirator. George Saunders recently spoke about his own reputation as literature’s Mr Nice Guy and gave himself some advice: “one: don’t believe it; two, interrupt it.” The first is easier than the second. One of Vonnegut’s most famous lines is from 1961’s Mother Night: “We are what we pretend to be, so we must be careful about what we pretend to be.” Vonnegut often pretended to be nicer than he was, which was good for both his ego and his income.

If you Google Vonnegut, one of the most-asked questions that comes up is: “Was Vonnegut a nice person?” Tough one. He could certainly be warm, wise and generous, but he could also be a greedy and disloyal business partner, a selfish, unfaithful husband and a crotchety, intimidating father. He suffered from depression and suicidal ideation; his work often flirts with nihilism. Robert B. Weide’s recent documentary Kurt Vonnegut: Unstuck in Time (the title quotes Vonnegut’s masterpiece Slaughterhouse-Five) is candid about the writer’s failings as a family man but Weide, who considered Vonnegut a close friend and mentor, still sands off a lot of rough edges.

In his more objective biography And So It Goes, Charles J. Shields quotes the private notes that  . . .

Continue reading.

Written by Leisureguy

10 November 2022 at 1:06 pm

Using the Astronomicum Caesareum Book

leave a comment »

Written by Leisureguy

3 November 2022 at 3:43 pm

All Possible Plots by Major Authors

leave a comment »

For the readers, God bless them. In The Fence magazine:

We praise canonical authors for their boundless imagination. Then why do all their plots feel the same?

Anthony Trollope

Your happiness is put on hold when it transpires your fiancé failed to correctly cash a cheque. This lasts for 130 chapters. Everyone else is ordained.

Evelyn Waugh

The spectre of God’s grace haunts your attempts to hang out with people posher than yourself.

Henry James

You declined an invitation, not wishing, to wit, for it to be understood that you might have deliberately allowed yourself to be put at a disadvantage. Now your ruin is certain.

Graham Greene

Your only desire is to preserve an inconceivably small piece of your dignity. You are denied this, because of a prior engagement with the British Foreign Office. This novel is called The Way of the Thing.

W Shakespeare (I)

You become King. This turns out to have been a very big mistake.

Samuel Richardson

‘Dearest Mama, An alluring yet unvirtuous rake has designs on my Innocence and has placed me in a sack to be transported to Covent Garden. Fortunately the sack is so designed as to allow the writing of several letters.’

David Foster Wallace

. . .

Continue reading.

Written by Leisureguy

3 November 2022 at 1:49 pm

Posted in Books

Science Over Capitalism: Kim Stanley Robinson and the Imperative of Hope

leave a comment »

James Bradley’s interview with Kim Stanley Robinson is excerpted from the book Tomorrow’s Parties: Life in the Anthropocene and appears in The MIT Press Reader:

There is no question Kim Stanley Robinson is one of the most important writers working today. Across almost four decades and more than 20 novels, his scrupulously imagined fiction has consistently explored questions of social justice, political and environmental economy, and utopian possibility.

Robinson is probably best known for his Mars trilogy, which envisions the settlement and transformation of Mars over several centuries, and the ethical and political challenges of building a new society. Yet it is possible his most significant legacy will turn out to be the remarkable sequence of novels that began with “2312.” Published across less than a decade, these six books reimagine both our past and our future in startlingly new ways, emphasizing the indivisibility of ecological and economic systems and placing the climate emergency center stage.

The most recent, “The Ministry for the Future,” published in 2020, is a work of extraordinary scale and ambition. Simultaneously a deeply confronting vision of the true scale of the climate crisis, a future history of the next 50 years, and a manifesto outlining the revolutionary change that will be necessary to avert catastrophe, it is by turns terrifying, exhilarating, and finally, perhaps surprisingly, guardedly hopeful. It is also one of the most important books published in recent years.

This interview was conducted between January and March 2021, beginning in the immediate aftermath of the attack on the United States Capitol and the inauguration of President Biden, and ending as a second wave of the COVID pandemic began to gather pace in many countries around the world. As we bounced questions back and forth across the Pacific, a drumbeat of impending disaster grew louder by the day: atmospheric carbon dioxide reached 417 ppm, a level 50 percent higher than preindustrial levels; a study showed the current system responsible for the relative warmth of the Northern Hemisphere — the Atlantic meridional overturning circulation — at its weakest level in a thousand years; and Kyoto’s cherry blossoms bloomed earlier than they have at any time since records began in the ninth century CE.


James Bradley: In several of your recent novels, you’ve characterized the first few decades of the 21st century as a time of inaction and indecision — in “2312,” for instance, you called them “the Dithering” — but in “The Ministry for the Future,” you talk about the 2030s as “the zombie years,” a moment when “civilization had been killed but it kept walking the Earth, staggering toward some fate even worse than death.” I wonder whether you could talk a little bit about that idea. What’s brought us to this point? And what does it mean for a civilization to be dead?

Kim Stanley Robinson: I’m thinking now that my sense of our global civilization dithering, and also trying to operate on old ideas and systems that are clearly inadequate to the present crisis, has been radically impacted by the COVID pandemic, which I think has been somewhat of a wake-up call for everyone — showing that we are indeed in a global civilization in every important sense (food supply, for instance), and also that we are utterly dependent on science and technology to keep eight billion people alive.

So “2312” was written in 2010. In that novel, I provided a timeline of sorts, looking backward from 2312, that was notional and intended to shock, also to fill the many decades it takes to make three centuries, and in a way that got my story in place the way I wanted it. In other words, it was a literary device, not a prediction. But it’s interesting now to look back and see me describing “the Dithering” as lasting so long. These are all affect states, not chronological predictions; I think it’s very important to emphasize science fiction’s double action, as both prophecy and metaphor for our present. As prophecy, SF is always wrong; as metaphor, it is always right, being an expression of the feeling of the time of writing.

So following that, “The Ministry for the Future” was written in 2019, before the pandemic. It expresses both fears and hopes specific to 2019 — and now, because of the shock of the pandemic, it can serve as an image of “how it felt before.” It’s already a historical artifact. That’s fine, and I think it might be possible that the book can be read better now than it could have been in January 2020 when I finished it.

Now I don’t think there will be a period of “zombie years,” and certainly not the 2030s. The pandemic as a shock has sped up civilization’s awareness of the existential dangers of climate change. Now, post COVID, a fictional future history might speak of the “Trembling Twenties” as it’s described in “The Ministry for the Future,” but it also seems it will be a period of galvanized, spasmodic, intense struggle for control over history, starting right now. With that new feeling, the 2030s seem very far off and impossible to predict at all.

JB: In “The Ministry for the Future,” the thing that finally triggers change is the catastrophic heat wave that opens the book. It’s a profoundly upsetting and very powerful piece of writing, partly because an event of the sort it depicts is likely to be a reality within a decade or so. But as somebody whose country has already experienced catastrophic climate disaster in the form of fire and flood and seen little or no change in our political discourse, I found myself wondering whether the idea such a disaster would trigger change mightn’t be too optimistic. Do you think it will take catastrophe to create real change? Or will the impetus come from elsewhere?

KSR: People are good at . . .

Continue reading.

Written by Leisureguy

28 October 2022 at 6:32 pm

“How communism got me into reading as a child”

leave a comment »

Claudia Befu writes at Story Voyager:

One of the most vivid memories I have from my childhood is bulging into the house on a Saturday or Sunday afternoon and asking my mother:

‘Did it start?’

On the days when the answer was ‘It already finished a long time ago’ I started crying.

‘Why didn’t you call me?’

‘You were playing.’

‘But I wanted to see the cartoon!’

I grew up in communism, and we only had cartoons on TV on Saturday and Sunday from 1 pm to 1:05 pm. Usually, it was one episode of ‘Tom and Jerry’, ‘Bolek a Lolek’, or some other party-approved cartoon.

As I grew up and started to play outdoors with other kids from the neighborhood, I usually missed the weekly episodes, and I was devastated.

The advantage of growing up with communist TV 📺 

I am already on day 38 of my 100-day TV detox challenge, and I can’t believe how time is flying. Things have been very busy at work lately, and this newsletter filled up the gap left by not watching Netflix in my free time. I also started to meet more people and generally spend quality time with my husband.

Aside from a couple of documentaries and some TikTok and YouTube videos, I haven’t watched anything during this time.

Between weeks two and four, I automatically thought about watching a series or a movie whenever there was some unstructured time. I am surprised at how deeply ingrained watching entertainment is in my psyche. But about one week ago, my brain stopped craving for series, and now I don’t think about it as often.

Besides, I can’t watch anything right now. I feel physically ill every time I think of starting a Netflix series.

How did I get to this?

This question made me go down the rabbit hole on the TV detox topic and look at my life through the TV lens.

Perhaps one of the most remarkable things is that I grew up watching very little TV.

It wasn’t by choice but by design. The communist TV diet was rationed like our food, hot water and electricity.

For example, a family of four could only buy one litter of cooking oil and half a kilogram of sugar per month. This was enough fat and sugar for the whole family for an entire month.

Hot water was dispensed twice weekly because showering every other day was more than enough. And electricity was cut for some hours during the night since everyone was sleeping anyway.

We had around seven to nine hours of TV every weekend and about two hours in the evening during the week. Of course, some TV entertainment was allowed on weekends, such as 5 minutes of cartoons or party-approved Romanian film productions.

But during the week, the two hours of TV were filled with news about the dictator.

Almost every evening, we would watch Nicolae Ceausescu pour cement into the foundation of yet another communist building while his wife observed him with a watchful eye. When he wasn’t pouring cement, he would walk through a laboratory wearing a white doctor’s coat or a factory wearing a safety helmet.

His wife, Elena Ceausescu, was always next to him, featuring her version of the ‘Thatch’ helmet hair and her Channel knock off suits made in Romania.

Left without much choice, I was gorging on the Encyclopaedia TV program that was running once a week, inspiring me from a very young age to become an astronaut. But, as you can conclude, the inspiration wasn’t strong enough.

This strict TV diet also had its advantages. As I grew up, my parents read a lot to us, and after I learned how to read at the ripe age of six, I started reading books myself, and I didn’t stop for the next six years.

Everyone who knew me during that time remembers me holding a book in my hand. Or a stash of books if they saw me on my way back from the library. Without a TV to distract me, I fully embraced the magic of books and developed a lifelong love for reading.

Do you doubt I read so much as a child just because I didn’t have anything age-appropriate to watch on TV?

Let me introduce you to the next chapter of my life.

The glory of capitalist TV

In the autumn of 1989, about three years after I started reading books, communism fell, and suddenly we had twelve hours of TV programs every day.

I remember watching my first . . .

Continue reading.

Written by Leisureguy

28 October 2022 at 3:25 pm

6 common errors in thinking

leave a comment »

Woo-Kyoung Ahn, John Hay Whitney Professor of Psychology at Yale University, has an excerpt from his book Thinking 101: How to Reason Better to Live Better in Inc. The excerpt begins:

WHEN I WAS a graduate student at the University of Illinois at Urbana-Champaign, doing research in cognitive psychology, our lab group went out every now and then for nachos and beers. It was a great opportunity for us to ask our adviser about things that wouldn’t likely come up in our more formal meetings. At one of those gatherings, I summoned up the courage to ask him a question that had been on my mind for some time: “Do you think cognitive psychology can make the world a better place?” I had asked a simple yes-or-no question, so he chose a simple answer: “Yes.”

Over the course of the next 30 years, I’ve tried to answer that question myself by working on problems that I hope have real-world applications. In my research at Yale University, where I’ve been a professor of psychology since 2003, I’ve examined some of the biases that can lead us astray–and developed strategies to correct them in ways that are directly applicable to situations people encounter in their daily lives.

I also saw how “thinking problems” cause troubles that go far beyond our individual concerns. These errors and biases contribute to a wide range of societal issues, including political polarization, complicity in climate change, and ethnic profiling. They can also come into play for people who run businesses–how they hire staff, interact with their colleagues, set strategies.

I introduced a course called “Thinking” to show students how psychology can help them recognize and tackle some of these real-world problems and make better decisions. Now I’ve written a book, Thinking 101, to make these lessons more widely available. And here I’m presenting a sample of the kind of material you’ll find in it.

My book is not about what is wrong with people. Thinking problems happen because we are wired in very particular ways. Reasoning errors are mostly byproducts of our highly evolved cognition, which has allowed us to survive and thrive as a species. As a result, de-biasing is notoriously challenging.

To avoid these errors in running a business, merely learning what they are and making a mental note not to commit them isn’t enough. Fortunately, there are actionable strategies you can adopt to change your thinking and help your team work better. These strategies can also help us figure out which things we can’t control, and show us how solutions that might seem promising can ultimately backfire.

1. Don’t Be Throttled by Things That Have Always Worked

From antiquity into the late 19th century, Western healers believed that if you drew out a patient’s “bad” blood when they were ill, their ailments would get better. George Washington presumably died from this treatment when his doctor drew 1.7 liters of blood to treat a throat infection. By the time Washington was born, we had already figured out that the earth is round, and Sir Isaac Newton had formulated the three physical laws of motion, but our intelligent ancestors still thought draining blood was the bomb.

Still, if we were in their situation, we might not have been much different. Picture yourself in the year 1850, with excru­ci­at­ing back pain. You’ve heard that in 1820, King George IV was bled 150 ounces and went on to live for another 10 years. You’ve heard that your neighbor’s insomnia was cured by bloodletting. And you’ve heard that about three quarters of people who got sick and had blood drawn got better (I am making up these numbers). So, you try bloodletting and you actually do feel better.

But here’s the catch. Suppose there are 100 people who got sick but did not have their blood drawn, and 75 of these people also got better. Now you can see that three-quarters of sick people get better whether their blood is drawn or not. But people neglected to check what happens to those who don’t follow this practice. They focused only on the confirming evidence.

Confirmation bias can easily lead us to an exaggerated and invalid view of ourselves. Once we start believing that we are depressed, we may act like a depressed person, making deeply pessimistic predictions about the future and avoiding any fun–which would make anybody feel depressed. And once you start doubting your competency, you may avoid risks that could have led to greater career opportunities, and then, no surprise, your career will end up looking like you lack competency.

These vicious cycles can work at the societal level. Traditionally, almost all scientists were men. Most people who were allowed to continue in the field did a good job. Thus, we developed the notion that men are good at science. Women were hardly given a chance to prove that they could be good scientists, too. So we had little evidence that could disconfirm the belief that only men are good at science. And society continues to operate based on that assumption.

It’s not difficult to see that any stereotype based on race, age, sexual orientation, or socioeconomic background can work the same way. Accord­ing to a 2020 report from Citibank, had our society invested equally in the education, housing, wages, and businesses of both White and Black Americans over the past 20 years, America would have been $16 trillion richer. If that number is too large to grasp, the gross domes­tic product of the United States was $21.43 trillion in 2019.

2. Keep in Mind That Examples Are Just Examples

I use a lot of examples in my teaching because cognitive psychology research tells me it’s useful to do so. Vivid examples are more convincing, easier to understand, and harder to forget than decontextualized, abstract explanations. But they can lead us to ignore important statistical principles.

Take the phenomenon known as the Sports Illustrated cover jinx. Right after an individual or a team appears on the cover of Sports Illustrated, their performance will often begin to decline. The August 31, 2015, issue of SI has a cover photo of Serena Williams, looking at the ball she’d just tossed in the air to serve. The headline reads, “All Eyes on Serena: the Slam.” No sooner did the issue hit the newsstands than Serena lost in the US Open, without reaching the final. Check Wikipedia for a long list of the teams and athletes who experienced the SI cover jinx all the way back to 1954, the year the magazine launched.

If the jinx is real, why does it happen? Perhaps those who make the cover get arrogant and let their guards down. Or they might become overly anxious ­because of the spotlight it shines on them. But rather than blaming the athletes themselves, the jinx may be explained by a purely statistical phenomenon known as regression toward the mean.

Whether people are taking tests, or engaging in sports, music, or any other activity, random factors that affect performance always come into play, often giving a result that is better or worse than usual. Athletes are affected by playing conditions, the strength of the competition, quality of rest and eating, the bounce of the ball, variability in refereeing. Those who performed well enough to be featured on SI’s cover have likely had many random factors aligned in their favor for a stretch. But statistically, it can’t last forever, and it won’t. And when someone is playing at an extremely high level, even a little bad luck can mean a loss, and hence the jinx. That is, unusually high scores–or unu­su­ally low–tend to regress toward the average the next time one tries the same thing, whether you became arrogant or anxious or not.

The regression fallacy can happen in job interviews and auditions, and this is where the power of specific examples can be problematic. Many hiring decisions are made after face-to-face interviews. Those who have made the short list have already passed a threshold, so there is not much variance among the candidates, meaning that random factors can be enough to shift the final decisions. Many things can go well or badly for the candidates during an interview, and many of them are out of their control. The interviewer could be in a bad mood because of the news they heard in their car on the way to work. I know of one candidate who showed up with mismatched shoes because they happened to be lying next to each other when she was rushing out of the house; just imagine how self-conscious she must have been throughout the interview.

On top of all these random factors, the inherent problem with these encoun­ters is that interviewers observe only a thin slice of the person’s performance. And this impression drawn from that particular day can make the decision-makers ignore the records that reflect the candidate’s skills over many years. A person who looks brilliant during an interview may not be as awesome once they are hired. And the candidate who was nervous because of her mismatched shoes could turn out to be the big catch the company missed. Given regression toward the mean, that is what we should expect.

But how can we avoid committing the regression fallacy ourselves? What should interviewers do, for instance? If possible, the most straightforward method would be to evaluate candidates solely on the basis of their résumés.

Doing away with job interviews might not be feasible for hiring decisions that require you to see the candidate in action. Résumés and recommendation letters may feel too impersonal and vague; we may believe that we can make a much better decision if we can set our eyes on the real person even for a brief moment. The problem is that once we do, it is hard to keep that one impression from overly affecting us. We just need to remind ourselves of the regression toward the mean, and make multiple observations of applicants. It takes more time and effort to see them in different settings, but in the end, it might be cheaper and easier than hiring the wrong person.

3. . .

Continue reading.

Written by Leisureguy

24 October 2022 at 12:15 pm

Posted in Books, Daily life, Education

A student asked her cosmology professor the meaning of life. Here was his response.

leave a comment »

  • Brian Thomas Swimme is a professor in the Philosophy, Cosmology, and Consciousness (PCC) Department at CIIS in San Francisco, CA.
  • In this excerpt of his book Cosmogenesis: An Unveiling of the Expanding Universe, Swimme recounts a time one of his students asked him about the meaning of life.

From Big Think:

I had just finished my lecture on Einstein’s special theory of relativity. The mathematical equations for one of his basic ideas, the so-called invariance of the space-time interval, filled the blackboards. I still had twenty minutes to spare. Perhaps I had galloped through the details too fast. I tended to overprepare for this course since it was loaded with some of the best students on campus, including Oona Fitzgerald who had scored a perfect 1600 on her SATs.

We were on the fourth floor of Thompson Hall, which had earned the nickname “the Boeing complex” because of the close relationship the corporation had established with the Departments of Chemistry, Physics, and Mathematics. Over the years, a significant number of professors and students had worked there. The Seattle-based company had funded part of Thompson Hall’s construction when the demand to maintain the university’s English Gothic architecture had led to extraordinary cost overruns.

I could have ended the class right there. My quota of chalk had already been transformed into the mathematical equations I had written out. I dropped the three leftover stubs into the wire-mesh holder at the corner of the blackboard and opened the class for questions. Oona Fitzgerald raised her hand, her round, freckled face beaming. “What’s the meaning of life?” she asked. This evoked some tentative laughter, and she smiled as if she might be joking. But after glancing around, she faced me again and waited. It would have been simple enough to avoid her question with a light remark, but I wanted to honor her sincerity. The bit of courage I needed came when I remembered Dr. Barker’s response to the same question I myself had asked a few years earlier in my quantum mechanics course. His irritated reply—“Science doesn’t deal with meaning”—left me feeling foolish. As if no real scientist would ask such a question. Only an amateurish pretender. Years later, and his words were still with me.

As I leaned back on my desk and reflected on Oona’s question, the strangest feeling arose. The students could see I had taken the question to heart. The mood in the room shifted. A tingling grew inside me. It was as if, unknown to me, I had been waiting for this, and yet I felt like a criminal faced with a forbidden act, something that should be avoided but that was too alluring to ignore.

I told the students what I thought was an important truth, that almost none of us knew our true identity. Just as amazing, we forgot that we did not know our true identity. This strange situation came from the tiny worlds in which we lived. We thought of ourselves as Americans or Chinese, as Republicans or Democrats, as believers or atheists. Each of those identities might be true, but each is secondary truth. There is a deeper truth. We are universe. The universe made us. In a most primordial way, we are cosmological beings.

Then I said it.

“To take this in, you need to ride inside the mathematical symbols.”

I did not know what I meant by saying you need to ride inside the mathematical symbols. I just said it.

“Begin with the primal light discovered in 1964 by Penzias and Wilson. This light, this cosmic microwave background radiation, arrives here from all directions. We know that each of these photons comes from a place near the origin of the cosmos, so if we trace these particles of light backward we are led to the birthplace of the universe. Which means, since this light comes from all directions, that we have discovered our origin in a colossal sphere of light. This colossal sphere, fourteen billion light-years away from us in every direction, is the origin of our universe. And thus the origin of each of us.”

I held out my arms as if clutching a gigantic ball.

“We can speculate about  . . .

Continue reading.

Written by Leisureguy

20 October 2022 at 9:01 pm

Why Do Americans Own More Guns Per Capita Than Anyone Else?

leave a comment »

In Nautilus Brian Gallagher does a one-question interview (no paywall) with Jennifer Carlson, a 2022 MacArthur Grant-winning sociologist at the University of Arizona and author of the forthcoming book Merchants of the Right: Gun Sellers and the Crisis of American Democracy

Why do Americans own more guns per capita than anyone else?

The legal structure makes it possible. The social structure makes it urgent. If you talk to people who own and carry guns, their number one reason for doing so is for self protection. This is really clear if you walk into a gun store and start talking to people. It’s very clear from the survey data. That’s actually historically new. Even as recently as the 1990s, people were saying hunting was the number one reason they owned guns. That’s not to say protection wasn’t an element before, but that it’s so central to defining what it means to own and carry a gun now is really important.

I write about this in my book Citizen Protectors. The politics of guns became reconfigured under what’s been called the “war on crime,” this central focus on crime as a dominant problem in American society—immigration, poverty, and so on, become a problem of crime. On guns, if you look back to the 1960s, there’s this survey data that I always refer to, and it’s the question of, “Should handguns be banned in the United States?” It’s a litmus test of the place that guns occupy in the American imaginary. Handguns are both the self-defense gun of choice, but also the dominant crime gun. The 1960s were the last time that more people responding to the survey questions said that they supported a ban over opposing a ban.

Now 75 percent of Americans oppose this ban, which gets us to what happened in 2020, which is that part of what has happened under the war on crime is that a lot of resources in the US got invested into the criminal justice system, policing prisons, and what have you. And at the same time social supports, welfare, all sorts of entitlements, got rolled back. You have this moment where you have this social safety net receding. So when we think about, “What is the appeal of guns?” Well, guns are that last remaining safety net for a lot of people. Even gun sellers that I interviewed were like, “Yeah, we know you can’t shoot a virus.” They joked about that. And said, “This is the only guarantee that people have.”

When 2020 happened, it became . . .

Continue reading. (no paywall)

Written by Leisureguy

18 October 2022 at 11:51 am

List of common misconceptions

leave a comment »

Written by Leisureguy

16 October 2022 at 5:55 am

Why I reactivated the function on my smartwatch that tells me that I’ve been sitting too long and it’s time to move around a bit

with one comment

I’m gradually working my way through the book Undo It!, by Dean Ornish MD and Anne Ornish. Ornish, like Greger, endorses lifestyle medicine to prevent and treat chronic diseases since research has shown that this approach is in general more effective and less costly than attempting to treat the diseases with medications.

For one thing, changes in lifestyle have only benign side effects, whereas some of the side effects of medications can be harmful or at least bothersome. Moreover, when someone takes medication to treat a chronic disease (such as hypertension, high cholesterol, type 2 diabetes, and others), they generally must continue the medication indefinitely — and some medications are expensive.

Ornish, like Greger, recommends a whole-food plant-based diet (since a ton of research has shown that the diet is optimal in terms of health) and also a regular program of exercise such as walking, running, bicycling, and so on. Ornish, however, adds two more recommendations:

  1. Stress management — learning how to reduce stress by avoiding occasions of stress when possible and also by using techniques such as meditation and yoga. The book explains why: stress is destructive to health, causing a cascade of ill effects in one’s body (and mind).
  2. Love, intimacy, and social connection —  LISC, as Ornish calls it, more or less acts as the opposite of stress, increasing one’s resilience, promoting a healthy response in the body, and minimizing the risk of loneliness and depression, both of which cause damaging physical changes in the body.

Both my Amazfit devices have a feature that I believe is fairly common among smartwatches. I can turn on a “move reminder.” After I’ve been sitting an hour, the reminder will vibrate to tell me it’s time to move. I have set that to run from 8:00am to 9:00pm, and when I get the signal, I do get up and move around, generally to do some household tasks. (I also do Nordic walking each day.)

I had turned this off because if I’m reading or writing, I don’t like to interrupt it, but I turned it back on after reading this passage in Chapter 2 of Undo It!

Undo It! (Ornish, Dean)

– Your Highlight on page 45 | Location 1042-1082 | Added on Saturday, October 15, 2022 9:38:06 PM

Being sedentary enhances stasis and illness. One of the reasons exercise is beneficial in so many ways is that it literally and figuratively keeps everything moving. Your heart pumps blood with sufficient force to circulate your blood throughout your arteries, but the pressure in your veins is substantially lower. 

When you exercise—walking, for example—the muscles in your arms and legs help to squeeze blood through your veins. It’s one reason the Queen’s Guards outside of Buckingham Palace in London are taught to bounce up and down on their toes when standing in one place for prolonged periods of time—otherwise, the blood would pool in their legs and they would pass out. In 2017, five guards actually did faint from standing around too long. 

Spending a lot of time sitting increases your risk of a stroke due to blood clot formation. According to some studies, it increases your risk of premature death from all causes as much as smoking does! 

Sitting for more than eight hours a day is associated with a 90 percent increased risk of type 2 diabetes. Those who sit the most have a 147 percent increased relative risk of cardiovascular events compared to those who sit the least. 

Women who sit more than six hours a day are 37 percent more likely to die prematurely than those who sit less than three hours a day, even if they exercise regularly. The time spent sitting was independently associated with total mortality, regardless of physical activity level. 

The combination of both sitting more than six hours a day and being less physically active was associated with a 94 percent increase in all-cause premature death rates in women and a 48 percent increase in men compared with those who reported sitting less than three hours a day and being most active. 

Why? Because even if you exercise at the gym after work, your blood has not been flowing very well earlier in the day while you’ve been sitting, which increases the likelihood of a blood clot forming during that time. Also, blood sugar, cholesterol levels, blood pressure, and other biomarkers are higher in people who are sedentary. 

Researchers recently found that sitting for several hours at a desk significantly reduced blood flow to the brain. However, getting up and taking just a two-minute walk every half hour actually increased the brain’s blood flow. 

We’ve evolved to move and forage much of the time (e.g., walking) and also to have bursts of intense exercise. Our muscles have both fast-twitch and slow-twitch muscle fibers. So work out regularly and avoid prolonged sitting. Studies show that both are important. 

One study found that the more breaks you take during the day after sitting for twenty minutes—even just getting up, walking around a minute or two, and sitting down again—the better your health. On average, each additional ten breaks per day were associated with 0.8 cm lower waist circumference, 0.3 mm lower systolic blood pressure, 3.7 percent lower triglycerides, 0.6 percent lower glucose, and 4.2 percent lower insulin. 

Talk on a portable phone so you can walk around your office while having conversations—and your energy level will likely be higher as well. Take a break from sitting every twenty or thirty minutes. If you work at a desk, try a standing desk, or improvise with a high table or counter. Walk with your colleagues for meetings rather than sitting in a conference room. I invested in a treadmill desk so I can walk while doing my email or talking on the phone. 

Some of the reasons not moving your body increases the risk of so many different illnesses are the effects of being sedentary on your lymphatic system—the garbage sewers of your body. It helps rid your body of toxins, waste, and other unwanted materials. Besides the lymphatic vessels, your lymphatic system includes your tonsils, appendix, thymus, and spleen. These are important parts of your immune system. 

The primary function of your lymphatic system is to transport lymph, a fluid containing infection-fighting white blood cells, throughout the body. When cells in your immune system have gone to battle, the dead cells are removed via your lymphatics. 

Lymph is then transported through larger lymphatic vessels to lymph nodes, where it is cleaned by white blood cells called lymphocytes. After that, lymph continues down your lymphatic system before emptying ultimately into the right or the left subclavian vein on either side of your neck. 

Pressure in your lymphatic system is even lower than in your veins [and the lymphatic system does not have the benefit of a heart muscle – LG] and relies on the contraction of your skeletal muscles—as happens in walking, for example—to squeeze the lymphatic fluid along. Also, when you take a deep breath, your diaphragm and lungs act as a bellows mechanism that changes pressure at the thoracic duct to pump the lymphatic system. 

When people are sedentary, and when they breathe in a shallow way (which is common when they feel chronically stressed), their muscles aren’t contracting enough to keep their lymph flowing. Because of this, the lymph can leak into their tissues and cause swelling, or edema, which predisposes them to illness. Also, when lymph is not flowing, it can leak back into the blood, causing inflammation and other problems.

Written by Leisureguy

15 October 2022 at 10:05 pm

%d bloggers like this: