Later On

A blog written for those whose interests more or less match mine.

Archive for May 18th, 2021

Sleep Evolved Before Brains. Hydras Are Living Proof.

leave a comment »

Veronique Greenwood writes in Quanta:

The hydra is a simple creature. Less than half an inch long, its tubular body has a foot at one end and a mouth at the other. The foot clings to a surface underwater — a plant or a rock, perhaps — and the mouth, ringed with tentacles, ensnares passing water fleas. It does not have a brain, or even much of a nervous system.

And yet, new research shows, it sleeps. Studies by a team in South Korea and Japan showed that the hydra periodically drops into a rest state that meets the essential criteria for sleep.

On the face of it, that might seem improbable. For more than a century, researchers who study sleep have looked for its purpose and structure in the brain. They have explored sleep’s connections to memory and learning. They have numbered the neural circuits that push us down into oblivious slumber and pull us back out of it. They have recorded the telltale changes in brain waves that mark our passage through different stages of sleep and tried to understand what drives them. Mountains of research and people’s daily experience attest to human sleep’s connection to the brain.

But a counterpoint to this brain-centric view of sleep has emerged. Researchers have noticed that molecules produced by muscles and some other tissues outside the nervous system can regulate sleep. Sleep affects metabolism pervasively in the body, suggesting that its influence is not exclusively neurological. And a body of work that’s been growing quietly but consistently for decades has shown that simple organisms with less and less brain spend significant time doing something that looks a lot like sleep. Sometimes their behavior has been pigeonholed as only “sleeplike,” but as more details are uncovered, it has become less and less clear why that distinction is necessary.

It appears that simple creatures — including, now, the brainless hydra — can sleep. And the intriguing implication of that finding is that sleep’s original role, buried billions of years back in life’s history, may have been very different from the standard human conception of it. If sleep does not require a brain, then it may be a profoundly broader phenomenon than we supposed.

Recognizing Sleep

Sleep is not the same as hibernation, or coma, or inebriation, or any other quiescent state, wrote the French sleep scientist Henri Piéron in 1913. Though all involved a superficially similar absence of movement, each had distinctive qualities, and that daily interruption of our conscious experience was particularly mysterious. Going without it made one foggy, confused, incapable of clear thought. For researchers who wanted to learn more about sleep, it seemed essential to understand what it did to the brain.

And so, in the mid-20th century, if you wanted to study sleep, you became an expert reader of electroencephalograms, or EEGs. Putting electrodes on humans, cats or rats allowed researchers to say with apparent precision whether a subject was sleeping and what stage of sleep they were in. That approach produced many insights, but it left a bias in the science: Almost everything we learned about sleep came from animals that could be fitted with electrodes, and the characteristics of sleep were increasingly defined in terms of the brain activity associated with them.

This frustrated Irene Tobler, a sleep physiologist working at the University of Zurich in the late 1970s, who had begun to study the behavior of cockroaches, curious whether invertebrates like insects sleep as mammals do. Having read Piéron and others, Tobler knew that sleep could be defined behaviorally too.

She distilled a set of behavioral criteria to identify sleep without the EEG. A sleeping animal does not move around. It is harder to rouse than one that’s simply resting. It may take on a different pose than when awake, or it may seek out a specific location for sleep. Once awakened it behaves normally rather than sluggishly. And Tobler added a criterion of her own, drawn from her work with rats: A sleeping animal that has been disturbed will later sleep longer or more deeply than usual, a phenomenon called sleep homeostasis. . .

Continue reading. There’s more.

Written by Leisureguy

18 May 2021 at 4:26 pm

Fox News, Republicans, and the Destruction of Democracy

leave a comment »

That is from this post by Kevin Drum, which is worth reading. The post concludes with:

Correlation is not causation blah blah blah. By itself, this isn’t proof of the baneful effects of Rupert Murdoch’s media empire. However, there’s plenty of other evidence and this is one more straw on the camel’s back. Fox News is responsible more than any other single entity for the destruction of American politics over the past two decades.

Read the whole thing.

Written by Leisureguy

18 May 2021 at 3:24 pm

The attack on American foundational principles

leave a comment »

Heather Cox Richardson writes:

I wanted to note that on this day in 1954, the Supreme Court handed down the Brown v. Board of Education of Topeka, Kansas, decision, declaring racial segregation in public schools unconstitutional. A unanimous court decided that segregation denied Black children the equal protection of the laws guaranteed by the Fourteenth Amendment, which was ratified in 1868 in the wake of the Civil War. Brown v. Board was a turning point in establishing the principle of racial equality in modern America.

Since the 1860s, we have recognized that equality depends upon ensuring that all Americans have a right to protect their own interests by having a say in their government.

Today, that principle is under attack.

In 1965, President Lyndon B. Johnson urged Congress to pass the Voting Rights Act to “help rid the Nation of racial discrimination in every aspect of the electoral process and thereby insure the right of all to vote.” And yet, in 2013, the Supreme Court gutted that law, and in the wake of the 2020 election in which voters gave Democrats control of the government, Republican-dominated states across the country are passing voter suppression laws.

Today, Senators Joe Manchin (D-WV) and Lisa Murkowski (R-AK) begged their colleagues to reinstate the Voting Rights Act. In 2006 a routine reauthorization of the law got through the Senate with a vote of 98-0; now it is not clear it can get even the ten Republican votes it will need to get through the Senate, so long as the filibuster remains intact.

But here’s the thing: Once you give up the principle of equality before the law, you have given up the whole game. You have admitted the principle that people are unequal, and that some people are better than others. Once you have replaced the principle of equality with the idea that humans are unequal, you have granted your approval to the idea of rulers and servants. At that point, all you can do is to hope that no one in power decides that you belong in one of the lesser groups.

In 1858, Abraham Lincoln, then a candidate for the Senate, warned that arguments limiting American equality to white men and excluding black Americans were the same arguments “that kings have made for enslaving the people in all ages of the world…. Turn in whatever way you will—whether it come from the mouth of a King, an excuse for enslaving the people of his country, or from the mouth of men of one race as a reason for enslaving the men of another race, it is all the same old serpent.” Either people—men, in his day—were equal, or they were not.

Lincoln went on, “I should like to know if . . .

Continue reading.

Written by Leisureguy

18 May 2021 at 2:28 pm

Ratatouille-ish, again

with 2 comments

Engineers used to say that “NTSC” (the color-television standard implemented in US sets) stood for “Never Twice the Same Color.” My “recipes” — which really are just descriptions of what I did, not prescriptions for what you must do — are much like that, due in part to my disinclination to measure except by eye and my inclination to use what I have on hand. I’m making the short-cut ratatouilley thing again (short-cut = using frozen roasted vegetables instead of roasting my own — my own roasted vegetables are much better, but a bit of trouble given my small oven). This is what it looks like about 10 minutes into cooking time (which will be 30-40 minutes):

I used my 4-qt sauté pan, and into it I put:

• 2 Tbsp olive oil (and this I did measure)
• 1 ginormous red onion, chopped
• 4 dried chipotle peppers, cut into pieces with kitchen shears
• 4 dried Sanaam chili peppers, cut into pieces
• 1 Tbsp Herbes de Provence
• 1 Tbsp dried thyme
• 2 Tbsp Mexican oregano
• 1.5 Tbsp ground black pepper

I cooked that for a while, then added:

• 10 medium-to-large domestic white mushrooms, sliced
• cloves from 1 head of hardneck garlic, chopped small and allowed to rest

I cooked that until the mushrooms started to lose their water, then added:

• 1 small can no-salt-added tomato paste

I cooked that, stirring constantly with my wooden spatula, until the tomato paste darkened a bit. It stuck somewhat, so I added a splash of red wine vinegar to deglaze the pan. Then I added:

• 1 18-oz can diced tomatoes (540ml = 18.26 fl. oz)
• 1 bag mixed roasted vegetables (500g, or 17.6 oz)
• 1 jar (drained) pitted Kalamata olives (315 ml = 10.65 fl. oz)
• about 1/3 cup red vermouth
• about 1/4 cup red wine vinegar
• several good dashes fish sauce (for umami; you could also use tamari, soy sauce, or Worcestershire sauce)
• 1 lemon, diced after cutting off and discarding the ends

The olive jar had an odd label: it said that the weight was 315 ml, but milliliters are a measure of volume, not weight.

I simmered it on my Max Burton 18XL induction burner, which has a simmer setting and a timer — very convenient. (Some induction ranges have a timer for each burner.)

Additional things I might include another time:

• a Parmesan rind (you can get at Whole Foods, for example: good flavor and lots of umami)
Kelp noodles (for the iodine, you know; the link is to a local brand, but they are readily available)
• Pesto
• Corn kernels would be colorful, but I don’t eat much corn — too starchy (like potatoes and rice)

UPDATE: Some topping ideas:

• Roasted pepitas (pumpkin seeds)
• Roasted sunflower seeds
• Bragg’s Nutritional Yeast (cheese taste, high in B12)

 

Written by Leisureguy

18 May 2021 at 1:51 pm

Weird dreams train us for the unexpected, says new theory

leave a comment »

Linda Geddes writes in the Guardian:

It’s a common enough scenario: you walk into your local supermarket to buy some milk, but by the time you get to the till, the milk bottle has turned into a talking fish. Then you remember you’ve got your GCSE maths exam in the morning, but you haven’t attended a maths lesson for nearly three decades.

Dreams can be bafflingly bizarre, but according to a new theory of why we dream, that’s the whole point. By injecting some random weirdness into our humdrum existence, dreams leave us better equipped to cope with the unexpected.

The question of why we dream has long divided scientists. Dreams’ subjective nature, and the lack of any means of recording them, makes it fiendishly difficult to prove why they occur, or even how they differ between individuals.

“While various hypotheses have been put forward, many of these are contradicted by the sparse, hallucinatory, and narrative nature of dreams, a nature that seems to lack any particular function,” said Erik Hoel, a research assistant professor of neuroscience at Tufts University in Massachusetts, US.

Inspired by recent insights into how machine “neural networks” learn, Hoel has proposed an alternative theory: the overfitted brain hypothesis.

A common problem when it comes to training artificial intelligence (AI) is that it becomes too familiar with the data it’s trained on, because it assumes that this training set is a perfect representation of anything it might encounter. Scientists try to fix this “overfitting” by introducing some chaos into the data, in the form of noisy or corrupted inputs.

Hoel suggests that our brains do something similar when we dream. Particularly as we get older, our days become statistically pretty similar to one another, meaning our “training set” is limited. But we still need to be able to generalise our abilities to new and unexpected circumstances – whether it’s our physical movements and reactions, or our mental processes and understanding. We can’t inject random noise into our brains while we’re awake, because we need to concentrate on the tasks at hand, and perform them as accurately as possible. But sleep is a different matter.

By creating a weirded version of the world, dreams may make our understanding of it less simplistic and more well-rounded. “It is the very strangeness of dreams in their divergence from waking experience that gives them their biological function,” Hoel said.

Already, there’s some evidence from neuroscience research to support this, he argues. For instance, one of the most reliable ways of prompting dreams about something that happens in real life is to repetitively perform a new task, such as learning to juggle, or repeatedly training on a ski simulator, while you are awake. Overtraining on the task triggers this overfitting phenomenon, meaning your brain attempts to generalise beyond its training set while you sleep by creating dreams. This may help explain why we often get better at physical tasks such as juggling, following a good night’s sleep.

Although Hoel’s hypothesis is still untested, an advantage is that . . .

Continue reading. If weird dreams are a survival advantage, they would certainly be favored by natural selection (and thus we would have them today).

Written by Leisureguy

18 May 2021 at 12:35 pm

Speaking of cultural memes and their workings: Sinead O’Connor Remembers Things Differently

leave a comment »

For those over a certain age, Sinead O’Connor’s ripping apart the photograph of the Pope is an indelible memory — but from a temporal distance it looks different from how it seemed at the time. Amanda Hess writes in the NY Times:

Sinead O’Connor is alone, which is how she prefers to be. She has been riding out the pandemic in a tiny village on an Irish mountaintop, watching murder shows, buying fairy-garden trinkets online and mainlining American news on CNN. On a recent overcast afternoon, she had a navy hijab arranged over her shaved head and a cigarette permanently installed between her fingertips, and when she leaned over an iPad inside her all-glass conservatory, she looked as if she had been hermetically sealed into her own little world.

“I’m lucky,” she said, “because I enjoy my own company.”

Her cottage was appointed in bright, saturated colors that leapt out from the monotonous backdrop of the Irish sky with the surreal quality of a pop-up book. Bubble-gum roses lined the windows, and the Hindu goddess Durga stretched her eight arms across a blanket on a cozy cherry couch. When O’Connor, 54, gave me a little iPad tour during our video interview, the place seemed to fold in on itself: The flowers were fake ones she bought on Amazon.com, and her pair of handsome velvet chairs weren’t made for sitting.

“Deliberately, I bought uncomfortable chairs, because I don’t like people staying long,” she said. “I like being on my own.” But she disclosed this with such an impish giggle that it sounded almost like an invitation.

O’Connor is, no matter how hard she tries to fight it, irresistible. She exudes a tender familiarity, thanks to her cherubic smile, her loose tongue and the fact that she happens to possess one of the most iconic heads in pop culture memory. In the early ’90s, O’Connor became so famous that the very dimensions of her skull seemed inscribed in the public consciousness. If you remember two things about her, it’s that she vaulted to fame with that enduring close-up in the video for her version of “Nothing Compares 2 U” — and then, that she stared down a “Saturday Night Live” camera, tore up a photo of Pope John Paul II and killed her career.

But O’Connor doesn’t see it that way. In fact, the opposite feels true. Now she has written a memoir, “Rememberings,” that recasts the story from her perspective. “I feel that having a No. 1 record derailed my career,” she writes, “and my tearing the photo put me back on the right track.”

O’Connor saw herself as a protest-singing punk. When she ascended to the top of the pop charts, she was trapped. “The media was making me out to be crazy because I wasn’t acting like a pop star was supposed to act,” she told me. “It seems to me that being a pop star is almost like being in a type of prison. You have to be a good girl.” And that’s just not Sinead O’Connor.

“CRAZY” IS A word that does some dirty cultural work. It is a flip way of referencing mental illness, yes. But it’s also a slippery label that has little to do with how a person’s brain works and everything to do with how she is culturally received. Calling someone crazy is the ultimate silencing technique. It robs a person of her very subjectivity.

By the time O’Connor appeared on “S.N.L.,” in October 1992, she had already been branded as insane — for boycotting the Grammy Awards where she was up for record of the year (they recognized only “material gain,” she said) and refusing to play “The Star-Spangled Banner” before her concerts (because national anthems “have nothing to do with music in general”). But now her reputation felt at permanent risk.

“I’m not sorry I did it. It was brilliant,” she said of her protest against abuse in the Catholic Church. “But it was very traumatizing,” she added. “It was open season on treating me like a crazy bitch.”

Soon after the show, O’Connor appeared at a Bob Dylan tribute concert, and when the crowd booed, she was so taken aback she thought, at first, that they were making fun of her outfit. Joe Pesci threatened to smack her in an “S.N.L.” monologue, and later, on that same stage, Madonna mocked her in a gently condescending fashion, play-scowling and ripping up a photograph of the tabloid-star sex offender Joey Buttafuoco. O’Connor was condemned by the Anti-Defamation League and a group called the National Ethnic Coalition of Organizations, which hired a steamroller to crush hundreds of her albums outside of her record company’s headquarters. The Washington Times named her “the face of pure hatred” and Frank Sinatra called her “one stupid broad.”

Now O’Connor’s memoir arrives at a time when the culture seems eager to reassess these old judgments. The top comment on a YouTube rip of O’Connor’s “Behind the Music” episode is: “Can we all just say she was right!” Few cultural castaways have been more vindicated by the passage of time: child sexual abuse, and its cover-up within the Catholic Church, is no longer an open secret. John Paul II finally acknowledged the church’s role in 2001, nearly a decade after O’Connor’s act of defiance.

But the overreaction to O’Connor was not just about whether she was right or wrong; it was about the kinds of provocations we accept from women in music. “Not because I was famous or anything, but because I was a human being, I had a right to put my hand up and say what I felt,” O’Connor said. Some artists are skilled at shocking in a way designed to sell more records, and others at tempering their political rage into palatable music, but “Sinead is not the tempering type,” her friend Bob Geldof, the musician and activist, told me. “In that, she is very much an Irish woman.”

To understand why O’Connor may have seen her cultural blacklisting as liberating, you have to understand just how deeply she was misapprehended throughout her career. She was still a teenager when she started work on her fierce, ethereal first record, “The Lion and the Cobra,” when an executive — “a square unto high heaven” — called her to lunch and told her to dress more femininely and grow out her close-cropped hair. So she marched to a barber and shaved it all off. “I looked like an alien,” she writes in the book, which was a kind of escape hatch from looking like a human woman. When O’Connor became pregnant in the midst of recording, she writes that the executive called a doctor and tried to coerce her into having an abortion, which she refused. Her first son, Jake, arrived just before the album did.

Later, when “Nothing Compares 2 U” made her a star, O’Connor said the song’s writer, Prince, terrorized her. She had pledged to reveal the details “when I’m an old lady and I write my book,” and now she has: She writes that Prince summoned her to his macabre Hollywood mansion, chastised her for swearing in interviews, harangued his butler to serve her soup though she repeatedly refused it, and sweetly suggested a pillow fight, only to thump her with something hard he’d slipped into his pillowcase. When she escaped on foot in the middle of the night, she writes, he stalked her with his car, leapt out and chased her around the highway.

Prince is the type of artist who is hailed as crazy-in-a-good-way, as in, “You’ve got to be crazy to be a musician,” O’Connor said, “but there’s a difference between being crazy and being a violent abuser of women.” Still, the fact that her best-known song was written by this person does not faze her at all. “As far as I’m concerned,” she said, “it’s my song.”

O’CONNOR’S STATEMENT ON “S.N.L.” was more personal than most knew. In the book, she details how her mother physically abused her throughout her childhood. “I won the prize in kindergarten for being able to curl up into the smallest ball, but my teacher never knew why I could do it so well,” she writes. There is a reason, in the “Nothing Compares 2 U” video, she begins to cry when she hits the line about her mama’s flowers. O’Connor was 18 when her mother died, and on that day, she took down the one photograph on her mom’s bedroom wall: the image of the pope. O’Connor carefully saved the photo, waiting for the right moment to destroy it.

“Child abuse is an identity crisis and fame is an identity crisis, so I went straight from one identity crisis into another,” she said. And when she tried to call attention to child abuse through her fame, she was vilified. “People would say that she’s fragile,” Geldof said. “No, no, no. Many people would have collapsed under the weight of being Sinead O’Connor, had it not been Sinead.”

Instead, O’Connor felt freed. “I could just be me. Do what I love. Be imperfect. Be mad, even,” she writes in the book. “I’m not a pop star. I’m just a troubled soul who needs to scream into mikes now and then.” She sees the backlash as having pushed her away from the wrong life, in mainstream pop, and forced her to make a living performing live, which is where she feels most comfortable as an artist.

“Rememberings” is a document of a difficult life, but it is also deliciously funny, starting with the title. (“As I’ve said, I can’t remember many details because I was constantly stoned,” she writes.) It is loaded with charming stories from the height of her fame. She rejects the Red Hot Chili Peppers singer Anthony Kiedis’s claim that they had a thing (“Only in his mind”) but confirms a fling with Peter Gabriel (to discover the profane term she assigns to their affair, you’ll have to read it.) . . .

Continue reading. There’s a lot more.

And the report linked above is worth reading as well. It is by Jon Pareles and appeared in the November 1, 1992, issue of the NY Times. That report begins:

You think it’s easy to get booed at Madison Square Garden? Maybe it is for a visiting hockey team, but at a rock concert, drawing boos qualifies as a perverse kind of achievement. Sinead O’Connor, who was booed (as well as cheered) at the Bob Dylan tribute on Oct. 16, once again showed that she has a gift that’s increasingly rare.: the ability to stir full-fledged outrage. She has stumbled onto the new 1990’s taboo: taking on an authority figure.

O’Connor was booed because, 13 days earlier, she had torn up a photograph of Pope John Paul II on NBC’s “Saturday Night Live,” saying, “Fight the real enemy.” Compounding her impropriety, she dropped her scheduled Dylan song and reprised “War,” the anti-racism song by Bob Marley and Haile Selassie. Her expression was timorous, defiant, martyred, and she made all the late-edition newspapers and television news.

Meanwhile, the tabloids happily reported, Madonna (no stranger to recontextualized Christian symbols) told The Irish Times: “I think there is a better way to present her ideas rather than ripping up an image that means a lot to other people.” She added, “If she is against the Roman Catholic Church and she has a problem with them, I think she should talk about it.”

She did: last week, O’Connor released an open letter, linking her being abused as a child to “the history of my people” and charging, “The Catholic church has controlled us by controlling education, through their teachings on sexuality, marriage, birth control and abortion, and most spectacularly through the lies they taught us with their history books.” The letter concluded, “My story is the story of countless millions of children whose families and nations were torn apart for money in the name of Jesus Christ.” Proselytizing as imperialism as child abuse — quite a leap.

Madonna’s reaction may have been professional jealousy. After Madonna had herself gowned, harnessed, strapped down and fully stripped to promote her album “Erotica” and her book “Sex,” O’Connor stole the spotlight with one photograph of a fully-clothed man. But the other vilification that descended on O’Connor showed she had struck a nerve.

Sex, which used to be a guaranteed shocker, has become a popular entertainment, with triple-X tapes on home VCR’s and lubricious innuendo in every sitcom. Visual and telephone sex, sex as commercial spectacle, may have moved in where fear of AIDS has made physical sex far less casual. Looking is safe; touching is not.

But as public standards of viewable sexual behavior have changed, a new kind of taboo is gaining force: challenging authority and its religious version, blasphemy. (Another button-pusher, sexual harassment, has more to do with power and authority than with titillation.) In an American culture that used to prize the loner, the wiseguy, the maverick, defense of authority is on the rise, whether it’s a backlash against permissiveness or fear of impending anarchy.

Anti-authority sentiments raise hackles highest when the challenge comes from insubordinate blacks (like Ice-T with “Cop Killer”) or women, like O’Connor. If a heavy-metal band took a picture of the Pope, hung it on an upside-down cross and burned it, the act would likely be greeted with yawns — that old bit again? But waifish female 25-year-olds like O’Connor don’t have the same prerogative. While bullies like Axl Rose are lionized as rock-and-roll rebels simply for lashing out at the press — like so many losing political candidates — O’Connor draws real outrage because she doesn’t know her place.

Not that O’Connor isn’t a loose cannon. She has a penchant for the impassioned but mis-targeted gesture: boycotting the Grammy Awards, refusing to perform on a “Saturday Night Live” show featuring Andrew Dice Clay, refusing to let “The Star-Spangled Banner” be played before a concert, singing a Bob Marley song at a Bob Dylan tribute. Tearing up the Pope’s photograph may have been the best way she could envision to condemn Catholicism, but she surely would have thought twice about tearing up a photograph of Louis Farrakhan or the Lubavitcher Rebbe.

She baffles the likes of Madonna by making her gestures without game plans or tie-ins. “War” doesn’t appear on her new album, “Am I Not Your Girl?” — a collection of standards accompanied by orchestra and sung in the voice of a terrified child who believes every unhappy word.

Yet for all O’Connor’s sincerity . . .

Continue reading.

Written by Leisureguy

18 May 2021 at 12:24 pm

You are a network, not just a body, mind, or your social role

leave a comment »

Kathleen Wallace, professor of philosophy at Hofstra University in Hempstead, New York,  works on ethics and metaphysics of personal identity and has an interesting piece in Aeon. I imagine, given the content, that it is an extract from her book The Network Self: Relation, Process, and Personal Identity (2019). It is fairly long, and it begins:

Who am I? We all ask ourselves this question, and many like it. Is my identity determined by my DNA or am I product of how I’m raised? Can I change, and if so, how much? Is my identity just one thing, or can I have more than one? Since its beginning, philosophy has grappled with these questions, which are important to how we make choices and how we interact with the world around us. Socrates thought that self-understanding was essential to knowing how to live, and how to live well with oneself and with others. Self-determination depends on self-knowledge, on knowledge of others and of the world around you. Even forms of government are grounded in how we understand ourselves and human nature. So the question ‘Who am I?’ has far-reaching implications.

Many philosophers, at least in the West, have sought to identify the invariable or essential conditions of being a self. A widely taken approach is what’s known as a psychological continuity view of the self, where the self is a consciousness with self-awareness and personal memories. Sometimes these approaches frame the self as a combination of mind and body, as René Descartes did, or as primarily or solely consciousness. John Locke’s prince/pauper thought experiment, wherein a prince’s consciousness and all his memories are transferred into the body of a cobbler, is an illustration of the idea that personhood goes with consciousness. Philosophers have devised numerous subsequent thought experiments – involving personality transfers, split brains and teleporters – to explore the psychological approach. Contemporary philosophers in the ‘animalist’ camp are critical of the psychological approach, and argue that selves are essentially human biological organisms. (Aristotle might also be closer to this approach than to the purely psychological.) Both psychological and animalist approaches are ‘container’ frameworks, positing the body as a container of psychological functions or the bounded location of bodily functions.

All these approaches reflect philosophers’ concern to focus on what the distinguishing or definitional characteristic of a self is, the thing that will pick out a self and nothing else, and that will identify selves as selves, regardless of their particular differences. On the psychological view, a self is a personal consciousness. On the animalist view, a self is a human organism or animal. This has tended to lead to a somewhat one-dimensional and simplified view of what a self is, leaving out social, cultural and interpersonal traits that are also distinctive of selves and are often what people would regard as central to their self-identity. Just as selves have different personal memories and self-awareness, they can have different social and interpersonal relations, cultural backgrounds and personalities. The latter are variable in their specificity, but are just as important to being a self as biology, memory and self-awareness.

Recognising the influence of these factors, some philosophers have pushed against such reductive approaches and argued for a framework that recognises the complexity and multidimensionality of persons. The network self view emerges from this trend. It began in the later 20th century and has continued in the 21st, when philosophers started to move toward a broader understanding of selves. Some philosophers propose narrative and anthropological views of selves. Communitarian and feminist philosophers argue for relational views that recognise the social embeddedness, relatedness and intersectionality of selves. According to relational views, social relations and identities are fundamental to understanding who persons are.

Social identities are traits of selves in virtue of membership in communities (local, professional, ethnic, religious, political), or in virtue of social categories (such as race, gender, class, political affiliation) or interpersonal relations (such as being a spouse, sibling, parent, friend, neighbour). These views imply that it’s not only embodiment and not only memory or consciousness of social relations but the relations themselves that also matter to who the self is. What philosophers call ‘4E views’ of cognition – for embodied, embedded, enactive and extended cognition – are also a move in the direction of a more relational, less ‘container’, view of the self. Relational views signal a paradigm shift from a reductive approach to one that seeks to recognise the complexity of the self. The network self view further develops this line of thought and says that the self is relational through and through, consisting not only of social but also physical, genetic, psychological, emotional and biological relations that together form a network self. The self also changes over time, acquiring and losing traits in virtue of new social locations and relations, even as it continues as that one self.

How do you self-identify? You probably have many aspects to yourself and would resist being reduced to or stereotyped as any one of them. But you might still identify yourself in terms of your heritage, ethnicity, race, religion: identities that are often prominent in identity politics. You might identify yourself in terms of other social and personal relationships and characteristics – ‘I’m Mary’s sister.’ ‘I’m a music-lover.’ ‘I’m Emily’s thesis advisor.’ ‘I’m a Chicagoan.’ Or you might identify personality characteristics: ‘I’m an extrovert’; or commitments: ‘I care about the environment.’ ‘I’m honest.’ You might identify yourself comparatively: ‘I’m the tallest person in my family’; or in terms of one’s political beliefs or affiliations: ‘I’m an independent’; or temporally: ‘I’m the person who lived down the hall from you in college.’ Some of these are more important than others, some are fleeting. The point is that who you are is more complex than any one of your identities. Thinking of the self as a network is a way to conceptualise this complexity and fluidity.

Let’s take a concrete example. Consider Lindsey: she is spouse, mother, novelist, English speaker, Irish Catholic, feminist, professor of philosophy, automobile driver, psychobiological organism, introverted, fearful of heights, left-handed, carrier of Huntington’s disease (HD), resident of New York City. This is not an exhaustive set, just a selection of traits or identities. Traits are related to one another to form a network of traits. Lindsey is an inclusive network, a plurality of traits related to one another. The overall character – the integrity – of a self is constituted by the unique interrelatedness of its particular relational traits, psychobiological, social, political, cultural, linguistic and physical.

Figure 1 below is based on an approach to modelling ecological networks; the nodes represent traits, and the lines are relations between traits (without specifying the kind of relation). . .

Continue reading. There’s much more.

I’ll point out that many of these relationships are culturally shaped and determined — one’s teachers, for example, or one’s doctor, dentist, team members, and so on. And that a network can also achieve an identity as a memeplex, a cluster of mutually supportive memes, and act in some ways like a living organism — things such as corporations or military units or sports teams or tribal communities or political parties or churches or . . .

The whole article is worth reading, and based on that, the book also is of interest.

Written by Leisureguy

18 May 2021 at 12:05 pm

Amla vs. Drugs for Cholesterol, Inflammation, and Blood-Thinning

leave a comment »

I recently resumed including a teaspoon of powdered amla (Indian gooseberries) in my breakfast. Here’s one reason why:

Written by Leisureguy

18 May 2021 at 9:28 am

l’Occitane Cade all the way, with Baili’s superb BR171

with 2 comments

A synthetic knot worked so well on the Cade soap last time that I have made it a practice to use a synthetic with this shaving soap (as I also do with Barrister & Mann Reserve shaving soaps). The lather was gratifying, and the Baili 171 did a fine job. (As I noted earlier, the razor is sold on Amazon with a case as the Baili BD171 for $11 vs. without a case as the Baili BR171 for $6 from (say) Groomatorium. Baili itself now calls this model “Victor,” and Victor is available in two colors: gold (BR173) and silver (BR171).

The details about the razor because it’s a really excellent razor at a modest price, and it should be better known. My shave was very comfortable and, after three passes, produced a very smooth result, to which I applied a splash (several sprays into my palm) of Cade EDT.

Written by Leisureguy

18 May 2021 at 9:17 am

Posted in Shaving

%d bloggers like this: