Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Philosophy’ Category

No absolute time: Newton got it wrong, Hume saw it right, and Einstein learned from Hume how relativity would work

leave a comment »

Matias Slavov, a postdoctoral researcher in philosophy at Tampere University in Finland, writes in Aeon:

In 1915, Albert Einstein wrote a letter to the philosopher and physicist Moritz Schlick, who had recently composed an article on the theory of relativity. Einstein praised it: ‘From the philosophical perspective, nothing nearly as clear seems to have been written on the topic.’ Then he went on to express his intellectual debt to ‘Hume, whose Treatise of Human Nature I had studied avidly and with admiration shortly before discovering the theory of relativity. It is very possible that without these philosophical studies I would not have arrived at the solution.’

More than 30 years later, his opinion hadn’t changed, as he recounted in a letter to his friend, the engineer Michele Besso: ‘In so far as I can be aware, the immediate influence of D Hume on me was greater. I read him with Konrad Habicht and Solovine in Bern.’ We know that Einstein studied Hume’s Treatise (1738-40) in a reading circle with the mathematician Conrad Habicht and the philosophy student Maurice Solovine around 1902-03. This was in the process of devising the special theory of relativity, which Einstein eventually published in 1905. It is not clear, however, what it was in Hume’s philosophy that Einstein found useful to his physics. We should therefore take a closer look.

In Einstein’s autobiographical writing from 1949, he expands on how Hume helped him formulate the theory of special relativity. It was necessary to reject the erroneous ‘axiom of the absolute character of time, viz, simultaneity’, since the assumption of absolute simultaneity

unrecognisedly was anchored in the unconscious. Clearly to recognise this axiom and its arbitrary character really implies already the solution of the problem. The type of critical reasoning required for the discovery of this central point [the denial of absolute time, that is, the denial of absolute simultaneity] was decisively furthered, in my case, especially by the reading of David Hume’s and Ernst Mach’s philosophical writings.

In the view of John D Norton, professor of the history and philosophy of science at the University of Pittsburgh, Einstein learned an empiricist theory of concepts from Hume (and plausibly from Mach and the positivist tradition). He then implemented concept empiricism in his argument for the relativity of simultaneity. The result is that different observers will not agree whether two events are simultaneous or not. Take the openings of two windows, a living room window and a kitchen window. There is no absolute fact to the matter of whether the living room window opens before the kitchen window, or whether they open simultaneously or in reverse order. The temporal order of such events is observer-dependent; it is relative to the designated frame of reference.

Once the relativity of simultaneity was established, Einstein was able to reconcile the seemingly irreconcilable aspects of his theory, the principle of relativity and the light postulate. This conclusion required abandoning the view that there is such a thing as an unobservable time that grounds temporal order. This is the view that Einstein got from Hume.

Hume’s influence on intellectual culture is massive. This includes all areas of philosophy and a variety of scientific disciplines. A poll conducted with professional philosophers a few years ago asked them to name the philosopher, no longer living, with whom they most identify. Hume won, by a clear margin. In Julian Baggini’s estimation, contemporary ‘scientists, who often have little time for philosophy, often make an exception for Hume’. Before saying more about Hume’s permanent relevance, we should go back to the 18th-century early modern context. His influence is due to his radical empiricism, which can’t be fully understood without examining the era in which he worked.

The dominant theory of cognition of early modern philosophy was idea theory. Ideas denote both mental states and the material of our thinking. A mental state is, for example, a toothache, and the material of our thinking are thoughts, for example, of a mathematical object such as a triangle. The clearest proponent of the theory of ideas was the French philosopher René Descartes, for whom philosophical enquiry is essentially an investigation of the mind’s ideas. In one of his letters, he explains why ideas are so important: ‘I am certain that I can have no knowledge of what is outside me except by means of the ideas I have within me.’ If we wish to gain any certainty in our investigations of any aspect of the world – whether the object of our investigation is the human mind or some natural phenomenon – we need to have a clear and distinct idea of the represented object in question.

Hume’s theory of ideas differs from Descartes’s because he rejects innatism. This view goes back to Plato’s doctrine of anamnesis, which maintains that all learning is a form of recollection as everything we learn is in us before we are taught. The early modern version of innatism emphasises that the mind is not a blank slate, but we are equipped with some ideas before our birth and sensory perception. Hume starts at the same point as his fellow Briton and predecessor, John Locke. The mind begins to have ideas when it begins to perceive. To ask when a human being acquires ideas in the first place ‘is to ask when he begins to perceive; having ideas and perception being the same thing,’ writes Locke in An Essay Concerning Human Understanding (1689). Drawing on this insight, Hume devised his copy principle.

Perception, for Hume, is divided into ideas and impressions. The difference between the two is  . . .

Continue reading.

You can get a Kindle edition of Hume’s books and essays for 77¢.

Written by LeisureGuy

16 January 2021 at 4:33 pm

Possession as a web made of memes and a look at identity: How we swim in the ocean of cultural entities and understandings

leave a comment »

A few mornings ago for some reason I found myself pondering the idea of possession. I was looking at one of my razors — not the Fendrihan Mk II I was using but the Fine aluminum slant on the shelf — and thinking that it was nice to own it — but I realized that “being owned” is not discoverable from the razor itself. “Ownership” exists not in the natural world but only in the meme-universe of human cultural knowledge, and cultural content is not part of the natural world but instead comes from the cultural knowledge of the observer.

One example of this consists of what you see here: black markings on a white background:

이 문장에는 의미가 있습니다 (한국어를 아는 경우에만 해당).

Not matter how closely you examine those markings, they remain simply black marks (unless, of course you have the cultural knowledge to interpret them).

I then encountered the following post by Maria Popova in Brain Pickings. The post addresses how our identities are not from nature but are formed from cultural elements.

“A person’s identity,” Amin Maalouf wrote as he contemplated what he so poetically called the genes of the soul“is like a pattern drawn on a tightly stretched parchment. Touch just one part of it, just one allegiance, and the whole person will react, the whole drum will sound.” And yet we are increasingly pressured to parcel ourselves out in various social contexts, lacerating the parchment of our identity in the process. As Courtney Martin observed in her insightful On Being conversation with Parker Palmer and Krista Tippett, “It’s never been more asked of us to show up as only slices of ourselves in different places.” Today, as Whitman’s multitudes no longer compose an inner wholeness but are being wrested out of us fragment by fragment, what does it really mean to be a person? And how many types of personhood do we each contain?

In the variedly stimulating 1976 volume The Identities of Persons (public library), philosopher Amelie Rorty considers the seven layers of personhood, rooted in literature but extensible to life. She writes:

Humans are just the sort of organisms that interpret and modify their agency through their conception of themselves. This is a complicated biological fact about us.

Rorty offers a brief taxonomy of those conceptions before exploring each in turn:

Characters are delineated; their traits are sketched; they are not presumed to be strictly unified. They appear in novels by Dickens, not those by Kafka. Figures appear in cautionary tales, exemplary novels and hagiography. They present narratives of types of lives to be imitated. Selves are possessors of their properties. Individuals are centers of integrity; their rights are inalienable. Presences are descendants of souls; they are evoked rather than presented, to be found in novels by Dostoyevsky, not those by Jane Austen.

Depending on which of these we adopt, Rorty argues, we become radically different entities, with different powers and proprieties, different notions of success and failure, different freedoms and liabilities, different expectations of and relations to one another, and most of all a different orientation toward ourselves in the emotional, intellectual, and social spaces we inhabit.

And yet we ought to be able to interpolate between these various modalities of being:

Worldliness consists of [the] ability to enact, with grace and aplomb, a great variety of roles.

Rorty begins with the character, tracing its origin to Ancient Greek drama:

Since the elements out of which characters are composed are repeatable and their configurations can be reproduced, a society of characters is in principle a society of repeatable and indeed replaceable individuals.

Characters, Rorty points out, don’t have identity crises because they aren’t expected to have a core unity beneath their assemblage of traits. What defines them is which of these traits become manifested, and this warrants the question of social context:

To know what sort of character a person is, is to know what sort of life is best suited to bring out his potentialities and functions… Not all characters are suited to the same sorts of lives: there is no ideal type for them all… If one tries to force the life of a bargainer on the character of a philosopher, one is likely to encounter trouble, sorrow, and the sort of evil that comes from mismatching life and temperament. Characters formed within one society and living in circumstances where their dispositions are no longer needed — characters in time of great social change — are likely to be tragic. Their virtues lie useless or even foiled; they are no longer recognized for what they are; their motives and actions are misunderstood. The magnanimous man in a petty bourgeois society is seen as a vain fool; the energetic and industrious man in a society that prizes elegance above energy is seen as a bustling boor; the meditative person in an expansive society is seen as melancholic… Two individuals of the same character will fare differently in different polities, not because their characters will change through their experiences (though different aspects will become dominant or recessive) but simply because a good fit of character and society can conduce to well-being and happiness, while a bad fit produces misery and rejection.

Rorty’s central point about character takes it out of the realm of the literary and the philosophical, and into the realm of our everyday lives, where the perennial dramas of who we are play out: . . .

Continue reading.

Written by LeisureGuy

13 January 2021 at 9:33 am

The Roots of Josh Hawley’s Rage

with 2 comments

Josh Hawley, like many on the Christian Right, have the attitude toward belief that Henry Ford had toward car color: Ford said that you could have a car in any color you wanted so long as you wanted black, and Hawley and his ilk say you can believe anything you like so long as you believe as they tell you to.

Katherine Stewart reports in the NY Times:

In today’s Republican Party, the path to power is to build up a lie in order to overturn democracy. At least that is what Senator Josh Hawley was telling us when he offered a clenched-fist salute to the pro-Trump mob before it ransacked the Capitol, and it is the same message he delivered on the floor of the Senate in the aftermath of the attack, when he doubled down on the lies about electoral fraud that incited the insurrection in the first place. How did we get to the point where one of the bright young stars of the Republican Party appears to be at war with both truth and democracy?

Mr. Hawley himself, as it happens, has been making the answer plain for some time. It’s just a matter of listening to what he has been saying.

In multiple speeches, an interview and a widely shared article for Christianity Today, Mr. Hawley has explained that the blame for society’s ills traces all the way back to Pelagius — a British-born monk who lived 17 centuries ago. In a 2019 commencement address at The King’s College, a small conservative Christian college devoted to “a biblical worldview,” Mr. Hawley denounced Pelagius for teaching that human beings have the freedom to choose how they live their lives and that grace comes to those who do good things, as opposed to those who believe the right doctrines.

The most eloquent summary of the Pelagian vision, Mr. Hawley went on to say, can be found in the Supreme Court’s 1992 opinion in Planned Parenthood v. Casey. Mr. Hawley specifically cited Justice Anthony Kennedy’s words reprovingly: “At the heart of liberty,” Kennedy wrote, “is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life.” The fifth century church fathers were right to condemn this terrifying variety of heresy, Mr. Hawley argued: “Replacing it and repairing the harm it has caused is one of the challenges of our day.”

In other words, Mr. Hawley’s idea of freedom is the freedom to conform to what he and his preferred religious authorities know to be right. Mr. Hawley is not shy about making the point explicit. In a 2017 speech to the American Renewal Project, he declared — paraphrasing the Dutch Reformed theologian and onetime prime minister Abraham Kuyper — “There is not one square inch of all creation over which Jesus Christ is not Lord.” Mr. Kuyper is perhaps best known for his claim that Christianity has sole legitimate authority over all aspects of human life.

“We are called to take that message into every sphere of life that we touch, including the political realm,” Mr. Hawley said. “That is our charge. To take the Lordship of Christ, that message, into the public realm, and to seek the obedience of the nations. Of our nation!”

Mr. Hawley has built his political career among people who believe that Shariah is just around the corner even as they attempt to secure privileges for their preferred religious groups to discriminate against those of whom they disapprove. Before he won election as a senator, he worked for Becket, a legal advocacy group that often coordinates with the right-wing legal juggernaut the Alliance Defending Freedom. He is a familiar presence on the Christian right media circuit.

The American Renewal Project, which hosted the event where Mr. Hawley delivered the speech I mentioned earlier, was founded by David Lane, a political organizer who has long worked behind the scenes to connect conservative pastors and Christian nationalist figures with politicians. The choice America faces, according to Mr. Lane, is “to be faithful to Jesus or to pagan secularism.”

The line of thought here is starkly binary and nihilistic. It says that human existence in an inevitably pluralistic, modern society committed to equality is inherently worthless. It comes with the idea that a right-minded elite of religiously pure individuals should aim to capture the levers of government, then use that power to rescue society from eternal darkness and reshape it in accord with a divinely-approved view of righteousness.

At the heart of Mr. Hawley’s condemnation of our terrifyingly Pelagian world lies a dark conclusion about the achievements of modern, liberal, pluralistic societies. When he was still attorney general, William Barr articulated this conclusion in a speech at the University of Notre Dame Law School, where he blamed “the growing ascendancy of secularism” for amplifying “virtually every measure of social pathology,” and maintained that “free government was only suitable and sustainable for a religious people.”

Christian nationalists’ acceptance of President Trump’s spectacular turpitude these past four years was a good measure of just how dire they think our situation is. Even a corrupt sociopath was better, in their eyes, than the horrifying freedom that religious moderates and liberals, along with the many Americans who don’t happen to be religious, offer the world.

That this neo-medieval vision is incompatible with constitutional democracy is clear. But in case you’re in doubt, consider where

Continue reading.

Written by LeisureGuy

11 January 2021 at 3:31 pm

I have to agree with Baruch Spinoza

with one comment

The brief video summary gives the reasons I agree, and I further would say that the only appropriate prayers are those of praise or gratitude. Asking God to make your football team win is (IMO) so misguided that one doesn’t know where to begin — it’s on a part with asking God to make Lima the capital of Boliva because that’s the answer you gave on the mid-term. Petitions to God are, in my view, never appropriate — at least for those who believe that God is all-loving, all-wise, and all-powerful. The only petitionary prayer that makes sense is, “Thy will be done. Amen”

So far as Spinoza’s omissions, I don’t see them as significant. So far as rituals are concerned, one can celebrate any number of things — obvious examples are the winter solstice, when the celebration is the turn toward longer days; the vernal equinox, when daylight lasts longer than night; the summer solstice, the peak of the sun and the beginning of the sumer; and the autumnal equinox, when harvest celebrations might begin. And of course one has rituals to recognize births, marriages, deaths, and anniversaries (such as birthdays or Bastille Day or the like). In fact, the equinoxes and solstices are often co-opted by religions, which time religious rituals to (roughly) coincide with those natural demarcations.

And belonging to a group does not require religion — one’s family is a start, and there are groups based on shared enthusiasms (sports fans, game players, literary discussion groups, bowling leagues), shared experiences (classmates, veterans organizations), shared location (neighborhood groups, civic organizations), shared outlooks (political parties and organizations, environmental groups).

At any rate, I was struck by Spinoza’s view of the world in which we find ourselves.

The video is from an Open Culture post by Josh Jones, which begins:

The so-called Enlightenment period encompasses a surprisingly diverse collection of thinkers, if not always in ethnic or national origin, at least in intellectual disposition, including perhaps the age’s most influential philosopher, the “philosopher’s philosopher,” writes Assad MeymandiBaruch Spinoza did not fit the image of the bewigged philosopher-gentleman of means we tend to popularly associate with Enlightenment thought.

He was born to a family of Sephardic Portuguese Marranos, Jews who were forced to convert to Catholicism but who reclaimed their Judaism when they relocated to Calvinist Amsterdam. Spinoza himself was “excommunicated by Amsterdam Jewry in 1656,” writes Harold Bloom in a review of Rebecca Goldstein’s Betraying Spinoza: “The not deeply chagrined 23-year-old Spinoza did not become a Calvinist, and instead consorted with more liberal Christians, particularly Mennonites.” . . .

Continue reading. There’s more, including links.

Written by LeisureGuy

7 January 2021 at 11:43 am

American Death Cult: Why has the Republican response to the pandemic been so mind-bogglingly disastrous?

leave a comment »

Jonathan Chait wrote this back in July 2020 in New York. And just a reminder: the US as of today has seen 20 million cases and more than 346,000 deaths due to Covid-19.

Last October, the Nuclear Threat Initiative and the Johns Hopkins Center for Health Security compiled a ranking system to assess the preparedness of 195 countries for the next global pandemic. Twenty-one panel experts across the globe graded each country in 34 categories composed of 140 subindices. At the top of the rankings, peering down at 194 countries supposedly less equipped to withstand a pandemic, stood the United States of America.

It has since become horrifyingly clear that the experts missed something. The supposed world leader is in fact a viral petri dish of uncontained infection. By June, after most of the world had beaten back the coronavirus pandemic, the U.S., with 4 percent of the world’s population, accounted for 25 percent of its cases. Florida alone was seeing more new infections a week than China, Japan, Korea, Vietnam, Thailand, Malaysia, Indonesia, the Philippines, Australia, and the European Union combined.

During its long period of decline, the Ottoman Empire was called “the sick man of Europe.” The United States is now the sick man of the world, pitied by the same countries that once envied its pandemic preparedness — and, as recently as the 2014 Ebola outbreak, relied on its expertise to organize the global response.

Our former peer nations are now operating in a political context Americans would find unfathomable. Every other wealthy nation in the world has successfully beaten back the disease, at least significantly, and at least for now. New Zealand’s health minister was forced to resign after allowing two people who had tested positive for COVID-19 to attend a funeral. The Italian Parliament heckled Prime Minister Giuseppe Conte when he briefly attempted to remove his mask to deliver a speech. In May — around the time Trump cheered demonstrators into the streets to protest stay-at-home orders — Boris Johnson’s top adviser set off a massive national scandal, complete with multiple calls for his resignation, because he’d been caught driving to visit his parents during lockdown. If a Trump official had done the same, would any newspaper even have bothered to publish the story?

It is difficult for us Americans to imagine living in a country where violations so trivial (by our standards) provoke such an uproar. And if you’re tempted to see for yourself what it looks like, too bad — the E.U. has banned U.S. travelers for health reasons.

The distrust and open dismissal of expertise and authority may seem uniquely contemporary — a phenomenon of the Trump era, or the rise of online misinformation. But the president and his party are the products of a decades-long war against the functioning of good government, a collapse of trust in experts and empiricism, and the spread of a kind of magical thinking that flourishes in a hothouse atmosphere that can seal out reality. While it’s not exactly shocking to see a Republican administration be destroyed by incompetent management — it happened to the last one, after all — the willfulness of it is still mind-boggling and has led to the unnecessary sickness and death of hundreds of thousands of people and the torpedoing of the reelection prospects of the president himself. Like Stalin’s purge of 30,000 Red Army members right before World War II, the central government has perversely chosen to disable the very asset that was intended to carry it through the crisis. Only this failure of leadership and management took place in a supposedly advanced democracy whose leadership succumbed to a debilitating and ultimately deadly ideological pathology.

How did this happen? In 1973, Republicans trusted science more than religion, while Democrats trusted religion more than science. The reverse now holds true. In the meantime, working-class whites left the Democratic Party, which has increasingly taken on the outlook of the professional class with its trust in institutions and empiricism. The influx of working-class whites (especially religiously observant ones) has pushed Republicans toward increasingly paranoid varieties of populism.

This is the conventional history of right-wing populism — that it was a postwar backlash against the New Deal and the Republican Party’s inability or unwillingness to roll it back. The movement believed the government had been subverted, perhaps consciously, by conspirators seeking to impose some form of socialism, communism, or world government. Its “paranoid style,” so described by historian Richard Hofstadter, became warped with anti-intellectualism, reflecting a “conflict between businessmen of certain types and the New Deal bureaucracy, which has spilled over into a resentment of intellectuals and experts.” Its followers seemed prone to “a disorder in relation to authority, characterized by an inability to find other modes for human relationship than those of more or less complete domination or submission.” Perhaps this sounds like someone you’ve heard of.

But for all the virulence of conservative paranoia in American life, without the sanction of a major party exploiting and profiting from paranoia, and thereby encouraging its growth, the worldview remained relatively fringe. Some of the far right’s more colorful adherents, especially the 100,000 reactionaries who joined the John Birch Society, suspected the (then-novel, now-uncontroversial) practice of adding small amounts of fluoride to water supplies to improve dental health was, in fact, a communist plot intended to weaken the populace. Still, the far right lacked power. Republican leaders held Joe McCarthy at arm’s length; Goldwater captured the nomination but went down in a landslide defeat. In the era of Sputnik, science was hardly a countercultural institution. “In the early Cold War period, science was associated with the military,” says sociologist Timothy O’Brien who, along with Shiri Noy, has studied the transformation. “When people thought about scientists, they thought about the Manhattan Project.” The scientist was calculating, cold, heartless, an authority figure against whom the caring, feeling liberal might rebel. Radicals in the ’60s often directed their protests against the scientists or laboratories that worked with the Pentagon.

But this began to change in the 1960s, along with everything else in American political and cultural life. New issues arose that tended to pit scientists against conservatives. Goldwater’s insouciant attitude toward the prospect of nuclear war with the Soviets provoked scientists to explain the impossibility of surviving atomic fallout and the formation of Scientists and Engineers for Johnson-Humphrey. New research by Rachel Carson about pollution and by Ralph Nader on the dangers of cars and other consumer products made science the linchpin of a vast new regulatory state. Business owners quickly grasped that stopping the advance of big government meant blunting the cultural and political authority of scientists. Expertise came to look like tyranny — or at least it was sold that way.

One tobacco company conceded privately in 1969 that it could not directly challenge the evidence of tobacco’s dangers but could make people wonder how solid the evidence really was. “Doubt,” the memo explained, “is our product.” In 1977, the conservative intellectual Irving Kristol urged business leaders to steer their donations away from public-interest causes and toward the burgeoning network of pro-business foundations. “Corporate philanthropy,” he wrote, “should not be, cannot be, disinterested.” The conservative think-tank scene exploded with reports questioning whether pollution, smoking, driving, and other profitable aspects of American capitalism were really as dangerous as the scientists said.

The Republican Party’s turn against science was slow and jagged, as most party-identity changes tend to be. The Environmental Protection Agency had been created under Richard Nixon, and its former administrator, Russell Train, once recalled President Gerald Ford promising to support whatever auto-emissions guidelines his staff deemed necessary. “I want you to be totally comfortable in the fact that no effort whatsoever will be made to try to change your position in any way,” said Ford — a pledge that would be unimaginable for a contemporary Republican president to make. Not until Ronald Reagan did Republican presidents begin letting business interests overrule experts, as when his EPA used a “hit list” of scientists flagged by industry as hostile. And even Reagan toggled between giving business a free hand and listening to his advisers (as he did when he signed a landmark 1987 agreement to phase out substances that were depleting the ozone layer and a plan the next year to curtail acid rain).

The party’s rightward tilt accelerated in the 1990s. “With the collapse of the Soviet Union, Cold Warriors looked for another great threat,” wrote science historians Naomi Oreskes and Erik Conway. “They found it in environmentalism,” viewing climate change as a pretext to impose government control over the whole economy. Since the 1990s was also the decade in which scientific consensus solidified that greenhouse-gas emissions were permanently increasing temperatures, the political stakes of environmentalism soared.

The number of books criticizing environmentalism increased fivefold over the previous decade, and more than 90 percent cited evidence produced by right-wing foundations. Many of these tracts coursed with the same lurid paranoia as their McCarthy-era counterparts. This was when the conspiracy theory that is currently conventional wisdom on the right — that scientists across the globe conspired to exaggerate or falsify global warming data in order to increase their own power — first took root.

This is not just a story about elites. About a decade after business leaders launched their attack on science from above, a new front opened from below: Starting in the late 1970s,  . . .

Continue reading.

Written by LeisureGuy

1 January 2021 at 4:55 pm

Plato in Sicily: Philosophy in Practice

leave a comment »

Nick Romeo, a journalist and author who teaches philosophy for Erasmus Academy, and Ian Tewksbury, a Classics graduate student at Stanford University, write in Aeon:

In 388 BCE, Plato was nearly forty. He had lived through an oligarchic coup, a democratic restoration, and the execution of his beloved teacher Socrates by a jury of his fellow Athenians. In his youth, Plato seriously contemplated an entry into Athens’ turbulent politics, but he determined that his envisioned reforms of the city’s constitution and educational practices were vanishingly unlikely to be realised. He devoted himself instead to the pursuit of philosophy, but he retained a fundamental concern with politics, ultimately developing perhaps the most famous of all his formulations: that political justice and human happiness require kings to become philosophers or philosophers to become kings. As Plato approached the age of forty, he visited Megara, Egypt, Cyrene, southern Italy, and, most consequentially of all, the Greek-speaking city-state of Syracuse, on the island of Sicily.

In Syracuse, Plato met a powerful and philosophically-minded young man named Dion, the brother-in-law of Syracuse’s decadent and paranoid tyrant, Dionysius I. Dion would become a lifelong friend and correspondent. This connection brought Plato to the inner court of Syracuse’s politics, and it was here that he decided to test his theory that if kings could be made into philosophers – or philosophers into kings – then justice and happiness could flourish at last.

Syracuse had a reputation for venality and debauchery, and Plato’s conviction soon collided with the realities of political life in Sicily. The court at Syracuse was rife with suspicion, violence and hedonism. Obsessed with the idea of his own assassination, Dionysius I refused to allow his hair to be cut with a knife, instead having it singed with coal. He forced visitors – even his son Dionysius II and his brother Leptines – to prove that they were unarmed by having them stripped naked, inspected and made to change clothes. He slew a captain who’d had a dream of killing him, and he put to death a soldier who handed Leptines a javelin to sketch a map in the dust. This was an inauspicious candidate for the title of philosopher-king.

Plato’s efforts did not fare well. He angered Dionysius I with his philosophical critique of the lavish hedonism of Syracusan court life, arguing that, instead of orgies and wine, one needed justice and moderation to produce true happiness. However sumptuous the life of a tyrant might be, if it was dominated by insatiable grasping after sensual pleasures, he remained a slave to his passions. Plato further taught the tyrant the converse: a man enslaved to another could preserve happiness if he possessed a just and well-ordered soul. Plato’s first visit to Sicily ended in dark irony: Dionysius I sold the philosopher into slavery. He figured that if Plato’s belief were true, then his enslavement would be a matter of indifference since, in the words of the Greek biographer Plutarch, ‘he would, of course, take no harm of it, being the same just man as before; he would enjoy that happiness, though he lost his liberty.’

Fortunately, Plato was soon ransomed by friends. He returned to Athens to found the Academy, where he likely produced many of his greatest works, including The Republic and The Symposium. But his involvement in Sicilian politics continued. He returned to Syracuse twice, attempting on both later trips to influence the mind and character of Dionysius II at the urging of Dion.

These three episodes are generally omitted from our understanding of Plato’s philosophy or dismissed as the picaresque inventions of late biographers. However, this is a mistake that overlooks the philosophical importance of Plato’s Italian voyages. In fact, his three trips to Sicily reveal that true philosophical knowledge entails action; they show the immense power of friendship in Plato’s life and philosophy; and they suggest that Plato’s philosopher-king thesis is not false so much as incomplete.

These key events are cogently expressed in Plato’s often-overlooked Seventh Letter. The Seventh Letter has proved an enigma for scholars since at least the great German philologists of the 19th century. While the majority of scholars have accepted its authenticity, few have given its theory of political action a prominent place in the exegesis of Plato. In the past three decades, some scholars have even moved to write it out of the Platonic canon, with the most recent Oxford commentary terming it The PseudoPlatonic Seventh Letter (2015). Each age has its own Plato, and perhaps given the apolitical quietism of many academics, it makes sense that contemporary academics often neglect Plato’s discussion of political action. Nonetheless, most scholars – even those who wished it to be a forgery – have found the letter authentic, based on historical and stylistic evidence. If we return to the story of Plato’s Italian journeys, which Plato himself tells in The Seventh Letter, we’re able to resurrect the historical Plato who risked his life in order to unite philosophy and power.

While The Seventh Letter focuses on the story of Plato’s three voyages to Syracuse, it begins with a brief synopsis of his early life. Like most members of the Athenian elite, his first ambition was to enter politics and public life. In Plato’s 20s, however, Athens underwent a series of violent revolutions, culminating in the restoration of the democracy and the execution of his teacher Socrates in 399 BCE. ‘Whereas at first I had been full of zeal for public life,’ Plato wrote, ‘when I noted these changes and saw how unstable everything was, I became in the end quite dizzy.’ He decided that the time was too chaotic for meaningful action, but he didn’t abandon the desire to engage in political life. Instead, in his own words, he was ‘waiting for the right time’. He was also waiting for the right friends.

When Plato first arrived in Sicily, a trip that likely took more than a week by boat on the rough and dangerous Mediterranean, he immediately noticed the islanders’ extravagant way of life. He was struck by their ‘blissful life’, one ‘replete … with Italian feasts’, where ‘existence is spent in gorging food twice a day and never sleeping alone at night.’ No one can become wise, Plato believed, if he lives a life primarily focused on sensual pleasure. Status-oriented hedonism creates a society devoid of community, one in which the stability of temperance is sacrificed to the flux of competitive excess. Plato writes:

Nor could any State enjoy tranquility, no matter how good its laws when its men think they must spend their all on excesses, and be easygoing about everything except the drinking bouts and the pleasures of love that they pursue with professional zeal. These States are always changing into tyrannies, or oligarchies, or democracies, while the rulers in them will not even hear mention of a just and equitable constitution.

Though the Syracusan state was in disarray, Plato’s friend Dion offered him a unique opportunity to influence the Sicilian kings. Dion didn’t partake in the ‘blissful life’ of the court. Instead, according to Plato, he lived ‘his life in a different manner’, because he chose ‘virtue worthy of more devotion than pleasure and all other kinds of luxury’. While today we might not associate friendship with political philosophy, many ancient thinkers understood the intimate connection between the two. Plutarch, a subtle reader of Plato, expresses this link nicely:

[L]ove, zeal, and affection … which, though they seem more pliant than the stiff and hard bonds of severity, are nevertheless the strongest and most durable ties to sustain a lasting government.

Plato saw in Dion ‘a zeal and attentiveness I had never encountered in any young man’. The opportunity to extend these bonds to the summit of political power would present itself 20 years later, after Plato escaped slavery and Dionysius I had died.

Dionysius II, the elder tyrant’s son, also didn’t appear likely to become a philosopher king. Although Dion wanted his brother-in-law Dionysius I to give Dionysius II a liberal education, the older king’s fear of being deposed made him reluctant to comply. He worried that if his son received a sound moral education, conversing regularly with wise and reasonable teachers, he might overthrow him. So Dionysius I kept Dionysius II confined and uneducated. As he grew older, courtiers plied him with wine and women. Dionysius II once held a 90-day long drunken debauch, refusing to conduct any official business: ‘drinking, singing, dancing, and buffoonery reigned there without control,’ Plutarch wrote.

Nonetheless, Dion used all his influence to persuade the young king to invite Plato to Sicily and place himself under the guidance of the Athenian philosopher. Dionysius II began sending Plato letters urging him to visit, and Dion as well as various Pythagorean philosophers from southern Italy added their own pleas. But Plato was nearly 60 years old, and his last experience in Syracusan politics must have left him reluctant to test fate again. Not heeding these entreaties would have been an easy and understandable choice.

Dion wrote to Plato that this was  . . .

Continue reading. There’s more.

Later in the article:

He writes in The Seventh Letter:

I set out from home … dreading self-reproach most of all; lest I appear to myself only theory and no deed willingly undertaken … I cleared myself from reproach on the part of Philosophy, seeing that she would have been disgraced if I, through poorness of spirit and timidity, had incurred the shame of cowardice …

This reveals a conception of philosophy in which ‘theory’ is damaged by a lack of corresponding ‘deed’. The legitimacy of philosophy requires the conjunction of knowledge and action.

Written by LeisureGuy

30 December 2020 at 1:15 pm

Cellphones cripple social skills

leave a comment »

Ron Srigley writes in MIT Technology Review:

A few years ago, I performed an experiment in a philosophy class I was teaching. My students had failed a midterm test rather badly. I had a hunch that their pervasive use of cell phones and laptops in class was partly responsible. So I asked them what they thought had gone wrong. After a few moments of silence, a young woman put up her hand and said: “We don’t understand what the books say, sir. We don’t understand the words.” I looked around the class and saw guileless heads pensively nodding in agreement.

I extemporized a solution: I offered them extra credit if they would give me their phones for nine days and write about living without them. Twelve students—about a third of the class—took me up on the offer. What they wrote was remarkable, and remarkably consistent. These university students, given the chance to say what they felt, didn’t gracefully submit to the tech industry and its devices.

The usual industry and education narrative about cell phones, social media, and digital technology generally is that they build community, foster communication, and increase efficiency, thus improving our lives. Mark Zuckerberg’s recent reformulation of Facebook’s mission statement is typical: the company aims to “give people the power to build community and bring the world closer together.”

Without their phones, most of my students initially felt lost, disoriented, frustrated, and even frightened. That seemed to support the industry narrative: look how disconnected and lonely you’ll be without our technology. But after just two weeks, the majority began to think that their cell phones were in fact limiting their relationships with other people, compromising their own lives, and somehow cutting them off from the “real” world. Here is some of what they said.

“You must be weird or something”

“Believe it or not, I had to walk up to a stranger and ask what time it was. It honestly took me a lot of guts and confidence to ask someone,” Janet wrote. (Her name, like the others here, is a pseudonym.) She describes the attitude she was up against: “Why do you need to ask me the time? Everyone has a cell phone. You must be weird or something.” Emily went even further. Simply walking by strangers “in the hallway or when I passed them on the street” caused almost all of them to take out a phone “right before I could gain eye contact with them.”

To these young people, direct, unmediated human contact was experienced as ill-mannered at best and strange at worst. James: “One of the worst and most common things people do nowadays is pull out their cell phone and use it while in a face-to-face conversation. This action is very rude and unacceptable, but yet again, I find myself guilty of this sometimes because it is the norm.” Emily noticed that “a lot of people used their cell phones when they felt they were in an awkward situation, for an example [sic] being at a party while no one was speaking to them.”

The price of this protection from awkward moments is the loss of human relationships, a consequence that almost all the students identified and lamented. Without his phone, James said, he found himself forced to look others in the eye and engage in conversation. Stewart put a moral spin on it. “Being forced to have [real relations with people] obviously made me a better person because each time it happened I learned how to deal with the situation better, other than sticking my face in a phone.” Ten of the 12 students said their phones were compromising their ability to have such relationships.

Virtually all the students admitted that ease of communication was one of the genuine benefits of their phones. However, eight out of 12 said they were genuinely relieved not to have to answer the usual flood of texts and social-media posts. Peter: “I have to admit, it was pretty nice without the phone all week. Didn’t have to hear the fucking thing ring or vibrate once, and didn’t feel bad not answering phone calls because there were none to ignore.”

Indeed, the language they used indicated that they experienced this activity almost as a type of harassment. “It felt so free without one and it was nice knowing no one could bother me when I didn’t want to be bothered,” wrote William. Emily said that she found herself “sleeping more peacefully after the first two nights of attempting to sleep right away when the lights got shut off.” Several students went further and claimed that communication with others was in fact easier and more efficient without their phones. Stewart: “Actually I got things done much quicker without the cell because instead of waiting for a response from someone (that you don’t even know if they read your message or not) you just called them [from a land line], either got an answer or didn’t, and moved on to the next thing.”

Technologists assert that their instruments make us more productive. But for the students, phones had the opposite effect. “Writing a paper and not having a phone boosted productivity at least twice as much,” Elliott claimed. “You are concentrated on one task and not worrying about anything else. Studying for a test was much easier as well because I was not distracted by the phone at all.” Stewart found he could “sit down and actually focus on writing a paper.” He added, “Because I was able to give it 100% of my attention, not only was the final product better than it would have been, I was also able to complete it much quicker.” Even Janet, who missed her phone more than most, admitted, “One positive thing that came out of not having a cell phone was that I found myself more productive and I was more apt to pay attention in class.”

Some students felt not only distracted by their phones, but morally compromised. Kate: “Having a cell phone has actually affected my personal code of morals and this scares me … I regret to admit that I have texted in class this year, something I swore to myself in high school that I would never do … I am disappointed in myself now that I see how much I have come to depend on technology … I start to wonder if it has affected who I am as a person, and then I remember that it already has.” And James, though he says we must continue to develop our technology, said that “what many people forget is that it is vital for us not to lose our fundamental values along the way.”

Other students were worried that their cell-phone addiction was depriving them of a relationship to the world. Listen to James: “It is almost like . . .

Continue reading.

Written by LeisureGuy

27 December 2020 at 7:59 am

How Claude Shannon Invented the Future

leave a comment »

David Tse writes in Quanta:

Science seeks the basic laws of nature. Mathematics searches for new theorems to build upon the old. Engineering builds systems to solve human needs. The three disciplines are interdependent but distinct. Very rarely does one individual simultaneously make central contributions to all three — but Claude Shannon was a rare individual.

Despite being the subject of the recent documentary The Bit Player — and someone whose work and research philosophy have inspired my own career — Shannon is not exactly a household name. He never won a Nobel Prize, and he wasn’t a celebrity like Albert Einstein or Richard Feynman, either before or after his death in 2001. But more than 70 years ago, in a single groundbreaking paper, he laid the foundation for the entire communication infrastructure underlying the modern information age.

Shannon was born in Gaylord, Michigan, in 1916, the son of a local businessman and a teacher. After graduating from the University of Michigan with degrees in electrical engineering and mathematics, he wrote a master’s thesis at the Massachusetts Institute of Technology that applied a mathematical discipline called Boolean algebra to the analysis and synthesis of switching circuits. It was a transformative work, turning circuit design from an art into a science, and is now considered to have been the starting point of digital circuit design.

Next, Shannon set his sights on an even bigger target: communication.

Communication is one of the most basic human needs. From smoke signals to carrier pigeons to the telephone to television, humans have always sought methods that would allow them to communicate farther, faster and more reliably. But the engineering of communication systems was always tied to the specific source and physical medium. Shannon instead asked, “Is there a grand unified theory for communication?” In a 1939 letter to his mentor, Vannevar Bush, Shannon outlined some of his initial ideas on “fundamental properties of general systems for the transmission of intelligence.” After working on the problem for a decade, Shannon finally published his masterpiece in 1948: “A Mathematical Theory of Communication.”

The heart of his theory is a simple but very general model of communication: A transmitter encodes information into a signal, which is corrupted by noise and then decoded by the receiver. Despite its simplicity, Shannon’s model incorporates two key insights: isolating the information and noise sources from the communication system to be designed, and modeling both of these sources probabilistically. He imagined the information source generating one of many possible messages to communicate, each of which had a certain probability. The probabilistic noise added further randomness for the receiver to disentangle.

Before Shannon, the problem of communication was primarily viewed as a deterministic signal-reconstruction problem: how to transform a received signal, distorted by the physical medium, to reconstruct the original as accurately as possible. Shannon’s genius lay in his observation that the key to communication is uncertainty. After all, if you knew ahead of time what I would say to you in this column, what would be the point of writing it?

This single observation shifted the communication problem from the physical to the abstract, allowing Shannon to model the uncertainty using probability. This came as a total shock to the communication engineers of the day.

Given that framework of uncertainty and probability, Shannon set out in his landmark paper to systematically determine the fundamental limit of communication. His answer came in three parts. Playing a central role in all three is the concept of an information “bit,” used by Shannon as the basic unit of uncertainty. A portmanteau of “binary digit,” a bit could be either a 1 or a 0, and Shannon’s paper is the first to use the word (though he said the mathematician John Tukey used it in a memo first).

First, Shannon came up with a formula for the minimum number of bits per second to represent the information, a number he called its entropy rate, H. This number quantifies the uncertainty involved in determining which message the source will generate. The lower the entropy rate, the less the uncertainty, and thus the easier it is to compress the message into something shorter. For example, texting at the rate of 100 English letters per minute means sending 26100 possible messages every minute, each represented by a sequence of 100 letters. One could encode all these possibilities into 470 bits, since 2470 ≈ 26100. If the sequences were equally likely, then Shannon’s formula would say that the entropy rate is indeed 470 bits per minute. In reality, some sequences are much more likely than others, and the entropy rate is much lower, allowing for greater compression.

Second, he provided a formula for the maximum number of bits per second that can be reliably communicated in the face of noise, which he called the system’s capacity, C. This is the maximum rate at which the receiver can resolve the message’s uncertainty, effectively making it the speed limit for communication.

Finally, he showed that reliable communication of the information from the source in the face of noise is possible if and only if  . . .

Continue reading.

Written by LeisureGuy

22 December 2020 at 4:30 pm

America, We Have a Problem

leave a comment »

Thomas B. Edsall writes in the NY Times:

The turbulence that followed the Nov. 3 election has roiled American politics, demonstrating an ominous vulnerability in our political system.

Donald Trump used the 41-day window between the presidential election and the Dec. 14 meeting of the Electoral College to hold the country in thrall based on his refusal to acknowledge Joe Biden’s victory and his own defeat.

Most troubling to those who opposed Trump, and even to some who backed him, was the capitulation by Republicans in the House and Senate. It took six weeks from Election Day for Mitch McConnell, the Senate majority leader, to acknowledge on Tuesday that “the Electoral College has spoken. Today I want to congratulate President-elect Joe Biden.”

Trump’s refusal to abide by election law was widely viewed as conveying an implicit threat of force. Equally alarming, Trump, with no justification, focused his claims of voter fraud on cities with large African-American populations in big urban counties, including Detroit in Wayne County, Milwaukee in Milwaukee County, Philadelphia in Philadelphia County and Atlanta in Fulton County.

Bob Bauer, a senior legal adviser to the Biden campaign, told reporters that the Trump campaign’s “targeting of the African-American community is not subtle. It is extraordinary,” before adding, “It’s quite remarkable how brazen it is.”

Viewing recent events through a Trump prism may be too restrictive to capture the economic, social and cultural turmoil that has grown more corrosive in recent years.

On Oct. 30, a group of 15 eminent scholars (several of whom I also got a chance to talk to) published an essay — “Political Sectarianism in America” — arguing that the antagonism between left and right has become so intense that words and phrases like “affective polarization” and “tribalism” were no longer sufficient to capture the level of partisan hostility.

“The severity of political conflict has grown increasingly divorced from the magnitude of policy disagreement,” the authors write, requiring the development of “a superordinate construct, political sectarianism — the tendency to adopt a moralized identification with one political group and against another.”

Political sectarianism, they argue,

consists of three core ingredients: othering — the tendency to view opposing partisans as essentially different or alien to oneself; aversion — the tendency to dislike and distrust opposing partisans; and moralization — the tendency to view opposing partisans as iniquitous. It is the confluence of these ingredients that makes sectarianism so corrosive in the political sphere.

There are multiple adverse outcomes that result from political sectarianism, according to the authors. It “incentivizes politicians to adopt antidemocratic tactics when pursuing electoral or political victories” since their supporters will justify such norm violation because “the consequences of having the vile opposition win the election are catastrophic.”

Political sectarianism also legitimates

a willingness to inflict collateral damage in pursuit of political goals and to view copartisans who compromise as apostates. As political sectarianism has surged in recent years, so too has support for violent tactics.

In a parallel line of analysis, Jack Goldstone, a professor of public policy at George Mason University, and Peter Turchin, a professor of ecology and evolutionary biology at the University of Connecticut, contend that a combination of economic and demographic trends point to growing political upheaval. Events of the last six weeks have lent credibility to their research: On Sept. 10, they published an essay, “Welcome To The ‘Turbulent Twenties,’” making the case that the United States is “heading toward the highest level of vulnerability to political crisis seen in this country in over a hundred years.” There is, they wrote, “plenty of dangerous tinder piled up, and any spark could generate an inferno.”

Goldstone and Turchin do not believe that doomsday is inevitable. They cite previous examples of countries reversing downward trends, including the United States during the Great Depression:

To be sure, the path back to a strong, united and inclusive America will not be easy or short. But a clear pathway does exist, involving a shift of leadership, a focus on compromise and responding to the world as it is, rather than trying desperately to hang on to or restore a bygone era.

The Goldstone-Turchin argument is based on a measure called a “political stress indicator,” developed by Goldstone in his 1991 book, “Revolution and Rebellion in the Early Modern World.” According to Goldstone, the measure “predicted the 1640s Puritan Revolution, the French Revolution of 1789, and the European Revolutions of 1830 and 1848.”

Goldstone wrote that . . .

Continue reading.

Written by LeisureGuy

16 December 2020 at 6:17 pm

Life begins at 40: the demographic and cultural roots of the midlife crisis

leave a comment »

Dante seems to be thinking a bout a man’s mid-life crises in the opening lines of The Divine Comedy, wherein he describes a man in midlife who has lost his way:

When I had journeyed half of our life’s way,
I found myself within a shadowed forest,
for I had lost the path that does not stray.

Ah, it is hard to speak of what it was,
that savage forest, dense and difficult,
which even in recall renews my fear:

so bitter—death is hardly more severe!
But to retell the good discovered there,
I’ll also tell the other things I saw.

I cannot clearly say how I had entered
the wood; I was so full of sleep just at
the point where I abandoned the true path.

But when I’d reached the bottom of a hill—
it rose along the boundary of the valley
that had harassed my heart with so much fear—

I looked on high and saw its shoulders clothed
already by the rays of that same planet
which serves to lead men straight along all roads.

At this my fear was somewhat quieted;
for through the night of sorrow I had spent,
the lake within my heart felt terror present.

And just as he who, with exhausted breath,
having escaped from sea to shore, turns back
to watch the dangerous waters he has quit,

so did my spirit, still a fugitive,
turn back to look intently at the pass
that never has let any man survive.

I let my tired body rest awhile.
Moving again, I tried the lonely slope—
my firm foot always was the one below.

And almost where the hillside starts to rise—
look there!—a leopard, very quick and lithe,
a leopard covered with a spotted hide.

He did not disappear from sight, but stayed;
indeed, he so impeded my ascent
that I had often to turn back again.

The time was . . .

Years ago I read Daniel J. Levinson’s The Seasons of a Man’s Life, a report on a 10-year psychological study he had done to discover the character of the stages of a man’s life. He specifically studied the lives of men and not women because (a) he was a man and he was interested to learn more about what he had experienced and what he could expect, and (b) he strongly suspected that the stages of a woman’s life would be different, with the impacts of post-puberty fertility (which brings the possibility and often the actuality of childbirth and the changes that entails) and menopause (which ends fertility and brings further change).

As I recall, from having read the book decades ago, he discovered that men must deal with ten-year milestones, points in a man’s life when certain key tasks must have been fulfilled or now be abandoned. This time is when he sets his priorities for the next part of his life. They go like this:

Around 10 years of age, childhood ends and in the next decade the youth must explore possibilities to learn more about the adult world.

Around age 20 the exploration ends because it’s time to make choices and build the framework and direction for one’s life as an independent adult. This is a big transition, and it brings a fair amount of stress.

Around age 30, there’s another transition, but at this point a man is usually established in his adult life, so this one is more a matter of course corrections and adjustments to priorities.

Around age 40 is another big transition, as stressful as at age 20 and perhaps even more so: at age 20 one more or less expects the stress of making big choices, but at age 40 most would expect their lives to be set. However, at this time mortality becomes clearer, and most want to figure out what their true life accomplishment will be.

Around age 50 is like around age 30: course corrections, but not the big stress.

And around age 60 is another stressful transition — it’s passing-the-torch time, and looking forward to what you will be doing as the younger generation supplants you.

Basically, he found that the 20 year transitions — 20, 40, 60, and (presumably) 80 are the stressful ones, with 10, 30, 50, and (presumably) 70 not being so traumatic and upsetting.

It’s an interesting book and worth reading. Gail Sheehy interviewed him at length and basically took what she wanted of his study’s findings to write her book, Passages: Predictable Crises of Adult Life, which I also read. I prefer Levinson’s book. It might be interesting to read it again.

With that as preamble, take a look at the 2019 Wilkins–Bernal–Medawar lecture given by Mark Jackson at The Royal Society:

Abstract

In 1965, the psychoanalyst and social scientist Elliott Jaques introduced a term, the ‘midlife crisis’, that continues to structure Western understandings and experiences of middle age. Following Jaques’s work, the midlife crisis became a popular means of describing how—and why—men and women around the age of 40 became disillusioned with work, disenchanted with relationships and detached from family responsibilities. Post-war sociological and psychological studies of middle age regarded the midlife crisis as a manifestation of either biological or psychological change, as a moment in the life course when—perhaps for the first time—people felt themselves to be declining towards death. Although the midlife crisis has often been dismissed as a myth or satirized in novels and films, the concept has persisted not only in stereotypical depictions of rebellion and infidelity at midlife, but also in research that has sought to explain the particular social, physical and emotional challenges of middle age. In the spirit of the pioneering research of John Wilkins, John Bernal and Peter Medawar, each of whom in different ways emphasized the complex interrelations between science and society, I want to argue that the emergence of the midlife crisis—as concept and experience—during the middle decades of the twentieth century was not coincidental. Rather it was the product of historically specific demographic changes and political aspirations—at least in the Western world—to keep alive the American dream of economic progress and material prosperity.


.
Introduction

In 1965, the Canadian-born psychoanalyst and social scientist Elliott Jaques introduced a term, the ‘midlife crisis’, that continues to shape Western accounts of ageing, love and loss. Working at the Tavistock Institute of Human Relations in London, Jaques was well known for his studies of organizational structures, introducing terms such as ‘corporate culture’ into contemporary discussions of occupational hierarchies and working practices.1 His research was based on empirical studies of institutions such as factories, churches and hospitals. But it was also shaped by his practice as a psychoanalyst and by the theories of Sigmund Freud and Melanie Klein. Psychoanalytical influences are particularly evident in his formulation of the midlife crisis. Jaques had begun to think about the concept in 1952—at the age of 35—when a period of personal reflection was prompted by the conclusion of his own analytical sessions with Klein and by reading Dante’s Inferno—a poetic account of a midlife journey into, and eventually through, darkness and depression. The ‘beautiful lines’ at the start of the Inferno, Jaques wrote many years later, ‘melded with my own inner experiences of the midlife struggle with its vivid sense of the meaning of personal death’.2

When Jaques first presented his paper on ‘death and the midlife crisis’ to the British Psychoanalytical Society in 1957, it generated only a muted response and was not accepted by the International Journal of Psycho-Analysis until eight years later.3 In the published version, which was based on a study of 300 creative artists as well as case histories from his clinical practice, Jaques argued that during the middle years of life, when the ‘first phase of adult life has been lived’, adjustment to a new set of circumstances was necessary: work and family had been established; parents had grown old; and children were ‘at the threshold of adulthood’. The challenge of coping with these pressures, when combined with personal experiences of ageing, triggered awareness of the reality of death: ‘The paradox is that of entering the prime of life, the stage of fulfilment, but at the same time the prime and fulfilment are dated. Death lies beyond.’4

According to Jaques, those who reached midlife without having successfully established themselves in terms of marriage and occupation were ‘badly prepared for meeting the demands of middle age’. As a result, they were likely to display what became the clichéd features of a midlife crisis: disillusionment with life; dissatisfaction with work; a desperation to postpone mental and physical decline; detachment from family responsibilities; and infidelity with a younger, more athletic accomplice. It was psychological immaturity, Jaques argued, that generated a depressive crisis around the age of 35 that was energetically masked by a manic determination to thwart advancing years:

The compulsive attempts, in many men and women reaching middle age, to remain young, the hypochondriacal concern over health and appearance, the emergence of sexual promiscuity in order to prove youth and potency, the hollowness and lack of genuine enjoyment of life, and the frequency of religious concern, are familiar patterns. They are attempts at a race against time.5

Jacqus warned that, for those who did not work carefully through the psychological anguish of midlife, impulsive strategies intended to protect against the tragedy of death were unlikely to be successful: ‘These defensive fantasies’, he insisted, ‘are just as persecuting, however, as the chaotic and hopeless internal situation they are meant to mitigate.’6

As Jaques’s turn of phrase became more popular on both sides of the Atlantic, he was regularly cited as the originator of the term.7 His concept of the midlife crisis shaped research into the life course and informed self-help and therapeutic approaches to the individual and relational challenges of middle age. Already by the late 1960s, Jaques’s work was framing attempts to understand and resolve the ‘search for meaning’ that was thought to typify the midlife identity crisis.8 In Britain, the impact of the midlife crisis on marriage inspired efforts to address the personal, familial and social determinants—and consequences—of rising levels of divorce. It also influenced the psychoanalytical approaches to resolving marital tensions adopted by Henry V. Dicks and his colleagues at the Tavistock Clinic.9 Elsewhere, the midlife crisis became a notable motif in the work of researchers investigating the impact of life transitions on marriage trajectories, personal identity, and health in men and women—most notably in studies by American authors such as Roger Gould, Gail Sheehy, George Vaillant and Daniel Levinson.10 The fantasies of middle-aged men hoping to retain their youthful vigour also figured in literary and cinematic treatments of marriage, love and loss during the middle years—most famously in novels by Sloan Wilson, David Ely, John Updike and David Nobbs.11

In the post-war decades, the midlife crisis was understood in two principal ways. On the one hand, problems of midlife were read in terms of  . . .

Continue reading.

Written by LeisureGuy

16 December 2020 at 3:38 pm

Impatience: a deep cause of Western failure in handling the pandemic

leave a comment »

Branko Milanovic writes at Global Inequality:

 In  October 2019, Johns Hopkins University and the Economist Intelligence Unit published the  Global Epidemic Preparedness Report (Global Health Security Report). Never was a report on an important global topic better timed. And never was it more wrong.

The report argued that the best prepared countries are the following three: the US (in reality, the covid outcome, as of mid-December 2020, was almost 1000 deaths per million), UK (the same), and the Netherlands (almost 600). Vietnam was ranked No. 50 (while its current covid fatalities per million are 0.4), China was ranked 51st (covid fatalities are 3 per million), Japan was ranked 21st (20). Indonesia (deaths: 69 per million) and Italy (almost 1100 deaths per million) were ranked the same; Singapore (5 deaths per million) and Ireland (428 deaths per million) were ranked next to each other. People who were presumably most qualified to figure out how to be best prepared for a pandemic have colossally failed.

Their mistake confirms how unexpected and difficult it is to explain the debacle of Western countries (where I include not only the US and Europe, but also Russia and Latin America) in the handling of the pandemic. There was no shortage of possible explanations produced ever since the failure became obvious: incompetent governments (especially Trump), administrative confusion, “civil liberties”, initial underestimation of the danger, dependence on imports of PPE…The debate will continue for years. To use a military analogy: the covid debacle is like the French debacle in 1940. If one looks at any objective criteria (number of soldiers, quality of equipment, mobilization effort), the French defeat should have never happened. Similarly, if one looks at the objective criteria regarding covid, as the October report indeed did, the death rates in the US, Italy or UK are simply impossible to explain: neither by the number of doctors or nurses per capita, by health expenditure, by the education level of the population, by total income, by quality of hospitals…

The failure is most starkly seen when contrasted with East Asian countries which, whether democratic or authoritarian, have had outcomes that are not moderately but several orders of magnitude superior to those of Western countries. How was this possible? People have argued that it might be due to Asian countries’ prior exposure to epidemics like SARS, or Asian collectivism as opposed to Western individualism.

I would like to propose another deeper cause of the debacle. It is a soft cause. It is a speculation. It cannot be proven empirically. It has never been measured and perhaps it is impossible to measure with any degree of exactness. That explanation is impatience.

When one looks at Western countries’ reaction to the pandemic, one is struck by its stop-and-go character. Lockdown measures were imposed, often reluctantly, in the Spring when the epidemic seemed to be at the peak, just to be released as soon as there was an improvement. The improvement was perceived by the public as the end of the epidemic. The governments were happy to participate in that self-deception. Then, in the Fall, the epidemic came back with vengeance, and again the tough measures were imposed half-heartedly, under pressure, and with the (already once-chastened) hope that they could be rescinded for the holidays.

Why did not governments and the public go from the beginning for strong measures whose objective would not have been merely to “flatten the curve” but to either eradicate the virus or drive it out to such an extent, as it was done in East Asia, so that only sporadic bursts might remain? Those flare ups could be dealt again using drastic measures as in June when Beijing closed its largest open market, supplying several million people, after a few cases of covid were linked to it.

The public, and thus I think, the governments were unwilling to take the East Asian approach to the pandemic because of a culture of impatience, of desire to quickly solve all problems, to bear only very limited costs. That delusion however did not work with covid.

I think that impatience can be related to  . . .

Continue reading.

Written by LeisureGuy

16 December 2020 at 2:30 pm

Philosophical counseling

leave a comment »

Well, I never, as people once said. Reasonio offers an unusual service:

As an APPA-certified philosophical counselor, I engage in 1-on-1 coaching and philosophical counseling work with clients.  My clients come from a variety of life situations and professions, but what they have in common is a desire to work through issues that are holding them back or presenting problems in their lives, relationships, or careers.

My work involves collaborating with clients to identify, understand, and address central problems and issues.  We also employ resources, insights, and models derived from philosophical sources.

My past and current clients include corporate executives, startup-CEOs, medical professionals, psychotherapists, consultants, retail employees, government workers, retirees, students, as well as many others.

The scope of my practice includes matters such as moral dilemmas, interpersonal relationships, existential crises and concerns, discordant or underdeveloped belief systems, emotional issues, work-life balance, career decisions and coaching, transition between life-stages, and realization of human potential. I also assist clients with engaging in self-reflection, productive decision-making, and realizing their own capacities for incorporating Philosophy to improve their lives, relationships, and careers.

The majority of my sessions are conducted virtually via Skype, but I also meet clients locally for face-to-face sessions in my local area.

My standard rate for a 50-minute session is US$120.00, but I offer a sliding scale fee for lower-income clients.  I work with a limited number of clients eligable for those rates in any given time period.

I also suggest perusing my Scope of Practice,which discusses the discipline and practice of philosophical counseling.  It also highlights areas of my particular expertise and concentration.

If you are interested in considering a course of Philosophical Counseling with me, I’m happy to work with you.  Email me at greg@reasonio.com, for more information or to schedule an initial session

What Is Philosophical Counseling?

As far back as the great philosophical schools in antiquity, philosophers have been been enrolled in practical roles as counselors and advisers, assisting people in making difficult decisions, improving their lives and relationships, and developing greater self-understanding.  Throughout history, practitioners have used philosophy in very practical ways, generating models for fulfilling and thoughtful ways of living.  During the last several centuries, however, philosophy became more and more restricted to an academic, largely theoretical discipline removed from the issues of everyday life.  In recent decades, many of its great practitioners deliberately steered philosophy solidly back to its practical roots and concerns.

This resulted in the emergence of philosophical counseling as a recognized discipline, community, and set of practices.  Philosophical counseling bears similarities to, but is distinct from other types of counseling, such as psychiatric,  psychological, or religious counseling.  Psychiatric and psychological counseling focus largely upon diagnosis and treatment of mental illness, while philosophical counseling works from a non-medical and non-clinical perspectives.  (It is worth noting that historically, great theorists and practitioners in psychiatry and psychology often had training in, and drew upon resources from philosophy).  Philosophical counseling also differs from religious counseling in drawing primarily upon philosophical resources and involving no religious or theological commitments.

Philosophical counseling is also similar to – and in some cases can overlap with – disciplines such as life coaching and professional coaching.  It differs, however, in the types of training and credentialing required for philosophical counseling, and in the more rigorous reliance upon philosophical models, resources, and perspectives in its practice.  An explicit philosophical framework situates the type of work carried out in coaching within broader horizons in philosophical counseling.

My Credentials, Practice, and Approaches.

My certification in philosophical counseling is granted by the American Philosophical Practitioners Association, and I have successfully assisted a number of clients over the years.  After earning a Ph.D in Philosophy from the University of Southern Illinois at Carbondale in 2002, I have designed and taught philosophy courses, with particularly strong focus on Ethics, Critical Thinking, Practical Rationality, Philosophy of Emotion, and the History of Philosophy for over a decade and a half.

Positions I have held in recent years have involved me in . . .

Continue reading. There’s more.

Written by LeisureGuy

12 December 2020 at 8:23 pm

The Fatal Flaw at the Heart of Our Civilization

leave a comment »

umair haque writes at Medium:

Continue reading.

Written by LeisureGuy

27 November 2020 at 10:09 am

A History of Philosophy from Ancient Greece to Modern Times (81 Video Lectures)

leave a comment »

Philosphy is endlessly fascinating, and I think this series will provide a good foundation. Watch one a day, starting today, and well before Valentine’s Day — well, 4 days before — you’ll have a better idea of what philosophy has been in the West.

This is via OpenCulture, which notes:

You can watch 81 video lectures tracing the history of philosophy, moving from Ancient Greece to modern times. Arthur Holmes presented this influential course at Wheaton College for decades, and now it’s online for you. The lectures are all streamable above, or available through this YouTube playlist.

Philosophers covered in the course include: Plato, Aquinas, Hobbes Descartes, Spinoza, Hume, Kant, Hegel, Nietzsche, Sartre and more.

A History of Philosophy has been added to our list of Free Online Philosophy courses, a subset of our meta collection, 1,500 Free Online Courses from Top Universities.

Written by LeisureGuy

22 November 2020 at 11:04 am

Rebecca Newberger Goldstein interview: “Plato, Gödel, Spinoza, Ahab”

leave a comment »

Richard Marshall does the interview at the blog 3:16:

Philosophical advances in epistemology and in ethics profoundly shape our points of view. We don’t see them precisely because we see with them. It’s like the fish who responds to the question “How’s the water today?” with “Water? What’s water?”’

‘Gödel proved that, in any formal system that is rich enough to express arithmetic, there are truths that are expressible within it that can’t be proved. That’s the First Incompleteness Theorem. The second is that one of the things that can’t be proved within such a formal system is its own consistency.’

‘Since Gödel had interpreted his First Incompleteness Theorem in the light of his mathematical realism, then yes, postmodernists are barking up the wrong tree. The proof shows that there is a mathematical truth—the Gödel sentence—that is not provable within the system. He’s not in any way attacking the notion of objective truth in mathematics. Quite the contrary.’

‘Spinoza, the great monist, objected to dualities in just about every domain. He was everywhere interested in fusing, in unifying, what seemed to be polarities, collapsing dualities. The supernatural is fused with the natural, the ontological with the logical (which is entailed in his a priori methodology), the mental with the physical, the intellectual with the emotional, the normative with the descriptive.’

‘Maybe Reality really is inconsistent with the reality of the self, and maybe then, driven by our conatus, we ought to resist Reality. That’s the path that Ahab takes, and it doesn’t end well for him, nor for those under his leadership, who have relinquished their wills to his. After all, why is it even worthwhile to struggle to know Reality, to struggle after anything at all, if the self that’s motivating the struggle is ultimately nothing at all?

Rebecca Newberger Goldstein is an American philosopher, novelist and public intellectual. She has written ten books, both fiction and nonfiction. Here she discusses the relationship between science and philosophy, the nature of philosophical progress, Gödel’s theorems, Incompleteness, Einstein, Realism and postmodernism, Gödel and Wittgenstein, Spinoza, Spinoza and his appeal to imaginative writers, the Spinoza War, and Moby Dick.

3:16: What made you become a philosopher?

Rebecca Newberger Goldstein :  I started out mostly interested in math and physics. There were problems that obsessed me in both fields—How can math be a priori? How can quantum mechanics be connected to a reality that is recognizably our own? I discovered, while in college, that these were actually philosophical problems. My senior year in college I went to speak to one of my professors, Sidney Morgenbesser, an amazing guy. He was the Socrates of his day. Socrates never published anything, but instead engaged in deeply probing dialogues. That was Sidney’s m.o. as well. Over the course of several long conversations Sidney convinced me that I was a philosopher, rather than a scientist. His clinching argument was that scientists are like the bourgeoisie, never questioning the system, whereas philosophers were like socialists, always examining the system critically: even if it seemed to be working, ought it to be working? That argument struck home. I basically applied to graduate school in philosophy on the strength of Sidney’s metaphor.

3:16: In your book Plato at the Googleplex’ you have Plato brought into our century to discuss philosophy with our contemporaries. So first, can you say what you take philosophy to be and why you think it hasn’t gone away and isn’t going away soon?

RNG: I think that philosophy is the systematic and rigorous investigation of one of the deepest urges that we humans have, which is the urge to get our bearings in the most general sense possible. In particular, (1) We want to know where we are. What is the nature of this universe in which we find ourselves? What fundamentally different kinds of things, belonging to different ontological categories, are there, and how do they operate? 2. We want to know our own place in the universe, whether we’re made of the same stuff, are a special category of being, and whether the universe itself has any attitudes towards us. 3. We want to know what we are meant to do with our lives. We want to know what it is to live a life worthy for a human to live, a good life.

The first two questions are ontological, the third is normative. Science, as it eventually developed, has the lead role in answering the ontological questions, but science is often pushed on by philosophy, in its critical probing role. And the more subtle the science becomes, the bigger the role for philosophy, since it falls to philosophy’s special techniques to figure out what is and isn’t entailed by the science. For example, what changes are we required to make to our concept of time in light of Relativity Theory? And as far as the normative questions go, there the ball falls squarely in philosophy’s court, though it’s sometimes advancements in science and technology that generate the balls, that is, new normative questions.

Consider the normative questions generated by advances in AI, or the question of what obligations we have to future generations, given the threat of climate change. As long as science and technology advance, philosophers will have new questions to work on.

3:16:  You place Plato in rooms having philosophical debates with all sorts of people but you never have him decisively concluding a debate or argument. For those hostile to philosophy this might be a hostage to fortune – it suggests that philosophy can’t decide anything and actually we would be better off listening to scientists, historians and novelists. Why should we heed the philosopher in our time if not even Plato can answer its questions? What do you say to those physicists who say philosophy is just pointless and has been supplanted by – well, in their case – physics?  . . .

Continue reading.

Written by LeisureGuy

19 November 2020 at 12:36 pm

Posted in Books, Philosophy

The Next Decade Could Be Even Worse

leave a comment »

Graeme Wood talks with Peter Turchin and writes it up for the Atlantic:

Peter turchin, one of the world’s experts on pine beetles and possibly also on human beings, met me reluctantly this summer on the campus of the University of Connecticut at Storrs, where he teaches. Like many people during the pandemic, he preferred to limit his human contact. He also doubted whether human contact would have much value anyway, when his mathematical models could already tell me everything I needed to know.

But he had to leave his office sometime. (“One way you know I am Russian is that I cannot think sitting down,” he told me. “I have to go for a walk.”) Neither of us had seen much of anyone since the pandemic had closed the country several months before. The campus was quiet. “A week ago, it was even more like a neutron bomb hit,” Turchin said. Animals were timidly reclaiming the campus, he said: squirrels, woodchucks, deer, even an occasional red-tailed hawk. During our walk, groundskeepers and a few kids on skateboards were the only other representatives of the human population in sight.

The year 2020 has been kind to Turchin, for many of the same reasons it has been hell for the rest of us. Cities on fire, elected leaders endorsing violence, homicides surging—­­to a normal American, these are apocalyptic signs. To Turchin, they indicate that his models, which incorporate thousands of years of data about human history, are working. (“Not all of human history,” he corrected me once. “Just the last 10,000 years.”) He has been warning for a decade that a few key social and political trends portend an “age of discord,” civil unrest and carnage worse than most Americans have experienced. In 2010, he predicted that the unrest would get serious around 2020, and that it wouldn’t let up until those social and political trends reversed. Havoc at the level of the late 1960s and early ’70s is the best-case scenario; all-out civil war is the worst.

The fundamental problems, he says, are a dark triad of social maladies: a bloated elite class, with too few elite jobs to go around; declining living standards among the general population; and a government that can’t cover its financial positions. His models, which track these factors in other societies across history, are too complicated to explain in a nontechnical publication. But they’ve succeeded in impressing writers for nontechnical publications, and have won him comparisons to other authors of “megahistories,” such as Jared Diamond and Yuval Noah Harari. The New York Times columnist Ross Douthat had once found Turchin’s historical model­ing unpersuasive, but 2020 made him a believer: “At this point,” Douthat recently admitted on a podcast, “I feel like you have to pay a little more attention to him.”

Diamond and Harari aimed to describe the history of humanity. Turchin looks into a distant, science-fiction future for peers. In War and Peace and War (2006), his most accessible book, he likens himself to Hari Seldon, the “maverick mathematician” of Isaac Asimov’s Foundation series, who can foretell the rise and fall of empires. In those 10,000 years’ worth of data, Turchin believes he has found iron laws that dictate the fates of human societies.

The fate of our own society, he says, is not going to be pretty, at least in the near term. “It’s too late,” he told me as we passed Mirror Lake, which UConn’s website describes as a favorite place for students to “read, relax, or ride on the wooden swing.” The problems are deep and structural—not the type that the tedious process of demo­cratic change can fix in time to forestall mayhem. Turchin likens America to a huge ship headed directly for an iceberg: “If you have a discussion among the crew about which way to turn, you will not turn in time, and you hit the iceberg directly.” The past 10 years or so have been discussion. That sickening crunch you now hear—steel twisting, rivets popping—­­is the sound of the ship hitting the iceberg.

“We are almost guaranteed” five hellish years, Turchin predicts, and likely a decade or more. The problem, he says, is that there are too many people like me. “You are ruling class,” he said, with no more rancor than if he had informed me that I had brown hair, or a slightly newer iPhone than his. Of the three factors driving social violence, Turchin stresses most heavily “elite overproduction”—­the tendency of a society’s ruling classes to grow faster than the number of positions for their members to fill. One way for a ruling class to grow is biologically—think of Saudi Arabia, where princes and princesses are born faster than royal roles can be created for them. In the United States, elites over­produce themselves through economic and educational upward mobility: More and more people get rich, and more and more get educated. Neither of these sounds bad on its own. Don’t we want everyone to be rich and educated? The problems begin when money and Harvard degrees become like royal titles in Saudi Arabia. If lots of people have them, but only some have real power, the ones who don’t have power eventually turn on the ones who do.

In the United States, Turchin told me, you can see more and more aspirants fighting for a single job at, say, a prestigious law firm, or in an influential government sinecure, or (here it got personal) at a national magazine. Perhaps seeing the holes in my T-shirt, Turchin noted that a person can be part of an ideological elite rather than an economic one. (He doesn’t view himself as a member of either. A professor reaches at most a few hundred students, he told me. “You reach hundreds of thousands.”) Elite jobs do not multiply as fast as elites do. There are still only 100 Senate seats, but more people than ever have enough money or degrees to think they should be running the country. “You have a situation now where there are many more elites fighting for the same position, and some portion of them will convert to counter-elites,” Turchin said.

Donald Trump, for example, may appear elite (rich father, Wharton degree, gilded commodes), but Trumpism is a counter-elite movement. His government is packed with credentialed nobodies who were shut out of previous administrations, sometimes for good reasons and sometimes because the Groton-­Yale establishment simply didn’t have any vacancies. Trump’s former adviser and chief strategist Steve Bannon, Turchin said, is a “paradigmatic example” of a counter-elite. He grew up working-class, went to Harvard Business School, and got rich as an investment banker and by owning a small stake in the syndication rights to Seinfeld. None of that translated to political power until he allied himself with the common people. “He was a counter-elite who used Trump to break through, to put the white working males back in charge,” Turchin said.

Elite overproduction creates counter-elites, and counter-elites look for allies among the commoners. If commoners’ living standards slip—not relative to the elites, but relative to what they had before—they accept the overtures of the counter-elites and start oiling the axles of their tumbrels. Commoners’ lives grow worse, and the few who try to pull themselves onto the elite lifeboat are pushed back into the water by those already aboard. The final trigger of impending collapse, Turchin says, tends to be state insolvency. At some point rising in­security becomes expensive. The elites have to pacify unhappy citizens with handouts and freebies—and when these run out, they have to police dissent and oppress people. Eventually the state exhausts all short-term solutions, and what was heretofore a coherent civilization disintegrates.

Turchin’s prognostications would be easier to dismiss as barstool theorizing if the disintegration were not happening now, roughly as the Seer of Storrs foretold 10 years ago. If the next 10 years are as seismic as he says they will be, his insights will have to be accounted for by historians and social scientists—assuming, of course, that there are still universities left to employ such people.

Turchin was born in 1957 in Obninsk, Russia, a city built by the Soviet state as a kind of nerd heaven, where scientists could collaborate and live together. His father, Valen­tin, was a physicist and political dissident, and his mother, Tatiana, had trained as a geologist. They moved to Moscow when he was 7 and in 1978 fled to New York as political refugees. There they quickly found a community that spoke the household language, which was science. Valen­tin taught at the City University of New York, and Peter studied biology at NYU and earned a zoology doctorate from Duke.

Turchin wrote a dissertation on . . .

Continue reading. There’s more

And see also “The Wealthy and Privileged Can Revolt, Too,” by Noah Smith in Bloomberg.

Written by LeisureGuy

13 November 2020 at 12:11 pm

An interesting approach to a theory of everything: Karl Friston’s free energy principle

leave a comment »

Shaun Raviv has a long and absorbing article in Wired that is well worth reading and pondering. It describes a principle that accounts for life at all scales, from microbes to animals to memes and cultural dynamics — the latter akin, as the article notes to Hari Seldon’s psychohistory is the Isaac Asimov Foundation series. The article begins:

When King George III of England began to show signs of acute mania toward the end of his reign, rumors about the royal madness multiplied quickly in the public mind. One legend had it that George tried to shake hands with a tree, believing it to be the King of Prussia. Another described how he was whisked away to a house on Queen Square, in the Bloomsbury district of London, to receive treatment among his subjects. The tale goes on that George’s wife, Queen Charlotte, hired out the cellar of a local pub to stock provisions for the king’s meals while he stayed under his doctor’s care.

More than two centuries later, this story about Queen Square is still popular in London guidebooks. And whether or not it’s true, the neighborhood has evolved over the years as if to conform to it. A metal statue of Charlotte stands over the northern end of the square; the corner pub is called the Queen’s Larder; and the square’s quiet rectangular garden is now all but surrounded by people who work on brains and people whose brains need work. The National Hospital for Neurology and Neurosurgery—where a modern-day royal might well seek treatment—dominates one corner of Queen Square, and the world-renowned neuroscience research facilities of University College London round out its perimeter. During a week of perfect weather last July, dozens of neurological patients and their families passed silent time on wooden benches at the outer edges of the grass.

On a typical Monday, Karl Friston arrives on Queen Square at 12:25 pm and smokes a cigarette in the garden by the statue of Queen Charlotte. A slightly bent, solitary figure with thick gray hair, Friston is the scientific director of University College London’s storied Functional Imaging Laboratory, known to everyone who works there as the FIL. After finishing his cigarette, Friston walks to the western side of the square, enters a brick and limestone building, and heads to a seminar room on the fourth floor, where anywhere from two to two dozen people might be facing a blank white wall waiting for him. Friston likes to arrive five minutes late, so everyone else is already there.

His greeting to the group is liable to be his first substantial utterance of the day, as Friston prefers not to speak with other human beings before noon. (At home, he will have conversed with his wife and three sons via an agreed-upon series of smiles and grunts.) He also rarely meets people one-on-one. Instead, he prefers to hold open meetings like this one, where students, postdocs, and members of the public who desire Friston’s expertise—a category of person that has become almost comically broad in recent years—can seek his knowledge. “He believes that if one person has an idea or a question or project going on, the best way to learn about it is for the whole group to come together, hear the person, and then everybody gets a chance to ask questions and discuss. And so one person’s learning becomes everybody’s learning,” says David Benrimoh, a psychiatry resident at McGill University who studied under Friston for a year. “It’s very unique. As many things are with Karl.”

At the start of each Monday meeting, everyone goes around and states their questions at the outset. Friston walks in slow, deliberate circles as he listens, his glasses perched at the end of his nose, so that he is always lowering his head to see the person who is speaking. He then spends the next few hours answering the questions in turn. “A Victorian gentleman, with Victorian manners and tastes,” as one friend describes Friston, he responds to even the most confused questions with courtesy and rapid reformulation. The Q&A sessions—which I started calling “Ask Karl” meetings—are remarkable feats of endurance, memory, breadth of knowledge, and creative thinking. They often end when it is time for Friston to retreat to the minuscule metal balcony hanging off his office for another smoke.

Friston first became a heroic figure in academia for devising many of the most important tools that have made human brains legible to science. In 1990 he invented statistical parametric mapping, a computational technique that helps—as one neuroscientist put it—“squash and squish” brain images into a consistent shape so that researchers can do apples-to-apples comparisons of activity within different crania. Out of statistical parametric mapping came a corollary called voxel-­based morphometry, an imaging technique that was used in one famous study to show that the rear side of the hippocampus of London taxi drivers grew as they learned “the knowledge.” . . .

Continue reading. The segment presented here does not touch on Friston’s idea — that comes later in the article.

Written by LeisureGuy

9 November 2020 at 11:42 am

Decartes’s dictum, updated

leave a comment »

Written by LeisureGuy

31 October 2020 at 10:37 am

Posted in Medical, Philosophy

John Gray: ‘What can we learn from cats? Don’t live in an imagined future’

leave a comment »

Tim Adams interviews John Gray in the Guardian:

What’s it like to be a cat? John Gray has spent a lifetime half-wondering. The philosopher – to his many fans the intellectual cat’s pyjamas, to his critics the least palatable of furballs – has had feline companions at home since he was a boy in South Shields. In adult life – he now lives in Bath with his wife Mieko, a dealer in Japanese antiquities – this has principally been two pairs of cats: “Two Burmese sisters, Sophie and Sarah, and two Birman brothers, Jamie and Julian.” The last of them, Julian, died earlier this year, aged 23. Gray, currently catless, is by no means a sentimental writer, but his new book, Feline Philosophy: Cats and the Meaning of Life, is written in memory of their shared wisdom.

Other philosophers have been enthralled by cats over the years. There was Schrödinger and his box, of course. And Michel de Montaigne, who famously asked: “When I am playing with my cat, how do I know she is not playing with me?” The rationalist René Descartes, Gray notes, once “hurled a cat out of the window in order to demonstrate the absence of conscious awareness in non-human animals; its terrified screams were mechanical reactions, he concluded.”

One impulse for this book was a conversation with a fellow philosopher, who assured Gray that he “had taught his cat to be vegan”. (Gray had only one question: “Did the cat ever go out?” It did.) When he informed another philosopher that he was writing about what we can learn from cats, that man replied: “But cats have no history.” “And,” Gray wondered, “is that necessarily a disadvantage?”

Elsewhere, Gray has written how Ludwig Wittgenstein once observed “if lions could talk we would not understand”, to which the zookeeper John Aspinall responded: “He hasn’t spent long enough with lions.” If cats could talk, I ask Gray, do you think we would understand?

“Well, the book is in some ways an experiment in that respect,” he says. “Of course, it’s not a scientific inquiry. But if you live with a cat very closely for a long time – and it takes a long time, because they’re slow to trust, slow to really enter into communication with you – then you can probably imagine how they might philosophise.”

Gray believes that humans turned to philosophy principally out of anxiety, looking for some tranquillity in a chaotic and frightening world, telling themselves stories that might provide the illusion of calm. Cats, he suggests, wouldn’t recognise that need because they naturally revert to equilibrium whenever they’re not hungry or threatened. If cats were to give advice, it would be for their own amusement.

Readers of Gray will recognise this book as a postscript or coda to Straw Dogs: Thoughts on Humans and Other Animals, the 2002 bestseller in which he elegantly dismantled the history of western philosophy – with its illusory faith in our species living somehow “above” evolving life and outside the constraints of nature. That book aimed its fire particularly at the prevailing belief of our time: that of the inevitably steady forward progress of humankind brought about by liberal democracy. When the book came out, as George W Bush was demanding “regime change” in Iraq, it struck a particular nerve. In the two decades since, its argument that the advance of rational enlightened thought might not offer any kind of lasting protection against baser tribal instincts or environmental destruction or human folly has felt like prophecy.

Gray never bought the idea that his book was a handbook for despair. His subject was humility; his target any ideology that believed it possessed anything more than doubtful and piecemeal answers to vast and changing questions. The cat book is written in that spirit. If like me you read with a pencil to hand, you will be underlining constantly with a mix of purring enjoyment and frequent exclamation marks. “Consciousness has been overrated,” Gray will write, coolly. Or “the flaw in rationalism is the belief that human beings can live by applying a theory”. Or “human beings quickly lose their humanity but cats never stop being cats”. He concludes with a 10-point list of how cats might give their anxious, unhappy, self-conscious human companions hints “to live less awkwardly”. These range from “never try to persuade human beings to be reasonable”, to “do not look for meaning in your suffering” to “sleep for the joy of sleeping”.

Does he see that 10-point plan, offered half in earnest (“as a cat would offer it”) as an answer to those people who criticised Straw Dogs for offering little in place of what it debunked? . . .

Continue reading.

Written by LeisureGuy

27 October 2020 at 4:01 pm

How Ayn Rand Destroyed Sears; or, The Folly of Capitalistic Competition.

leave a comment »

Alfie Kohn wrote a good book worth reading: No Contest: The Case Against Competition. (He wrote a second good book worth reading, Punished by Rewards: The Trouble with Gold Stars, Incentive Plans, A’s, Praise, and Other Bribes, but that’s not the one I’m talking about.) In it, he describes research findings that demonstrate the drastic costs of competition.

Of course, Libertarians will have none of this, and blinded by Ayn Rand they insist that the free market, unhindered by any restrictions save the laws enforcing contracts, will solve any problem efficiently, and commonsense observation and history of what actually happens when corporations and companies operate free of regulation and oversight is something simply must be ignored.

Sometimes, though, the effects of untrammeled competition are not so easily ignored, as when those effects result in the destruction of a once-towering company.

Leigh Phillips and Michal Rozworski wrote a book, The People’s Republic of Walmart: How the World’s Biggest Corporations are Laying the Foundation for Socialism, that argues that centralized planning on a vast scale can work, an idea that gives Libertarians the heebie-jeebies, they include a description of what happened when free-market forces were released within Sears.

An extract from their book:

While companies like Walmart operate within the market, internally, as in any other firm, everything is planned. There is no internal market. The different departments, stores, trucks and suppliers do not compete against each other in a market; everything is coordinated.

It is no small irony then, that one of Walmart’s main competitors, the venerable, 120-plus-year-old Sears, Roebuck & Company, destroyed itself by embracing the exact opposite of Walmart ’s galloping socialization of production and distribution: by instituting an internal market.

The Sears Holdings Corporation reported losses of some $2 billion in 2016, and some $10.4 billion in total since 2011, the last year that the business turned a profit. In the spring of 2017, it was in the midst of closing another 150 stores, in addition to the 2,125 already shuttered since 2010—more than half its operation—and had publicly acknowledged “substantial doubt” that it would be able to keep any of its doors open for much longer. The stores that remain open, often behind boarded-up windows, have the doleful air of late-Soviet retail desolation: leaking ceilings, inoperative escalators, acres of empty shelves, and aisles shambolically strewn with abandoned cardboard boxes half-filled with merchandise. A solitary brand-new size-9 black sneaker lies lonesome and boxless on the ground, its partner neither on a shelf nor in a storeroom. Such employees as remain have taken to hanging bedsheets as screens to hide derelict sections from customers.

The company has certainly suffered in the way that many other brick-and-mortar outlets have in the face of the challenge from discounters such as Walmart and from online retailers like Amazon. But the consensus among the business press and dozens of very bitter former executives is that the overriding cause of Sears’s malaise is the disastrous decision by the company’s chairman and CEO, Edward Lampert, to disaggregate the company’s different divisions into competing units: to create an internal market.

From a capitalist perspective, the move appears to make sense. As business leaders never tire of telling us, the free market is the fount of all wealth in modern society. Competition between private companies is the primary driver of innovation, productivity and growth. Greed is good, per Gordon Gekko’s oft-quoted imperative from Wall Street. So one can be excused for wondering why it is, if the market is indeed as powerfully efficient and productive as they say, that all companies did not long ago adopt the market as an internal model.

Lampert, libertarian and fan of the laissez-faire egotism of Russian American novelist Ayn Rand, had made his way from working in warehouses as a teenager, via a spell with Goldman Sachs, to managing a $15 billion hedge fund by the age of 41. The wunderkind was hailed as the Steve Jobs of the investment world. In 2003, the fund he managed, ESL Investments, took over the bankrupt discount retail chain Kmart (launched the same year as Walmart). A year later, he parlayed this into a $12 billion buyout of a stagnating (but by no means troubled) Sears.

At first, the familiar strategy of merciless, life-destroying post-acquisition cost cutting and layoffs did manage to turn around the fortunes of the merged Kmart-Sears, now operating as Sears Holdings. But Lampert’s big wheeze went well beyond the usual corporate raider tales of asset stripping, consolidation and chopping-block use of operations as a vehicle to generate cash for investments elsewhere. Lampert intended to use Sears as a grand free market experiment to show that the invisible hand would outperform the central planning typical of any firm.

He radically restructured operations, splitting the company into thirty, and later forty, different units that were to compete against each other. Instead of cooperating, as in a normal firm, divisions such as apparel, tools, appliances, human resources, IT and branding were now in essence to operate as autonomous businesses, each with their own president, board of directors, chief marketing officer and statement of profit or loss. An eye-popping 2013 series of interviews by Bloomberg Businessweek investigative journalist Mina Kimes with some forty former executives described Lampert’s Randian calculus: “If the company’s leaders were told to act selfishly, he argued, they would run their divisions in a rational manner, boosting overall performance.”

He also believed that the new structure, called Sears Holdings Organization, Actions, and Responsibilities, or SOAR, would improve the quality of internal data, and in so doing that it would give the company an edge akin to statisti- cian Paul Podesta’s use of unconventional metrics at the Oakland Athletics baseball team (made famous by the book, and later film starring Brad Pitt, Moneyball). Lampert would go on to place Podesta on Sears’s board of directors and hire Steven Levitt, coauthor of the pop neoliberal economics bestseller Freakonomics, as a consultant. Lampert was a laissez-faire true believer. He never seems to have got the memo that the story about the omnipotence of the free market was only ever supposed to be a tale told to frighten young children, and not to be taken seriously by any corporate executive.

And so if the apparel division wanted to use the services of IT or human resources, they had to sign contracts with them, or alternately to use outside contractors if it would improve the financial performance of the unit—regardless of whether it would improve the performance of the company as a whole. Kimes tells the story of how .  . .

Continue reading.

In my view, Libertarians love logic but fail to recognize its imitations. As Oliver Wendell Holmes, Jr. observed with respect to the law, “The life of the law has not been logic: it has been experience.” Logic can readily take you to a place or conclusions that experience shows is bad, that doesn’t work. Generally this happens because the logic is using false assumptions, of which Libertarians have an abundant supply — cf. the Sears story above, a tragic clash between logic and experience.

Written by LeisureGuy

23 October 2020 at 4:45 pm

%d bloggers like this: