Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Education’ Category

No absolute time: Newton got it wrong, Hume saw it right, and Einstein learned from Hume how relativity would work

leave a comment »

Matias Slavov, a postdoctoral researcher in philosophy at Tampere University in Finland, writes in Aeon:

In 1915, Albert Einstein wrote a letter to the philosopher and physicist Moritz Schlick, who had recently composed an article on the theory of relativity. Einstein praised it: ‘From the philosophical perspective, nothing nearly as clear seems to have been written on the topic.’ Then he went on to express his intellectual debt to ‘Hume, whose Treatise of Human Nature I had studied avidly and with admiration shortly before discovering the theory of relativity. It is very possible that without these philosophical studies I would not have arrived at the solution.’

More than 30 years later, his opinion hadn’t changed, as he recounted in a letter to his friend, the engineer Michele Besso: ‘In so far as I can be aware, the immediate influence of D Hume on me was greater. I read him with Konrad Habicht and Solovine in Bern.’ We know that Einstein studied Hume’s Treatise (1738-40) in a reading circle with the mathematician Conrad Habicht and the philosophy student Maurice Solovine around 1902-03. This was in the process of devising the special theory of relativity, which Einstein eventually published in 1905. It is not clear, however, what it was in Hume’s philosophy that Einstein found useful to his physics. We should therefore take a closer look.

In Einstein’s autobiographical writing from 1949, he expands on how Hume helped him formulate the theory of special relativity. It was necessary to reject the erroneous ‘axiom of the absolute character of time, viz, simultaneity’, since the assumption of absolute simultaneity

unrecognisedly was anchored in the unconscious. Clearly to recognise this axiom and its arbitrary character really implies already the solution of the problem. The type of critical reasoning required for the discovery of this central point [the denial of absolute time, that is, the denial of absolute simultaneity] was decisively furthered, in my case, especially by the reading of David Hume’s and Ernst Mach’s philosophical writings.

In the view of John D Norton, professor of the history and philosophy of science at the University of Pittsburgh, Einstein learned an empiricist theory of concepts from Hume (and plausibly from Mach and the positivist tradition). He then implemented concept empiricism in his argument for the relativity of simultaneity. The result is that different observers will not agree whether two events are simultaneous or not. Take the openings of two windows, a living room window and a kitchen window. There is no absolute fact to the matter of whether the living room window opens before the kitchen window, or whether they open simultaneously or in reverse order. The temporal order of such events is observer-dependent; it is relative to the designated frame of reference.

Once the relativity of simultaneity was established, Einstein was able to reconcile the seemingly irreconcilable aspects of his theory, the principle of relativity and the light postulate. This conclusion required abandoning the view that there is such a thing as an unobservable time that grounds temporal order. This is the view that Einstein got from Hume.

Hume’s influence on intellectual culture is massive. This includes all areas of philosophy and a variety of scientific disciplines. A poll conducted with professional philosophers a few years ago asked them to name the philosopher, no longer living, with whom they most identify. Hume won, by a clear margin. In Julian Baggini’s estimation, contemporary ‘scientists, who often have little time for philosophy, often make an exception for Hume’. Before saying more about Hume’s permanent relevance, we should go back to the 18th-century early modern context. His influence is due to his radical empiricism, which can’t be fully understood without examining the era in which he worked.

The dominant theory of cognition of early modern philosophy was idea theory. Ideas denote both mental states and the material of our thinking. A mental state is, for example, a toothache, and the material of our thinking are thoughts, for example, of a mathematical object such as a triangle. The clearest proponent of the theory of ideas was the French philosopher René Descartes, for whom philosophical enquiry is essentially an investigation of the mind’s ideas. In one of his letters, he explains why ideas are so important: ‘I am certain that I can have no knowledge of what is outside me except by means of the ideas I have within me.’ If we wish to gain any certainty in our investigations of any aspect of the world – whether the object of our investigation is the human mind or some natural phenomenon – we need to have a clear and distinct idea of the represented object in question.

Hume’s theory of ideas differs from Descartes’s because he rejects innatism. This view goes back to Plato’s doctrine of anamnesis, which maintains that all learning is a form of recollection as everything we learn is in us before we are taught. The early modern version of innatism emphasises that the mind is not a blank slate, but we are equipped with some ideas before our birth and sensory perception. Hume starts at the same point as his fellow Briton and predecessor, John Locke. The mind begins to have ideas when it begins to perceive. To ask when a human being acquires ideas in the first place ‘is to ask when he begins to perceive; having ideas and perception being the same thing,’ writes Locke in An Essay Concerning Human Understanding (1689). Drawing on this insight, Hume devised his copy principle.

Perception, for Hume, is divided into ideas and impressions. The difference between the two is  . . .

Continue reading.

You can get a Kindle edition of Hume’s books and essays for 77¢.

Written by LeisureGuy

16 January 2021 at 4:33 pm

C.S. Lewis On The Reading of Old Books

leave a comment »

C.S. Lewis in his introduction to Athanasius’ On the Incarnation.

There is a strange idea abroad that in every subject the ancient books should be read only by the professionals, and that the amateur should content himself with the modern books. Thus I have found as a tutor in English Literature that if the average student wants to find out something about Platonism, the very last thing he thinks of doing is to take a translation of Plato off the library shelf and read the Symposium. He would rather read some dreary modern book ten times as long, all about “isms” and influences and only once in twelve pages telling him what Plato actually said.

The error is rather an amiable one, for it springs from humility. The student is half afraid to meet one of the great philosophers face to face. He feels himself inadequate and thinks he will not understand him. But if he only knew, the great man, just because of his greatness, is much more intelligible than his modern commentator.

The simplest student will be able to understand, if not all, yet a very great deal of what Plato said; but hardly anyone can understand some modern books on Platonism. It has always therefore been one of my main endeavours as a teacher to persuade the young that firsthand knowledge is not only more worth acquiring than secondhand knowledge, but is usually much easier and more delightful to acquire.

This mistaken preference for the modern books and this shyness of the old ones is nowhere more rampant than in theology. Wherever you find a little study circle of Christian laity you can be almost certain that they are studying not St. Luke or St. Paul or St. Augustine or Thomas Aquinas or Hooker or Butler, but M. Berdyaev or M. Maritain or M. Niebuhr or Miss Sayers or even myself.

Now this seems to me topsy-turvy. Naturally, since I myself am a writer, I do not wish the ordinary reader to read no modern books. But if he must read only the new or only the old, I would advise him to read the old. And I would give him this advice precisely because he is an amateur and therefore much less protected than the expert against the dangers of an exclusive contemporary diet.

A new book is still on its trial and the amateur is not in a position to judge it. It has to be tested against the great body of Christian thought down the ages, and all its hidden implications (often unsuspected by the author himself) have to be brought to light.

Often it cannot be fully understood without the knowledge of a good many other modern books. If you join at eleven o’clock a conversation which began at eight you will often not see the real bearing of what is said. Remarks which seem to you very ordinary will produce laughter or irritation and you will not see why—the reason, of course, being that the earlier stages of the conversation have given them a special point.

In the same way sentences in a modern book which look quite ordinary may be directed at some other book; in this way you may be led to accept what you would have indignantly rejected if you knew its real significance. The only safety is to have a standard of plain, central Christianity (“mere Christianity” as Baxter called it) which puts the controversies of the moment in their proper perspective. Such a standard can be acquired only from the old books.

It is a good rule, after reading a new book, never to allow yourself another new one till you have read an old one in between. If that is too much for you, you should at least read one old one to every three new ones.

Every age has its own outlook. It is specially good at seeing certain truths and specially liable to make certain mistakes. We all, therefore, need the books that will correct the characteristic mistakes of our own period. And that means the old books.

All contemporary writers share to some extent the contemporary outlook—even those, like myself, who seem most opposed to it. Nothing strikes me more when I read the controversies of past ages than the fact that both sides were usually assuming without question a good deal which we should now absolutely deny. They thought that they were as completely opposed as two sides could be, but in fact they were all the time secretly united—united with each other and against earlier and later ages—by a great mass of common assumptions.

We may be sure that the characteristic blindness of the twentieth century—the blindness about which posterity will ask, “But how could they have thought that?”—lies where we have never suspected it, and concerns something about which there is untroubled agreement between Hitler and President Roosevelt or between Mr. H. G. Wells and Karl Barth. None of us can fully escape this blindness, but we shall certainly increase it, and weaken our guard against it, if we read only modern books. Where they are true they will give us truths which we half knew already. Where they are false they will aggravate the error with which we are already dangerously ill.

The only palliative is to keep the clean sea breeze of the centuries blowing through our minds, and this can be done only by reading old books. Not, of course, that there is any magic about the past. People were no cleverer then than they are now; they made as many mistakes as we. But not the same mistakes. They will not flatter us in the errors we are already committing; and their own errors, being now open and palpable, will not endanger us. Two heads are better than one, not because either is infallible, but because they are unlikely to go wrong in the same direction. To be sure, the books of the future would be just as good a corrective as the books of the past, but unfortunately we cannot get at them.

I myself was first led into reading the Christian classics, almost accidentally, as a result of my English studies. Some, such as Hooker, Herbert, Traherne, Taylor and Bunyan, I read because they are themselves great English writers; others, such as Boethius, St. Augustine, Thomas Aquinas and Dante, because they were “influences.” George Macdonald I had found for myself at the age of sixteen and never wavered in my allegiance, though I tried for a long time to ignore his Christianity.

They are, you will note, a mixed bag, representative of many Churches, climates and ages. And that brings me to yet another reason for reading them. The divisions of Christendom are undeniable and are by some of these writers most fiercely expressed. But if any man is tempted to think—as one might be tempted who read only con- temporaries—that “Christianity” is a word of so many meanings that it means nothing at all, he can learn beyond all doubt, by stepping out of his own century, that this is not so.

Measured against the ages “mere Christianity” turns out to be no insipid interdenominational transparency, but something positive, self-consistent, and inexhaustible. I know it, indeed, to my cost. In the days when I still hated Christianity, I learned to recognise, like some all too familiar smell, that almost unvarying something which met me, now in Puritan Bunyan, now in Anglican Hooker, now in Thomist Dante. It was there (honeyed and floral) in Francois de Sales; it was there (grave and homely) in Spenser and Walton; it was there (grim but manful) in Pascal and Johnson; there again, with a mild, frightening, Paradisial flavour, in Vaughan and Boehme and Traherne.

In the urban sobriety of the eighteenth century one was not safe—Law and Butler were two lions in the path. The supposed “Paganism” of the Elizabethans could not keep it out; it lay in wait where a man might have supposed himself safest, in the very centre of The Faerie Queene and the Arcadia. It was, of course, varied; and yet—after all—so unmistakably the same; recognisable, not to be evaded, the odour which is death to us until we allow it to become life:

an air that kills
From yon far country blows.We are all rightly distressed, and ashamed also, at the divisions of Christendom. But those who have always lived within the Christian fold may be too easily dispirited by them. They are bad, but such people do not know what it looks like from without. Seen from there, what is left intact despite all the divisions, still appears (as it truly is) an immensely formidable unity. I know, for I saw it; and well our enemies know it. That unity any of us can find by going out of his own age.

It is not enough, but it is more than you had thought till then. Once you are well soaked in it, if you then venture to speak, you will have an amusing experience. You will be thought a Papist when you are actually reproducing Bunyan, a Pantheist when you are quoting Aquinas, and so forth. For you have now got on to the great level viaduct which crosses the ages and which looks so high from the valleys, so low from the mountains, so narrow compared with the swamps, and so broad compared with the sheep-tracks.

The present book is something of an experiment. The translation is intended for the world at large, not only for theological students. If it succeeds, other translations of other great Christian books will presumably follow. In one sense, of course, it is not the first in the field. Translations of the Theologia Germanica, the Imitation, the Scale of Perfection, and the Revelations of Lady Julian of Norwich, are already on the market, and are very valuable, though some of them are not very scholarly.

But it will be noticed that these are all books of  . . .

Continue reading. There’s more.

Written by LeisureGuy

15 January 2021 at 12:48 pm

Posted in Books, Daily life, Education

How to Talk with a Conspiracy Theorist (and Why People Believe Conspiracy Theories in the First Place) and What Experts Recommend

leave a comment »

Josh Jones writes in Open Culture:

Why do people pledge allegiance to views that seem fundamentally hostile to reality? Maybe believers in shadowy, evil forces and secret cabals fall prey to motivated reasoning. Truth for them is what they need to believe in order to get what they want. Their certainty in the justness of a cause can feel as comforting as a warm blanket on a winter’s night. But conspiracy theories go farther than private delusions of grandeur. They have spilled into the streets, into the halls of the U.S. Capitol building and various statehouses. Conspiracy theories about a “stolen” 2020 election are out for blood.

As distressing as such recent public spectacles seem at present, they hardly come near the harm accomplished by propaganda like Plandemic—a short film that claims the COVID-19 crisis is a sinister plot—part of a wave of disinformation that has sent infection and death rates soaring into the hundreds of thousands.

We may never know the numbers of people who have infected others by refusing to take precautions for themselves, but we do know that the number of people in the U.S. who believe conspiracy theories is alarmingly high.

A Pew Research survey of adults in the U.S. “found that 36% thought that these conspiracy theories” about the election and the pandemic “were probably or definitely true,” Tanya Basu writes at the MIT Technology Review. “Perhaps some of these people are your family, your friends, your neighbors.” Maybe you are conspiracy theorist yourself. After all, “it’s very human and normal to believe in conspiracy theories…. No one is above [them]—not even you.” We all resist facts, as Cass Sunstein (author of Conspiracy Theories and Other Dangerous Ideas) says in the Vox video above, that contradict cherished beliefs and the communities of people who hold them.

So how do we distinguish between reality-based views and conspiracy theories if we’re all so prone to the latter? Standards of logical reasoning and evidence still help separate truth from falsehood in laboratories. When it comes to the human mind, emotions are just as important as data. “Conspiracy theories make people feel as though they have some sort of control over the world,” says Daniel Romer, a psychologist and research director at the University of Pennsylvania’s Annenberg Public Policy Center. They’re airtight, as Wired shows below, and it can be useless to argue. . .

Continue reading. There’s much more, including more brief videos worth viewing.

Later in the article:

[A]n abridged version of MIT Technology Review’s ten tips for reasoning with a conspiracy theorist, and read Basu’s full article here.

  1. Always, always speak respectfully: “Without respect, compassion, and empathy, no one will open their mind or heart to you. No one will listen.”
  2. Go private: Using direct messages when online “prevents discussion from getting embarrassing for the poster, and it implies a genuine compassion and interest in conversation rather than a desire for public shaming.”
  3. Test the waters first: “You can ask what it would take to change their mind, and if they say they will never change their mind, then you should take them at their word and not bother engaging.”
  4. Agree: “Conspiracy theories often feature elements that everyone can agree on.”
  5. Try the “truth sandwich”: “Use the fact-fallacy-fact approach, a method first proposed by linguist George Lakoff.”
  6. Or use the Socratic method: This “challenges people to come up with sources and defend their position themselves.”
  7. Be very careful with loved ones: “Biting your tongue and picking your battles can help your mental health.”
  8. Realize that some people don’t want to change, no matter the facts.
  9. If it gets bad, stop: “One r/ChangeMyView moderator suggested ‘IRL calming down’: shutting off your phone or computer and going for a walk.”
  10. Every little bit helps. “One conversation will probably not change a person’s mind, and that’s okay.”

Written by LeisureGuy

13 January 2021 at 10:55 am

Why poor people find Trump attractive

leave a comment »

This is a Twitter thread that seems to have been deleted. It is by @jpbrammer and was posted 18 Nov 2016. I have typed it out from screengrabs of the tweets.

So I’m a Mexican-American from a poor rural (mostly white) town in Oklahoma. Missing from this debate? How poor whites seem themselves.

If you’re wondering how poor exploited white people could vote for a dude with a golden elevator who will fuck them over, here’s how.

They don’t see themselves as poor. They don’t base their identity on it. They see themselves as “temporarily embarrassed millionaires.”

The stigma against poverty is incredibly strong. It is shameful to be poor, to not have the comforts of the middle class. So they pretend —

that they aren’t poor. They are willing to lie to make it seem that they aren’t poor. They purchase things to make it seem like they’re not.

In my town, wealth waan’t associated with greed, but with hard work and inherent goodness. You are blessed if you have material wealth.

When they see Trump they don’t see an extortionist who is rich because of the very conditions that keep their own communities in poverty.

They see someone who worked hard and was justly rewarded with wealth. Most men, especially, think they too could be Trump were it not for

the unfair obstacles put in their way. White men who don’t consider themselves successful enough have so many excuses for their “failures.”

The idea that immigrants are the reason they are poor and not wealthy like Trump is so appealing. It takes all the shame and blame away.

And here we have a man who, they think, “tells it like it is’ and is willing to name the things stealing prosperity out of their hands.

If these people saw themselves as an exploited class of people, if American culture didn’t stigmatize poverty so much, it might be different.

But American has so entangled wealth with goodness and poverty with moral deficiency that they can’t build that identity. They won’t.

Trump is rich, and so according to American criteria, he is also:
1. Wise
2. Fair
3. Moral
4. Deserving
5. Strong
6. Clever
He *has* to be.

Capitalism and the American Dream teach that poverty is a temporary state that can be transcend with hard work and cleverness.

To fail to transcend poverty, and to admit that you are poor, is to admit that you are neither hardworking nor clever. It’s cultural brainwashing.

So if an exploited class of people don’t want to admit they’re exploited and they blame themselves for their oppression, what manifests?

Xenophobia. Hatred of anyone who is “different,” queer people, people of color. These people are eroding the “goodness” of America.

And if they would just stop ruining America, then the perfect design of America could work again and deliver prosperity.

I’m telling you, as someone who has spent almost his entire life in this environment, that if you think cities are a “bubble…” Good God.

How you balance those realities, and what conclusions you reach to improve the lives of both, well, I’m not smart enough to have the answer.

Still, we need to understand the identity working class white people have built for themselves, on diametrically opposed to, well, reality.

Because Trump won’t make them rich. Even if he deports all the brown people, it won’t bring them what they’re hoping for.

It strikes me that once a person’s falls into accepting an illusion as true, they become vulnerable to more deceptions because they’ve lost touch with the testing ground of reality — false hopes, false dreams, false statements have more power on those who already live in self-deception or who already believe a false vision.

Written by LeisureGuy

10 January 2021 at 3:01 pm

New article on Medium: “Choosing Which Student Goes Next”

leave a comment »

The article will be primarily of interest to teachers. If you teach or know someone who does, take a look.

Written by LeisureGuy

4 January 2021 at 10:49 am

Diet Drift and a Hard Reset: Learning to recover from failure

leave a comment »

I just had another article published on Medium. It discusses something I’ve blogged about, though with an emphasis on the learning aspect.

Written by LeisureGuy

27 December 2020 at 5:18 pm

The Vatican’s Latinist

with 2 comments

John Byron Kuhner wrote in The New Criterion in 2017:

I1970, the Procurator General of the Discalced Carmelite Order, Finian Monahan, was summoned to the Vatican for a meeting. The subject of the meeting was a promising young American priest by the name of Reginald Foster. The head Latinist of the Vatican’s State Department had tapped Foster to write papal correspondence, which was at the time composed entirely in Latin. Foster wanted the job but was bound by a vow of obedience, and the decision would be made by his superiors. Monahan intended to resist. Foster, thirty years of age, had proven himself to be both supremely intellectually gifted and utterly reliable—a precious thing at a time when the Catholic Church’s religious orders were hemorrhaging priests. Monahan thought Latin was a dead end. He didn’t want to lose one of his best to a Vatican department that would only get less and less important every year. He said Foster would go to the Vatican “over my dead body.”

Foster remembers the meeting vividly. “So we arrive there, and we’re ushered into this office, and who do we find there but Ioannes Benelli,” Foster says, using Benelli’s Latin name, as was customary at the Vatican at that time. He continues:

Benelli was Paul VI’s hatchet man—whenever he wanted something to get done, he called on Benelli. He was very energetic—got things done, and no nonsense. Everyone was terrified of him. I was too, and now here we were in the room with him, and he turns to Monahan and says, “This is Foster?” The General said yes. Then Benelli said, “Thank you very much, we won’t be needing you anymore.” And he took me by the hand and brought me down to the State Department and that was the end of that. Monahan didn’t say a word. I was now working for the Pope, and it was like I was more or less out of the Carmelite Order. A lot of the time the Order didn’t even really know what I was doing.

Foster would spend the next forty years at the Vatican, part of a small team of scribes who composed the pope’s correspondence, translated his encyclicals, and wrote copy for internal church documents. His somewhat unique position between the Carmelite Order and the Vatican bureaucracy meant that in fact he had a great deal of freedom for a priest. Later in his career his loose tongue—some in the church called it a loose cannon—would attract the notice of journalists looking for interesting copy. “Sacred language?” he said when asked about Latin as the “sacred language” of the church. “In the first century every prostitute in Rome spoke it fluently—and much better than most people in the Roman Curia.” The Minnesota Star Tribune quoted him as saying “I like to say mass in the nude,” which caused a small Curial kerfuffle (Foster claims he was misquoted). He appeared in Bill Maher’s movie Religulous, which featured him agreeing with the proposition that the Vatican itself was at odds with the message of Jesus, that the pope should not be living in a palace, and that hell and “that Old Catholic stuff” was “finished” and “gone.” Foster says the pope received complaints from bishops and cardinals about his appearance. “They said ‘Who is this Latinist of yours and what the hell is he doing?’ They would have fired me for sure. But by the time the film came out I was sick and a few months away from retirement anyway. So they just waited it out and let me go quietly.” He had already been fired from his post at the pontifical Gregorian University for allowing dozens of students to take his classes without paying for them.

Besides being the Pope’s Latinist and “one of the Vatican’s most colorful characters” (as the Catholic News Service called him), Foster has been a tireless champion of Latin in the classroom. Indeed, Foster’s greatest legacy may be as a teacher. “The most influential Latin teacher in the last half-century is Reggie Foster,” says Dr. Nancy Llewellyn, professor of Latin at Wyoming Catholic College. “That’s not just my opinion—that’s a fact. For decades, he had the power to change lives like no other teacher in our field. I saw him for an hour in Rome in 1985 and that one hour completely changed my life. His approach was completely different from every other Latin teacher out there, and it was totally transformative.”

A humanist par excellence, Latin for Foster was not something to be dissected by linguistic analysis or serve as the raw data for a theory of gender or poetics: it was a language, a medium of human connection. I first met Foster in 1995, at his summer school, and couldn’t get enough: I returned seven times. No one on Earth was reading as much Latin as he and his students were, but he was more like an old-school newspaper editor than an academic: he wanted the story. But for that you actually had to know Latin, and know it well. Foster was ruthless about ignorance, and equally ruthless about anything that to him looked like mere academic posturing. “I don’t care about your garbage literary theory!” he barked at his students one day. “I can tell in about ten seconds if you know the Latin or if you are making it all up.” “Latin is the best thing that ever happened to humanity. It leaves you zero room for nonsense. You don’t have to be a genius. But it requires laser-sharp concentration and total maturity. If you don’t know what time of day it is, or what your name is, or where you are, don’t try Latin because it will smear you on the wall like an oil spot.” The number of Foster’s students runs into the thousands, and many of them are now themselves some of the most dedicated teachers in the field. “When I was in college I asked people, ‘Hey, we all know Latin is a language. Does anybody actually speak it anymore?’ And they told me there was one guy, some guy at the Vatican, who still spoke the language, and that was Fr. Foster,” says Dr. Michael Fontaine, a professor of Classics at Cornell University. “I said to myself, ‘I have to study with this guy.’ And that changed everything for me.” Dr. Paul Gwynne, professor of Medieval and Renaissance Studies at the American University of Rome, said of Foster, “He is not just the best Latin teacher I’ve ever seen, he’s simply the best teacher I’ve ever seen. Studying Latin with the Pope’s apostolic secretary, for whom the language is alive, using the city of Rome as a classroom . . . it changed my whole outlook on life, really.”

Time seems to bend around Foster, and past and present intertwine. When I wrote to Fr. Antonio Salvi, the current head of the Vatican’s Latin department, for comment about Foster, he responded entirely in Latin, beginning with four words that sounded like an old soldier praising Cato—“Probus vir, parvo contentus.” An upright man. Content with little. And in many ways Foster’s resembles the life of a medieval saint: at the age of six, he would play priest, ripping up old sheets as vestments. He entered seminary at thirteen. He said he wanted only three things in life: to be a priest, to be a Carmelite, and to do Latin. He has spent his entire life in great personal poverty. His cell had no mattress: he slept on the tile floor with a thin blanket. His clothes were notorious in Rome: believing that . . .

Continue reading.

Full disclosure: The Younger Daughter teaches Latin and Classical Greek.

Written by LeisureGuy

27 December 2020 at 8:42 am

Posted in Daily life, Education, Religion

Tagged with ,

Cellphones cripple social skills

leave a comment »

Ron Srigley writes in MIT Technology Review:

A few years ago, I performed an experiment in a philosophy class I was teaching. My students had failed a midterm test rather badly. I had a hunch that their pervasive use of cell phones and laptops in class was partly responsible. So I asked them what they thought had gone wrong. After a few moments of silence, a young woman put up her hand and said: “We don’t understand what the books say, sir. We don’t understand the words.” I looked around the class and saw guileless heads pensively nodding in agreement.

I extemporized a solution: I offered them extra credit if they would give me their phones for nine days and write about living without them. Twelve students—about a third of the class—took me up on the offer. What they wrote was remarkable, and remarkably consistent. These university students, given the chance to say what they felt, didn’t gracefully submit to the tech industry and its devices.

The usual industry and education narrative about cell phones, social media, and digital technology generally is that they build community, foster communication, and increase efficiency, thus improving our lives. Mark Zuckerberg’s recent reformulation of Facebook’s mission statement is typical: the company aims to “give people the power to build community and bring the world closer together.”

Without their phones, most of my students initially felt lost, disoriented, frustrated, and even frightened. That seemed to support the industry narrative: look how disconnected and lonely you’ll be without our technology. But after just two weeks, the majority began to think that their cell phones were in fact limiting their relationships with other people, compromising their own lives, and somehow cutting them off from the “real” world. Here is some of what they said.

“You must be weird or something”

“Believe it or not, I had to walk up to a stranger and ask what time it was. It honestly took me a lot of guts and confidence to ask someone,” Janet wrote. (Her name, like the others here, is a pseudonym.) She describes the attitude she was up against: “Why do you need to ask me the time? Everyone has a cell phone. You must be weird or something.” Emily went even further. Simply walking by strangers “in the hallway or when I passed them on the street” caused almost all of them to take out a phone “right before I could gain eye contact with them.”

To these young people, direct, unmediated human contact was experienced as ill-mannered at best and strange at worst. James: “One of the worst and most common things people do nowadays is pull out their cell phone and use it while in a face-to-face conversation. This action is very rude and unacceptable, but yet again, I find myself guilty of this sometimes because it is the norm.” Emily noticed that “a lot of people used their cell phones when they felt they were in an awkward situation, for an example [sic] being at a party while no one was speaking to them.”

The price of this protection from awkward moments is the loss of human relationships, a consequence that almost all the students identified and lamented. Without his phone, James said, he found himself forced to look others in the eye and engage in conversation. Stewart put a moral spin on it. “Being forced to have [real relations with people] obviously made me a better person because each time it happened I learned how to deal with the situation better, other than sticking my face in a phone.” Ten of the 12 students said their phones were compromising their ability to have such relationships.

Virtually all the students admitted that ease of communication was one of the genuine benefits of their phones. However, eight out of 12 said they were genuinely relieved not to have to answer the usual flood of texts and social-media posts. Peter: “I have to admit, it was pretty nice without the phone all week. Didn’t have to hear the fucking thing ring or vibrate once, and didn’t feel bad not answering phone calls because there were none to ignore.”

Indeed, the language they used indicated that they experienced this activity almost as a type of harassment. “It felt so free without one and it was nice knowing no one could bother me when I didn’t want to be bothered,” wrote William. Emily said that she found herself “sleeping more peacefully after the first two nights of attempting to sleep right away when the lights got shut off.” Several students went further and claimed that communication with others was in fact easier and more efficient without their phones. Stewart: “Actually I got things done much quicker without the cell because instead of waiting for a response from someone (that you don’t even know if they read your message or not) you just called them [from a land line], either got an answer or didn’t, and moved on to the next thing.”

Technologists assert that their instruments make us more productive. But for the students, phones had the opposite effect. “Writing a paper and not having a phone boosted productivity at least twice as much,” Elliott claimed. “You are concentrated on one task and not worrying about anything else. Studying for a test was much easier as well because I was not distracted by the phone at all.” Stewart found he could “sit down and actually focus on writing a paper.” He added, “Because I was able to give it 100% of my attention, not only was the final product better than it would have been, I was also able to complete it much quicker.” Even Janet, who missed her phone more than most, admitted, “One positive thing that came out of not having a cell phone was that I found myself more productive and I was more apt to pay attention in class.”

Some students felt not only distracted by their phones, but morally compromised. Kate: “Having a cell phone has actually affected my personal code of morals and this scares me … I regret to admit that I have texted in class this year, something I swore to myself in high school that I would never do … I am disappointed in myself now that I see how much I have come to depend on technology … I start to wonder if it has affected who I am as a person, and then I remember that it already has.” And James, though he says we must continue to develop our technology, said that “what many people forget is that it is vital for us not to lose our fundamental values along the way.”

Other students were worried that their cell-phone addiction was depriving them of a relationship to the world. Listen to James: “It is almost like . . .

Continue reading.

Written by LeisureGuy

27 December 2020 at 7:59 am

“Why I chose to study classics”

leave a comment »

Charlotte Higgins writes in Prospect:

In my final Classical Musing column, I wanted to try to set down what it is about classics that I find worthwhile. In my day job, I am a writer on the Guardian. Journalism is, by definition, about the events of the day, which rush past at a bewildering speed. The literature of the deep past offers a respite and, to an extent, an escape. This was certainly true on a personal level in the early months of the pandemic, when I was on leave, immersed in writing a book of retellings of stories from the Greek myths.

Disappearing into the world of Arachne, Penelope and Medea offered a defence against the anxieties of the moment. But classics also offers a new perspective on the modern world; a different lens through which to see our own times. Encountering the literature of the past is a dynamic process. When we read, we cannot leave behind our contemporary baggage; we see ourselves reflected back in those old books. In turn, reading ancient texts can cast light on our own moment. You can understand a lot about power in the time of an epidemic by reading Sophocles’ Oedipus Tyrannus; you can get an intriguing perspective on modern patriarchy by reading Aeschylus’s The Kindly Ones.

I’m often asked why I chose to study classics. It was mostly because I fell in love with the stories. The world of Catullus and Euripides was so familiar and yet so thrillingly alien, so hard to decode. I have also come to realise, though, that it was also because I wanted to escape having to compete intellectually with my father and brothers, who were all gifted in the sciences. Honesty is required: I also liked classics partly because Latin and Greek sounded impressive, especially to my parents, neither of whom were from the kind of background where Greek epigrams ran in the veins. These days, I would put it in terms of cultural capital. Classics held lots of traditional cachet, and I wanted to partake of it.

This cachet, of course, was unfairly bestowed. The discipline is currently beginning to face up to its historical role in shaping a damaging worldview that put the Graeco-Roman world at the centre of a rhetoric of white European and north American exceptionalism and superiority. But this history does not mean classics should be abandoned—or written off by the left, as it sometimes is, as “elitist”. (It’s always worth remembering that Karl Marx was a classicist, and that his PhD in Greek materialist philosophy palpably shaped his theories of historical materialism.) It means, rather, that classics should constantly be reshaped, opened out, rethought—and some really exciting scholarship in the field is doing exactly that.

The death of classics has been predicted for centuries. It is indeed suffering setbacks. As I sat down to write this column, I heard that an attempt to reintroduce a route for trainee teachers to qualify in Latin in Scotland has been unsuccessful. This kind of knockback is being energetically fought by educators and organisations such as Advocating Classics Education. Classics in the culture at large continues to find a ready audience: one thinks of the popularity of Madeline Miller’s novels, Mary Beard’s history books and television programmes, and Emily Wilson’s brilliant Odyssey translation. Classics generates creativity—I think not just of  . . .

Continue reading.

Written by LeisureGuy

19 December 2020 at 11:04 am

Posted in Books, Daily life, Education

Jack Reacher, Prospero, and Lee Child

leave a comment »

I’m a big Lee Child fan, right up there close to Barry Eisler (who I think is a bit better). I liked this article in Crime Reads by Heather Martin (author of a biography of Lee Child/Jim Grant):

The boy who would one day become Lee Child was eleven years old when the light dawned. He was on the number 16 bus, heading home from King Edward’s School to Handsworth Wood in Birmingham and reading Ian Fleming’s Dr No. He may have been smoking—he’d found his first pack of ten Rothman’s King Size on the very same route just a few months earlier. He loved it from the first drag.

Earlier that day, he’d been reading ‘Theseus and the Minotaur’ at school. His Latin master rated the eleven-year-old Jim Grant highly: ‘Has gone from good to very good,’ read his report in Autumn 1965; then in Summer 1966: ‘Clever boy: with clever classmates would shoot ahead.’ 

Even on his own he was doing OK, and it struck him like an epiphany: ‘Theseus and the Minotaur’ and Dr No ‘were exactly the same story in every beat and every plot point’. The same stories were being told over and over again, and the better they were the more often they were told, and the more often people wanted to hear them and the more popular they became. Increase of appetite grows by what it feeds on. ‘It was the study of Latin poetry that taught me all I needed to know about genre writing,’ he would tell an assembly of academics at the Graduate Center on New York’s Fifth Avenue more than fifty years later. Together with, he might have added, a healthy dollop of James Bond.

In 2010 the author of the bestselling Jack Reacher series wrote an essay on Theseus for Thrillers: 100 Must Reads (ed. David Morrell and Hank Wagner), laying bare the prototype:

I first read this tale, in Latin, as a schoolboy. There was something about the story elements that nagged at me. I tried to reduce the specifics to generalities and arrived at a basic shape: Two superpowers in an uneasy standoff; a young man of rank acting alone and shouldering personal responsibility for a crucial outcome; a strategic alliance with a young woman from the other side; a major role for a gadget; an underground facility; an all-powerful opponent with a grotesque sidekick; a fight to the death; an escape; the cynical abandonment of the temporary female ally; the return home to a welcome that was partly grateful and partly scandalised.

My own Eureka moment came when I re-read The TempestUnbearably pretentious—I could hear Child’s judgement ringing in my ears. But I couldn’t really agree. This wasn’t a Bloomian question of canon or hierarchy; only a time traveller from the 2400s can say whether or not we are still reading Reacher in four centuries’ time. But we routinely tout the universality of Shakespeare. We marvel at how his language has shaped and permeates ours, and how it continues to do so. If this means anything, then it must be that his writing always already inhabits the writing of others, as do the Greek myths—and not just some select others, an intellectual elite approved by the arbiters of taste, but embracing all humankind equally. Shakespeare spoke to the rude mechanicals and the groundlings as much as the court, just as Child addresses the professionals at the centre of the literary universe as much as the one-book-a-year consumers that in his preferred analogy he typically locates on the outer rings of Saturn.

Naturally, having delved into the dark backward and abysm of time to write Lee Child’s biography and full knowing his undying love for the ‘sheer incandescent beauty of Shakespeare’s verse’, there was an element of confirmation bias in my perception of a bond between these disparate writers. Both are men of Warwickshire, and both were educated by the King Edward’s Foundation of grammar schools. Coventry, the city of Jim Grant’s birth, features prominently in Henry IV, Part 1, and just as the teenage Shakespeare would regularly walk there from Stratford-upon-Avon to watch the mystery plays, so too would Jim—a fan since the age of nine, when his father took him to see Henry IV, Part 2—catch the bus to Stratford to seek out the latest production by the Royal Shakespeare Company. As a teenager he interned there, and in 1970, on the cusp of Sixth Form, was backstage when Ian Richardson and Ben Kingsley played Prospero and Ariel, and Peter Brook first staged his seismic Dream. In 1973 he did sound and lighting for the Brook-inspired school production of The Tempest and at Sheffield University, where he spent four years in the Drama Studio while ostensibly studying Law, took as one of his early aliases ‘Richard Strange’, drawn directly from the song in which Ariel leads the distraught Ferdinand to believe that his father, Alonso, lies full fathom five beneath the ocean waves, transformed into something ‘rich and strange’ with bones of coral made and pearls that were his eyes. The programme notes for Ibsen’s A Doll’s House record that design was by Jim Grant and publicity by Richard Strange; Grant received a rave review for his stagecraft in the Sheffield Evening Telegraph.

But as always, too, there were echoes of  . . .

Continue reading.

Written by LeisureGuy

17 December 2020 at 5:42 pm

Posted in Books, Education

The Disadvantages of an Elite Education

leave a comment »

The subtitle is “Our best universities have forgotten that the reason they exist is to make minds, not careers,” but that’s certainly not true of my alma mater, St. John’s College (Annapolis MD and Santa Fe NM). William Deresiewicz wrote this piece in June of 2008, but it’s still interesting:

It didn’t dawn on me that there might be a few holes in my education until I was about 35. I’d just bought a house, the pipes needed fixing, and the plumber was standing in my kitchen. There he was, a short, beefy guy with a goatee and a Red Sox cap and a thick Boston accent, and I suddenly learned that I didn’t have the slightest idea what to say to someone like him. So alien was his experience to me, so unguessable his values, so mysterious his very language, that I couldn’t succeed in engaging him in a few minutes of small talk before he got down to work. Fourteen years of higher education and a handful of Ivy League degrees, and there I was, stiff and stupid, struck dumb by my own dumbness. “Ivy retardation,” a friend of mine calls this. I could carry on conversations with people from other countries, in other languages, but I couldn’t talk to the man who was standing in my own house.

It’s not surprising that it took me so long to discover the extent of my miseducation, because the last thing an elite education will teach you is its own inadequacy. As two dozen years at Yale and Columbia have shown me, elite colleges relentlessly encourage their students to flatter themselves for being there, and for what being there can do for them. The advantages of an elite education are indeed undeniable. You learn to think, at least in certain ways, and you make the contacts needed to launch yourself into a life rich in all of society’s most cherished rewards. To consider that while some opportunities are being created, others are being cancelled and that while some abilities are being developed, others are being crippled is, within this context, not only outrageous, but inconceivable.

I’m not talking about curricula or the culture wars, the closing or opening of the American mind, political correctness, canon formation, or what have you. I’m talking about the whole system in which these skirmishes play out. Not just the Ivy League and its peer institutions, but also the mechanisms that get you there in the first place: the private and affluent public “feeder” schools, the ever-growing parastructure of tutors and test-prep courses and enrichment programs, the whole admissions frenzy and everything that leads up to and away from it. The message, as always, is the medium. Before, after, and around the elite college classroom, a constellation of values is ceaselessly inculcated. As globalization sharpens economic insecurity, we are increasingly committing ourselves—as students, as parents, as a society—to a vast apparatus of educational advantage. With so many resources devoted to the business of elite academics and so many people scrambling for the limited space at the top of the ladder, it is worth asking what exactly it is you get in the end—what it is we all get, because the elite students of today, as their institutions never tire of reminding them, are the leaders of tomorrow.


.
The first disadvantage of an elite education, as I learned in my kitchen that day, is that it makes you incapable of talking to people who aren’t like you. Elite schools pride themselves on their diversity, but that diversity is almost entirely a matter of ethnicity and race. With respect to class, these schools are largely—indeed increasingly—homogeneous. Visit any elite campus in our great nation and you can thrill to the heartwarming spectacle of the children of white businesspeople and professionals studying and playing alongside the children of black, Asian, and Latino businesspeople and professionals. At the same time, because these schools tend to cultivate liberal attitudes, they leave their students in the paradoxical position of wanting to advocate on behalf of the working class while being unable to hold a simple conversation with anyone in it. Witness the last two Democratic presidential nominees, Al Gore and John Kerry: one each from Harvard and Yale, both earnest, decent, intelligent men, both utterly incapable of communicating with the larger electorate.

But it isn’t just a matter of class. My education taught me to believe that people who didn’t go to an Ivy League or equivalent school weren’t worth talking to, regardless of their class. I was given the unmistakable message that such people were beneath me. We were “the best and the brightest,” as these places love to say, and everyone else was, well, something else: less good, less bright. I learned to give that little nod of understanding, that slightly sympathetic “Oh,” when people told me they went to a less prestigious college. (If I’d gone to Harvard, I would have learned to say “in Boston” when I was asked where I went to school—the Cambridge version of noblesse oblige.) I never learned that there are smart people who don’t go to elite colleges, often precisely for reasons of class. I never learned that there are smart people who don’t go to college at all.

I also never learned that there are smart people who aren’t “smart.” The existence of multiple forms of intelligence has become a commonplace, but however much elite universities like to sprinkle their incoming classes with a few actors or violinists, they select for and develop one form of intelligence: the analytic. While this is broadly true of all universities, elite schools, precisely because their students (and faculty, and administrators) possess this one form of intelligence to such a high degree, are more apt to ignore the value of others. One naturally prizes what one most possesses and what most makes for one’s advantages. But social intelligence and emotional intelligence and creative ability, to name just three other forms, are not distributed preferentially among the educational elite. The “best” are the brightest only in one narrow sense. One needs to wander away from the educational elite to begin to discover this.

What about people who aren’t bright in any sense? I have a friend who . . .

Continue reading. Audio at the link, plus much more in the article.

Written by LeisureGuy

12 December 2020 at 8:43 pm

Posted in Daily life, Education, Memes

Dolly Parton knows effective educational intervention

leave a comment »

Erick Moore has a good Facebook post:

In 1990, the high school dropout rate for Dolly Parton’s hometown of Sevierville Tennessee was at 34% (Research shows that most kids make up their minds in fifth/sixth grade not to graduate). That year, all fifth and sixth graders from Sevierville were invited by Parton to attend an assembly at Dollywood. They were asked to pick a buddy, and if both students completed high school, Dolly Parton would personally hand them each a $500 check on their graduation day. As a result, the dropout rate for those classes fell to 6%, and has generally retained that average to this day.

Shortly after the success of The Buddy Program, Parton learned in dealing with teachers from the school district that problems in education often begin during first grade when kids are at different developmental levels. That year The Dollywood Foundation paid the salaries for additional teachers assistants in every first grade class for the next 2 years, under the agreement that if the program worked, the school system would effectively adopt and fund the program after the trial period.

During the same period, Parton founded the Imagination Library in 1995: The idea being that children from her rural hometown and low-income families often start school at a disadvantage and as a result, will be unfairly compared to their peers for the rest of their lives, effectively encouraging them not to pursue higher education. The objective of the Imagination library was that every child in Sevier County would receive one book, every month, mailed and addressed to the child, from the day they were born until the day they started kindergarten, 100% free of charge. What began as a hometown initiative now serves children in all 50 states, Australia, Canada, and the United Kingdom, mailing thousands of free books to children around the world monthly.

On March 1, 2018 Parton donated her 100 millionth book at the Library of Congress: a copy of “Coat of Many Colors” dedicated to her father, who never learned to read or write.

Happy 74th Birthday Dolly Parton!

Written by LeisureGuy

2 December 2020 at 5:55 pm

“I should have loved biology”

leave a comment »

James Somers describes in this post how schools shortchanged him on biology — particularly poignant given the previous post on Lake Tanganyika’s cichlids. He writes:

I should have loved biology but I found it to be a lifeless recitation of names: the Golgi apparatus and the Krebs cycle; mitosis, meiosis; DNA, RNA, mRNA, tRNA.

In the textbooks, astonishing facts were presented without astonishment. Someone probably told me that every cell in my body has the same DNA. But no one shook me by the shoulders, saying how crazy that was. I needed Lewis Thomas, who wrote in The Medusa and the Snail:

For the real amazement, if you wish to be amazed, is this process. You start out as a single cell derived from the coupling of a sperm and an egg; this divides in two, then four, then eight, and so on, and at a certain stage there emerges a single cell which has as all its progeny the human brain. The mere existence of such a cell should be one of the great astonishments of the earth. People ought to be walking around all day, all through their waking hours calling to each other in endless wonderment, talking of nothing except that cell.

I wish my high school biology teacher had asked the class how an embryo could possibly differentiate—and then paused to let us really think about it. The whole subject is in the answer to that question. A chemical gradient in the embryonic fluid is enough of a signal to slightly alter the gene expression program of some cells, not others; now the embryo knows “up” from “down”; cells at one end begin producing different proteins than cells at the other, and these, in turn, release more refined chemical signals; …; soon, you have brain cells and foot cells.

How come we memorized chemical formulas but didn’t talk about that? It was only in college, when I read Douglas Hofstadter’s Gödel, Escher, Bach, that I came to understand cells as recursively self-modifying programs. The language alone was evocative. It suggested that the embryo—DNA making RNA, RNA making protein, protein regulating the transcription of DNA into RNA—was like a small Lisp program, with macros begetting macros begetting macros, the source code containing within it all of the instructions required for life on Earth. Could anything more interesting be imagined?

Someone should have said this to me:

Imagine a flashy spaceship lands in your backyard. The door opens and you are invited to investigate everything to see what you can learn. The technology is clearly millions of years beyond what we can make.

This is biology.

–Bert Hubert, “Our Amazing Immune System”

In biology class, biology wasn’t presented as a quest for the secrets of life. The textbooks wrung out the questing. We were nowhere acquainted with real biologists, the real questions they had, the real experiments they did to answer them. We were just given their conclusions.

For instance I never learned that a man named Oswald Avery, in the 1940s, puzzled over two cultures of Streptococcus bacteria. One had a rough texture when grown in a dish; the other was smooth, and glistened. Avery noticed that when he mixed the smooth strain with the rough strain, every generation after was smooth, too. Heredity in a dish. What made it work? This was one of the most exciting mysteries of the time—in fact of all time.

Most experts thought that protein was somehow responsible, that traits were encoded soupily, via differing concentrations of chemicals. Avery suspected a role for nucleic acid. So, he did an experiment, one we could have replicated on our benches in school. Using just a centrifuge, water, detergent, and acid, he purified nucleic acid from his smooth strep culture. Precipitated with alcohol, it became fibrous. He added a tiny bit of it to the rough culture, and lo, that culture became smooth in the following generations. This fibrous stuff, then, was “the transforming principle”—the long-sought agent of heredity. Avery’s experiment set off a frenzy of work that, a decade later, ended in the discovery of the double helix.

In his “Mathematician’s Lament,” Paul Lockhart describes how school cheapens mathematics by robbing us of the questions. We’re not just asked, hey, how much of the triangle . . .

Continue reading.

Written by LeisureGuy

1 December 2020 at 3:05 pm

Learn How to Play Chess Online: Free Chess Lessons for Beginners, Intermediate Players & Beyond

leave a comment »

When I first learned to play chess I knew no chessplayers, so I tried to figure it out with the instructions that came with the little plastic set I received. I only knew checkers, and from that got the (mistaken) idea that knights captured pieces by jumping over them (rather than as they do: by landing on them).

Nowadays the resources available are manifold. For those who already know the game, I have at the right a link to a excellent on-line book providing instruction in tactics: Predator at the Chessboard. But that assumes you know the game and have some experience in playing it. What if you have never played?

Open Culture has an excellent round-up of free on-line resources available to those who want to learn the game from scratch. And I’ll note that there are many opportunities to play on-line so that your opponent need not be physically present or even close.

Written by LeisureGuy

1 December 2020 at 10:24 am

Empire of fantasy: British colonialism in children’s literature

leave a comment »

In Aeon Maria Sachiko Cecire has a very interesting essay adapted from material published in Re-Enchanted: The Rise of Children’s Fantasy Literature in the Twentieth Century. The essay begins:

Much has changed in the fantasy genre in recent decades, but the word ‘fantasy’ still conjures images of dragons, castles, sword-wielding heroes and premodern wildernesses brimming with magic. Major media phenomena such as Harry Potter and Game of Thrones have helped to make medievalist fantasy mainstream, and if you look in the kids’ section of nearly any kind of store today you’ll see sanitised versions of the magical Middle Ages packaged for youth of every age. How did fantasy set in pseudo-medieval, roughly British worlds achieve such a cultural status? Ironically, the modern form of this wildly popular genre, so often associated with escapism and childishness, took root in one of the most elite spaces in the academic world.

The heart of fantasy literature grows out of the fiction and scholarly legacy of two University of Oxford medievalists: J R R Tolkien and C S Lewis. It is well known that Tolkien and Lewis were friends and colleagues who belonged to a writing group called the Inklings where they shared drafts of their poetry and fiction at Oxford. There they workshopped what would become Tolkien’s Middle-earth books, beginning with the children’s novel The Hobbit (1937), and followed in the 1950s with The Lord of the Rings and Lewis’s Chronicles of Narnia series, which was explicitly aimed at children. Tolkien’s influence on fantasy is so important that in the 1990s the American scholar Brian Attebery defined the genre ‘not by boundaries but by a centre’: Tolkien’s Middle-earth. ‘Tolkien’s form of fantasy, for readers in English, is our mental template’ for all fantasy, he suggests in Strategies of Fantasy (1992). Lewis’s books, meanwhile, are iconic as both children’s literature and fantasy. Their recurring plot structure of modern-day children slipping out of this world to save a magical, medieval otherworld has become one of the most common approaches to the genre, identified in Farah Mendlesohn’s taxonomy of fantasy as the ‘portal-quest’.

What is less known is that Tolkien and Lewis also designed and established the curriculum for Oxford’s developing English School, and through it educated a second generation of important children’s fantasy authors in their own intellectual image. Put in place in 1931, this curriculum focused on the medieval period to the near-exclusion of other eras; it guided students’ reading and examinations until 1970, and some aspects of it remain today. Though there has been relatively little attention paid to the connection until now, these activities – fantasy-writing, often for children, and curricular design in England’s oldest and most prestigious university – were intimately related. Tolkien and Lewis’s fiction regularly alludes to works in the syllabus that they created, and their Oxford-educated successors likewise draw upon these medieval sources when they set out to write their own children’s fantasy in later decades. In this way, Tolkien and Lewis were able to make a two-pronged attack, both within and outside the academy, on the disenchantment, relativism, ambiguity and progressivism that they saw and detested in 20th-century modernity.

Tolkien articulated his anxieties about the cultural changes sweeping across Britain in terms of ‘American sanitation, morale-pep, feminism, and mass-production’, calling ‘this Americo-cosmopolitanism very terrifying’ and suggesting in a 1943 letter to his son Christopher that, if this was to be the outcome of an Allied Second World War win, he wasn’t sure that victory would be better for the ‘mind and spirit’ – and for England – than a loss to Nazi forces.

Lewis shared this abhorrence for ‘modern’ technologisation, secularisation and the swiftly dismantling hierarchies of race, gender and class. He and Tolkien saw such broader shifts reflected in . . .

Continue reading. There’s much more.

Written by LeisureGuy

30 November 2020 at 10:17 am

Posted in Books, Education, Memes, Politics

Esperanto Scrabble tiles

leave a comment »

At last. They also have sign language tiles (Australian, British, and US) and Aurebesh.

Written by LeisureGuy

28 November 2020 at 9:08 pm

Posted in Esperanto, Games

Cognitive Biases that Interfere with Critical Thinking and Scientific Reasoning

leave a comment »

Dr. Hershey H. Friedman, Professor of Business at the Murray Koppelman School of Business, Brooklyn College of the City University of New York, has an interesting PDF with the title shown. Its free and you can download it from that link. It begins:

Abstract: It is clear that people do not always behave in a rational manner. Sometimes they are presented with too much information or they may want to make a quick decision. This may cause them to rely on cognitive shortcuts known as heuristics (rules of thumb). These cognitive shortcuts often result in cognitive biases; at least 175 cognitive biases have been identified by researchers. This paper describes many of these biases starting with actor-observer bias and ending with zero-risk bias. It also describes how one can overcome them and thereby become a more rational decision maker.

***

Many of the fundamental principles of economic theory have recently been challenged. Economic theory is largely based on the premise of the “rational economic man.” Rational man makes decisions based solely on self-interest and wants to maximize his utility. However, the rational man theory may be a theory that is dead or rapidly dying. After the Great Recession of 2008, Alan Greenspan, former Chairman of the Federal Reserve, told Congress: “I made a mistake in presuming that the self-interests of organizations, specifically banks and others, were such that they were best capable of protecting their own shareholders” (Ignatius, 2009). Nouriel Roubini, a prominent economist known as Dr. Doom for predicting the housing market collapse in 2006, stated that “The rational man theory of economics has not worked” (Ignatius, 2009). Kahneman (2011: 374) avows: “Theories can survive for a long time after conclusive evidence falsifies them, and the rational-agent model certainly survived the evidence we have seen, and much other evidence as well.”

Kahneman (2011: 269) describes how he was handed an essay written by the Swiss economist Bruno Frey that stated: “The agent of economic theory is rational, selfish, and his tastes do not change.” Kahneman was astonished that economists could believe this given that it was quite obvious to psychologists that “people are neither fully rational nor completely selfish, and that their tastes are anything but stable. Our two disciplines seemed to be studying different species, which the behavioral economist Richard Thaler later dubbed Econs and Humans.”

Many economists now realize that man does not always behave in a rational manner. Thaler and Mullainatha (2008) describe how in experiments involving “ultimatum” games, we see evidence that people do not behave as traditional economic theory predicts they will. People will act “irrationally” and reject offers they feel are unfair:

In an ultimatum game, the experimenter gives one player, the proposer, some money, say ten dollars. The proposer then makes an offer of x, equal or less than ten dollars, to the other player, the responder. If the responder accepts the offer, he gets x and the proposer gets 10 − x. If the responder rejects the offer, then both players get nothing. Standard economic theory predicts that proposers will offer a token amount (say twenty-five cents) and responders will accept, because twenty-five cents is better than nothing. But experiments have found that responders typically reject offers of less than 20 percent (two dollars in this example).

This is why we must also draw on insights from the discipline of psychology. Ariely (2008) uses the latest research to demonstrate that people are predictably irrational; they use heuristics or rules of thumb to make decisions. Heuristics may be seen as “cognitive shortcuts” that humans utilize when there is a great deal of required information to collect in order to make a correct decision but time (or desire to do the extensive research) and/or money is limited (Caputo, 2013). Using rules of thumb may help a person make quick decisions but might lead to a systematic bias. Smith (2015) lists 67 cognitive biases that interfere with rational decision making. A cognitive bias is defined as:

a systematic error in thinking that affects the decisions and judgments that people make. Sometimes these biases are related to memory. The way you remember an event may be biased for a number of reasons and that in turn can lead to biased thinking and decision-making. In other instance, cognitive biases might be related to problems with attention. Since attention is a limited resource, people have to be selective about what they pay attention to in the world around them (Chery, 2016).

Download to continue reading.

Written by LeisureGuy

26 November 2020 at 12:13 pm

A History of Philosophy from Ancient Greece to Modern Times (81 Video Lectures)

leave a comment »

Philosphy is endlessly fascinating, and I think this series will provide a good foundation. Watch one a day, starting today, and well before Valentine’s Day — well, 4 days before — you’ll have a better idea of what philosophy has been in the West.

This is via OpenCulture, which notes:

You can watch 81 video lectures tracing the history of philosophy, moving from Ancient Greece to modern times. Arthur Holmes presented this influential course at Wheaton College for decades, and now it’s online for you. The lectures are all streamable above, or available through this YouTube playlist.

Philosophers covered in the course include: Plato, Aquinas, Hobbes Descartes, Spinoza, Hume, Kant, Hegel, Nietzsche, Sartre and more.

A History of Philosophy has been added to our list of Free Online Philosophy courses, a subset of our meta collection, 1,500 Free Online Courses from Top Universities.

Written by LeisureGuy

22 November 2020 at 11:04 am

Sex, Lies, And Regret: Giancarlo Granda Reels From Eight Years With The Falwells

leave a comment »

Josh Kovinsky reports at TPM:

Giancarlo Granda first noticed that something was off about Becki Falwell while in bed with her.

It was around three or four in the morning, Granda recalls, when he woke up and noticed Becki staring at him without blinking.

It’s what he’d later come to call “the look” — something that, eight years later, still creeps him out.

But Granda is neither squeamish nor skittish. And nor was he, as a 21 year old, primed to let something a little odd like Becki’s “dark, black eyes” fixating on him in the night give him pause.

After all, he was waking up at Cheeca Lodge, an exclusive resort on the Florida Keys’s Islamorada, just weeks after having first met Becki poolside at Miami’s glitzy Fontainebleau hotel.

Becki and her husband Jerry Jr., who, Granda recalls, was sleeping on the couch that night, had invited Granda to the resort for a good time. Granda was only dimly aware of who he was with: education administrators at a Christian university.

But where others saw bible thumpers with an axe to grind against modernity, Granda says he saw a “hot cougar,” an outlet for his own business idea, and, eventually, a second family. Across days of in-person interviews outside Washington D.C., over weeks of phone calls, and through dozens of supporting documents, Granda provided TPM with the most detailed account yet of his entanglement with the Falwells, which would contribute to Jerry losing his position as president of Liberty University and leave Granda feeling besieged by embarrassing articles, wanting to change his name.

Born to a family of first generation Cuban and Mexican immigrants, Granda, with his chiseled jaw and rigorous workout schedule, projects masculinity. It wasn’t always that way. Late in high school, Granda went through a period in which his personal, athletic, and academic lives collapsed due to an obsession with gaming, leaving the school baseball team to spend all his time on first-person shooters like Halo and Call of Duty.

But Granda found redemption from that period by working out and working hard as a pool attendant at the Fontainebleau, which allowed him to study part-time for a bachelor’s degree at Florida International University. It also gave him the physique that, according to Granda, Becki Falwell snapped pictures of as he worked poolside in March 2012, before she invited Granda back to her hotel room, pouring him a glass of Jack Daniels as she asked if it was okay if her husband Jerry watched.

That brief triptych of his life, from baseball infielder, to gamer, to pool attendant, had given him a business idea: Big Brothers, Big Sisters, but for gamers. He shared his plan with the Falwells.

“I think it’s an excellent idea,” Granda recalls Jerry telling him over breakfast at Cheeca Lodge. The evangelical scion then suggested he “partner up with Liberty University.”

Granda was elated. He and the Falwells made plans for a trip to New York City later in April — one full month after their first meeting. Granda told his buddies at the Fontainebleau what was happening: he had befriended a wealthy, well-connected couple who were going to take him to New York and introduce him to deep-pocketed investors, giving his business idea — and separate real estate ambitions — legs.

Granda had told his sister about the relationship from the beginning, who remembers asking her brother, months into their relationship, if Jerry was “the pastor guy who is super conservative?”

“And he’s like, ‘No, it’s the son,’” she recalls.

It was Jerry, the tall, good ol’ boy son of the famous televangelist, who stoked Granda’s business ambitions while Becki, the dark-haired daughter of millionaire donors to Liberty, flirted with Granda and drew him away for private liaisons.

But as the chance encounter turned into a years-long relationship, Granda’s sense of good fortune wouldn’t last. Granda would come to see these early meetings not as encouragement, but as “grooming.” Publicity around their arrangement was not a useful link to someone well-connected, but “psychological torture.” Two jaded, middle-aged evangelicals taking advantage of someone decades younger, simply out of ennui.

“I’m standing up for my 20-year-old self,” Granda, now 29, reflects.

Carnival ménage à trois

The Falwells have a dramatically different narrative.

In a lawsuit filed against Liberty University in late October, Jerry alleged that Granda carried on a sporadic affair with Becki from 2012 to 2014 in which he was not involved. The couple befriended the 20 year old, the lawsuit says, because they were “impressed” by his “entrepreneurial attitude and ambition.” Later, the lawsuit claims, Granda sought to extort the Falwells.

The Falwells responded to initial inquiries from TPM, and appeared willing to discuss their relationship with Granda. But after Jerry filed his lawsuit against Liberty, the Falwells stopped replying to TPM’s requests for an on-the-record interview.

Granda shared records with TPM that he said back up his version of events. And other reporting suggests that Granda may not be alone in his account. Politico reported earlier this month that Becki told a neighbor about a separate liaison she had had with a Liberty student, saying that Jerry would only be angry about the encounter if he didn’t get to watch. In a statement to Politico, the Falwells called the story “completely false.”

But in 2012, Granda knew none of that. He was seduced by New York.

In Manhattan, the three stayed at the trendy Gansevoort Hotel, sharing drinks on the hotel’s rooftop by a year-round pool with a giant mosaic of Marilyn Monroe at the bottom.

The Falwells squired Granda around Manhattan, taking him to . . .

Continue reading.

Written by LeisureGuy

18 November 2020 at 9:50 am

The peril of pursuing perfection

with 4 comments

I have written about the difficulty faced by adult beginners in playing piano: they are hyperconscious of the mistakes they make, and they don’t want to play until they can play without making such mistakes. But studying our mistakes is how we learn.

I just came across this story from Art and Fear, by David Bayles and Ted Orland:

The ceramics teacher announced on opening day that he was dividing the class into two groups. All those on the left side of the studio, he said, would be graded solely on the quantity of work they produced, all those on the right solely on its quality. His procedure was simple: on the final day of class he would bring in his bathroom scales and weigh the work of the “quantity” group: fifty pounds of pots rated an “A”, forty pounds a “B”, and so on. Those being graded on “quality”, however, needed to produce only one pot – albeit a perfect one – to get an “A”. Well, came grading time and a curious fact emerged: the works of highest quality were all produced by the group being graded for quantity. It seems that while the “quantity” group was busily churning out piles of work – and learning from their mistakes – the “quality” group had sat theorizing about perfection, and in the end had little more to show for their efforts than grandiose theories and a pile of dead clay.

Written by LeisureGuy

16 November 2020 at 3:42 pm

%d bloggers like this: