Later On

A blog written for those whose interests more or less match mine.

Archive for January 13th, 2022

Judge Tosses Teen Rape Conviction, Says 148 Days in Jail Is ‘Plenty of Punishment’

leave a comment »

Just to be clear: the judge reversed his own decision because he decided that raping someone was not that big a deal. (The rape was of a minor (16 years old) and done by an adult (18 years old).) Zoe Richards reports in Yahoo! News:

An Illinois judge had a shocking outburst in court Wednesday, kicking a prosecutor out with minimal explanation as outrage grows over the judge’s decision to reverse his own ruling on a teen accused of rape.

“Mr. Jones, get out,” Adams County Judge Robert Adrian fumed as he ordered Josh Jones of the Adams County State’s Attorney’s Office to leave the courtroom. The Muddy River News, whose journalist was in court, reported that Jones was set to appear in an unrelated case but had apparently “liked” a Facebook post supporting domestic violence survivors in the wake of Adrian’s extraordinary ruling in the rape case .

“I’m not on social media, but my wife is,” Adrian said. “She saw the thumbs up you gave to people attacking me.”

He added: “I can’t be fair with you today. Get out.”

Adrian declined The Daily Beast’s request for comment about the outburst on Wednesday, citing a Supreme Court rule that urges judges to abstain from public comment about pending or impending proceedings.

The move comes as critics slam Adrian for tossing out a sexual assault conviction for 18-year-old Drew Clinton, who allegedly stuffed a pillow in a girl’s face as he raped her at a graduation party in May last year.

Clinton’s accuser, 16-year-old Cameron Vaughan, broke her silence Tuesday days after Adams’ reversal on Jan. 3.

“I woke up at my friend’s place with a pillow over my face so I couldn’t be heard and Drew Clinton inside of me,” Vaughan said, according to WGEM. “I asked him to stop multiple times and he wouldn’t.”

After finally pushing him off, Vaughan said, Clinton jumped up to play video games “as if nothing had happened.”

During a bench trial in October, Clinton was found guilty of one count of criminal sexual assault. But last week, Adrian changed his mind and sensationally declared the teen “not guilty” during a sentencing hearing.

According to a copy of last week’s hearing transcript, the judge insisted that . . .

Continue reading.

Written by Leisureguy

13 January 2022 at 4:28 pm

The Psychedelic Jelly

leave a comment »

This is quite cool. When I lived in Monterey, the Monterey Bay Aquarium was a great local institution, and the Monterey Bay Aquarium Research Institute (MBARI) was very active. After I moved to Pacific Grove (adjacent to Monterey, a move of about 6 blocks), I was just a block from MBARI offices.

Written by Leisureguy

13 January 2022 at 4:20 pm

The Riemann Hypothesis, Explained

leave a comment »

This is quite a good video. It appeaars in an article in Quanta, “Mathematicians Clear Hurdle in Quest to Decode Primes,” by Kevin Hartnett, interesting in its own right, but the video also works as a standalone.

Written by Leisureguy

13 January 2022 at 2:36 pm

Posted in Math, Video

Black-eyed peas and bok choy

leave a comment »

Photo is from recipe; I ate mine before I thought of taking a photo.

Not black-eyed peas and baby bok choy in the same dish, though that does sound good, but two different dishes. The first one is just something I made up, but I like it.

Black-eyed peas Mexicanish

• 1/2 avocado diced
• 1/2 – 1 jalapeño, chopped fine
• about 1/2 cup cilantro, chopped small
• juice of 1/2 lime (can substitute 1/2 lemon)
• pinch of salt — I used rosemary salt
• about 1/2 cup cooked, drained, and chilled blacked peas
• about 1/3 cup red onion, chopped small

Stir to mix. Eat and enjoy.

Stir-Fried Baby Bok Choy

The other recipe, Stir-Fried Baby Bok Choy, I found on the web. Yesterday I bought some Shanghai baby bok choy and some regular baby bok choy, so I’m going to cook them together: variety.

 

Written by Leisureguy

13 January 2022 at 1:46 pm

Could Small Still Be Beautiful?

leave a comment »

Bryce T. Bauer writes in Craftsmanship:

1. “Economics as a Form of Brain Damage”
2. The Schumacher Center for a New Economics
3. The New Economics of Land Ownership
4. The New Economics of Business Financing
5. The New Economics of Currency
6. The New Economics of Entrepreneurship
7. Challenges to the New Economy

Four decades ago, just as some of the forces that have caused today’s problems with globalization and inequality began to take hold, a British economist by the name of E.F. Schumacher took America by storm with a set of contrary ideas about how an economy should work.

Schumacher aimed squarely at supporting everyday people and the communities where they lived. For a brief period in the mid-1970s, his name enjoyed headline status — and his book, “Small Is Beautiful: Economics as if People Mattered,” joined a pantheon of powerful, call-to-action works of the time. Schumacher’s book was taken so seriously that, a few years after its publication, it was listed alongside such enduring critiques as Rachel Carson’s “Silent Spring” and Paul R. Ehrlich’s “The Population Bomb.”

While “Small Is Beautiful” hasn’t endured with quite the same power those works have enjoyed, its ideas have still seeped into the thinking of some of the nation’s latter-day acolytes of social and environmental sustainability, including Wendell Berry, Jane Jacobs, and Bill McKibben. Schumacher’s work also inspired a small think-tank focused on turning the small towns and bucolic countryside of the Massachusetts Berkshires into a laboratory for further exploration of his theories.

Given how rarely Schumacher’s once-popular ideas are discussed today, one can’t help but wonder—were his perceptions all wrong? Or, as the director of the institute focused on sustaining his ideas, and as Schumacher himself also said, was their time yet to come? If the latter, might that time be now? Every day, it seems, more and more experts join the argument that the accelerating dominance of global companies — in a world struggling with income inequality, resource depletion, and the growing ravages of climate change — has put us on an unsustainable path. If that bleak outlook is correct, maybe it’s time to give Schumacher’s ideas a second look.

“ECONOMICS AS A FORM OF BRAIN DAMAGE”

When “Small Is Beautiful” came out, in 1973, Schumacher had already worked for several decades as an economist. In the years after its publication, he toured the United States speaking to crowds across the country and meeting with political leaders, including an address before 50 members of Congress and a meeting with President Jimmy Carter. At the time, America was being wrenched by many of the ills he said modern economics would cause. The 1970s was a decade marked by oil and gas shocks, labor unrest and stagflation, a growing concern over the environment, and the discord of the Vietnam War. Schumacher was attuned to what it all portended. (In fact, the first use of the term “global warming” occurred just two years after Schumacher’s book was published.) Schumacher wrote “we do well to ask why it is that all these terms — pollution, environment, ecology, etc. — have so suddenly come into prominence…is this a sudden fad, a silly fashion, or perhaps a sudden failure of nerve?”

Born in Bonn, Germany, Schumacher had fled Nazi Germany to England in 1937. During the Second World War, when Great Britain began interning Germans, including Jewish refugees, Schumacher and his family moved to the countryside, where he worked on a farm until his writing caught the notice of John Maynard Keynes, the British economist who launched the 20th century’s activist alternative to unfettered, free-market economics.

The core of Schumacher’s argument lay in his book’s subtitle: “Economics as if People Mattered.” For far too long, economists had approached the problem of development in a way that focused too much on goods over people, emphasizing the elimination of labor instead of job creation. He accused these experts of treating consumption as the end itself, always to be maximized.

In Schumacher’s view, the economy would not benefit from the standard methods of stimulation; if anything, it should be de-intensified. If this could be managed, Schumacher believed, it would allow time “for any piece of work — enough to make a really good job of it, to enjoy oneself, to produce real equality, even to make things beautiful.”

The opportunity to work this way — which is central to any artisan or tradesman, and to his or her ability to produce top-notch, innovative work — clearly has only declined further in the years since Schumacher made this observation. And if anything, his critique might be even more timely today. In a new book, “Tightrope: Americans Reaching for Hope,” veteran New York Times journalists Nicholas Kristof and Sheryl WuDunn argue that the growing scarcity of jobs that offer such visceral satisfactions is part of what’s plunged America’s working class into unprecedented levels of despair, drug addiction, and suicide.

To be truly helpful, Schumacher argued, development funds in poor areas should be spent on “intermediate technology” — that is, technology that’s cheap, resilient, and simple enough to be used by workers in areas that lack access to education, ready capital, and sophisticated infrastructure. Technology that’s too expensive, and too complex to be readily used in developing economies, he said, destroys “the possibilities of self-reliance.”

Whenever he traveled to speak about these ideas in the U.S., crowds met his stops — 2,000 in Chicago, 500 in Minneapolis, 200 at the Colorado School of the Mines in Golden, 600 in an overflow crowd at the Helena, Montana Civic Center — and his book was, at one point, reportedly selling 30,000 copies a month. His ideas also inspired a government “Office of Appropriate Technology” in California, where then-governor Jerry Brown introduced Schumacher during a 1977 tour of America. (That organization is still in existence, in slightly altered form in Montana, as the National Center for Appropriate Technology.) During Gov. Brown’s more idealistic days, he once said, “if you want to understand my philosophy, read this,” as he brandished a copy of “Small Is Beautiful.”

“The 60s was a generation that wanted to do things different…and there was Schumacher saying I was a conventional economist and I was mistaken,” says Susan Witt, who became the executive director and co-founder of what’s now called the E.F. Schumacher Center for a New Economics. “I didn’t take into account human beings. I didn’t take into account their spiritual lives. I didn’t take into account concern for the earth and I’ve had to re-think my economics. Those essays in ‘Small Is Beautiful’ touched a generation.”

One of those touched by Schumacher’s ideas was . . .

Continue reading.

Written by Leisureguy

13 January 2022 at 12:07 pm

The superb iKon slant with one of the Doppelgänger CK-6 soaps

leave a comment »

I gave up trying to keep track of the Phoenix Artisan’s Doppelgänger family of soaps. The idea is that the fragrance is a knock-off of a famous fragrance, but it’s impossible for me to remember which colors go with which fragrances. They do smell good, and they also produce that great CK-6 lather, this morning ably assisted by the PA Star Craft shaving brush. 

This is the first slant iKon made, and it is terrific. It’s now sold with the B1 coating, but the head is the same. Correct angle and light pressure are essential for this razor, and with those it performs flawlessly. I have a totally smooth face now, with no nicks.

A splash of the matching shave, beefed up with a couple of squirts of hydrating gel, and I am up and running — a bit late, but I already did some early-morning blogging when I woke up early (before heading back to bed), and those posts struck me as quite interesting.

Written by Leisureguy

13 January 2022 at 11:07 am

Posted in Shaving

To Hell and Back: Allison Cornish on the Divine Comedy

leave a comment »

I believe that anyone who has reached (say) middle age — and many at other junctures in their lives — will feel a thrill of recognition on reading the opening lines of Dante’s Divine Comedy:

In the middle of the journey of our life
I found myself again in a dark forest,
for I had lost the pathway straight and right.

Ah how hard it is to describe, this forest
savage and rough and overwhelming, for
to think of it renews my fear before it! …

How I got there, I cannot rightly say,
I was so full of sleep at that point still
at which I had abandoned the true way.

The Octavian Report interviews Allison Cornish:

Written some 700 years ago, Dante’s Divine Comedy remains one of the greatest works of world literature. Religion, politics, history, love, war, money: it has it all. The three-book epic plumbs the depths of hell and reaches for the highest clouds of paradise, while always remaining grounded in the here and now. In an interview with The Octavian Report, Allison Cornish—who’s an NYU professor, president of the Dante Society of America, and author of the book Vernacular Translation in Dante’s Italyexplains why The Divine Comedy has stood the test of time, what makes it so influential, and why its politics resonate today. . .

Octavian Report: What first got you interested in medieval Italian literature?

Allison Cornish: I was an English major at Berkeley, and toward the end of my time there, I had to take a class in medieval literature. So I studied Beowulf with Allan Renoir, who was the son of the filmmaker [Jean Renoir] and grandson of the artist [Pierre August Renoir]. He said, “you should go to graduate school,” so I went to Cornell. And I had already started studying Italian and French, so I guess I came to it through language first.

OR: Why did you zero in on Dante?

Cornish: The Divine Comedy is just a book like no other—it’s the book of books, in a certain way. Like the Bible, which of course it models itself on. It’s very conscious about the written word, and how we use it to get in touch with reality. The mega plot of Dante’s work is that we’re reading a book. And education is also stressed. Those two things are what Dante the narrator needs; he needs to be led out of the dark wood by a book. That book turns out to be Virgil’s—which, of course, is a book that’s not really from his culture.

So in many ways, the Divine Comedy is a book about books. When I talk about books these days with students, I try not to say the word “books,” because students today are… I hate to say less bookish, but they respond to and are active in so much other media. Yet Dante still speaks to them. Some of these students have told me they want to do a project about fame, fame and one’s legacy. Those used to seem like antiquated ideas. But they totally understand them because of social media. And Dante offers insights into their life online as a kind of legacy, a kind of afterlife.

OR: Was Dante the first person to make himself the main character in an epic?

Cornish: Of an epic, yes.

OR: What do we know about him and why he wrote the Divine Comedy?

Cornish: We don’t know that much about Dante that he doesn’t tell us himself. Independently known facts about him are very few. We do know that he existed, that he served in government, that he was sent into exile, when he died, and that he became a fairly famous figure later on. We don’t know that much else. We know he was married to a woman named Gemma Donati. Does he ever mention her? No. Do we actually know who the character Beatrice is based on? People think it could be a woman called Beatrice Portinari, who was the daughter of a banker and married another banker. But there’s no clear evidence that it’s her.

Dante crafted The Divine Comedy into autobiography of sorts. He took lyric poems, which everyone was writing—love poetry was the fashion and the sort of pop music of the time—and he compiled them into an autobiographical narrative, always emphasizing that life is like a book. But who was he, really? I don’t know. We would probably say upper middle class. Florentine. His father was probably some kind of banker. Dante himself got into government, and to be in government then you had to join a guild. In his case, it was a guild for pharmacists and painters, who had in common the fact that they both ground their minerals in a mortar and pestle. Was Dante an actual pharmacist? Or a painter? I don’t know. It was just something you had to do to get involved in government.

We also know that in 1301, he was sent to see the pope in Rome, and that later various shenanigans led to his being exiled. We know that he tried to get back into the city in various ways. Writing letters, maybe even plotting conspiracies. He was very hopeful that the Holy Roman Emperor, Henry VII, would come down and take over Florence and bring him back in. But Dante finally gave up hope of all that and went from court to court and sought patronage from other lords before he died in Ravenna in 1321.

OR: But the book became really famous, right?

Cornish: Yes, it was an instant best seller and we have evidence that people knew about “The Inferno” before Dante died.

OR: How big was the original audience?

Cornish: I don’t know how to put a number on that. One of the things that’s always said about Dante is that he was the first to write in Italian, and that this fact was marvelous because it brought learning to the people. That has to be historicized a little bit. First of all, at the time, people were demanding access to literary culture in the language that they could read—Italian—without having gone to school, which was the only place you’d learn Latin. But a lot of other stuff was already being written in the vernacular; Dante arrived at a moment when lots of translations [into Italian] were being written.

But the thing is, to write an epic of this scope and ambition, and to do it in a language that’s really tied to a very local place—Florence and Tuscany, not even all of Italy—was remarkable. It really localized something that was universalist in its scope. That’s the paradox Dante embodied. On the one hand, he insisted on the local and the personal and the “I” and used phrases like “my girlfriend” and “my language.” On the other, his work also took us all the way to the stars and beyond.

OR: Why is it a comedy, given its brutality?

Cornish: That is actually the only part of the title that Dante himself gave the work. He called it “My Comedy.” The “divine” part was added later. As for why it’s called a comedy, part of it is that it has a happy ending. Dante seems to be juxtaposing it to The Aeneid, which he calls a tragedy.

The other thing that “comedy” suggested then was a low style, having to do with servants and lower-class people—cooks, stable boys, that kind of thing—as well as a lot of vulgarity. And remember, the vernacular in which Dante wrote was seen as the language of women. It was “the mother tongue,” something you’d learn from your nursemaid.

OR: Why is it that everyone knows “The Inferno” so much better than the other parts of The Divine Comedy?

Cornish: Well, “The Inferno”’s door is open. The gate to hell is wide, and it’s easy to get into it. There’s a lot of action, there’s a lot of horror, and there’s a lot of seductive people to root for, who seem to be rebelling against the order that they’ve been placed in. Meanwhile, “Purgatory” is a mountain and requires work, and “Paradiso” requires even more. Some people say that “Purgatory” is where the lectures begin. Of course, it’s not all lectures, you’re also meeting people. But it’s more difficult.

OR: Do you have a favorite section? . . .

Continue reading.

Written by Leisureguy

13 January 2022 at 7:16 am

Paraconsistent Logics Find Structure in Our Inconsistent World

leave a comment »

Zach Weber, associate professor of philosophy at the University of Otago in New Zealand and author of Paradoxes and Inconsistent Mathematics (2021), has in Aeon what I suspect is an extract from that book. He writes:

Here is a dilemma you may find familiar. On the one hand, a life well lived requires security, safety and regularity. That might mean a family, a partner, a steady job. On the other hand, a life well lived requires new experiences, risk and authentic independence, in ways incompatible with a family or partner or job. Day to day, it can seem not just challenging to balance these demands, but outright impossible. That’s because, we sense, the demands of a good life are not merely difficult; sometimes, the demands of a good life actually contradict. ‘Human experience,’ wrote the novelist George Eliot in 1876, ‘is usually paradoxical.’

One aim of philosophy is to help us make sense of our lives, and one way philosophy has tried to help in this regard is through logic. Formal logic is a perhaps overly literal approach, where ‘making sense’ is cashed out in austere mathematical symbolism. But sometimes our lives don’t make sense, not even when we think very hard and carefully about them. Where is logic then? What if, sometimes, the world truly is senseless? What if there are problems that simply cannot be resolved consistently?

Formal logic as we know it today grew out of a project during the 17th-century Enlightenment: the rationalist plan to make sense of the world in mathematical terms. The foundational assumption of this plan is that the world does make sense, and can be made sense of: there are intelligible reasons for things, and our capacity to reason will reveal these to us. In his book La Géométrie (1637), René Descartes assumed that the world could be covered by a fine-mesh grid so precise as to reduce geometry to analysis; in his Ethics (1677), Baruch Spinoza proposed a view of Nature and our place in it so precise as to be rendered in proofs; and in a series of essays written around 1679, G W Leibniz envisioned a formal language capable of expressing every possible thought in structure-preserving, crystalline symbols – a characteristica universalis – that obeys precise algebraic rules, allowing us to use it to find answers – a calculus ratiocinator.

ationalism dreams big. But dreams are cheap. The startling thing about this episode is that, by the turn of the 20th century, Leibniz’s aspirations seemed close to coming true due to galvanic advances across the sciences, so much so that the influential mathematician David Hilbert was proposing something plausible when in 1930 he made the rationalist assumption a credo: ‘We must know, we will know.’

Hilbert’s credo was based in part on the spectacular successes of logicians in the late 19th century carving down to the bones of pure mathematics (geometry, set theory, arithmetic, real analysis) to find the absolute certainty of deductive validity. If logic itself can be understood in exacting terms, then the project of devising a complete and consistent theory of the world (or at least, the mathematical basis thereof) appeared to be in reach – a way to answer every question, as Hilbert put it, ‘for the honour of human understanding itself’.

But even as Hilbert was issuing his credo and elaborating his plans for solving the Entscheidungsproblem – of building what we would now call a computer that can mechanically decide the truth or falsity of any sentence – all was not well. Indeed, all had not been well for some time.

Already in 1902, on the verge of completing his life’s work, the logician Gottlob Frege received an ominous letter from Bertrand Russell. Frege had been working to provide a foundation for mathematics of pure logic – to reduce complex questions about arithmetic and real analysis to the basic question of formal, logical validity. If this programme, known as logicism, were successful then the apparent certainty of logical deduction, the inescapable truth of the conclusions of sound derivations, would percolate up, so to speak, into all mathematics (and any other area reducible to mathematics). In 1889, Frege had devised an original ‘concept notation’ for quantified logic exactly for this goal, and had used it for his Basic Laws of Arithmetic (two volumes of imposing symbolism, published in 1893 and 1903). Russell shared this logicist goal, and in his letter to Frege, Russell said, in essence, that he had liked Frege’s recent book very much, but had just noticed one little oddity: that one of the basic axioms upon which Frege had based all his efforts seemed to entail a contradiction.

Frege had assumed what he called ‘Basic Law V’ which says, in effect: Sets are collections of things that share a property. For example, the set of all triangles is comprised of all and only the triangles. This seemed obvious enough for Frege to assume as a self-evident logical truth. But from Basic Law V, Russell showed that Frege’s system could prove a statement of the form P and not-P as a theorem. It is called Russell’s Paradox:

Let R be the collection of all things with the property of ‘not being a self-member’. (For example, the set of triangles is not itself a triangle, so it is an R.) What about R itself? If is in R, then it is not, by definition of R; if is not in R, then it is, again by definition. It must be one or the other – so it is both: is in and is not in R, self-membered and not, a contradiction.

The whole system was in fact inconsistent, and thus – in Frege and Russell’s view – absurd. Nonsense. In a few short lines, Frege’s life work had been shown to be a failure.

He would continue to work for another two decades, but his grand project was destroyed. Russell would also spend the next decades trying to come to terms with own his simple discovery, first writing the monumental but flawed Principia Mathematica (three volumes, 1910-13) with Alfred North Whitehead, then eventually pivoting away from logic without ever really solving the problem. Years would pass, with some of the best minds in the world trying mightily to overcome the contradiction Russell had found, without finding a fully satisfactory solution.

By 1931, a young logician named Kurt Gödel had leveraged a similar paradox out of Russell’s own system. Gödel found a statement that, if provable true or false – that is, decidable – would be inconsistent. Gödel’s incompleteness theorems show that there cannot be a complete, consistent and computable theory of the world – or even just of numbers! Any complete and computable theory will be inconsistent. And so, the Enlightenment rationalist project, from Leibniz to Hilbert’s programme, has been shown impossible.

Or so goes the standard story. But the lesson that we must give up on a full understanding of the world in which we live is an enormous pill to swallow. It has been almost a century or more since these events, filled with new and novel advances in logic, and some philosophers and logicians think it is time for a reappraisal.

If the world were a perfect place, we would not need logic. Logic tells us what follows from things we already believe, things we are already committed to. Logic helps us work around our fallible and finite limitations. In a perfect world, the infinite consequences of our beliefs would lie transparently before us. ‘God has no need of any arguments, even good ones,’ said the logician Robert Meyer in 1976: all the truths are apparent before God, and He does not need to deduce one from another. But we are not gods and our world is not perfect. We need logic because we can go wrong, because things do go wrong, and we need guidance. Logic is most important for making sense of the world when the world appears to be senseless.

The story just told ends in failure in part because the logic that Frege, Russell and Hilbert were using was classical logic. Frege assumed something obvious and got a contradiction, but classical logic makes no allowance for contradiction. Because of the classical rule of ex contradictione quodlibet (‘from a contradiction everything follows’), any single contradiction renders the entire system useless. But logic is a theory of validity: an attempt to account for what conclusions really do follow from given premises. As contemporary ‘anti-exceptionalists about logic’ have noted, theories of logic are like everything else in science and philosophy. They are developed and debated by people, and all along there have been disagreements about what the correct theory of logic is. Through that ongoing debate, many have suggested that a single contradiction leading to arbitrary nonsense seems incorrect. Perhaps, then, the rule of ex contradictione itself is wrong, and should not be part of our theory of logic. If so, then perhaps Frege didn’t fail after all.

Over the past decades, logicians have developed mathematically rigorous systems that can handle inconsistency not by eradicating or ‘solving’ it, but by accepting it. Paraconsistent logics create a new opportunity for theories that, on the one hand, seem almost inalienably true (like Frege’s Basic Law V) but, on the other, are known to contain some inconsistencies, such as blunt statements of the form P and not-P. In classical logic, there is a hard choice: give up any inconsistent theory as irrational, or else devolve into apparent mysticism. With these new advances in formal logic, there may be a middle way, whereby sometimes an inconsistency can be retained, not as some mysterious riddle, but rather as a stone-cold rational view of our contradictory world.

Paraconsistent logics have been most famously promoted by Newton da Costa since the 1960s, and Graham Priest since the 1970s. Though viewed initially (and still) with some scepticism, ‘paraconsistent logics’ now have an official mathematics classification code (03B53, according to the American Mathematical Society) and there have been five World Congress of Paraconsistency meetings since 1997. These logics are now studied by researchers across the globe, and hold out the prospect of accomplishing the impossible: recasting the very laws of logic itself to make sense of our sometimes seemingly senseless situation. If it works, it could ground a new sort of Enlightenment project, a rationalism that rationally accommodates some apparent irrationality. On this sort of approach, truth is beholden to rationality; but rationality is also ultimately beholden to truth.

That might sound a little perplexing, so let’s start with a very ordinary example. Suppose . . .

Continue reading. This strikes me as an exciting concept. I personally have by stymied by the way that (classical) logic leads sometimes to a dead end or an unresolved knot. This is an interesting approach that holds the promise of offering guidance in a (classically) inconsistent world (cf. quantum mechanics).

Written by Leisureguy

13 January 2022 at 6:54 am

A century of quantum mechanics questions the fundamental nature of reality

leave a comment »

Which is the “true” view of reality: the underlying structure of quantum mechanics, with strange (but observable) phenomena, such as the double-slit experiment? or the emergent reality we experience in our daily life? I thought of Neo’s situation in the Matrix: the underlying structure of bodies embedded in machines that feed and control them vs. the experience of their daily lives, lived in ignorance of those machines. Do you want the red pill? or the blue one?

Tom Siegfried writes in Science News:

Scientists are like prospectors, excavating the natural world seeking gems of knowledge about physical reality. And in the century just past, scientists have dug deep enough to discover that reality’s foundations do not mirror the world of everyday appearances. At its roots, reality is described by the mysterious set of mathematical rules known as quantum mechanics.

Conceived at the turn of the 20th century and then emerging in its full form in the mid-1920s, quantum mechanics is the math that explains matter. It’s the theory for describing the physics of the microworld, where atoms and molecules interact to generate the world of human experience. And it’s at the heart of everything that made the century just past so dramatically unlike the century preceding it. From cell phones to supercomputers, DVDs to pdfs, quantum physics fueled the present-day electronics-based economy, transforming commerce, communication and entertainment.

But quantum theory taught scientists much more than how to make computer chips. It taught that reality isn’t what it seems.

“The fundamental nature of reality could be radically different from our familiar world of objects moving around in space and interacting with each other,” physicist Sean Carroll suggested in a recent tweet. “We shouldn’t fool ourselves into mistaking the world as we experience it for the world as it really is.”

In a technical paper backing up his tweet, Carroll notes that quantum theory consists of equations that describe mathematical entities roaming through an abstract realm of possible natural events. It’s plausible, Carroll argues, that this quantum realm of mathematical possibilities represents the true, fundamental nature of reality. If so, all the physical phenomena we perceive are just a “higher-level emergent description” of what’s really going on.

“Emergent” events in ordinary space are real in their own way, just not fundamental, Carroll allows. Belief that the “spatial arena” is fundamental “is more a matter of convenience and convention than one of principle,” he says.

Carroll’s perspective is not the only way of viewing the meaning of quantum math, he acknowledges, and it is not fully shared by most physicists. But everybody does agree that quantum physics has drastically remodeled humankind’s understanding of nature. In fact, a fair reading of history suggests that quantum theory is the most dramatic shift in science’s conception of reality since the ancient Greeks deposed mythological explanations of natural phenomena in favor of logic and reason. After all, quantum physics itself seems to defy logic and reason. [And see the next post on the limits of logic and reason — and how those break down. – LG]

It doesn’t, of course. Quantum theory represents the ultimate outcome of superior logical reasoning, arriving at truths that could never be discovered merely by observing the visible world.

It turns out that in the microworld — beyond the reach of the senses — phenomena play a game with fantastical rules. Matter’s basic particles are not tiny rocks, but more like ghostly waves that maintain multiple possible futures until forced to assume the subatomic equivalent of substance. As a result, quantum math does not describe a relentless cause-and-effect sequence of events as Newtonian science had insisted. Instead science morphs from dictator to oddsmaker; quantum math tells only probabilities for different possible outcomes. Some uncertainty always remains.

The quantum revolution

The discovery of quantum uncertainty was what first impressed the world with the depth of the quantum revolution. German physicist Werner Heisenberg, in 1927, astounded the scientific community with the revelation that deterministic cause-and-effect physics failed when applied to atoms. It was impossible, Heisenberg deduced, to measure both the location and velocity of a subatomic particle at the same time. If you measured one precisely, some uncertainty remained for the other.

“A particle may have an exact place or an exact speed, but it can not have both,” as Science News Letter, the predecessor of Science Newsreported in 1929. “Crudely stated, the new theory holds that chance rules the physical world.” Heisenberg’s uncertainty principle “is destined to revolutionize the ideas of the universe held by scientists and laymen to an even greater extent than Einstein’s relativity.”

Heisenberg’s breakthrough was the culmination of a series of quantum surprises. First came German physicist Max Planck’s discovery, in 1900, that light and other forms of radiation could be absorbed or emitted only in discrete packets, which Planck called quanta. A few years later Albert Einstein argued that light also traveled through space as packets, or particles, later called photons. Many physicists dismissed such early quantum clues as inconsequential. But in 1913, the Danish physicist Niels Bohr used quantum theory to explain the structure of the atom. Soon the world realized that reality needed reexamining.

By 1921, awareness of the quantum revolution had begun to expand beyond the confines of physics conferences. In that year,  . . .

Continue reading.

Written by Leisureguy

13 January 2022 at 6:43 am

How Imaginary Numbers Were Invented

leave a comment »

This video touches on the always tricky question, “What is ‘reality’?”, as does the next post.

Written by Leisureguy

13 January 2022 at 6:31 am

Posted in Daily life, Math, Video

%d bloggers like this: