Later On

A blog written for those whose interests more or less match mine.

A Theory of Reality as More Than the Sum of Its Parts

leave a comment »

Emergence fascinates me, since the emergent phenomenon can seem totally different from—almost unrelated to—that from which it emerges (e.g., life from atoms). Natalie Wolchover writes in Quanta:

In his 1890 opus, The Principles of Psychology, William James invoked Romeo and Juliet to illustrate what makes conscious beings so different from the particles that make them up.

“Romeo wants Juliet as the filings want the magnet; and if no obstacles intervene he moves towards her by as straight a line as they,” James wrote. “But Romeo and Juliet, if a wall be built between them, do not remain idiotically pressing their faces against its opposite sides like the magnet and the filings. … Romeo soon finds a circuitous way, by scaling the wall or otherwise, of touching Juliet’s lips directly.”

Erik Hoel, a 29-year-old theoretical neuroscientist and writer, quoted the passage in a recent essay in which he laid out his new mathematical explanation of how consciousness and agency arise. The existence of agents — beings with intentions and goal-oriented behavior — has long seemed profoundly at odds with the reductionist assumption that all behavior arises from mechanistic interactions between particles. Agency doesn’t exist among the atoms, and so reductionism suggests agents don’t exist at all: that Romeo’s desires and psychological states are not the real causes of his actions, but merely approximate the unknowably complicated causes and effects between the atoms in his brain and surroundings.

Hoel’s theory, called “causal emergence,” roundly rejects this reductionist assumption.

“Causal emergence is a way of claiming that your agent description is really real,” said Hoel, a postdoctoral researcher at Columbia University who first proposed the idea with Larissa Albantakis and Giulio Tononi of the University of Wisconsin, Madison. “If you just say something like, ‘Oh, my atoms made me do it’ — well, that might not be true. And it might be provably not true.”

Using the mathematical language of information theory, Hoel and his collaborators claim to show that new causes — things that produce effects — can emerge at macroscopic scales. They say coarse-grained macroscopic states of a physical system (such as the psychological state of a brain) can have more causal power over the system’s future than a more detailed, fine-grained description of the system possibly could. Macroscopic states, such as desires or beliefs, “are not just shorthand for the real causes,” explained Simon DeDeo, an information theorist and cognitive scientist at Carnegie Mellon University and the Santa Fe Institute who is not involved in the work, “but it’s actually a description of the real causes, and a more fine-grained description would actually miss those causes.”

“To me, that seems like the right way to talk about it,” DeDeo said, “because we do want to attribute causal properties to higher-order events [and] things like mental states.”

Hoel and collaborators have been developing the mathematics behind their idea since 2013. In a May paper in the journal Entropy, Hoel placed causal emergence on a firmer theoretical footing by showing that macro scales gain causal power in exactly the same way, mathematically, that error-correcting codes increase the amount of information that can be sent over information channels. Just as codes reduce noise (and thus uncertainty) in transmitted data — Claude Shannon’s 1948 insight that formed the bedrock of information theory — Hoel claims that macro states also reduce noise and uncertainty in a system’s causal structure, strengthening causal relationships and making the system’s behavior more deterministic.

“I think it’s very significant,” George Ellis, a South African cosmologist who has also written about top-down causation in nature, said of Hoel’s new paper. Ellis thinks causal emergence could account for many emergent phenomena such as superconductivityand topological phases of matter. Collective systems like bird flocks and superorganisms — and even simple structures like crystals and waves — might also exhibit causal emergence, researchers said.

The work on causal emergence is not yet widely known among physicists, who for centuries have taken a reductionist view of nature and largely avoided further philosophical thinking on the matter. But at the interfaces between physics, biology, information theory and philosophy, where puzzles crop up, the new ideas have generated excitement. Their ultimate usefulness in explaining the world and its mysteries — including consciousness, other kinds of emergence, and the relationships between the micro and macro levels of reality — will come down to whether Hoel has nailed the notoriously tricky notion of causation: Namely, what’s a cause? “If you brought 20 practicing scientists into a room and asked what causation was, they would all disagree,” DeDeo said. “We get mixed up about it.”

A Theory of Cause

In a fatal drunk driving accident, what’s the cause of death? Doctors name a ruptured organ, while a psychologist blames impaired decision-making abilities and a sociologist points to permissive attitudes toward alcohol. Biologists, chemists and physicists, in turn, see ever more elemental causes. “Famously, Aristotle had a half-dozen notions of causes,” DeDeo said. “We as scientists have rejected all of them except things being in literal contact, touching and pushing.”

The true causes, to a physicist, are the fundamental forces acting between particles; all effects ripple out from there. Indeed, these forces, when they can be isolated, appear perfectly deterministic and reliable — physicists can predict with high precision the outcomes of particle collisions at the Large Hadron Collider, for instance. In this view, causes and effects become hard to predict from first principles only when there are too many variables to track.

Furthermore, philosophers have argued that causal power existing at two scales at once would be twice what the world needs; to avoid double-counting, the “exclusion argument” says all causal power must originate at the micro level. But it’s almost always easier to discuss causes and effects in terms of macroscopic entities. When we look for the cause of a fatal car crash, or Romeo’s decision to start climbing, “it doesn’t seem right to go all the way down to microscopic scales of neurons firing,” DeDeo said. “That’s where Erik [Hoel] is jumping in. It’s a bit of a bold thing to do to talk about the mathematics of causation.”

Friendly and large-limbed, Hoel grew up reading books at Jabberwocky, his family’s bookstore in Newburyport, Massachusetts. He studied creative writing as an undergraduate and planned to become a writer. (He still writes fiction and has started a novel.) But he was also drawn to the question of consciousness — what it is, and why and how we have it — because he saw it as an immature scientific subject that allowed for creativity. For graduate school, he went to Madison, Wisconsin, to work with Tononi — the only person at the time, in Hoel’s view, who had a truly scientific theory of consciousness.

Tononi conceives of consciousness as information: bits that are encoded not in the states of individual neurons, but in the complex networking of neurons, which link together in the brain into larger and larger ensembles. Tononi argues that this special “integrated information” corresponds to the unified, integrated state that we experience as subjective awareness. Integrated information theory has gained prominence in the last few years, even as debates have ensued about whether it is an accurate and sufficient proxy for consciousness. But when Hoel first got to Madison in 2010, he became Tononi’s first collaborator on the theory.

Tononi tasked Hoel with exploring the general mathematical relationship between scales and information. The scientists later focused on how the amount of integrated information in a neural network changes as you move up the hierarchy of spatiotemporal scales, looking at links between larger and larger groups of neurons. They hoped to figure out which ensemble size might be associated with maximum integrated information — and thus, possibly, with conscious thoughts and decisions. Hoel taught himself information theory and plunged into the philosophical debates around consciousness, reductionism and causation.

Hoel soon saw that understanding how consciousness emerges at macro scales would require a way of quantifying the causal power of brain states. He realized, he said, that “the best measure of causation is in bits.” He also read the works of the computer scientist and philosopher Judea Pearl, who developed a logical language for studying causal relationships in the 1990s called causal calculus. With Albantakis and Tononi, Hoel devised a measure of causal power called “effective information,” which indicates how effectively a particular state influences the future state of a system. (Effective information can be used to help calculate integrated information, but it is simpler and more general and, as a measure of causal power, does not rely on Tononi’s other ideas about consciousness.)

The researchers showed that in simple models of neural networks, the amount of effective information increases as you coarse-grain over the neurons in the network — that is, treat groups of them as single units. The possible states of these interlinked units form a causal structure, where transitions between states can be mathematically modeled using so-called Markov chains. At a certain macroscopic scale, . . .

Continue reading.

Written by LeisureGuy

1 June 2017 at 2:02 pm

Posted in Daily life, Math, Memes, Science

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s