Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Books’ Category

The Theory and Practice of Civic Engagement, by Eric Liu

leave a comment »

James Fallows writes in the Atlantic:

If you happen to be in Redlands, California, on Thursday evening, September 21, I suggest you go by the headquarters of the tech company Esri to hear a talk by my friend Eric Liu, on the practical possibilities for civic engagement in our politically troubled age.

If you don’t happen to be in Redlands, I recommend getting Eric’s book, You Are More Powerful Than You Think. It addresses a central question of this politically troubled age: what, exactly, citizens who are unhappy with national politics can do, other than write a check or await the next chance to vote.

This is a question I wrestled with immediately after last year’s election, in this Atlantic article, and in a commencement speech a few months later. But Eric, author of several previous books about the theory and practice of citizenship (including The Gardens of Democracy and A Chinaman’s Chance) and head of the Citizen University network, based in Seattle, has devoted his useful and enlightening new book to just this topic, in the age of Trump. He described some of its principles in a NYT interview with David Bornstein a few months ago. Essentially his topic is how to bridge the gap between thinking, “something should be done,” and actually taking steps to doing that something, on your own and with others. This also is the ongoing theme of Citizen University, which emphasizes that citizenship is a job in addition to being a status.

I’ll leave the details, of which there are many, to Eric — on the podium in Redlands or in the pages of his book. The high-concept part of his argument flows from these three axioms:

  • Power creates monopolies, and is winner-take-all. → You must change the game.
  • Power creates a story of why it’s legitimate. → You must change the story.
  • Power is assumed to be finite and zero-sum. → You must change the equation.

He goes on, in practical terms, to illustrate what these mean. The political question of this era (as discussed here) is how the resilient qualities of American civic society match up against the challenges presented by the lurches of Donald Trump. Can the judiciary adhere to pre-2017 standards? How will the Congress fare in its ongoing search for a soul? Will states and cities maintain their policies on the environment, on standards of justice, on treatment of refugees and immigrants? And how, fundamentally, can citizens play a more active and powerful role in the affairs of their nation? These and others are central struggles of our time. And Eric Liu’s book is part of the effort to push the outcome in a positive direction.

Written by LeisureGuy

21 September 2017 at 7:33 pm

Posted in Books, Daily life, Politics

Why Hillary Clinton’s Book Is Actually Worth Reading

leave a comment »

James Fallows has a very interesting column on Hillary Clinton and her book. It begins:

Most books by politicians are bad. They’re bad because they are cautious, or pious, or boring, or some even-worse combination of all three.

They’re cautious because over the years politicians learn they have more to lose than gain by taking “interesting” or edgy stands. (Something I learned when working as a campaign and White House speechwriter: In “normal” writing, your goal is to make your meaning as clear as possible, ideally in a memorable way. For a politician, the goal is to make the meaning just clear enough that most people will still agree with you. Clearer than that, and you’re in trouble.)

They’re pious because in one way or another the “revealing” stories about the authors are really campaign ads—for future elections by politicians with a big race still ahead of them, or for history’s esteem by senior figures looking back. Thus  politicians’ biographies fall into the general categories of humble-brag (most of them) or braggy-brag (Trump’s).

And they’re boring because they’re necessarily often about policy. That’s hard enough to make interesting in the hands of very skillful writers, from Rachel Carson and Jane Jacobs to John Hersey and Michael Lewis. If politicians turning out books on “Our Schools: A New Blueprint!” were comparably skilled as writers, they’d be making their livings without having to bother with PACs and polls.

Of course there are exceptions. Some autobiographical books manage to be interesting because they’re written early enough not to be swathed in campaign caution (Barack Obama’s Dreams From My Father), or come from a quirky-enough sensibility to avoid normal constraints (Jimmy Carter’s Why Not the Best?), or are from performers talented enough to work subversively within the constraints (Al Franken’s Giant of the Senate, which is the kind of book Will Rogers might have written if he had made it into the Senate). And of course some all-out manifestos with an edge can shape the evolution of politics. Barry Goldwater’s Conscience of a Conservative didn’t get him into the White House, but it competed with the works of Ayn Rand on many conservatives’ bookshelves and lastingly shaped a movement.

I don’t know whether Hillary Clinton’s previous books were good or bad. I didn’t read them, because I assumed they were normal politician-books. But What Happened is not a standard work in this oeuvre. It’s interesting; it’s worth reading; and it sets out questions that the press, in particular, has not done enough to face.

* * *

On the overall interesting-ness of the book, I refer you to Megan Garber’s extensive analysis of the different personas Hillary Clinton has presented through her now very-long public career, and the much less-guarded one that comes through in What Happened. By the previously mentioned depressing standards of most political books, this one isn’t cautious (because the author  convincingly claims she’s not running for anything any more), it’s not (very) pious (because she favors an acid-humor tone), and most of it is not boring (because most of it is not directly about policy).

As an example of why it’s interesting, consider the opening scene, about how Clinton dealt with the inauguration ceremony in which she might have expected to be sworn in herself, but instead sat there watching Donald Trump take the oath. She’d wondered whether she had to show up at all, and talked with former presidents George W. Bush and Jimmy Carter, each of whom had called her right after the news of her loss sank in: . . .

Continue reading.

Written by LeisureGuy

15 September 2017 at 10:09 am

What Happens When War Is Outlawed

leave a comment »

Louis Menand has a very interesting essay and book review in the New Yorker:

On August 27, 1928, in Paris, with due pomp and circumstance, representatives of fifteen nations signed an agreement outlawing war. The agreement was the unanticipated fruit of an attempt by the French Foreign Minister, Aristide Briand, to negotiate a bilateral treaty with the United States in which each nation would renounce the use of war as an instrument of policy toward the other. The American Secretary of State, Frank Kellogg, had been unenthusiastic about Briand’s idea. He saw no prospect of going to war with France and therefore no point in promising not to, and he suspected that the proposal was a gimmick designed to commit the United States to intervening on France’s behalf if Germany attacked it (as Germany did in 1914). After some delay and in response to public pressure, Kellogg told Briand that his idea sounded great. Who wouldn’t want to renounce war? But why not make the treaty multilateral, and have it signed by “all the principal powers of the world”? Everyone would renounce the use of war as an instrument of policy.
Kellogg figured that he had Briand outfoxed. France had mutual defense treaties with many European states, and it could hardly honor those treaties if it agreed to renounce war altogether. But the agreement was eventually worded in a way that left sufficient interpretive latitude for Briand and other statesmen to see their way clear to signing it, and the result was the General Treaty for the Renunciation of War, also known as the Paris Peace Pact or the Kellogg-Briand Pact. By 1934, sixty-three countries had joined the Pact—virtually every established nation on earth at the time.
The Treaty of Versailles, signed in 1919, gets bad press. It imposed punitive conditions on Germany after the First World War and is often blamed for the rise of Hitler. The Kellogg-Briand Pact does not get bad press. It gets no press. That’s because the treaty went into effect on July 24, 1929, after which the following occurred: Japan invaded Manchuria (1931); Italy invaded Ethiopia (1935); Japan invaded China (1937); Germany invaded Poland (1939); the Soviet Union invaded Finland (1939); Germany invaded Denmark, Norway, Belgium, the Netherlands, Luxembourg, and France and attacked Great Britain (1940); and Japan attacked the United States (1941), culminating in a global war that produced the atomic bomb and more than sixty million deaths. A piece of paper signed in Paris does not seem to have presented an obstacle to citizens of one country engaging in the organized slaughter of the citizens of other countries.
In modern political history, therefore, the Paris Peace Pact, if it is mentioned at all, usually gets a condescending tip of the hat or is dutifully registered in footnote. Even in books on the law of war, little is made of it. There is not a single reference to it in the political philosopher Michael Walzer’s “Just and Unjust Wars,” a classic work published in 1977. The summary on the U.S. State Department’s Web site is typical: “In the end, the Kellogg-Briand Pact did little to prevent World War II or any of the conflicts that followed. Its legacy remains as a statement of the idealism expressed by advocates for peace in the interwar period.”
The key term in that sentence is “idealism.” In international relations, an idealist is someone who believes that foreign policy should be based on universal principles, and that nations will agree to things like the outlawry of war because they perceive themselves as sharing a harmony of interests. War is bad for every nation; therefore, it is in the interests of all nations to renounce it.
An alternative theory is (no surprise) realism. A realist thinks that a nation’s foreign policy should be guided by a cold consideration of its own interests. To a realist, the essential condition of international politics is anarchy. There is no supreme law governing relations among sovereign states. When Germany invades France, France cannot take Germany to court. There are just a lot of nations out there, each trying to secure and, if possible, extend its own power. We don’t need to judge the morality of other nations’ behavior. We only need to ask whether the interests of our nation are affected by it. We should be concerned not with some platonic harmony of interests but with the very real balance of power.
A standard way to write the history of twentieth-century international relations is to cast as idealists figures like Woodrow Wilson, who, in 1917, entered the United States into a European war to make the world “safe for democracy,” and the other liberal internationalists who came up with the League of Nations and the Kellogg-Briand Pact. The Second World War proved these people spectacularly wrong about how nations behave, and they were superseded by the realists.
To the realists, such Wilsonian ideas as world government and the outlawry of war were quixotic. Nations should recognize that conflict is endemic to the international arena, and they should not expend blood and treasure in the name of an abstraction. Containment, the American Cold War policy of preventing the Soviet Union from expanding without otherwise intervening in its affairs, was a realist policy. Communists could run their own territories however they liked as long as they stayed inside their boxes. If our system was better, theirs would eventually implode; if theirs was better, ours would. The author of that policy, the diplomat George Kennan, called the Kellogg-Briand Pact “childish, just childish.”
And yet since 1945 nations have gone to war against other nations very few times. When they have, most of the rest of the world has regarded the war as illegitimate and, frequently, has organized to sanction or otherwise punish the aggressor. In only a handful of mostly minor cases since 1945—the Russian seizure of Crimea in 2014 being a flagrant exception—has a nation been able to hold on to territory it acquired by conquest.
Historians have suggested several reasons for this drop in the incidence of interstate war. The twenty years after the Second World War was a Pax Americana. By virtue of the tremendous damage suffered in the war by all the other powers, the United States became a global hegemon. America kept the peace (on American terms, of course) because no other country had the military or economic capacity to challenge it. This is the “great” America that some seventy-five million American voters in the last Presidential election were born in, and that many of them have been convinced can be resurrected by shutting the rest of the world out—which would be a complete reversal of the policy mind-set that made the United States a dominant power back when those voters were children.
By the nineteen-seventies, the rest of the world had caught up, and students of international affairs began to predict that, in the absence of a credible global policeman, there would be a surge in the number of armed conflicts around the world. When this didn’t happen, various explanations were ventured. One was that the existence of nuclear weapons had changed the calculus that nations used to judge their chances in a war. Nuclear weapons now operated as a general deterrence to aggression.
Other scholars proposed that the spread of democracy—including, in the nineteen-eighties, the Velvet Revolution in Eastern Europe and the dismembering of the Soviet Union—made the world a more peaceable place. Historically, democracies have not gone to war with other democracies. It was also argued that globalization, the interconnectedness of international trade, had rendered war less attractive. When goods are the end products of a worldwide chain of manufacture and distribution, a nation that goes to war risks cutting itself off from vital resources.
In “The Internationalists” (Simon & Schuster), two professors at Yale Law School, Oona A. Hathaway and Scott J. Shapiro, present another explanation for the decline in interstate wars since 1945. They think that nations rarely go to war anymore because war is illegal, and has been since 1928. In their view, the signing of the Kellogg-Briand Pact was not a Dr. Seuss parable with funny characters in striped trousers and top hats. The treaty did what its framers intended it to do: it effectively ended the use of war as an instrument of national policy.
Then what about the Japanese invasion of Manchuria, the Italian invasion of Ethiopia, and so on, down to the Japanese bombing of Pearl Harbor? Those actions were carried out by nations that were among the Pact’s original signatories, and they clearly violated its terms. According to Hathaway and Shapiro, the invasions actually turned out to be proof of the Pact’s effectiveness, because the Second World War was fought to punish aggression. The Allied victory was the triumph of Kellogg-Briand.
O.K., so what about the deterrent effect of nuclear weapons? The spread of democracy? Free trade and globalization? Isn’t the Kellogg-Briand Pact just a case of post hoc ergo propter hoc—an exercise in feel-good diplomacy that happened to find confirmation many years later in a state of global affairs made possible by other means? On the contrary, Hathaway and Shapiro argue. If war had not been outlawed, none of those other things—deterrence, democracy, trade—would have been possible. The Kellogg-Briand Pact is the explanation that explains all other explanations.
Genuine originality is unusual in political history. “The Internationalists” is an original book. There is something sweet about the fact that it is also a book written by two law professors in which most of the heroes are law professors. Sweet but significant, because one of the points of “The Internationalists” is that ideas matter.
This is something that can be under-recognized in political histories, where the emphasis tends to be on material conditions and relations of power. Hathaway and Shapiro further believe that ideas are produced by human beings, something that can be under-recognized in intellectual histories, which often take the form of books talking to books. “The Internationalists” is a story about individuals who used ideas to change the world.
The cast is appropriately international. Many of the characters are barely known outside scholarly circles, and they are all sketched in as personalities, beginning with the seventeenth-century Dutch polymath Hugo Grotius, who is said to have been the most insufferable pedant of his day. They include the nineteenth-century Japanese philosopher and government official Nishi Amane; the brilliant academic rivals Hans Kelsen, an Austrian Jew, and Carl Schmitt, a book-burning Nazi; the American lawyer Salmon Levinson, who began the outlawry movement in the nineteen-twenties and then got written out of its history by men with bigger egos; and the Czech émigré Bohuslav Ečer and the Galician émigré Hersch Lauterpacht, who helped formulate the arguments that made possible the prosecution of Nazi leaders at Nuremberg and laid the groundwork for the United Nations.
The book covers an enormous stretch of historical ground, from 1603, when a Dutch trader attacked and looted a Portuguese ship in the waters outside Singapore, to the emergence of the Islamic State. The general argument is that it made sense to outlaw war in 1928 because war had previously been deemed a legitimate instrument of national policy.
The key figure in the early part of the story is Grotius, who, in contriving a legal justification for an obviously brigandly Dutch seizure of Portuguese goods off Singapore, eventually produced a volume, “On the Laws of War and Peace,” published in 1625, that Hathaway and Shapiro say became “the textbook on the laws of war.” Grotius argued that wars of aggression are legal as long as states provide justification for them, but that even when the justifications prove to be shams the winners have a right to keep whatever they have managed to seize. In Grotius’s system, to use Hathaway and Shapiro’s formulations, might makes right and possession is ten-tenths of the law.
That doesn’t sound like much of a legal order, but it placed some constraints on what nations could do. For one thing, . . .

Continue reading.

Written by LeisureGuy

11 September 2017 at 3:32 pm

Posted in Books, Government, Law, Military

Naomi Oreskes on the Politics of Climate Change

leave a comment »

A Five Books interview with Naomi Oreskes, Professor of the History of Science and Affiliated Professor of Earth and Planetary Sciences at Harvard University:

The risks of climate change are increasingly clear and urgent. And yet, in the United States and some other countries, policies to significantly reduce greenhouse gas emissions do not seem to be working. The US President has called climate change a hoax and pulled the United States out of the Paris Agreement. And about 6.5 percent of global GDP — about 5 trillion dollars a year — goes to subsidising fossil fuels. How did we get into this situation in the first place?

Scientists have known for a long time that an increase in atmospheric greenhouse gases—produced by burning fossil fuel—could change the climate. By the late 1970s, it was clear that greenhouse gases were accumulating in the atmosphere, and scientists concluded that this would cause effects, probably by the end of the century. However, the observable effects came sooner than they expected: in 1988, scientists at NASA led by James Hansen, concluded that anthropogenic climate change was underway.

Hansen’s work got a good deal of attention. He testified in Congress. It was reported in the New York Times. And that same year the Intergovernmental Panel on Climate Change was created, in anticipation that the world would need good scientific information to inform policy decisions on the issue. Most scientists involved at the time thought that there would soon be a political response. And there was, but it was not the one they expected.

Until that time, there was no political resistance to climate science. Many climate scientists were Republicans, and throughout most of the post-war period, Republican political and business leaders had supported scientific research as strongly, if not more strongly, than Democratic leaders did. But, in the 1980s—just as the reality of climate change was being established scientifically—some people began to realise that if anthropogenic climate change was as dangerous as scientists thought, it would require government action to deal with it. In particular it would require government intervention in the marketplace, such as regulation or taxation to reduce or even eliminate the use of fossil fuels.

In this sense, it was similar to acid rain and stratospheric ozone depletion, as well as to the problem of tobacco use. If you were a liberal Democrat, and you didn’t have any particular objection to government intervention in the marketplace, that wasn’t a problem for you, and there was no particular reason to object to the scientific findings. But if you were a conservative Republican who objected to those interventions, then it was a problem for you.

Some conservatives — particularly a group of Cold War scientists with links to the Reagan administration, who feared that government intervention in the marketplace was the slippery slope to socialism — began to question the science around all these issues. In our work, we discovered that they had also worked with the tobacco industry, on the grounds that controlling tobacco would lead to an increase government control of our lives in general. Today that argument is often referred to as the problem of the “nanny state,” but they thought it was much more nefarious than that. They equated government control of the marketplace with Soviet-style totalitarianism. In this, they took inspiration from the neo-liberal economist, Milton Friedman, and his mentor, Frederick von Hayek.

Working with the tobacco industry, they developed a set of strategies and tactics to intended to undermine the scientific evidence of the harms of smoking and prevent the government from controlling tobacco or even trying to discourage its use. They now applied those strategies and tactics to climate change.

At first, their arguments were taken up by conservative and libertarian think tanks in Washington, DC, such as the Competitive Enterprise Institute, the CATO Institute, and the George C Marshall institute, who started to promote doubt and uncertainty about climate science. But soon, the fossil fuel industry was funding them. An alliance developed between powerful fossil fuel companies, such as Exxon Mobil and Peabody Coal, and think tanks such as CATO, to promote doubt about climate science and prevent government action. In the mid 1990s, it became their goal to prevent the US from signing the Kyoto Protocol to the UN Framework Convention on Climate Change.

Although the UNFCCC had been signed in 1992 by a Republican President — George H.W. Bush — by the late 1990s nearly all Republicans had aligned against it. And things went downhill from there. As the scientific evidence of climate change became stronger, and Democrats accepted it and started to propose legislation to deal with it, Republicans became more and more entrenched in rejecting it. Things went from bad to worse, as, at first, only extremists in the Republican party went into fully-fledged denial, but by the late 2000s, climate change denial had become routine. If you look at the candidates who ran in the Republican primary in 2016, only John Kasich had a position consistent with the findings of the scientific community. Donald Trump, of course, infamously claims climate change is a hoax, but Ted Cruz propagated the canard that warming had stopped in the 2000s. So in various ways, most Republicans in recent years have taken positions that refuse to accept the scientific evidence. And here we are. There are other elements to the story too, like the various advertising campaigns that fossil fuel companies ran to cast doubt upon climate science, but that is the core of the matter. In short, a confluence of economic interest and political ideology, which came to dominate conservative thinking in the USA, has led to the wholesale rejection of the findings of climate scientists by American conservatives as individuals and by the Republican party as an institution.

You’ve recently published a paper about ExxonMobil’s communications strategy from 1977 to 2014. What do your findings tell us about the state of the politics of climate change?

It tells us that things are bad for a reason. Many people want to say that we’re in this mess because people don’t think straight, are irrational, or aren’t clear-headed about dangers that they think are far in the future. And, of course, there’s an element of that in this story, but there’s also a very big elephant in the room, which is the long history of organised systematic climate change denial. My paper co-written with Geoffrey Supran speaks to that. Other people have already written about some of the activities that ExxonMobil was involved with in the past, such as the Global Climate Coalition, a group that in the 1990s worked to prevent the United States from signing on to Kyoto by trying to challenge the scientific basis for it. We tried to do something a little more systematic than what had been done before. Two years ago, the Los Angeles Timesand Inside Climate News published a series of investigative journalism pieces in which they looked at archival documents that reflected the work that ExxonMobil had done on the issue of climate change going back to the 1970s. They showed that the company was well aware as long ago as 1979 of climate change as a risk that would affect their business and had some interesting and serious climate science research going on even within the company. They also found that company employees were collaborating with academics at New York University and in government laboratories to try and better understand the potential threat and what it might mean for the petroleum industry.

When the Los Angeles Times and Inside Climate News published their articles ExxonMobil claimed they were false and wrong, and that the reporters had cherry-picked the documents. On its website, the corporation issued a challenge. They posted a set of documents that they claimed supported their claims, and refuted the ICN and LA Times. And they challenged the public, saying “read the documents” and make up your own mind.

Geoffrey Supran and I took up the challenge. We read all the documents that Inside Climate News published, we read all the documents that ExxonMobil claimed refuted the Inside Climate News findings, and we also read a set of advertorials – paid advertisements – that ExxonMobil had taken out mostly in the 1990s and early 2000s. And we compared these different communications. What our comprehensive comparison shows beyond any reasonable doubt is that inside ExxonMobil there was a conversation going on that was fully consistent with the evolving science that climate change was real, that it was a serious threat, and that it could lead to oil and gas assets being stranded, but in public ExxonMobil made a decision to run a series of advertisements aimed at the American people in which the message was a message of uncertainty and doubt. Since we published our paper, ExxonMobil has continued to mislead the public about its history of misleading the public.

Your first book choice is The Great Derangement by Indian novelist Amitav Ghosh. Those who know him as a novelist may wonder what he has to say about the politics of climate change.

Quite a lot, as anyone who reads the book will see. It’s absolutely fascinating on a number of levels. First, we have a famous, articulate and politically astute novelist taking up the issue of climate change. I think that’s extremely important because one of the arguments that Amitav makes in this book, which I agree with one hundred percent, is that for too long this problem has been discussed as scientific question; it’s mostly been covered by science journalists and written up in the science pages of the newspapers. But it’s fundamentally no longer a scientificquestion. The science — the key scientific issues — have been resolved now for a long time, but it’s a political question because we have to do something about it. It’s an economic question because it has to do with how we run our economies based on fossil fuels, and it’s also a deeply historical question.

Amitav looks at the long history of fossil fuel exploitation and the way it’s linked to colonialism and post-colonialism, and to make the argument that if we’re going to fix this problem, we have to understand the larger historical, economic, and social context as well. The book is also an explicit call for humanists — writers and authors and novelists and others — to become engaged and think through: How did we get into this situation? And how do we get out of it? And as Amitav says, it’s a kind of derangement. We’re on a path that is going to lead to tremendous destruction — what has just happened this week in Houston and Mumbai and Barbuda is exhibit A — and yet most of us are going about our lives as if nothing particularly special is happening. And, as we know, American politicians are going about their lives still in many cases in denial about the basic framework of this problem.

You called Houston exhibit A, but if more and more extreme weather events are part of climate change one could say it’s exhibit F.

You’re right, I only said exhibit A in the sense that it’s the most obvious and immediate right in this moment in American life. But, of course, you’re absolutely right. We’ve had Superstorm Sandy, Hurricane Katrina, the Russian fires of a few years ago, and the European heat waves of 2003, not to mention the recent floods in South Asia. There have been all kinds of incidents where we have seen what I call the human face of global warming. We’ve seen how climate change is already impacting people – causing damage and causing death – but somehow we don’t assimilate that. This is the point that Amitav Ghosh is calling ‘the great derangement’, that there is something frankly deranged about having all these things happening in front of our faces that are terrifically costly – both in terms of monetary damages and impacts on people’s lives – and yet somehow we don’t connect the dots. As you say, we could call this exhibit F and we have not connected the dots from A to B to C to D to E to F and also, I would say, to ExxonMobil and all of the fossil fuels companies that even today are continuing to explore for still more oil and gas reserves. That is a kind of craziness.

Roy Scranton, the author of your second choice, Learning to Die in the Anthropocenewrites “civilizations have throughout history marched blindly toward disaster, because humans are wired to believe that tomorrow will be much like today”. He also says that “The biggest problem climate change poses isn’t how the Department of Defense should plan for resource wars, whether we should put up sea walls to protect [Manhattan], or when we should abandon Miami. It won’t be addressed by buying a Prius, or signing a treaty…The biggest problem we face is a philosophical one: understanding that this civilization is already dead.” It seems a very negative place to start.

It is a very dark book and, by recommending it, I’m not suggesting I necessarily agree with everything in it or even necessarily agree with his ultimately bleak assessment, but I do think it’s an extremely important book. I say that for two reasons. . .

Continue reading.

Written by LeisureGuy

11 September 2017 at 2:05 pm

Yuval Noah Harari (author of “Sapiens: A Brief History of Humankind”) answers some interesting questions

leave a comment »

Andrew Anthony writes in the Guardian:

Last week, on his Radio 2 breakfast show, Chris Evans read out the first page of Sapiens, the book by the Israeli historian Yuval Noah Harari. Given that radio audiences at that time in the morning are not known for their appetite for intellectual engagement – the previous segment had dealt with Gary Barlow’s new tour – it was an unusual gesture. But as Evans said, “the first page is the most stunning first page of any book”.

If DJs are prone to mindless hyperbole, this was an honourable exception. The subtitle of Sapiens, in an echo of Stephen Hawking’s great work, is A Brief Historyof Humankind. In grippingly lucid prose, Harari sets out on that first page a condensed history of the universe, followed by a summary of the book’s thesis: how the cognitive revolution, the agricultural revolution and the scientific revolution have affected humans and their fellow organisms.

It is a dazzlingly bold introduction, which the remainder of the book lives up to on almost every page. Although Sapiens has been widely and loudly praised, some critics have suggested that it is too sweeping. Perhaps, but it is an intellectual joy to be swept along.

It’s one of those books that can’t help but make you feel smarter for having read it. Barack Obama and Bill Gates have undergone that experience, as have many others in the Davos crowd and Silicon Valley. The irony, perhaps, is that one of the book’s warnings is that we are in danger of becoming an elite-dominated global society.

At the centre of the book is the contention that what made Homo sapiens the most successful human being, supplanting rivals such as Neanderthals, was our ability to believe in shared fictions. Religions, nations and money, Harari argues, are all human fictions that have enabled collaboration and organisation on a massive scale.

Originally published in Hebrew in 2011, the book was translated into English three years later and became an international bestseller. It ranges across a multitude of disciplines with seemingly effortless scholarship, bringing together a keen understanding of history, anthropology, zoology, linguistics, philosophy, theology, economics, psychology, neuroscience and much else besides.

Yet the author of this accomplished and far-reaching book is a young Israeli historian whose career, up until that point, had been devoted to the relative academic backwater of medieval military history. Apparently, Sapiens is based on an introductory course to world history that Harari had to teach, after senior colleagues dodged the task. The story goes that the major Israeli publishers weren’t interested. No one saw international stardom beckoning.

Last year, Harari’s follow-up, Homo Deus: A Brief History of Tomorrow,was published in the UK, becoming another bestseller. It develops many of the themes explored in Sapiens, and in particular examines the possible impact of biotechnological and artificial intelligence innovation on Homo sapiens, heralding perhaps the beginning of a new bionic or semi-computerised form of human.

Again, it’s an exhilarating book that takes the reader deep into questions of identity, consciousness and intelligence, grappling with what kinds of choices and dilemmas a fully automated world will present us with.

Now 41, Harari grew up in a secular Jewish family in Haifa. He studied history at the Hebrew University of Jerusalem and completed his doctorate at Oxford. He is a vegan and he meditates for two hours a day, often going on extended retreats. He says it helps him focus on the issues that really matter. He lives with his husband on a moshav, an agricultural co-operative, outside Jerusalem. Being gay, he says, helped him to question received opinions. “Nothing should be taken for granted,” he has said, “even if everybody believes it.”

One of the pleasures of reading his books is that he continually calls on readers, both explicitly and implicitly, to think about what we know and what we think we know. And he has little time for fashionable stances.

He writes and speaks like a man who is not excessively troubled by doubt. If that makes him sound arrogant, let me clarify: he arrives at his conclusions after a great deal of research and contemplation. However, once he’s persuaded himself – and he says he always leaves it to the evidence to decide his thinking – he doesn’t hold back in his efforts to persuade the reader. It makes for what Jared Diamond called “unforgettably vivid language”.

Harari is a naturally gifted explainer, invariably ready with the telling anecdote or memorable analogy. As a result, it’s tempting to see him less as a historian than as some kind of all-purpose sage. We asked public figures and readers to pose questions for Harari, and many of these ( below) were of a moral or ethical nature, seeking answers about what should be done, rather than about what has happened. But the Israeli seems used to the role, and perfectly happy to give his best shot at replying. A historian of the distant past and the near future, he has carved out a whole new discipline of his own. It’s a singular achievement by an impressively multiple-minded man.

We are living through a fantastically rapid globalisation. Will there be one global culture in the future or will we maintain some sort of deliberate artificial tribal groupings?
Helen Czerski, physicist
I’m not sure if it will be deliberate but I do think we’ll probably have just one system, and in this sense we’ll have just one civilisation. In a way this is already the case. All over the world the political system of the state is roughly identical. All over the world capitalism is the dominant economic system, and all over the world the scientific method or worldview is the basic worldview through which people understand nature, disease, biology, physics and so forth. There are no longer any fundamental civilisational differences.

Andrew Anthony: Are you saying that the much-maligned Francis Fukuyama was correct in his analysis of the end of history?
It depends how you understand the end of history. If you mean the end of ideological clashes, then no, but if you mean the creation of a single civilisation which encompasses the whole world, then I think he was largely correct.

What is the biggest misconception humanity has about itself?
Lucy Prebble, playwright
Maybe it is that by gaining more power over the world, over the environment, we will be able to make ourselves happier and more satisfied with life. Looking again from a perspective of thousands of years, we have gained enormous power over the world and it doesn’t seem to make people significantly more satisfied than in the stone age.

Is there a real possibility that environmental degradation will halt technological progress?
TheWatchingPlace, posted online
I think it will be just the opposite – that, as the ecological crisis intensifies, the pressure for technological development will increase, not decrease. I think that the ecological crisis in the 21st century will be analogous to the two world wars in the 20th century in serving to accelerate technological progress.

As long as things are OK, people would be very careful in developing or experimenting in genetic engineering on humans or giving artificial intelligence control of weapon systems. But if you have a serious crisis, caused for example by ecological degradation, then people will be tempted to try all kinds of high-risk, high-gain technologies in the hope of solving the problem, and you’ll have something like the Manhattan Project in the second world war.

What role does morality play in a future world of artificial intelligence, artificial life and immortality? Will an aspiration to do what is good and right still motivate much of the race?
Andrew Solomon, writer . . .

Continue reading.

The “fictions” Harari talks about are memes by another name. Their meme character is evident from his descriptions. And I like the picture he draws of those who are so caught up in memes that they ignore the non-meme world (i.e., physical reality). As you might expect, this does the persons no good at all. They’re trapped in the meme equivalent of a Venus flytrap.

I’m not sure he sees how memes evolve independently of us.

Written by LeisureGuy

5 September 2017 at 5:31 pm

I was born in poverty in Appalachia. ‘Hillbilly Elegy’ doesn’t speak for me.

leave a comment »

Betsy Rader is an employment lawyer at Betsy Rader Law LLC, located in Chagrin Falls, Ohio. She is running as a Democrat to represent Ohio’s 14th Congressional District in the U.S. House. She writes in the Washington Post:

J.D. Vance’s book “Hillbilly Elegy,” published last year, has been assigned to students and book clubs across the country. Pundits continue to cite it as though the author speaks for all of us who grew up in poverty. But Vance doesn’t speak for me, nor do I believe that he speaks for the vast majority of the working poor.

From a quick glance at my résumé, you might think me an older, female version of Vance. I was born in Appalachia in the 1960s and grew up in the small city of Newark, Ohio. When I was 9, my parents divorced. My mom became a single mother of four, with only a high school education and little work experience. Life was tough; the five of us lived on $6,000 a year.

Like Vance, I attended Ohio State University on scholarship, working nights and weekends. I graduated at the top of my class and, again like Vance, attended Yale Law School on a financial-need scholarship. Today, I represent people who’ve been fired illegally from their jobs. And now that I’m running for Congress in Northeast Ohio, I speak often with folks who are trying hard but not making much money.

A self-described conservative, Vance largely concludes that his family and peers are trapped in poverty due to their own poor choices and negative attitudes. But I take great exception when he makes statements such as: “We spend our way into the poorhouse. We buy giant TVs and iPads. Our children wear nice clothes thanks to high-interest credit cards and payday loans. We purchase homes we don’t need, refinance them for more spending money, and declare bankruptcy. . . . Thrift is inimical to our being.”

Who is this “we” of whom he speaks? Vance’s statements don’t describe the family in which I grew up, and they don’t describe the families I meet who are struggling to make it in America today. I know that my family lived on $6,000 per year because as children, we sat down with pen and paper to help find a way for us to live on that amount. My mom couldn’t even qualify for a credit card, much less live on credit. She bought our clothes at discount stores.

Thrift was not inimical to our being; it was the very essence of our being.

With lines like “We choose not to work when we should be looking for jobs,” Vance’s sweeping stereotypes are shark bait for conservative policymakers. They feed into the mythology that the undeserving poor make bad choices and are to blame for their own poverty, so taxpayer money should not be wasted on programs to help lift people out of poverty. Now these inaccurate and dangerous generalizations have been made required college reading.

Here is the simple fact: Most poor people work. Seventy-eight percentof families on Medicaid include a household member who is working. People work hard in necessary and important jobs that often don’t pay them enough to live on. For instance, child-care workers earn an average of $22,930 per year, and home health aides average $23,600. (Indeed, it is a sad irony that crucial jobs around caretaking and children have always paid very little.)

The problem with living in constant economic insecurity is not a lack of thrift, it is that people in these circumstances are always focused on the current crisis. They can’t plan for the future because they have so much to deal with in the present. And the future seems so bleak that it feels futile to sacrifice for it. What does motivate most people is the belief that the future can be better and that we have a realistic opportunity to achieve it. But sometimes that takes help.

Yes, I worked hard, but I didn’t just pull myself up by my bootstraps. And neither did Vance. The truth is that people helped us out: My public school’s guidance counselor encouraged me to go to college. The government helped us out: I received scholarships and subsidized federal loans to help pay my educational expenses. The list of helpers goes on. . .

Continue reading.

Written by LeisureGuy

2 September 2017 at 1:34 pm

Posted in Books, Daily life, Education

Learning to Learn: You, Too, Can Rewire Your Brain

leave a comment »

John Schwartz reports in the NY Times:

The studio for what is arguably the world’s most successful online course is tucked into a corner of Barb and Phil Oakley’s basement, a converted TV room that smells faintly of cat urine. (At the end of every video session, the Oakleys pin up the green fabric that serves as the backdrop so Fluffy doesn’t ruin it.)

This is where they put together “Learning How to Learn,” taken by more than 1.8 million students from 200 countries, the most ever on Coursera. The course provides practical advice on tackling daunting subjects and on beating procrastination, and the lessons engagingly blend neuroscience and common sense.

Dr. Oakley, an engineering professor at Oakland University in Rochester, Mich., created the class with Terrence Sejnowski, a neuroscientist at the Salk Institute for Biological Studies, and with the University of California, San Diego.

Prestigious universities have spent millions and employ hundreds of professionally trained videographers, editors and producers to create their massive open online courses, known as MOOCs. The Oakleys put together their studio with equipment that cost $5,000. They figured out what to buy by Googling “how to set up a green screen studio” and “how to set up studio lighting.” Mr. Oakley runs the camera and teleprompter. She does most of the editing. The course is free ($49 for a certificate of completion — Coursera won’t divulge how many finish).

“It’s actually not rocket science,” said Dr. Oakley — but she’s careful where she says that these days. When she spoke at Harvard in 2015, she said, “the hackles went up”; she crossed her arms sternly by way of grim illustration.

This is home-brew, not Harvard. And it has worked. Spectacularly. The Oakleys never could have predicted their success. Many of the early sessions had to be trashed. “I looked like a deer in the headlights,” Dr. Oakley said. She would flub her lines and moan, “I just can’t do this.” Her husband would say, “Come on. We’re going to have lunch, and we’re going to come right back to this.” But he confessed to having had doubts, too. “We were in the basement, worrying, ‘Is anybody even going to look at this?’”

Dr. Oakley is not the only person teaching students how to use tools drawn from neuroscience to enhance learning. But her popularity is a testament to her skill at presenting the material, and also to the course’s message of hope. Many of her online students are 25 to 44 years old, likely to be facing career changes in an unforgiving economy and seeking better ways to climb new learning curves.

Dr. Oakley’s lessons are rich in metaphor, which she knows helps get complex ideas across. The practice is rooted in the theory of neural reuse, which states that metaphors use the same neural circuits in the brain as the underlying concept does, so the metaphor brings difficult concepts “more rapidly on board,” as she puts it.

She illustrates her concepts with goofy animations: There are surfing zombies, metabolic vampires and an “octopus of attention.” Hammy editing tricks may have Dr. Oakley moving out of the frame to the right and popping up on the left, or cringing away from an animated, disembodied head that she has put on the screen to discuss a property of the brain.

Sitting in the Oakleys’ comfortable living room, with its solid Mission furniture and mementos of their world travels, Dr. Oakley said she believes that just about anyone can train himself to learn. “Students may look at math, for example, and say, ‘I can’t figure this out — it must mean I’m really stupid!’ They don’t know how their brain works.”

Her own feelings of inadequacy give her empathy for students who feel hopeless. “I know the hiccups and the troubles people have when they’re trying to learn something.” After all, she was her own lab rat. “I rewired my brain,” she said, “and it wasn’t easy.”

As a youngster, she was not a diligent student. “I flunked my way through elementary, middle school and high school math and science,” she said. She joined the Army out of high school to help pay for college and received extensive training in Russian at the Defense Language Institute. Once out, she realized she would have a better career path with a technical degree (specifically, electrical engineering), and set out to tackle math and science, training herself to grind through technical subjects with many of the techniques of practice and repetition that she had used to let Russian vocabulary and declension soak in.

Along the way, she met Philip Oakley — in, of all places, Antarctica. It was 1983, and she was working as a radio operator at the Amundsen-Scott South Pole Station. (She has also worked as a translator on a Russian trawler. She’s been around.) Mr. Oakley managed the garage at the station, keeping machinery working under some of the planet’s most punishing conditions.

She had noticed him largely because, unlike so many men at the lonely pole, he hadn’t made any moves on her. “You can be ugly as a toad out there and you are the most popular girl,” she said. She found him “comfortably confident.” After he left a party without even saying hello, she told a friend she’d like to get to know him better. The next day, he was waiting for her at breakfast with a big smile on his face. Three weeks later, on New Year’s Eve, he walked her over to the true South Pole and proposed at the stroke of midnight. A few weeks after that, they were “off the ice” in New Zealand and got married.

Dr. Oakley recounts her journey in both of her best-selling books: “A Mind for Numbers: How to Excel at Math and Science (Even if You Flunked Algebra)” and, out this past spring, “Mindshift: Break Through Obstacles to Learning and Discover Your Hidden Potential.” The new book is about learning new skills, with a focus on career switchers. And yes, she has a MOOC for that, too. . . .

Continue reading. The column includes four techniques that may be helpful:

Four Techniques to Help You Learn

FOCUS/DON’T The brain has two modes of thinking that Dr. Oakley simplifies as “focused,” in which learners concentrate on the material, and “diffuse,” a neural resting state in which consolidation occurs — that is, the new information can settle into the brain. (Cognitive scientists talk about task-positive networks and default-mode networks, respectively, in describing the two states.) In diffuse mode, connections between bits of information, and unexpected insights, can occur. That’s why it’s helpful to take a brief break after a burst of focused work.

TAKE A BREAK To accomplish those periods of focused and diffuse-mode thinking, Dr. Oakley recommends what is known as the Pomodoro Technique, developed by one Francesco Cirillo. Set a kitchen timer for a 25-minute stretch of focused work, followed by a brief reward, which includes a break for diffuse reflection. (“Pomodoro” is Italian for tomato — some timers look like tomatoes.) The reward — listening to a song, taking a walk, anything to enter a relaxed state — takes your mind off the task at hand. Precisely because you’re not thinking about the task, the brain can subconsciously consolidate the new knowledge. Dr. Oakley compares this process to “a librarian filing books away on shelves for later retrieval.”

As a bonus, the ritual of setting the timer can also help overcome procrastination. Dr. Oakley teaches that even thinking about doing things we dislike activates the pain centers of the brain. The Pomodoro Technique, she said, “helps the mind slip into focus and begin work without thinking about the work.”

“Virtually anyone can focus for 25 minutes, and the more you practice, the easier it gets.”

PRACTICE “Chunking” is the process of creating a neural pattern that can be reactivated when needed. It might be an equation or a phrase in French or a guitar chord. Research shows that having a mental library of well-practiced neural chunks is necessary for developing expertise.

Practice brings procedural fluency, says Dr. Oakley, who compares the process to backing up a car. “When you first are learning to back up, your working memory is overwhelmed with input.” In time, “you don’t even need to think more than ‘Hey, back up,’ ” and the mind is free to think about other things.

Chunks build on chunks, and, she says, the neural network built upon that knowledge grows bigger. “You remember longer bits of music, for example, or more complex phrases in French.” Mastering low-level math concepts allows tackling more complex mental acrobatics. “You can easily bring them to mind even while your active focus is grappling with newer, more difficult information.”

KNOW THYSELF Dr. Oakley urges her students to understand that people learn in different ways. Those who have “racecar brains” snap up information; those with “hiker brains” take longer to assimilate information but, like a hiker, perceive more details along the way. Recognizing the advantages and disadvantages, she says, is the first step in learning how to approach unfamiliar material.

See also Mindset, by Carol Dweck.

Written by LeisureGuy

2 September 2017 at 11:44 am

Posted in Books, Education, Science

%d bloggers like this: