Archive for August 15th, 2021
FDR and the social safety net
Heather Cox Richardson’s August 14 newsletter:
On this day in 1935, President Franklin Delano Roosevelt signed the Social Security Act into law. While FDR’s New Deal had put in place new measures to regulate business and banking and had provided temporary work relief to combat the Depression, this law permanently changed the nature of the American government.
The Social Security Act is known for its payments to older Americans, but it did far more than that. It established unemployment insurance; aid to homeless, dependent, and neglected children; funds to promote maternal and child welfare; and public health services. It was a sweeping reworking of the relationship of the government to its citizens, using the power of taxation to pool funds to provide a basic social safety net.
The driving force behind the law was FDR’s Secretary of Labor, Frances Perkins. She was the first woman to hold a position in the U.S. Cabinet and still holds the record for having the longest tenure in that job: she lasted from 1933 to 1945.
She brought to the position a vision of government very different from that of the Republicans who had run it in the 1920s. While men like President Herbert Hoover had harped on the idea of a “rugged individualism” in which men worked their way up, providing for their families on their own, Perkins recognized that people in communities had always supported each other. The vision of a hardworking man supporting his wife and children was more myth than reality: her own husband suffered from bipolar disorder, making her the family’s primary support.
As a child, Perkins spent summers with her grandmother, with whom she was very close, in the small town of Newcastle, Maine, where she witnessed a supportive community. In college, at Mount Holyoke, she majored in chemistry and physics, but after a professor required students to tour a factory to observe working conditions, Perkins became committed to improving the lives of those trapped in industrial jobs. After college, Perkins became a social worker and, in 1910, earned a masters degree in economics and sociology from Columbia University. She became the head of the New York office of the National Consumers League, urging consumers to use their buying power to demand better conditions and wages for the workers who made the products they were buying.
The next year, in 1911, she witnessed the Triangle Shirtwaist Fire in which 146 workers, mostly women and girls, died. They were trapped in the building when the fire broke out because the factory owner had ordered the doors to the stairwells and exits locked to make sure no one slipped outside for a break. Unable to escape the smoke and fire in the factory, the workers—some of them on fire—leaped from the 8th, 9th, and 10th floors of the building, dying on the pavement.
The Triangle Shirtwaist Fire turned Perkins away from voluntary organizations to improve workers’ lives and toward using the government to adjust the harsh conditions of industrialization. She began to work with the Democratic politicians at Tammany Hall, who presided over communities in the city that mirrored rural towns and who exercised a form of social welfare for their voters, making sure they had jobs, food, and shelter and that wives and children had a support network if a husband and father died. In that system, the voices of women like Perkins were valuable, for their work in the immigrant wards of the city meant that they were the ones who knew what working families needed to survive.
The overwhelming unemployment, hunger, and suffering caused by the Great Depression made Perkins realize that state governments alone could not adjust the conditions of the modern world to create a safe, supportive community for ordinary people. She came to believe, as she said: “The people are what matter to government, and a government should aim to give all the people under its jurisdiction the best possible life.”
Through her Tammany connections Perkins met FDR, and when he asked her to be his Secretary of Labor, she told him that she wanted the federal government to provide unemployment insurance, health insurance, and old-age insurance. She later recalled: “I remember he looked so startled, and he said, ‘Well, do you think it can be done?’”
Creating federal unemployment insurance became her primary concern. Congressmen had little interest in passing such legislation. They said they worried that unemployment insurance and federal aid to dependent families would undermine a man’s willingness to work. But Perkins recognized that those displaced by the Depression had added new pressure to the idea of old-age insurance.
In Long Beach, California, Dr. Francis Townsend had looked out of his window one day to see elderly women rooting through garbage cans for food. Appalled, he came up with a plan to help the elderly and stimulate the economy at the same time. Townsend proposed that the government provide every retired person over 60 years old with $200 a month, on the condition that they spend it within 30 days, a condition designed to stimulate the economy.
Townsend’s plan was wildly popular. More than that, though, it sparked people across the country to start coming up with their own plans for protecting the elderly and the nation’s social fabric, and together, they began to change the public conversation about social welfare policies.
They spurred Congress to action. Perkins recalled that Townsend “startled the Congress of the United States because the aged have votes. The wandering boys didn’t have any votes; the evicted women and their children had very few votes. If the unemployed didn’t stay long enough in any one place, they didn’t have a vote. But the aged people lived in one place and they had votes, so every Congressman had heard from the Townsend Plan people.”
FDR put together a committee to come up with . . .
The Afghanistan occupation and the Japan occupation
Noah Smith has an interesting post:
Everyone is talking about the Taliban’s swift reconquest of Afghanistan in the wake of the U.S. withdrawal. As usual, most Americans understand this event only through the lens of their domestic political viewpoints. Conservatives who just a few years ago were praising Trump’s new “America first” attitude and his withdrawal from Iraq are now wailing that Biden’s withdrawal from Afghanistan is the sign of a dying, decaying empire (that can of course only be restored by a conservative return to cultural and electoral dominance). Some on the left are decrying the U.S. “defeat”, raising the question of whether they think we should have continued occupying Afghanistan forever in order to secure “victory”. Others on the left are bewildered, bereft of talking points except to call for accepting a bunch of Afghan refugees (which of course is something we really ought to do).
Americans are approaching the situation this way because America is an insular country. Americans are among the least likely people to travel abroad, and our foreign language ability is among the world’s worst. Even our economy is unusually closed. When asked to identify Iran on a map, here is how Americans responded: . . .
I’m just sad they didn’t include the Western hemisphere on the map. Anyway, you get the point.
I’m no foreign policy expert, but I have lived overseas (about 4 years in Japan). That experience taught me how insular my own views of the world had been, and gave me a desire to bring the same perspective to my fellow countrymen. Realistically, though, this won’t happen, so instead all I can do is offer my thoughts on a blog.
Basically, my thought is this: Military occupations are much less able to transform countries than Americans tend to think. In particular, we should never go into a war expecting the outcome to look like post-WW2 Japan.
The Afghanistan War
I supported the Afghanistan War in 2001. I’m not a military interventionist in general — I strongly opposed the Iraq War just two years later, and protested against it. But in 2001, the case for war in Afghanistan seemed strong. America had suffered a huge, devastating attack on our territory; the terrorist group who perpetrated the attack was still at large; the Taliban government of Afghanistan was sheltering those terrorists. The case for war, as I saw it, had nothing to do with the odious nature of the Taliban regime — there are lots of odious regimes in the world, and we don’t go invading them just because they’re nasty and bad, nor should we. Instead, it was about eliminating a clear and present threat to the United States, and about punishing those who had been responsible for it.
Ten years later, that case for war still seemed strong. Bin Laden slept with the fishes. The leadership of al Qaeda had all been killed or captured, except for Ayman al-Zawahiri, a cranky old man who we seemed to leave in place in order to alienate as many people as possible before al Qaeda finally slipped into the history books. Though no one will ever say al Qaeda is dead, the centralized, competent organization that attacked us on 9/11 is certainly gone, and the name is now basically just a franchise used by a ragtag bunch of scattered local Islamist gangs who usually lose the wars they’re fighting in. Mullah Omar, the Taliban leader who chose to shelter and support al Qaeda, bought the farm in 2013 (though we didn’t know it til 2015).
In other words, America did what I saw us as having come to do. The threat (al Qaeda) was eliminated, and the punitive expedition seemed to have inflicted sufficient punishment on the people who sheltered them. Accordingly, the U.S. began to draw down troops, and by 2015 our military presence in the country was relatively minor. . .
Continue reading. There’s much more, and the bit about Japan is interesting (and convincing).
New edition of “The History of Jazz”
Ted Gioia writes at The Honest Broker:
Back in the early 1990s, Sheldon Meyer of Oxford University Press asked me to write a full history of jazz, from its origins to the current day—a book that would serve as the publishing house’s flagship work on the subject.
When Oxford University Press had published Marshall Stearns’s The Story of Jazz in 1956, it had served as a milestone moment in music scholarship. For the first time, a major academic press was embracing jazz as a legitimate field of study. But by the 1990s, Stearns’s book was terribly out-of-date, and Oxford needed a new work to replace it in their offerings. My book was envisioned as that replacement.
I told Meyer that I would need at least 4-5 years to deliver a book on such an expansive topic. He accepted this timeline—he was a wise editor who took a long term view of publishing, a rarity nowadays, but that’s why so many books he edited went on to win the Pulitzer or Bancroft prizes. I was blessed to have him as my editor, and wanted to work with him on this project. I managed to complete the manuscript in the promised time frame, and in 1997 my book The History of Jazz was published, a few days after my 40th birthday.
In retrospect, I view this moment as the key turning-point in my vocation as a music historian. The History of Jazz would prove to be the bestselling jazz book of the next quarter-of-a-century, selling hundreds of thousands of copies in English and various translations. It brought me in contact with readers all over the world, and put me in an enviable position. Music tends to be a young person’s game, and that’s true for writers as well as performers. Yet I found that I had somehow reversed the trend, finding a much larger readership after the age of 40 than I’d ever enjoyed as a young man—in the aftermath everyone from the White House to the United Nations would contact me for guidance and advice on jazz-oriented projects, and I still hear daily from readers of this book who share their own jazz stories from all over the world. I never take that for granted, and have always felt gratitude to Sheldon and Oxford, but especially to these readers, who have stayed with me through so many subsequent books.
But the history of jazz is not a static subject. The music continues to morph and evolve. So I wrote an updated and expanded second edition of The History of Jazz released in in 2011. And ten years later, another upgrade is very much necessary. A few weeks ago, the new third edition of The History of Jazz was released—which has allowed me to bring this exciting story, once again, up to the current day.
Below is an extract from the new edition for my subscribers. It looks at the extraordinary conjunction of events spurring a resurgence of interest in jazz in the current moment.
For more information on the book, you may want to check out my recent interview for NPR, conducted by Natalie Weiner.
How Jazz Was Declared Dead—Then Came Roaring Back to Life
by Ted Gioia (from The History of Jazz, 2021 Edition)
I’ve heard many predictions about jazz over the years. The prognosticators typically serve up grim forecasts about the genre’s inevitable decline into irrelevancy or its survival on life support as a kind of musical museum exhibit celebrating past glories. Such prophecies aren’t much fun to consider—but they haven’t been very accurate either. None of these seers has anticipated what’s actually now happening on the jazz scene, a development as delightful as it has been unexpected. Jazz has somehow rediscovered its roots as populist music, embarking on a new and unscripted dialogue with mainstream culture. To some extent, jazz has even turned into a kind of talisman for forward-looking sounds in commercial music—with the same mass-market periodicals that published obituaries for the genre just a short while ago now proclaiming its hot new status.
Artists as different from each other as Kamasi Washington, Esperanza Spalding, Shabaka Hutchings, and Robert Glasper have shown that they can draw on the full range of current-day song styles without losing their jazz roots, and attract a young crossover audience who are energized and excited by this give-and-take. Pop culture stars, from Kendrick Lamar to Lady Gaga, have returned the favor, seeking out ways of uplifting their own artistry by incorporating jazz ingredients into their music. In the process, the whole notion of jazz as a niche genre for snobbish insiders has gotten overturned. Jazz is showing up with increasing frequency on tourist guides, suggested as the preferred evening’s entertainment in New York or London or Tokyo or some other travel destination. And even for stay-at-homes watching movies from the comfort of their couch, a surprising number of Hollywood offerings—La La Land, Green Book, Whiplash, Miles Ahead, Born to Be Blue, Soul—have served up jazz stories and songs suitable for mainstream appeal.
Of course, not every jazz old-timer celebrates the music’s newfound popularity. Just as complaints could be heard in the 1980s and 1990s when the music gained wider respectability and made an alliance with academic and nonprofit institutions, a whole litany of different grievances have been raised now that the genre has seemingly reversed course and returned to the people. But the lessons of jazz history are fairly clear by now: complaints and denunciations by entrenched insiders are almost always a sign that something important is underway. In this instance, the new discourse between jazz and popular music seems more than just a passing trend but the sign of an emerging ethos that might prove lasting and transformative.
It’s hard to pinpoint the moment a trend reverses. And in the case of jazz, it sometimes seemed as if its alleged downturn would never end—at least judging by the pessimistic media pronouncements on the art form made during the early years of the twenty-first century. Jazz’s problem, they declared, wasn’t like a bad haircut, something you could grow out of, or an embarrassing tattoo that a laser might zap away, but more like a death sentence. I still recall my dismay when The Atlantic entitled an otherwise favorable review of one of my books with the dispiriting headline: “The End of Jazz,” and followed it up with a subhead that promised to explain “how America’s most vibrant music became a relic.” I was miffed, but I could hardly blame the author. He was simply stating the consensus view among opinion leaders.
That was back in 2012, but the notion that jazz was dead had been bouncing around for quite some time. In 2007, Esquire had published a similar article, proclaiming in its headline not only the “Death of Jazz,” but adding that the genre had been in decline since John Coltrane’s demise forty years earlier. Around that same time, critic Marc Myers published an article on his JazzWax website entitled “Who Killed Jazz and When?,” which reached a similar conclusion, but pinpointed an even earlier cause of decline— specifically, the decision by jazz bands in the late 1940s to stop playing for dancers. When CNN tackled the same matter, in an article entitled “When Jazz Stopped Being Cool,” the guilty parties were now the Beatles and rock & roll. Other pundits focused on different root causes for the music’s obsolescence, with everyone from elitist fans to narcissistic performers getting a share of the blame. But the final result was, as they saw it, hardly open to debate: jazz had been on life support for too long, and it was time to put the dear old thing out of its misery.
It’s now been several years since I’ve seen any of those anguished obituaries for jazz, and instead a different kind of news story has taken its place. Big font headlines now proclaim . . .