Archive for the ‘Daily life’ Category
Becky Bach reports in Pacific Standard:
Thousands of veterans suffering from post-traumatic stress disorder rely on the Department of Veterans Affairs for relief. They might be better served, however, if they tapped the hard-won wisdom of incarcerated Vietnam veteran Michael “Doc” Piper.
Piper knows, though the VA has yet to acknowledge, that community service could be the most effective treatment available for PTSD, a debilitating condition marked by nightmares, anxiety, flashbacks, pain, anger, self-blame, alienation, and depression.
Despite his confinement in Soledad Correctional Training Facility, a California state prison, 67-year-old Piper is a professional volunteer. From a 106-square-foot former broom closet with no Internet access, Piper helps fellow incarcerated veterans access VA benefits. By helping others, Piper says he’s been able to cope with his anger, nightmares, and flashbacks. But he’s not the only one who understands the power of community service.
Mission Continues, a St. Louis-based organization, is generating national attention, including a June 2013 Time cover story, for its success helping veterans who served in Iraq and Afghanistan integrate into society. It’s a dire need: The VA has treated nearly 300,000 veterans from those conflicts for PTSD symptoms, according to a November report. Mission Continues, which was founded in 2007 by a group of veterans, places veterans in six-month service fellowships in community organizations across the country.
Fellows paint hospital walls, collect food donations, and plant gardens, developing career and life skills in the process. And although in-depth studies are lacking, an investigation (PDF) by Washington University in St. Louis social scientist Monica Matthieu, found that Mission Continues helps.
Matthieu and her team surveyed 27 Mission Continues fellows, many of whom have been diagnosed with PTSD. Following their fellowship, 71 percent continued their education and 86 percent were able to find employment.
The VA currently assaults PTSD with a grab bag of treatments. It recommends (PDF) a combination of drugs, most commonly anti-depressants, and therapies including individual and group psychotherapy, hypnosis, and meditation. The department’s 2010 guidelines also recommend social and family skills training, job training, education, and spiritual support. VA therapists even teach stress-tolerance techniques.
For example, . . .
The creator of The Wire, David Simon, delivered an impromptu speech about the divide between rich and poor in America at the Festival of Dangerous Ideas in Sydney, and how capitalism has lost sight of its social compact. This is an edited extract:
Ameica is a country that is now utterly divided when it comes to its society, its economy, its politics. There are definitely two Americas. I live in one, on one block in Baltimore that is part of the viable America, the America that is connected to its own economy, where there is a plausible future for the people born into it. About 20 blocks away is another America entirely. It’s astonishing how little we have to do with each other, and yet we are living in such proximity.
There’s no barbed wire around West Baltimore or around East Baltimore, around Pimlico, the areas in my city that have been utterly divorced from the American experience that I know. But there might as well be. We’ve somehow managed to march on to two separate futures and I think you’re seeing this more and more in the west. I don’t think it’s unique to America.
I think we’ve perfected a lot of the tragedy and we’re getting there faster than a lot of other places that may be a little more reasoned, but my dangerous idea kind of involves this fellow who got left by the wayside in the 20th century and seemed to be almost the butt end of the joke of the 20th century; a fellow named Karl Marx.
I’m not a Marxist in the sense that I don’t think Marxism has a very specific clinical answer to what ails us economically. I think Marx was a much better diagnostician than he was a clinician. He was good at figuring out what was wrong or what could be wrong with capitalism if it wasn’t attended to and much less credible when it comes to how you might solve that.
You know if you’ve read Capital or if you’ve got the Cliff Notes, you know that his imaginings of how classical Marxism – of how his logic would work when applied – kind of devolve into such nonsense as the withering away of the state and platitudes like that. But he was really sharp about what goes wrong when capital wins unequivocally, when it gets everything it asks for.
That may be the ultimate tragedy of capitalism in our time, that it has achieved its dominance without regard to a social compact, without being connected to any other metric for human progress.
We understand profit. In my country we measure things by profit. We listen to the Wall Street analysts. They tell us what we’re supposed to do every quarter. The quarterly report is God. Turn to face God. Turn to face Mecca, you know. Did you make your number? Did you not make your number? Do you want your bonus? Do you not want your bonus?
And that notion that capital is the metric, that profit is the metric by which we’re going to measure the health of our society is one of the fundamental mistakes of the last 30 years. I would date it in my country to about 1980 exactly, and it has triumphed.
Capitalism stomped the hell out of Marxism by the end of the 20th century and was predominant in all respects, but the great irony of it is that the only thing that actually works is not ideological, it is impure, has elements of both arguments and never actually achieves any kind of partisan or philosophical perfection.
It’s pragmatic, it includes the best aspects of socialistic thought and of free-market capitalism and it works because we don’t let it work entirely. And that’s a hard idea to think – that there isn’t one single silver bullet that gets us out of the mess we’ve dug for ourselves. But man, we’ve dug a mess.
After the second world war, the west emerged with the American economy coming out of its wartime extravagance, emerging as the best product. It was the best product. It worked the best. It was demonstrating its might not only in terms of what it did during the war but in terms of just how facile it was in creating mass wealth.
Plus, it provided a lot more freedom and was doing the one thing that guaranteed that the 20th century was going to be – and forgive the jingoistic sound of this – the American century.
It took a working class that had no discretionary income at the beginning of the century, which was working on subsistence wages. It turned it into a consumer class that not only had money to buy all the stuff that they needed to live but enough to buy a bunch of shit that they wanted but didn’t need, and that was the engine that drove us.
It wasn’t just that we could supply stuff, or that we had the factories or know-how or capital, it was that we created our own demand and started exporting that demand throughout the west. And the standard of living made it possible to manufacture stuff at an incredible rate and sell it.
And how did we do that?
We did that by not giving in to either side. That was the new deal. That was the great society. That was all of that argument about collective bargaining and union wages and it was an argument that meant neither side gets to win.
Labour doesn’t get to win all its arguments, capital doesn’t get to. But it’s in the tension, it’s in the actual fight between the two, that capitalism actually becomes functional, that it becomes something that every stratum in society has a stake in, that they all share.
The unions actually mattered. The unions were part of the equation. It didn’t matter that they won all the time, it didn’t matter that they lost all the time, it just mattered that they had to win some of the time and they had to put up a fight and they had to argue for the demand and the equation and for the idea that workers were not worth less, they were worth more.
Ultimately we abandoned that and believed in the idea of trickle-down and the idea of the market economy and the market knows best, to the point where now libertarianism in my country is actually being taken seriously as an intelligent mode of political thought. It’s astonishing to me. But it is. People are saying I don’t need anything but my own ability to earn a profit. I’m not connected to society. I don’t care how the road got built, I don’t care where the firefighter comes from, I don’t care who educates the kids other than my kids. I am me. It’s the triumph of the self. I am me, hear me roar.
That we’ve gotten to this point is astonishing to me because basically in winning its victory, in seeing that Wall come down and seeing the former Stalinist state’s journey towards our way of thinking in terms of markets or being vulnerable, you would have thought that we would have learned what works. Instead we’ve descended into what can only be described as greed. This is just greed. This is an inability to see that we’re all connected, that the idea of two Americas is implausible, or two Australias, or two Spains or two Frances.
Societies are exactly what they sound like. If everybody is invested and if everyone just believes that they have “some”, it doesn’t mean that everybody’s going to get the same amount. It doesn’t mean there aren’t going to be people who are the venture capitalists who stand to make the most. It’s not each according to their needs or anything that is purely Marxist, but it is that everybody feels as if, if the society succeeds, I succeed, I don’t get left behind. And there isn’t a society in the west now, right now, that is able to sustain that for all of its population.
And so in my country you’re seeing a horror show. You’re seeing . . .
Paul Krugman patiently deals with the GOP’s benighted worldview yet again:
Six years have passed since the United States economy entered the Great Recession, four and a half since it officially began to recover, but long-term unemployment remains disastrously high. And Republicans have a theory about why this is happening. Their theory is, as it happens, completely wrong. But they’re sticking to it — and as a result, 1.3 million American workers, many of them in desperate financial straits, are set to lose unemployment benefits at the end of December.
Now, the G.O.P.’s desire to punish the unemployed doesn’t arise solely from bad economics; it’s part of a general pattern of afflicting the afflicted while comforting the comfortable (no to food stamps, yes to farm subsidies). But ideas do matter — as John Maynard Keynes famously wrote, they are “dangerous for good or evil.” And the case of unemployment benefits is an especially clear example of superficially plausible but wrong economic ideas being dangerous for evil.
Here’s the world as many Republicans see it: Unemployment insurance, which generally pays eligible workers between 40 and 50 percent of their previous pay, reduces the incentive to search for a new job. As a result, the story goes, workers stay unemployed longer. In particular, it’s claimed that the Emergency Unemployment Compensation program, which lets workers collect benefits beyond the usual limit of 26 weeks, explains why there are four million long-term unemployed workers in America today, up from just one million in 2007.
Correspondingly, the G.O.P. answer to the problem of long-term unemployment is to increase the pain of the long-term unemployed: Cut off their benefits, and they’ll go out and find jobs. How, exactly, will they find jobs when there are three times as many job-seekers as job vacancies? Details, details.
Proponents of this story like to cite academic research — some of it from Democratic-leaning economists — that seemingly confirms the idea that unemployment insurance causes unemployment. They’re not equally fond of pointing out that this research is two or more decades old, has not stood the test of time, and is irrelevant in any case given our current economic situation.
The view of most labor economists now is that unemployment benefits have only a modest negative effect on job search — and in today’s economy have no negative effect at all on overall employment. On the contrary, unemployment benefits help create jobs, and cutting those benefits would depress the economy as a whole.
Ask yourself how, exactly, ending unemployment benefits would create more jobs. It’s true that some of the currently unemployed, finding themselves even more desperate than before, might manage to snatch jobs away from those who currently have them. But what would give businesses a reason to employ more workers as opposed to replacing existing workers?
You might be tempted to argue that more intense competition among workers would lead to lower wages, and that cheap labor would encourage hiring. But that argument involves a fallacy of composition. Cut the wages of some workers relative to those of other workers, and those accepting the wage cuts may gain a competitive edge. Cut everyone’s wages, however, and nobody gains an edge. All that happens is a general fall in income — which, among other things, increases the burden of household debt, and is therefore a net negative for overall employment.
The point is that employment in today’s American economy is limited by demand, not supply. Businesses aren’t failing to hire because they can’t find willing workers; they’re failing to hire because they can’t find enough customers. And slashing unemployment benefits — which would have the side effect of reducing incomes and hence consumer spending — would just make the situation worse.
Still, don’t expect prominent Republicans to change their views, except maybe to come up with additional reasons to punish the unemployed. For example, Senator Rand Paul recently cited research suggesting that the long-term unemployed have a hard time re-entering the work force as a reason to, you guessed it, cut off long-term unemployment benefits. You see, those benefits are actually a “disservice” to the unemployed. . .
Continue reading. The GOP consistently claims that it helps those who need help if you don’t give them any help. They become much better off as a result. I don’t get it. When someone needs help, they need help. It’s not hard to understand.
The NY Times has an excellent editorial:
Beyond new state efforts to restrict women’s access to proper reproductive health care, another, if quieter, threat is posed by mergers between secular hospitals and Catholic hospitals operating under religious directives from the nation’s Roman Catholic bishops. These directives, which oppose abortions, inevitably collide with a hospital’s duty to provide care to pregnant women in medical distress. This tension lies at the heart of a federal lawsuit filed last week by the American Civil Liberties Union.
The suit was brought on behalf of a Michigan woman, Tamesha Means, who says she was subjected to substandard care at a Catholic hospital — the only hospital in her county — after her water broke at 18 weeks of pregnancy. Doctors in such circumstances typically induce labor or surgically remove the fetus to reduce the woman’s chances of infection. But according to the complaint, doctors acting in accordance with the bishops’ directives did not inform Ms. Means that her fetus had virtually no chance of surviving or that terminating her pregnancy was the safest treatment option.
Despite acute pain and bleeding, Ms. Means was sent home twice, and when she returned a third time with a fever from her untreated infection, she miscarried even as the paperwork was being prepared to discharge her again. The fetus died soon after.
The case has gained attention because Ms. Means is not suing the hospital for medical negligence but the United States Conference of Catholic Bishops. The A.C.L.U. is arguing, on her behalf, that having issued the mandates and made them conditions of hospital affiliation, the conference is responsible for “the unnecessary trauma and harm” that Ms. Means and “other pregnant women in similar situations have experienced at Catholic-sponsored hospitals.”
How the suit will play out is unclear, but it showcases an important issue. Catholic hospitals account for about 15 percent of the nation’s hospital beds and, in many communities, are the only hospital facilities available. Allowing religious doctrine to prevail over the need for competent emergency care and a woman’s right to complete and accurate information about her condition and treatment choices violates medical ethics and existing law.
The problem Ms. Means encountered is not unique or limited to her particular medical needs. In 2010, the Diocese of Phoenix punished a nun and stripped a hospital of its affiliation after doctors there performed an abortion to save a mother’s life.
In a statement last Friday, the president of the bishops’ group, Archbishop Joseph Kurtz, said that the religious directives did not encourage or require substandard medical treatment. He also portrayed the case as an attack on religious freedom — the same unpersuasive argument the bishops are making against the new federal health care law’s requirement that all plans include contraception coverage.
The bishops are free to worship as they choose and advocate for their beliefs. But those beliefs should not shield the bishops from legal accountability when church-affiliated hospitals following their rules cause patients harm.
I blogged this on Saturday, but in case you didn’t see it, here it is.
Quite easy in practice, once you get the habit of accepting the cash register receipt, folding it up, and putting it in your pocket so you can later enter the total.
A sobering column from TomDispatch.com:
Sometimes a single story has a way of standing in for everything you need to know. In the case of the up-arming, up-armoring, and militarization of police forces across the country, there is such a story. Not the police, mind you, but the campus cops at Ohio State University now possess an MRAP; that is, a $500,000, 18-ton, mine-resistant, ambush-protected armored vehicle of a sort used in the Afghan War and, as Hunter Stuart of the Huffington Post reported, built to withstand “ballistic arms fire, mine fields, IEDs, and nuclear, biological, and chemical environments.” Sounds like just the thing for bouts of binge drinking and post-football-game shenanigans.
That MRAP came, like so much other equipment police departments are stocking up on — from tactical military vests, assault rifles, and grenade launchers to actual tanks and helicopters – as a freebie via a Pentagon-organized surplus military equipment program. As it happens, police departments across the country are getting MRAPs like OSU’s, including the Dakota County Sheriff’s Office in Minnesota. It’s received one of 18 such decommissioned military vehicles already being distributed around that state. So has Warren County which, like a number of counties in New York state, some quite rural, is now deploying Afghan War-grade vehicles. (Nationwide, rural counties have received a disproportionate percentage of the billions of dollars worth of surplus military equipment that has gone to the police in these years.)
When questioned on the utility of its new MRAP, Warren County Sheriff Bud York suggested, according to the Post-Star, the local newspaper, that “in an era of terrorist attacks on U.S. soil and mass killings in schools, police agencies need to be ready for whatever comes their way… The vehicle will also serve as a deterrent to drug dealers or others who might be contemplating a show of force.” So, breathe a sigh of relief, Warren County is ready for the next al-Qaeda-style show of force and, for those fretting about how to deal with such things, there are now 165 18-ton “deterrents” in the hands of local law enforcement around the country, with hundreds of requests still pending.
You can imagine just how useful an MRAP is likely to be if the next Adam Lanza busts into a school in Warren County, assault rifle in hand, or takes over a building at Ohio State University. But keep in mind that we all love bargains and that Warren County vehicle cost the department less than $10. (Yes, you read that right!) A cornucopia of such Pentagon “bargains” has, in the post-9/11 years, played its part in transforming the way the police imagine their jobs and in militarizing the very idea of policing in this country.
Just thinking about that MRAP at OSU makes me feel like I grew up in Neolithic America. After all, when I went to college in the early 1960s, campus cops were mooks in suits. Gun-less, they were there to enforce such crucial matters as “parietal hours.” (If you’re too young to know what they were, look it up.) At their worst, they faced what in those still civilianized (and sexist) days were called “panty raids,” but today would undoubtedly be seen as potential manifestations of a terrorist mentality. Now, if there is a sit-in or sit-down on campus, as infamously at the University of California, Davis, during the Occupy movement, expect that the demonstrators will be treated like enemies of the state and pepper-sprayed or perhaps Tased. And if there’s a bona fide student riot in town, the cops will now roll out an armored vehicle (as they did recently in Seattle).
By the way, don’t think it’s just the weaponry that’s militarizing the police. It’s a mentality as well that, like those weapons, is migrating home from our distant wars. It’s a sense that the U.S., too, is a “battlefield” and that, for instance, those highly militarized SWAT teams spreading to just about any community you want to mention are made up of “operators” (a “term of art” from the special operations community) ready to deal with threats to American life.
Embedding itself chillingly in our civilian world, that battlefield is proving mobile indeed. As Chase Madar wrote for TomDispatch the last time around, it leads now to the repeated handcuffing of six- and seven-year-olds in our schools as mini-criminals for offenses that once would have been dealt with by a teacher or principal, not a cop, and at school, not in jail or court. Today, Madar returns to explain just how this particular nightmare is spreading into every crevice of American life. Tom
The Over-Policing of America
Police Overkill Has Entered the DNA of Social Policy
By Chase Madar
If all you’ve got is a hammer, then everything starts to look like a nail. And if police and prosecutors are your only tool, sooner or later everything and everyone will be treated as criminal. This is increasingly the American way of life, a path that involves “solving” social problems (and even some non-problems) by throwing cops at them, with generally disastrous results. Wall-to-wall criminal law encroaches ever more on everyday life as police power is applied in ways that would have been unthinkable just a generation ago.
By now, the militarization of the police has advanced to the point where “the War on Crime” and “the War on Drugs” are no longer metaphors but bland understatements. There is the proliferation of heavily armed SWAT teams, even in small towns; the use of shock-and-awe tactics to bust small-time bookies; the no-knock raids to recover trace amounts of drugs that often result in the killing of family dogs, if not family members; and in communities where drug treatment programs once were key, the waging of a drug version of counterinsurgency war. (All of this is ably reported on journalist Radley Balko’s blog and in his book, The Rise of the Warrior Cop.) But American over-policing involves far more than the widely reported up-armoring of your local precinct. It’s also the way police power has entered the DNA of social policy, turning just about every sphere of American life into a police matter.
The School-to-Prison Pipeline
It starts in our schools, where discipline is increasingly outsourced to police personnel. What not long ago would have been seen as normal childhood misbehavior – doodling on a desk, farting in class, a kindergartener’stantrum – can leave a kid in handcuffs, removed from school, or even booked at the local precinct. Such “criminals” can be as young as seven-year-old Wilson Reyes, a New Yorker who was handcuffed and interrogated under suspicion of stealing five dollars from a classmate. (Turned out he didn’t do it.)
Though it’s a national phenomenon, Mississippi currently leads the way in turning school behavior into a police issue. The Hospitality State hasimposed felony charges on schoolchildren for “crimes” like throwing peanuts on a bus. Wearing the wrong color belt to school got one child handcuffed to a railing for several hours. All of this goes under the rubric of “zero-tolerance” discipline, which turns out to be just another form of violence legally imported into schools.
Despite a long-term drop in youth crime, the carceral style of education remains in style. Metal detectors — a horrible way for any child to start the day — are installed in ever more schools, even those with sterlingdisciplinary records, despite the demonstrable fact that such scanners provide no guarantee against shootings and stabbings.
Every school shooting, whether in Sandy Hook, Connecticut, or Littleton, Colorado, only leads to more police in schools and more arms as well. It’s the one thing the National Rifle Association and Democratic senators canagree on. There are plenty of successful ways to run an orderly school without criminalizing the classroom, but politicians and much of the media don’t seem to want to know about them. The “school-to-prison pipeline,” a jargon term coined by activists, is entering the vernacular.
Go to Jail, Do Not Pass Go
Even as simple a matter as getting yourself from point A to point B can quickly become a law enforcement matter as travel and public space are ever more aggressively policed. Waiting for a bus? Such loitering just got three Rochester youths arrested. Driving without a seat belt can easily escalate into an arrest, even if the driver is a state judge. (Notably, all four of these men were black.) If the police think you might be carrying drugs, warrantless body cavity searches at the nearest hospital may be in the offing — you will be sent the bill later.
Air travel entails increasingly intimate pat-downs and arbitrary rules that many experts see as nothing more than “security theater.” As for staying at home, it carries its own risks as Harvard professor Henry Louis Gates found out when a Cambridge police officer mistook him for a burglar and hauled him away — a case that is hardly unique.
Overcriminalization at Work
Office and retail work might seem like an unpromising growth area for police and prosecutors, but criminal law has found its way into the white-collar workplace, too.
You know, we’re watching this happen. It’s going on, and we are just watching (probably because the FBI and NSA seem to respect no bounds of privacy). But at least it’s being done out in the open, before our very eyes..
I was just watching a kind of Chinese Godfather, quite good but having bouts of graphic violence—New World, on Netflix Watch Instantly—and I got to thinking about how you rise in such an organization—or, indeed, in politics, in the business world, wherever.
Setting aside pure luck and favoritism, advancement means getting to know people—not merely acquaintances, and probably not friends, but to know a lot of people very well so that you know how they will respond to various things, what pleases them, what they fear, what they want, and all that. And not just individuals along: you have to learn who wields the real power in each group, and how the groups are connected and what influences them. You have to know whom to stroke, whom to ignore, and whom to fight.
It takes a lot of knowing, and that is probably why those who achieve power and position generally are older: while there are books on negotiating and managing and the like, the practical skills, naturally enough, require practice.
And it struck me that what is being learned are patterns. Our pattern recognition engine has to work overtime—that is, for a long time—to suss out the complex patterns of a large organization, including the outside influences on it.
I realize that patterns turn up everywhere, but much of our learning and culture is pattern-based: when you learn a game (say, Go or Chess), you learn the rules that govern allowable patterns, and then you play games until you start to recognize patterns, at which point you start to learn. And as you learn, you are discovering more and more complex patterns—some you know, some you’re just starting to work out, some remain to be discovered. And the board games are simple compared to the complexities of a large organization and all the patterns, internal and external, that are learned in order to achieve a prominent position in the organization.
Obviously, our pattern-recognition abilities are formidable, and I got to wondering why. Pattern-recognition at some level occurs as the basic level of life, as proteins “recognize” molecules and all the subcellular making and matching patterns goes on. But take it up to the level of vertebrates: animals clearly recognize and use patterns to some degree: predators use the patterns of their prey in order to hunt.
In animals there is not the kind of conscious recognition we bring to patterns—not that this matters, but I’ll grant that—but the question is how did we get so very good at patterns, so that we can speak (patterns), make music (patterns), appreciate music (patterns), and so on?
Well, being able to learn/recognize any pattern is an evolutionary advantage over not being able to learn/recognize one, so as soon as the ability appeared, it would encounter strongly favorable natural selection. And the more patterns the organism can recognize, the better off it is in comparison to its more limited fellows. Thus it would seem that pattern-recognition-capability would advance quickly, with the ability to recognize more patterns generally a benefit.
So in modern humans we have advanced pattern-recognition ability, which means we can recognize extremely complex patterns involving many aspects (visual, lingual, action, context, etc.), but of course the more complex patterns take longer to work out.
I imagine someone will throw up Alexander the Great, but of course the pattern he created was quite fragile—lack of sufficient experience in making/recognizing patterns?—and it all fell apart when he died: making a pattern that would endure was not something he achieved.
And it was an excellent movie. Truly is a Chinese Godfather of a sort. And during the movie, the viewer has to work out the patterns in play. Worth seeing, but watch out for the violence: not a kid’s movie by any means. If you see the movie, you’ll understand why the above occurred to me.
And, it occurs to me, human culture is a way to preserve and pass on the patterns we’ve worked out (some of which correspond to reality, some of which do not).
Interesting article in Salon by Lisa Wade:
Of all people in America, adult, white, heterosexual men have the fewest friends. Moreover, the friendships they have, if they’re with other men, provide less emotional support and involve lower levels of self-disclosure and trust than other types of friendships. When men get together, they’re more likely to do stuff than have a conversation. Friendship scholar Geoffrey Greif calls these “shoulder-to-shoulder” friendships, contrasting them to the “face-to-face” friendships that many women enjoy. If a man does have a confidant, three-quarters of the time it’s a woman, and there’s a good chance she’s his wife or girlfriend.
When I first began researching this topic I thought, surely this is too stereotypical to be true. Or, if it is true, I wondered, perhaps the research is biased in favor of female-type friendships. In other words, maybe we’re measuring male friendships with a female yardstick. It’s possible that men don’t want as many or the same kinds of friendships as women.
But they do. When asked about what they desire from their friendships, men are just as likely as women to say that they want intimacy. And, just like women, their satisfaction with their friendships is strongly correlated with the level of self-disclosure. Moreover, when asked to describe what they mean by intimacy, men say the same thing as women: emotional support, disclosure and having someone to take care of them.
Men desire the same level and type of intimacy in their friendships as women, but they aren’t getting it.
In an effort to understand why men’s friendships are less intimate than women’s, psychologist Niobe Way interviewed boys about their friendships in each year of high school. She found that younger boys spoke eloquently about their love for and dependence on their male friends. In fact, research shows that boys are just as likely as girls to disclose personal feelings to their same-sex friends and they are just as talented at being able to sense their friends’ emotional states.
But, at about age 15 to 16 — right at the same age that the suicide rate of boys increases to four times the rate of girls — boys start reporting that they don’t have friends and don’t need them. Because Way interviewed young men across each year of high school, she was able to document this shift. One boy, Justin, said this in his first year, when he was 15:
[My best friend and I] love each other… that’s it… you have this thing that is deep, so deep, it’s within you, you can’t explain it. It’s just a thing that you know that person is that person… I guess in life, sometimes two people can really, really understand each other and really have a trust, respect and love for each other.
By his senior year, however, this is what he had to say about friendship:
[My friend and I] we mostly joke around. It’s not like really anything serious or whatever… I don’t talk to nobody about serious stuff… I don’t talk to nobody. I don’t share my feelings really. Not that kind of person or whatever… It’s just something that I don’t do.
During these years, young men are . . .
Juan Cole posts at Informed Comment:
Jon Schwartz ( @tinyrevolution ) posted this to Twitter. It is a side by side comparison of a passage from “1984″ to the news report from a former senior FBI official that the FBI can turn on the laptop cameras of individuals without activating the red light that shows the camera is operating.
The Washington Post broke the story. If the FBI is doing this without a warrant it is yet another nail in the coffin of the US 4th Amendment, which guarantees people the right not to have government snoop through their personal effects without evidence of wrongdoing and a judge’s permission.
The sphere is the ideal shape for a slow-melting ice cube: maximum volume for minimum melting area = least dilution. I remember that my tutor Duncan McDonald commented at a party that the best punch he had ever enjoyed was at a party in the Highlands: the bowl contained the punch and one enormous cubical block of ice. He dipped out a cup and found that it was straight single-malt scotch. The idea of the ice cube was, as with the sphere, to minimize surface area and maximize volume, only they couldn’t make an ice sphere—or it was too difficult.
Well, it’s child’s play for me, now that I’ve purchased (in the interests of research) one set of these Tovoloj and one set of these Stone Cask Ice Rounds. I’ve been using them—well, one—in my iced tea and it is very nice: it’s as large a sphere as would fit in the glass (and it occurs to me that there may be some standard interior diameters for drinking glasses?) and it does melt more slowly that do my usual small ice cubes. Of those two, I give the nod to Stone Cask: not so fussy, easier to use, better spheres. Just do what it saws and not fill more than 1/2″ from fill hole: otherwise, the water expands up through the hole and then freezes, in effect bolting the ice sphere to that half of the mold. You have to break off the bolt to get it out.
I also got this enormous-cube tray, and I do like the enormous cubes: very nice in a Manhattan. And indeed the dilution is noticeably slowed.
UPDATE: One drawback to having one enormous ice cube/sphere: you get no clink from the cubes.
And, unfortunately for her, Missouri is one of the 24 states that refused to expand Medicare (though it would have cost them nothing), so that many in those states are out of luck:
I think it’s worth noting that the legislators who refused to accept the expansion of Medicare do not fall into that gap, and they couldn’t care less about those who do. Obviously.
Here’s the story, by Charles Ornstein in Pacific Standard:
For Missouri public radio reporter Harum Helmy, the Affordable Care Act is more than just a story she covers. It is also a story she’s living.
“I know — an uninsured health reporter,” she wrote to me last month. “The joke’s not lost on me.”
Helmy, 23, a part-time reporter/producer for KBIA in Columbia, Missouri, recently completed her coursework at the University of Missouri. She’s on her first professional job. At the station, she covers Obamacare, among other things. But she doesn’t make much money, and if the law worked as it was intended, she would be covered by Missouri’s Medicaid program beginning January 1.
That wasn’t meant to be.
As signed by President Obama, the Affordable Care Act (ACA) would have required every state to expand its Medicaid program for the poor to include adults earning less than 138 percent of the federal poverty level. Those earning more than that, up to four times the poverty level, would qualify for subsidies to purchase health insurance in marketplaces.
But the Supreme Court ruled last year that states could opt out of the Medicaid expansion without consequence, and Missouri along with 24 other states have done just that. The problem is that the law didn’t include subsidies for people in those states who earn less than the federal poverty level to buy coverage through the exchanges—they were supposed to be covered under Medicaid.
That’s the gap in which Helmy sits.
She earns less than the poverty level ($10 an hour for 20 hours per week) and qualifies neither for Medicaid nor a subsidy. Helmy was born in Texas and is a U.S. citizen, though her parents live in Indonesia. While she attended classes at the university, her parents paid for her health coverage.
According to the Kaiser Family Foundation, a non-partisan think tank, “In states that do not expand Medicaid, nearly 5 million poor uninsured adults have incomes above Medicaid eligibility levels but below poverty and may fall into a ‘coverage gap’ of earning too much to qualify for Medicaid but not enough to qualify for Marketplace premium tax credits.” In Missouri, 193,000 people, including Helmy, fall into the gap, Kaiser estimates.
On paper, the Medicaid expansion seems like a great deal. The federal government has agreed to pick up 100 percent of the cost of the expansion for the first three years, phasing down to 90 percent in 2020. But officials in states that have declined to take part view Medicaid as a broken program. They don’t trust the federal government to keep its funding pledge and do not believe they have adequate state funds to cover their portion.
Missouri Governor Jay Nixon, a Democrat, wants to expand Medicaid in his state, but the Republican-controlled Legislature won’t go along with it.
Helmy discussed her situation in a podcast in October (around the 12-minute mark). “I would just get a little bit personal here and say I’m one of those people,” she said. “I’m in this weird gap where I need insurance, my employer doesn’t give me insurance, but I don’t make enough to get a subsidy.”
I asked her what it felt like to be affected by the act. . . .
I think as people in the 24 states become aware that their legislators turned down free money that would have enabled them to have healthcare insurance, and they see that people making less than they get coverage, and people getting more than they are covered, the people left out in the cold are going to get amazingly angry. We may seem some interesting surprises in the November 2014 elections in those states.
A series of charts that compares various aspects of healthcare:
Total expenditures, with proportion covered by government and not
Out-of-pocket expenses by country
Proportion of people with public insurance that buy private insurance
Total healthcare expenditures by country
Proportion of people who do not seek medical aid because of costs
Costs for routine doctor visits by country
Prescription drug and screening costs
And yet I believe that many still hold to the idea that the US offers “the best healthcare system in the world.”
Quite a few conservative voices are raised in an outcry about contraception coverage under the Affordable Care Act, which requires companies with more than 50 employees to provide health insurance coverage that includes, among other things, contraceptive services. NOTE: There is no requirement that any individual actually use such services: that is left up to the personal religious convictions of the individual. Businesses, however, hate to pay money, so some are saying that they should not be required to pay for insurance that includes contraceptive coverage because they (the business owners) don’t believe in it. In other words, the business owners want to impose their own religious views on their employees, which I think is wrong.
So a question to ask those espousing such a view: “Do you believe that a large business owned by Christian Scientists should be allowed not to offer healthcare insurance at all, since the owners do not believe in medical services in general?”
UPDATE: Emily Baxter has an interesting post at ThinkProgress on the general topic of healthcare insurance and religious liberty: Despite The Fights Over Religious Liberty, Obamacare Doesn’t Have To Be ‘Girl Versus God’
I put together a little spreadsheet to help me find out (and control) how much we spend at the supermarket. I call it “grocery”, but it includes all household items I buy there: cleaning supplies, paper products, kitty litter, and the like, along with food. I simply tuck the cash-register tape into my pocket and enter those at home when I have time. I realize that there’s probably a wonderful smartphone app for this, but I am still computer-oriented.
Next to each purchase entry I note any items that significantly added to the cost or that struck me as worth noting.
The USDA provides the average spent for food at four budget levels in a series of tables updated monthly: here they are. Example: for a family of two aged 51-70, the cost per month for food in Oct 2013 averages:
Thrifty plan: $362.40
Low-cost plan: $467.70
Moderate-cost plan: $582.90
Liberal plan: $700.10
Since a fair amount of non-food spending is included when using the totals from the supermarket cash-register receipts, I went with the Liberal plan. Our actual amount spent purely for food is less than the liberal plan. (We probably spend close to the Moderate-cost plan.)
In fact, I am pretty sloppy on some of the details. Besides including money spent at the supermarket on things like laundry detergent, paper towels, kitty litter, and aluminum foil as if those were “food,” I generally don’t enter into the spreadsheet the cost of restaurant meals. I think of those as “entertainment,” and so it doesn’t get counted. It’s not a problem for us: we eat out rarely, so that cost is not significant in our situation. We just pay for that from our personal spending allowance. You should do what fits your own circumstances.
I’ve found that in practice it’s quite easy to accept the cash register receipts, fold them up, and keep them in my shirt pocket until I’m ready to enter the totals.
Here’s the spreadsheet. Enjoy. After you’ve used it a while, I’d be interested in hearing your thoughts.
Good NY Times column from Paul Krugman:
Much of the media commentary on President Obama’s big inequality speech was cynical. You know the drill: it’s yet another “reboot” that will go nowhere; none of it will have any effect on policy, and so on. But before we talk about the speech’s possible political impact or lack thereof, shouldn’t we look at the substance? Was what the president said true? Was it new? If the answer to these questions is yes — and it is — then what he said deserves a serious hearing.
And once you realize that, you also realize that the speech may matter a lot more than the cynics imagine.
First, about those truths: Mr. Obama laid out a disturbing — and, unfortunately, all too accurate — vision of an America losing touch with its own ideals, an erstwhile land of opportunity becoming a class-ridden society. Not only do we have an ever-growing gap between a wealthy minority and the rest of the nation; we also, he declared, have declining mobility, as it becomes harder and harder for the poor and even the middle class to move up the economic ladder. And he linked rising inequality with falling mobility, asserting that Horatio Alger stories are becoming rare precisely because the rich and the rest are now so far apart.
This isn’t entirely new terrain for Mr. Obama. What struck me about this speech, however, was what he had to say about the sources of rising inequality. Much of our political and pundit class remains devoted to the notion that rising inequality, to the extent that it’s an issue at all, is all about workers lacking the right skills and education. But the president now seems to accept progressive arguments that education is at best one of a number of concerns, that America’s growing class inequality largely reflects political choices, like the failure to raise the minimum wage along with inflation and productivity.
And because the president was willing to assign much of the blame for rising inequality to bad policy, he was also more forthcoming than in the past about ways to change the nation’s trajectory, including a rise in the minimum wage, restoring labor’s bargaining power, and strengthening, not weakening, the safety net.
And there was this: “When it comes to our budget, we should not be stuck in a stale debate from two years ago or three years ago. A relentlessly growing deficit of opportunity is a bigger threat to our future than our rapidly shrinking fiscal deficit.” Finally! Our political class has spent years obsessed with a fake problem — worrying about debt and deficits that never posed any threat to the nation’s future — while showing no interest in unemployment and stagnating wages. Mr. Obama, I’m sorry to say, bought into that diversion. Now, however, he’s moving on.
Still, does any of this matter? The conventional pundit wisdom of the moment is that Mr. Obama’s presidency has run aground, even that he has become irrelevant. But this is silly. In fact, it’s silly in at least three ways.
First, much of the current conventional wisdom involves extrapolating from Obamacare’s shambolic start, and assuming that things will be like that for the next three years. They won’t. HealthCare.gov is working much better, people are signing up in growing numbers, and the whole mess is already receding in the rear-view mirror.
Second, . . .
Basically, the GOP thinks that employers should not have to meet a minimum wage, no matter how small. Rep. Joe Barton (R-TX) and others have called for an end to the minimum wage—and pretty clearly believe that without it, employers would happily pay even lower wages.
I find this attitude astonishing. Obviously, not all conservatives think this way: rather than very low minimum wages combined with lots of government assistance, some are proposing a higher minimum wage—one that people could live on. Government assistance programs then would cost less and demand would also increase, helping the economy. And having a livable minimum wage does not, as it turns out, reduce employment.
Look at these posts (and charts): it’s an important issue because when companies like Walmat and McDonald’s pay their employees so poorly that the companies recommend that employees apply for food stamps and get food contributions for the holidays, they are obviously not paying enough. For some in the GOP to say that there should be no minimum wage at all is staggering. But: Joe Barton…
Why should there be a law defining the minimum wage? The same reason as for most laws: there are some very bad people out there, which is why we need laws against robbery and rape and also why we need a law setting a minimum to legal wages. The same reason for all these laws: stop bad actors or, if they go ahead, provide a means to punish them.
Everyone seems to love Nelson Mandela now, though at the time the US and South Africa and others considered him little more than a terrorist. Juan Cole points out that the US and Israel strongly supported Apartheid (possibly because both nations have an underclass (African-Americans for the US, Palestinians for Israel) and thus aligned themselves with the white overlords of South Africa):
The attempt to make Nelson Mandela respectable is an ongoing effort of Western government spokesmen and the Western media.
He wasn’t respectable in the business circles of twentieth-century New York or Atlanta, or inside the Beltway of Washington, D.C. He wasn’t respectable for many of the allies of the United States in the Cold War, including Britain and Israel.
I visited Soweto in 2012 and went to Mandela’s old house. It was a moving experience. I don’t want him to be reduced to a commercialized icon on this day of all days.
We should remember that for much of the West in the Cold War, South Africa’s thriving capitalist economy was what was important. Its resources were important. Its government, solely staffed by Afrikaners and solely for Afrikaners, was seen as a counter-weight to Soviet and Communist influence in Africa. Washington in the 1980s obsessed about Cuba’s relationship to Angola (yes).
That the Afrikaners treated black Africans like dirt and discriminated against them viciously, denying them the franchise or any hint of equality, was considered in Western capitals at most an unfortunate idiosyncrasy that could not be allowed to interfere with the West’s dependence on Pretoria in fighting the international Left.
The African National Congress had attempted nonviolent protest in the 1950s, but the white Afrikaaner government outlawed all those techniques and replied with deadly force. In the early 1960s when Nelson Mandela turned to sabotage, the United States was a nakedly capitalist country engaged in an attempt to ensure that peasants and workers did not come to power. It was a deeply racist society that practiced Apartheid, a.k.a. Jim Crow in its own South.
The US considered the African National Congress to be a form of Communism, and sided with the racist Prime Ministers Hendrik Verwoerd and P.W. Botha against Mandela.
Decades later, in the 1980s, the United States was still supporting the white Apartheid government of South Africa, where a tiny minority of Afrikaaners dominated the economy and refused to allow black Africans to shop in their shops or fraternize with them, though they were happy to employ them in the mines. Ronald Reagan declared Nelson Mandela, then still in jail, a terrorist, and the US did not get around to removing him from the list until 2008! Reagan, while delivering pro forma denunciations of Apartheid or enforced black separation and subjugation, nevertheless opposed sanctions with teeth on Pretoria. Reagan let the racist authoritarian P.W. Botha come to Washington and met with him.
Likewise British PM Margaret Thatcher befriended Botha and castigated Mandela’s ANC as terrorists. As if the Afrikaners weren’t terrorizing the black majority! She may have suggested to Botha that he release Mandela for PR purposes, but there is not any doubt on whose side she stood.
The Israeli government had extremely warm relations with Apartheid South Africa, to the point where Tel Aviv offered the Afrikaners a nuclear weapon (presumably for brandishing at the leftist states of black Africa). That the Israelis accuse Iran of being a nuclear proliferator is actually hilarious if you know the history. Iran doesn’t appear ever to have attempted to construct a nuclear weapon, whereas Israel has hundreds and seems entirely willing to share.
In the US, the vehemently anti-Palestinian Anti-Defamation League in San Francisco spied on American anti-Apartheid activists on behalf of the Apartheid state. If the ADL ever calls you a racist, you can revel in the irony.
Ronald Reagan imagined that there were “moderates” in the Botha government. There weren’t. He wanted “constructive engagement” with them. It failed. The Afrikaners imposed martial law. Reagan tried to veto Congressional sanctions on Pretoria in 1986 but Congress over-rode him.
Nelson Mandela was a socialist who believed in the ideal of economic equality or at least of a decent life for everyone in society. He was also a believer in parliamentary government. So, he was a democratic socialist.
The current Republican Party . . .
In Salon Joan Walsh also points to the weirdness of the current GOP’s praise for Mandela, when the GOP basically opposes everything Mandela stood for:
I tried to honor Nelson Mandela on the day of his death, and love my political enemies. But the white-washing of Mandela’s legacy, as well as the role of the United States in supporting both apartheid and Mandela’s long imprisonment, has to be rebutted.
It began on Mandela’s 95th birthday in July, when House Speaker John Boehner had the audacity to declare in a tribute “At times it can almost feel like we are talking about an old friend.”
It got much worse when Sen. Ted Cruz announced Thursday night: “Nelson Mandela will live in history as an inspiration for defenders of liberty around the globe.”
But Cruz’s political heroes opposed Mandela as a terrorist and a communist, and there’s little doubt the red-baiting Texas senator would have done the same had he been in Congress back then. (The Daily Beast’s Peter Beinart and Foreign Policy’s Sam Kleiner(from July) have the two best pieces about “apartheid amnesia” I’ve read.)
It’s shocking how little American leaders of both parties did to oppose the rise and consolidation of the brutal apartheid regime in the ‘50s and ’60s, but it was Richard Nixon who developed closer ties. The anti-apartheid movement of the 1970s and ’80s – where Barack Obama got his political start; I covered the University of Wisconsin’s successful divestment movement with the Daily Cardinal in 1978 — was demonized as the far left at the time. Moderates proposed alternatives like the Sullivan Principles, named after Rev. Leon Sullivan, a General Motors board member, which tried (and failed) to impose a code of conduct on companies doing business in South Africa (Sullivan eventually agreed they weren’t enough).
Ronald Reagan made it a priority to fight domestic and international divestment efforts — efforts that, in the end, helped pressure the South African government to enter negotiations and free Nelson Mandela. Reagan vetoed an amazingly (if belatedly) bipartisan bill to impose tough sanctions on the apartheid regime. Of course then-Congressman Dick Cheney had voted against the sanctions in 1986, and he defended his position while running for vice president in 2000, telling ABC: ”The ANC was then viewed as a terrorist organization. … I don’t have any problems at all with the vote I cast 20 years ago.”
The Heritage Foundation was a clubhouse for apartheid backers; as late as 1990, when Mandela had been freed from prison and traveled to the U.S., Heritage suggested he was a terrorist, “not a freedom fighter.” Grover Norquist advised pro-apartheid South African student groups and declared that the issue “is the one foreign policy debate that the Left can get involved in and feel that they have the moral high ground,” while insisting that it was a “complicated situation.” It was not.
As late as 2003, the National Review attacked Mandela for opposing the Iraq war. His “vicious anti-Americanism and support for Saddam Hussein should come as no surprise,” NR wrote, “given his longstanding dedication to communism and praise for terrorists.”
It’s also disrespecting Mandela to leave his radicalism out of his tributes. For a time he believed ending apartheid would require armed resistance, and although he eventually renounced violence, he refused to do so as a condition of being released from prison. He was a revolutionary who believed in a radical redistribution of wealth, and a global warrior against poverty, to the end. Yes, it’s important to remember his legacy of reconciliation, and love, toward white South Africans who had brutalized him. But it’s equally important to remember the commitment to equality that let him endure prison, and adopt reconciliation as the best strategy to achieve freedom and justice.
So I’m not dwelling on the hypocrisy of the right at this point, but I can’t ignore it either. . .
The common thought is yes, but there are counterarguments, as Zazie Todd points out in an article in Pacific Standard:
Did you over-indulge during Thanksgiving? Could a dog be the answer? A new meta-analysis by an international team investigates whether dog owners are more physically active than people without dogs.
Contrary to popular belief, not all pet dogs are walked. And it’s possible that dog owners spend time walking their pets at the expense of participating in sports or going to the gym. A 155-pound adult uses 493kcals playing soccer or using a rowing machine at a moderate pace, compared to 211 walking. So the question is whether, on average, people who own dogs get more physical activity than those who don’t.
This is of particular interest to public health specialists who want to know how to leverage your dog to make you more active. The scientists, writing in the Journal of Physical Activity and Health, explain that “considering the large proportion of dog owners, and that many dogs enjoy being walked, dog walking could provide a potentially viable strategy for increasing population levels of physical activity.”
The team, led by the University of Western Australia’sHayley Christian, analyzed 29 research studies conducted between 1990 and 2010, mainly in the United States and Australia. They looked at dog owners and non-dog owners of all ages, from children to seniors.
Their results showed that dog owners . . .
Christian Christensen has an interesting post at Informed Comment:
The Public Professor: Dissent in Commodified Higher Education
Or…What Kind of University Will My Daughter Attend in 2027?
The following is the text of my public Professorial Installation lecture given at Uppsala University. These lectures have been given at Uppsala University for centuries, and are intended for a broad audience.
In 1967, in a piece entitled The Responsibility of Intellectuals, Noam Chomsky wrote the following:
It is the responsibility of intellectuals to speak the truth and to expose lies. This, at least, may seem enough of a truism to pass over without comment. Not so, however. For the modern intellectual, it is not at all obvious.
About one week from today my daughter will celebrate her second birthday. This means that she will be entering university – should she choose to go – in the year 2027. Part of my talk will be about what I see as the role of the university professor in a highly mediated environment, and in relation to what Noam Chomsky said about public intellectuals. But, it is also in part about my daughter, and the future of universities in what is rapidly becoming a highly commercialized academic environment.
As a media and communications scholar, many people take it for granted that I am able to communicate effectively in public fora. Communication to the public is not, of course, the central role of the communications scholar. We analyze and investigate various phenomena related to media and communications, but that does not necessarily mean that we are “good communicators” ourselves. In actual fact, this is probably one of the weaknesses of those of us who work in academia: that is, our inability to take the fascinating and critical ideas that we discuss in our journal articles and in our books, and translate them into what we might want to call, “popular language.”
In the academic world, the presentation of intellectual material in popular form is generally looked down upon. I am educated in the United States, where the position of the “public intellectual” is significantly less defined (and respected) than it is here in Sweden and Europe. It is, I feel, a central duty for those of us working within academia to take the material that we do research on and to discuss it publicly, to make public – in some form and in some way – the knowledge that we have spent years gathering and shaping.
What does this issue – being a public professor – have to do with my daughter, and what does this have to do with social media? I see these three issues as inter-linked. One of the things that I am most worried about in relation to my daughter starting university in 2027 is whether or not the university will come to exist in a form that we recognize today. What I mean by this is: a space within contemporary society not entirely dictated by commercial interests and considerations. It is one of the things that I am grateful for: that, as an employee of a university, at least to some extent, I work within a space where my thinking can be divorced from purely profit-making and commercial considerations.
Spaces such as these are becoming increasingly rare. The media, urban spaces, politics are all zones where the communication that we encounter (from text to visuals to speech) are soaked in the logic of the commercial. We are surrounded by advertising, from the moment we wake up in the morning, to the time we spend walking on the streets, to the very logos that we wear on our bodies in the form of clothing. Our media systems are almost exclusively commercial, and even countries with a history of public service broadcasting have seen that history slowly erased, replaced with a commercialized reality.
As capitalism continues its march forward, there exists a drive to locate new elements of our existence that have yet to be turned into products to be bought and sold. Even our personal experiences have become fair game. The social media site Facebook essentially commodifies various elements of our private life: our thoughts, our pictures, our likes, our dislikes, our families, our friendships.
‘The media, urban spaces, politics are all zones where the communication that we encounter (from text to visuals to speech) are soaked in the logic of the commercial.’However, I do believe that social media – and I recognize that the very term “social media” is problematic – provide opportunities. I do not wish to stand here and sound like a techo-phobe or neo-Luddite, and one of the positive byproducts of the development of the internet, digital technologies and social media has been the ability of what we might wish to call “ordinary citizens” to make their voices heard. Now, again, let me say that this ability has been vastly overblown by the mainstream media. The vast majority of bloggers, videos on YouTube, postings to Facebook and tweets on Twitter, fall into digital black-holes, never to be seen or heard by the billions of users around the globe.
But, I myself have a blog. I use Facebook. I use Twitter. This is because opportunities do exist. Recent events in north Africa and the global Occupy Wall Street movement have shown that digital technologies can be utilized by ordinary citizens – those not wealthy or privileged enough to own a newspaper or television station – for the greater good. Digital media use is not the ONLY factor in these cases, but it is A factor that cannot simply be dismissed. In the same way I would argue that academics, those of us employed as public sector workers, should make the most of these technologies in order to spread the information that we gather. To spread the research, the knowledge, the critical thinking that we have spent years and years cultivating.
Universities have become increasingly commodified: universities in the UK charge students tuition fees, and we in Sweden have begun to charge international students tuition fees, things that have been done in my own country, the United States, for a number of years. Commodification was, for a long period, seen as anathema to higher education in Europe, but, as time as gone by, we have seen the increasing commodification of university life. In the same way, departments that are considered to be “unprofitable” – in other words, they do not have large numbers of students, or do not produce “cutting edge” research that attracts the interest of outside financers – simply begin to disappear. Language departments, and niche intellectual areas of inquiry struggle financially, and are therefore not “of value” to universities.
If we look forward to 2027, when my daughter will begin at university, then it is critical to ask if the departments that I have just discussed actually exist? Will the majority of universities, for example, have a French department? Will universities and their leaders be willing to stand up and defend the existence of departments that are, in fact, vital symbols of what a university SHOULD be in a modern society. That is: a space, a bastion for free thinking outside of market constraints and outside of market logic.
What will the 2027 university look like? . . .
I have also noticed how activities and interests tend nowadays to be devalued and restricted if they aren’t part of some commercial enterprise.