Later On

A blog written for those whose interests more or less match mine.

Archive for the ‘Memes’ Category

Parts & Recreation: Revell’s world of plastic models

leave a comment »

Ed Sexton, a former race car driver and a longtime manager at Revell, practicing his favorite hobby: building tiny plastic model cars.

.
Jeff Greenwald writes in Craftsmanship magazine:

1. “Very Much an Art”   
2. An Uniquely American Industry  
3. Industrial Ikebana   
4. Models of Obsession  
5. Could Revell Take on Lego?  
6. A Physical Story

My first plastic model, financed by weeks of snow shoveling, was Revell’s 1965 Gemini spacecraft. The kit had 93 parts, including two Lilliputian astronauts that I manipulated—with real envy—into the impossibly cramped capsule that would carry them into orbit. I remember bits of the process: the pages of the Long Island Press, spread over the kitchen table; the dizzying aroma of Testor’s glue; the UNITED STATES decals that seemed permanently attached to their backing until they suddenly slid off, in useless fragments, onto the painted plastic.

Over the years I built scores of models. I was a geeky adolescent outsider, sneaking into American pop culture through tiny plastic doors. While my peers were collecting Beatles singles, I exulted in the 1966 Batmobile that perched on my desk, honoring me with its silver rocket tubes and fine orange piping. A panoply of popular movie monsters snarled on my bookshelves. Each one had taken hours to assemble, but what else was I doing? Pong was still six years away.

Five decades later, in November, 2014, Warner Brothers re-released the entire original series of 120 Batman episodes. The news inspired an immediate visit to the neighborhood hobby shop, even though I hadn’t been inside one in decades.

In the 1960s and 70s, plastic models had sprung—as effortlessly as Pop-Tarts—from the aerospace programs, car designers and TV shows they mimicked. What were today’s inspirations? Once I arrived in the hobby shop, what amazed me most was that plastic models still existed—thousands of them, including a vintage Batmobile.  Yet unlike the models I built as a kid, most of these now bore a “Made in China” disclaimer. Even Revell, a company whose very logo looks like an American flag, had outsourced. But Revell’s home office was still in Illinois, apparently going strong. How could this be?

“VERY MUCH AN ART”

Sprawled over the flatlands some 30 miles northwest of Chicago, the boundaries of Elk Grove Village embrace the largest industrial park in the United States. More than 3,600 businesses have set up branches or headquarters in this former farming community. Next to Chicago itself, it’s the second largest manufacturing area in the country. Incongruously, the town still hosts its namesake: a herd of elk imported from the plains of Montana in the 1920s, now living in resigned boredom near the eastern edge of the Busse Woods Forest Preserve.

Brian Eble, vice president of marketing for Revell—still America’s premier model company—met me at the breakfast buffet of Elk Grove’s Comfort Inn, hand outstretched. Eble grew up on an Illinois farm and looks like a middle-aged superhero: close-cropped gray hair, a strong jaw, broad shoulders. An avid builder as a kid, he spent breakfast waxing philosophical about how model making had changed since our childhoods.

“Take a model car,” he suggested. “They used to carve the originals out of bass wood, and fashion the mold from that. Now, of course, it’s all done with computers. But the magic is the same. You’re taking a real car,” he said, lifting his java, “and shrinking it down to the size of this cup.

“Here’s the question,” he said. “How do you infuse craftsmanship into . . .

Continue reading. There’s much more, including many more photos.

Written by LeisureGuy

20 January 2021 at 2:07 pm

“Sense of Entitlement”: Rioters Faced No Consequences Invading State Capitols. No Wonder They Turned to the U.S. Capitol Next.

leave a comment »

Economics has the term “moral hazard,” which refers to a lack of incentive to guard against risk where one is protected from its consequences, e.g. by a bail-out. This issue was discussed a lot in the 2008 bailout of big banks, and indeed since the banks were protected from the consequences of their actions, they quickly returned to their old (and profitable) ways.

It strikes me that the lack of consequences for various offenses against the government (starting with, say, the 2014 Bundy armed refusal to stand down) has over time resulted in the insurrection in DC — and indeed many of the particcipants think they should not in any way face consequences for their actions.

Jeremy Kohler reports in ProPublica:

The gallery in the Idaho House was restricted to limited seating on the first day of a special session in late August. Lawmakers wanted space to socially distance as they considered issues related to the pandemic and the November election.

But maskless protesters shoved their way past Idaho State Police troopers and security guards, broke through a glass door and demanded entry. They were confronted by House Speaker Scott Bedke, a Republican. He decided to let them in and fill the gallery.

“You guys are going to police yourselves up there, and you’re going to act like good citizens,” he told the invaders, according to a YouTube video of the incident.

“I just thought that, on balance, it would be better to let them go in and defuse it … rather than risk anyone getting hurt or risk tearing up anything else,” Bedke said of the protesters in an interview last week. He said he talked to cooler heads in the crowd “who saw that it was a situation that had gotten out of control, and I think on some level they were very apologetic.”

That late-summer showdown inside the Statehouse in Boise on Aug. 24 showed supporters of President Donald Trump how they could storm into a seat of government to intimidate lawmakers with few if any repercussions. The state police would say later that they could not have arrested people without escalating the potential for violence and that they were investigating whether crimes were committed. No charges have been filed. The next day, anti-government activist Ammon Bundy and two others were arrested when they refused to leave an auditorium in the Statehouse and another man was arrested when he refused to leave a press area.

In a year in which state governments around the country have become flashpoints for conservative anger about the coronavirus lockdown and Trump’s electoral defeat, it was right-wing activists — some of them armed, nearly all of them white — who forced their way into state capitols in Idaho, Michigan and Oregon. Each instance was an opportunity for local and national law enforcement officials to school themselves in ways to prevent angry mobs from threatening the nation’s lawmakers.

But it was Trump supporters who did the learning. That it was possible — even easy — to breach the seats of government to intimidate lawmakers. That police would not meet them with the same level of force they deployed against Black Lives Matter protesters. That they could find sympathizers on the inside who might help them.

And they learned that criminal charges, as well as efforts to make the buildings more secure, were unlikely to follow their incursions. In the three cases, police made only a handful of arrests.

The failure to stop state capitol invasions is especially chilling after the attack on the U.S. Capitol last week, which left five dead, including a police officer, as lawmakers met to certify the election of President-elect Joe Biden.

Experts and elected officials said the lack of action by lawmakers and police created an environment that encouraged political violence. The FBI has warned of armed protests occurring in all 50 state capitols in the run-up to the inauguration on Wednesday. Authorities in both Washington and state capitols have dramatically strengthened security.

“Eventually, you get to the point of entitlement where you can get away with anything and there will never be any accountability,” the Idaho House minority leader, Ilana Rubel, a Democrat, said. “I don’t know that (Bedke) was wrong under the circumstances, but it adds up to creating a sense of entitlement.”

Bedke said he saw no correlation between the events in Boise and Washington. But domestic terror experts said in interviews that the statehouse invasions likely created a sense of impunity among right-wing activists. The feeling grew throughout the year as Trump praised gun-carrying activists at state capitols as “very good people” and emboldened the insurrectionists in Washington.

Amy Cooter, a Vanderbilt University sociologist and expert in the militia movement, said the U.S. Capitol attack may have been less likely to occur if the violence in state capitols had been met with harsher punishment.

What’s more, she said that authorities who failed to take action against protesters earlier may find it difficult to do so now.

While many Trump supporters already see their First Amendment rights as being under attack, they may see efforts to block them from state capitols as an attack on their Second Amendment rights, she said, further legitimizing their need to stand up to what they perceive as tyranny.

When officials acquiesce to demands, “it typically makes these folks feel like those are ‘constitutional’ officials who support their general aims, which can then embolden them against officials they believe to be the opposite, that is, officials they believe to be betraying their oaths to the people,” Cooter said.

If extremist groups “believe they have been given allowances in the past and are not moving forward, this can further reinforce that notion of officials who are derelict in their duty, officials who should be removed and, depending on what group we’re talking about, possibly officials who should be confronted with force.”

Days after Trump tweeted “LIBERATE MICHIGAN,” protesters taking part in an “American Patriot Rally” outside the Michigan Capitol in Lansing on April 30 swarmed into the building demanding an end to the stay-at-home order put in place by Gov. Gretchen Whitmer to combat the COVID-19 pandemic.

The group, which numbered in the hundreds, included several heavily armed men. Few wore face coverings or observed social distancing. A line of state police troopers and other Capitol employees held the mob back from entering the House floor.

“We had hundreds of individuals storm our Capitol building,” state Rep. Sarah Anthony said in an interview. “No, lives were not lost, blood was not shed, property was not damaged, but I think they saw how easy it was to get into our building and they could get away with that type of behavior and there would be little to no consequences.”

Some armed invaders entered the Senate gallery. While none of the protesters faced charges, two of the men seen in a photo posted by state Sen. Dayna Polehanki looking down on lawmakers would be among the 14 people charged months later in a plot to kidnap Whitmer and bomb the state Capitol.

“It made national and international . . .

Continue reading. There’s much more — other statehouses, for example.

Written by LeisureGuy

19 January 2021 at 12:55 pm

Turn Off the Gaslight: Manipulation through mindgames

leave a comment »

Ramani Durvasula, a licensed clinical psychologist in private practice in California and professor of psychology at California State University, Los Angeles, and a visiting professor at the University of Johannesburg and author of numerous books, writes in Aeon:

‘He didn’t mean anything by it, stop making such a big deal out of it.’

‘Here, let me take care of it, you don’t know what you are doing.’

‘You’re too sensitive.’

‘Stop overreacting.’

‘You keep imagining things.’

‘That’s not how it happened.’

‘Your memory seems to be slipping.’

Such comments undermine our trust in ourselves and our belief in what we know. More than that, they trespass on our sense of identity. The more we hear such phrases, the more we stop trusting ourselves. When another person becomes a gatekeeper to our reality, then we’re in a precarious spot – vulnerable to further manipulation and control. This reality-doubting is called ‘gaslighting’.

As a psychologist in practice, I often see my role as the person who turns off the gaslights. I work with survivors of relationships with high-conflict, antagonistic, rigid, entitled, dysregulated people. These might be their partners, parents, adult children, siblings or colleagues. Once we remove the gaslight, and the house lights come on, my clients recognise that this one difficult person in their lives was the tip of a dysfunctional iceberg.

The term gaslighting derives from theatre and film. Patrick Hamilton’s play Gas Light (1938) was adapted as the British film Gaslight in 1940 and the American classic of the same name in 1944. To this day, Gaslight, a reference to the flickering gaslights featured in the drama, remains a masterclass on how one predatory partner captivates and then slowly undermines the other.

The play and films introduced the term ‘gaslighting’ into our vernacular to refer to a specific type of manipulation – one in which a person’s reality itself is hijacked by another. This can also be manifested by minimisation, deflection, denial and coercive control. The term is now ubiquitous, and we apply it not just to close relationships but also to any reality-bending that is generated by institutions, media and leaders. The genius of the films was to remind us that gaslighting is actually a grooming process, not just a singular event. It’s a process of establishing and then exploiting trust and authority to achieve an endgame of control and dominance.

The backstory (spoiler alert) is the murder of a famed London opera singer. The murderer fails to leave with the jewels he’s come for because he’s interrupted by the victim’s niece, Paula (played by Ingrid Bergman). Years pass, and Paula meets Gregory (Charles Boyer), who unbeknown to her is the murderer. They marry after a quick courtship, and he insists on moving back to the house where the murder occurred, slowly manipulating her reality, including the flickering lights, all with the intent of retrieving the jewels, at last.

In Gaslight, we witness the architecture of abusive relationships. These are relationships that proceed too quickly, too intensely – ‘she was swept off her feet’. Paula was primed to miss the red flags because she’d endured the traumatic loss of her beloved aunt and, upon returning to London, was living in a space associated with grief. Gaslight also shows us the danger of romanticising behaviours such as showing up out of nowhere and surprising a new partner, of insisting on spending time with her alone and creating their own little world together, which can be harbingers of more insidious abusive relationship dynamics such as stalking and isolation. The relationship creates a dynamic in which it is simpler and safer for Paula to doubt herself than to question him.

Atherapist bears witness and validates the pain of her clients, hoping to engender insight, change, and the ability to steer one’s own life. I have spent decades turning off the gaslights that flicker and glow in my client’s lives. They experienced the denial of childhood trauma by parents and family, or the invalidation of controlling spouses who acted as judge and jury on their emotional states. My clients have been told by another person or persons how they feel for so long that they no longer feel able to identify their own emotions. To work with clients being gaslighted means dismantling childhood and religious teachings, societal frameworks and cultural codes of conduct. Year after year, I listen to stories of ‘wonderful’ childhoods that devolve into a Eugene O’Neill play under the harsh glare of sunlight and therapeutic interpretation.

Where this struck me most was in working with clients who have endured gaslighted marriages for 20 years or longer. (My specific focus is in an area called narcissistic abuse, a phenomenon whereby people become riddled with self-doubt, anxiety and confusion after being in a relationship with an unempathic, entitled, arrogant, egocentric, manipulative partner, family member or other individual.) These were marriages littered with a range of patterns including control, infidelity, a malignant neglect, deceit, an adult life spent having their realities and voices erased.

There was the moment in therapy, when the word ‘abuse’ would come out of my mouth, and the reactions were almost universal:

‘Abuse, no, that’s not me, it was just difficult, in fact, I think maybe my expectations were too high.’

‘He only pushed me once, we were both really mad.’

Over the years of their marriages, the self-gaslighting started to become reflexive. My clients fell into the propaganda that they termed marriage, and a chorus of enablers allowed them to maintain the delusion and the illusion. Once the word ‘abuse’ entered the conversation, a transformation occurred, a new narrative entered the room.

Some would terminate therapy. They would say: ‘Thank you for returning my reality to me, but I won’t leave the relationship, and now I understand I was fighting the wrong battles.’ Others used the therapeutic validation as a call to arms, once the gaslight was turned off, once they no longer fell into the narcissist’s reality, the mortar went out of the bricks of the relationship and, the next time the partner threatened divorce, they smiled and said ‘Sounds good.’

To watch a client come out of gaslighting is to witness someone come back into their own (or come into it for the first time). But I also witnessed clients become isolated. Nobody around them wanted to hear about it, and they would often face gaslighting outside of their marriage. ‘Are you sure it happened that way?’ or ‘That’s just your version of the events.’ They were rarely told that their reality was valid. Many of them were looking for a simple benediction that would strengthen their resolve. However, I wasn’t just seeing this in marriages. My clients who experienced abuse in childhood were still hearing family members tell them in adulthood: ‘Just let it go, he’s dead, and far worse abuse has happened to other people.’ The gaslighting of childhood was sustained in adulthood and made the trauma far more difficult to release. These lights can flicker for a lifetime.

Deconstruction of gaslighting as a concept is something that philosophers have done better than psychologists. Recent papers by

Andrew Spear (2019) and Kate Abramson (2014) addressed this phenomenon through a dispassionate lens, and proposed that gaslighting is a multistep process of indoctrination. It is comprised of initially drawing in a target; establishing trust and authority (or capitalising on existing trust – for example, a family member or a spouse); slowly dismantling that person’s sense of trust in herself through doubt and questioning or by manipulating elements of the physical environment (eg, moving or hiding objects – and then denying it); eroding a sense of self-trust and self-knowledge in the victim so the victim is less likely to doubt the gaslighter’s word; and finally winning over the victim’s agreement with the gaslighter’s reality. Ultimately, this robs the victim of his or her autonomy and cements the victim’s ongoing consent.

Traditional conceptualisations of gaslighting focus on the emotional abuse inherent in doubting a person’s reality with a goal of destabilising the victim. This isn’t just about the gaslighter’s need for control and capitulation, but their need for consent. The ultimate ‘agreement’ of their victim renders a picture of the relationship to the world that looks consensual and cooperative. The impact of gaslighting is most acutely observed in cult members or others who seem brainwashed – they espouse agreement with the tenets of the cult leader, and over time it appears as though the views of the cult are their own. Once that kind of agreement and acceptance are issued, it is far more difficult for the victim to exit from the situation or relationship.

There is a menacing simplicity to the gaslighter’s motivations – by and large, they appear to be motivated by power and control, which is likely a compensatory offset of their own sense of insecurity. Gaslighters project their own insecurity onto their victims and magnify any insecurity that their victims already have. To achieve this, . . .

Continue reading. There’s much more. And gaslighting is much more common than many (including current victims) realize.

Written by LeisureGuy

18 January 2021 at 4:16 pm

Rep. Watson Coleman: “I’m 75. I had cancer. I got covid-19 because my GOP colleagues dismiss facts.”

leave a comment »

Bonnie Watson Coleman, a Democrat representing New Jersey’s 12th Congressional District in the U.S. House of Representatives, writes in the Washington Post:

Over the past day, a lot of people have asked me how I feel. They are usually referring to my covid-19 diagnosis and my symptoms. I feel like I have a mild cold. But even more than that, I am angry.

I am angry that after I spent months carefully isolating myself, a single chaotic day likely got me sick. I am angry that several of our nation’s leaders were unwilling to deal with the small annoyance of a mask for a few hours. I am angry that the attack on the Capitol and my subsequent illness have the same cause: my Republican colleagues’ inability to accept facts.

When I left for Washington last week, it was my first trip there in several months. I had a list of things to accomplish, including getting my picture taken for the card I use when voting on the House floor. For the past two years, I appeared on that card completely bald as a result of the chemotherapy I underwent to eliminate the cancer in my right lung. It was because of that preexisting condition that I relied so heavily on the proxy voting the House agreed to last year, when we first began to understand the danger of covid-19.

I was nervous about spending a week among so many people who regularly flout social distancing and mask guidelines, but I could not have imagined the horror of what happened on Jan. 6.

To isolate as much as possible, I planned to spend much of my day in my apartment, shuttling to the House floor to vote. But the building shares an alley with the Republican National Committee, where, we’d later learn, law enforcement found a pipe bomb. I was evacuated from that location early in the afternoon.

The next best option would have been my office in the Cannon House Office Building, where just three of my staffers worked at their desks to ensure safe distancing. Before I arrived, security evacuated that building as well, forcing us to linger in the hallways and cafeteria spaces of the House complex. As I’m sure you can imagine, pushing the occupants of an entire building into a few public spaces doesn’t make for great social distancing. Twice, I admonished groups of congressional staff to put on their masks. Some of these staffers gave me looks of derision, but slowly complied.

My staff and I then decided that the Capitol building would likely be the safest place to go, since it would be the most secure and least likely to be crowded. I’ve spent a lot of time since in utter disbelief at how wrong those assumptions turned out to be.

Everyone knows what happened next: A mob broke through windows and doors and beat a U.S. Capitol Police officer, then went on a rampage. Members and staff took cover wherever we could, ducking into offices throughout the building, then were told to move to a safer holding location.

I use “safer” because, while we might have been protected from the insurrectionists, we were not safe from the callousness of members of Congress who, having encouraged the sentiments that inspired the riot, now ignored requests to wear masks.

I’ve been asked if I will share the names of those members. You’ve probably seen video of some of them laughing at my colleague and friend Rep. Lisa Blunt Rochester (D-Del.) as she tries to distribute masks. But it’s not their names that matter.

What matters are facts, both about the covid-19 pandemic and the conduct of the 2020 election:

You can, in fact, breathe through a mask. Doctors have been doing it for decades. It is occasionally annoying — my glasses tend to fog, and when I wear makeup and a mask, I end up with smudged lipstick. That is a small price to pay for the safety of those around me.

You can, in fact, count on a mask to reduce the chances of spreading the virus. Studies of how many droplets escape into the air and the rates of infection following the implementation of mask mandates both prove effectiveness.

Refusing to wear a mask is not, in fact, an act of self-expression. It’s an act of public endangerment. The chaos you create  . ..

Continue reading.

Alex London (@ca_london) noted on Twitter:

Members of Congress who feel they have to carry a gun to protect their colleagues but won’t wear a mask to protect their colleagues, don’t want to protect their colleagues. They’re just hoping they get to kill someone.

Written by LeisureGuy

17 January 2021 at 10:16 am

Good insight: The far right embraces violence because it has no real political program

leave a comment »

Suzanne Schneider, a historian at the Brooklyn Institute for Social Research and author of the forthcoming book “The Apocalypse and the End of History,” writes in the Washington Post:

More than a week has passed since a pro-Trump mob overran the U.S. Capitol, but we are still struggling to come to terms with the day’s events. Much of the difficulty stems from the fact that the Trump mob was both menacing and ridiculous, dangerous and utterly delusional. On one hand, there was an absurdist quality to many participants: conspiracy theorists, neo-Nazis, militia members, fans of animal pelts. Yet our cosplaying revolutionaries were not playing at all, leaving five dead and dozens wounded. Some said they were intent on genuine violence: “We will storm the government buildings, kill cops, kill security guards, kill federal employees and agents, and demand a recount,” a user reportedly wrote on 8kun the day before the assault.

We cannot make sense of the Capitol attack simply by trying to assess whether its perpetrators were really out for blood or just acting out a game of make-believe for the benefit of the cameras. The Trump insurrectionists exposed that a politics of spectacle, built upon delusion, is no less dangerous than “the real thing.” Precisely because they lack an affirmative political vision, far-right movements fetishize violence as the premier form of civic participation. It is what is offered to the masses in lieu of actual power. The result is violence that becomes almost casual, shorn of any political rationale and reflecting a reality in which human beings are just as disposable as their video game counterparts.

Events from recent years make it clear that the binary between fantasy and danger is a false one. Consider, for instance, the mass shooters who live-stream their rampages on Facebook or gaming platforms such as Twitch, a growing trend from Florida to New Zealand to Germany. Performative violence of this sort is no less real for being optimized for our new media ecosystem. If anything, performative violence gains its horrific quality because it treats human beings as means to an end — props that frame the protagonists’ moment of glory. The attack on the Capitol exists on a spectrum with these acts of violence, offering yet another instance of live action role play directed against real human bodies. The truly frightening thing about cosplaying in this regard is that it is part of a politics of delusion that is acted out in the real world. That many who participated in the attack are having trouble grasping the legal consequences that came along with their live-streamed insurrection testifies to this sense of confusion between material life and the revolutionaries they played on TV.

What does the growing prevalence of this mode of violence as spectacle — and the groups that embrace it — mean? In 1936, the German-Jewish critic Walter Benjamin observed that “fascism sees its salvation in giving these masses not their right, but instead a chance to express themselves.” That is to say, fascists used art in the service of politics to deflect people from pursuing the redistributive demands that historically came alongside mass political movements. Today, too, such performances furnish excitement and purpose for participants while leaving alone the underlying power structures that oppress them. Benjamin noted the rise of fascist aesthetics in contemporary film, visual arts, and ceremonies and other civic rituals; today, we encounter a much-reduced range of aesthetic expression. To the extent that the far right makes art, composes music or writes literature, it is so poor in quality that it can be read only as kitsch. What is left, and what is truly glorified within the emerging far-right imagination, is violence. Ours might be a hollowed-out fascism, a reality TV version of the 20th century’s premier political horror, but that does not make it any less dangerous. Kitsch can also kill.

For far-right leaders today, inciting violence against the nation’s “enemies” offers the fan base a pathway to political participation that preserves the anti-democratic character of the movement, as if to say: We do not need you to govern, only to harm. It is no wonder, then, that intimations of violence have become a common mode of personal expression among adherents of current far-right movements: Cue a thousand photos of extremists decked out in tactical gear, toting their professional-grade death tools and looking eager to reenact some bit of revolutionary drama. The insurrectionist wearing the “Camp Auschwitz” sweatshirt seemed ready to take up his guard duties against political prisoners but not to stop the certification of Biden’s victory. Violence has become the central act through which the far right understands political agency, which is why fantasies about harming the nation’s “enemies” — journalistsactivistsopposition politicians — abound within the right-wing imaginary.

Violence is not, in this sense, ancillary to far-right politics but central to preserving the vast inequalities that even its “moderate” supporters wish to maintain. Beyond the tax cuts and deregulation so favored by his plutocratic backers, President Trump’s signature accomplishments were notable for their gratuitous cruelty: the ban on travel from Muslim nations, family separation at the southern border, home invasions and deportations by Immigration and Customs Enforcement that served no material interest beyond offering his fan base reasons to cheer. These are not disjointed parts of the right-wing agenda, as Jacob Hacker and Paul Pierson have recently argued, but rather co-dependent, which is one reason the growth of white nationalism has mirrored the uptick in economic inequality. Acts of violence, particularly against people of color, are the spoonful of sugar that helps the GOP’s economic platform — notoriously unpopular among its base — go down. Violence does the deflective work Benjamin identified with fascist aesthetics.

The events this month also underscored that “freedom” — that most signature of conservative values — has been refashioned to contain violence at its core: freedom to carry a weapon and use it at will, to infect others around you during a pandemic, to die of preventable disease rather than submit to a national health-care system. Moreover, the primacy of violence within the right’s political vision also helps explain why our authorized death dispensers — police officers and military personnel — have become demigods in certain circles. (That’s why it was so shocking to see the Trump mob engage Capitol Police officers in battle, violating the unmatched sanctity of blue lives.) The right fringe also likes to . . .

Continue reading.

Written by LeisureGuy

16 January 2021 at 7:09 pm

Possession as a web made of memes and a look at identity: How we swim in the ocean of cultural entities and understandings

leave a comment »

A few mornings ago for some reason I found myself pondering the idea of possession. I was looking at one of my razors — not the Fendrihan Mk II I was using but the Fine aluminum slant on the shelf — and thinking that it was nice to own it — but I realized that “being owned” is not discoverable from the razor itself. “Ownership” exists not in the natural world but only in the meme-universe of human cultural knowledge, and cultural content is not part of the natural world but instead comes from the cultural knowledge of the observer.

One example of this consists of what you see here: black markings on a white background:

이 문장에는 의미가 있습니다 (한국어를 아는 경우에만 해당).

Not matter how closely you examine those markings, they remain simply black marks (unless, of course you have the cultural knowledge to interpret them).

I then encountered the following post by Maria Popova in Brain Pickings. The post addresses how our identities are not from nature but are formed from cultural elements.

“A person’s identity,” Amin Maalouf wrote as he contemplated what he so poetically called the genes of the soul“is like a pattern drawn on a tightly stretched parchment. Touch just one part of it, just one allegiance, and the whole person will react, the whole drum will sound.” And yet we are increasingly pressured to parcel ourselves out in various social contexts, lacerating the parchment of our identity in the process. As Courtney Martin observed in her insightful On Being conversation with Parker Palmer and Krista Tippett, “It’s never been more asked of us to show up as only slices of ourselves in different places.” Today, as Whitman’s multitudes no longer compose an inner wholeness but are being wrested out of us fragment by fragment, what does it really mean to be a person? And how many types of personhood do we each contain?

In the variedly stimulating 1976 volume The Identities of Persons (public library), philosopher Amelie Rorty considers the seven layers of personhood, rooted in literature but extensible to life. She writes:

Humans are just the sort of organisms that interpret and modify their agency through their conception of themselves. This is a complicated biological fact about us.

Rorty offers a brief taxonomy of those conceptions before exploring each in turn:

Characters are delineated; their traits are sketched; they are not presumed to be strictly unified. They appear in novels by Dickens, not those by Kafka. Figures appear in cautionary tales, exemplary novels and hagiography. They present narratives of types of lives to be imitated. Selves are possessors of their properties. Individuals are centers of integrity; their rights are inalienable. Presences are descendants of souls; they are evoked rather than presented, to be found in novels by Dostoyevsky, not those by Jane Austen.

Depending on which of these we adopt, Rorty argues, we become radically different entities, with different powers and proprieties, different notions of success and failure, different freedoms and liabilities, different expectations of and relations to one another, and most of all a different orientation toward ourselves in the emotional, intellectual, and social spaces we inhabit.

And yet we ought to be able to interpolate between these various modalities of being:

Worldliness consists of [the] ability to enact, with grace and aplomb, a great variety of roles.

Rorty begins with the character, tracing its origin to Ancient Greek drama:

Since the elements out of which characters are composed are repeatable and their configurations can be reproduced, a society of characters is in principle a society of repeatable and indeed replaceable individuals.

Characters, Rorty points out, don’t have identity crises because they aren’t expected to have a core unity beneath their assemblage of traits. What defines them is which of these traits become manifested, and this warrants the question of social context:

To know what sort of character a person is, is to know what sort of life is best suited to bring out his potentialities and functions… Not all characters are suited to the same sorts of lives: there is no ideal type for them all… If one tries to force the life of a bargainer on the character of a philosopher, one is likely to encounter trouble, sorrow, and the sort of evil that comes from mismatching life and temperament. Characters formed within one society and living in circumstances where their dispositions are no longer needed — characters in time of great social change — are likely to be tragic. Their virtues lie useless or even foiled; they are no longer recognized for what they are; their motives and actions are misunderstood. The magnanimous man in a petty bourgeois society is seen as a vain fool; the energetic and industrious man in a society that prizes elegance above energy is seen as a bustling boor; the meditative person in an expansive society is seen as melancholic… Two individuals of the same character will fare differently in different polities, not because their characters will change through their experiences (though different aspects will become dominant or recessive) but simply because a good fit of character and society can conduce to well-being and happiness, while a bad fit produces misery and rejection.

Rorty’s central point about character takes it out of the realm of the literary and the philosophical, and into the realm of our everyday lives, where the perennial dramas of who we are play out: . . .

Continue reading.

Written by LeisureGuy

13 January 2021 at 9:33 am

A Game Designer’s Analysis Of QAnon

leave a comment »

In watching the videos of the attack on the Capitol it struck me that it seemed somewhat like a multplayer online game acted out in real life (especially given that such games usually seem to involve combat — as if we are doing simulation training to make people adopt violence as the standard way of solving problems). The parallels — and the effects of learning behaviors from online games — are discussed in a very interesting article on Medium.

Friedrich Nietzche’s famously wrote, “He who fights with monsters should look to it that he himself does not become a monster. And if you gaze long into an abyss, the abyss also gazes into you.” First I’ll observe that the QAnon worldview does indeed have its converts fighting monsters (cannibalistic pedophile Satan-worshiping liberals), and as we saw on Wednesday, some of the QAnon faithful have indeed become monsters. Moreover, as the following article points out, those playing the game QAnon are being played by the game.

One thing I gleaned from the article is why teaching by the Socratic method is so effective

Reed Berkowitz writes:

I am a game designer with experience in a very small niche. I create and research games designed to be played in reality. I’ve worked in Alternate Reality Games (ARGs), LARPsexperience fictioninteractive theater, and “serious games”. Stories and games that can start on a computer, and finish in the real world. Fictions designed to feel as real as possible. Games that teach you. Puzzles that come to life all around the players. Games where the deeper you dig, the more you find. Games with rabbit holes that invite you into wonderland and entice you through the looking glass.

When I saw QAnon, I knew exactly what it was and what it was doing. I had seen it before. I had almost built it before. It was gaming’s evil twin. A game that plays people. (cue ominous music)

QAnon has often been compared to ARGs and LARPs and rightly so. It uses many of the same gaming mechanisms and rewards. It has a game-like feel to it that is evident to anyone who has ever played an ARG, online role-play (RP) or LARP before. The similarities are so striking that it has often been referred to as a LARP or ARG. However this beast is very very different from a game.

It is the differences that shed the light on how QAnon works and many of them are hard to see if you’re not involved in game development. QAnon is like the reflection of a game in a mirror, it looks just like one, but it is inverted.

Guided Apophenia

In one of the very first experience fictions (XF) I ever designed, the players had to explore a creepy basement looking for clues. The object they were looking for was barely hidden and the clue was easy. It was Scooby Doo easy. I definitely expected no trouble in this part of the game.

But there was trouble. I didn’t know it then, but its name was APOPHENIA.

Apophenia is “the tendency to perceive a connection or meaningful pattern between unrelated or random things (such as objects or ideas)

As the participants started searching for the hidden object, on the dirt floor, were little random scraps of wood.

How could that be a problem!?

It was a problem because three of the pieces made the shape of a perfect arrow pointing right at a blank wall. It was uncanny. It had to be a clue. The investigators stopped and stared at the wall and were determined to figure out what the clue meant and they were not going one step further until they did. The whole game was derailed. Then, it got worse. Since there obviously was no clue there, the group decided the clue they were looking for was IN the wall. The collection of ordinary tools they found conveniently laying around seemed to enforce their conclusion that this was the correct direction. The arrow was pointing to the clue and the tools were how they would get to it. How obvious could it be?

I stared in horror because it all fit so well. It was better and more obvious than the clue I had hidden. I could see it. It was all random chance but I could see the connections that had been made were all completely logical. I had a crude backup plan and I used it quickly before these well-meaning players started tearing apart the basement wall with crowbars looking for clues that did not exist.

These were normal people and their assumptions were normal and logical and completely wrong.

In most ARG-like games apophenia is the plague of designers and players, sometimes leading participants to wander further and further away from the plot and causing designers to scramble to get them back or (better yet) incorporate their ideas. In role-playing games, ARGs, video games, and really anything where the players have agency, apophenia is going to be an issue.

This happens because in real games there are actual solutions to actual puzzles and a real plot created by the designers. It’s easy to get off track because there is a track. A great game runner (often called a puppet-master) can use one or two of these speculations to create an even better game, but only as much as the plot can be adjusted for in real time or planned out before-hand. It can create amazing moments in a game, but it’s not easy. For instance, I wish I could have instantly entombed something into that wall in the basement because it would have worked so well, but I was out of luck!

If you are a designer, and have puzzles, and have a plot, then apophenia is a wild card you always have to be concerned about.

QAnon is a mirror reflection of this dynamic. Here apophenia is the point of everything. There are no scripted plots. There are no puzzles to solve created by game designers. There are no solutions.

QAnon grows on the wild misinterpretation of random data, presented in a suggestive fashion in a milieu designed to help the users come to the intended misunderstanding. Maybe “guided apophenia” is a better phrase. Guided because the puppet masters are directly involved in hinting about the desired conclusions. They have pre-seeded the conclusions. They are constantly getting the player lost by pointing out unrelated random events and creating a meaning for them that fits the propaganda message Q is delivering.

There is no reality here. No actual solution in the real world. Instead, this is a breadcrumb trail AWAY from reality. Away from actual solutions and towards a dangerous psychological rush. It works very well because when you “figure it out yourself” you own it. You experience the thrill of discovery, the excitement of the rabbit hole, the acceptance of a community that loves and respects you. Because you were convinced to “connect the dots yourself” you can see the absolute logic of it. This is the conclusion you arrived at. More about this later.

Everyone on the board agrees with you because it’s highly likely they were the ones that pointed it out to you just for that purpose. (more on this later)

“Hey, what’s that?!”

“It looks like an arrow, pointing at the wall.”

“Why do you think it’s there? Do people just leave arrows pointing to things randomly? What does your common sense say about that?”

“It says there must be something there.”

“Yes. You are right. Maybe you should look at it more closely?”

Every cloud has a shape that can look like something else. Everything that flickers is also a jumble of Morse code. The more information that is out there, the easier it is to allow apophenia to guide us into anything. This is about looking up at the sky and someone pointing out constellations.

The difference is that these manufactured connections lead to the desired conclusions Q’s handlers have created. When players arrive at the “correct” answers they are showered with adoration, respect, and social credit. Like a teenage RP, the “correct” answer is the one that the group respects the most and makes the story the most enjoyable. The idea that bolsters the theory. The correct answer is the one that provides the poster with the most credit.

It’s like a Darwinian fiction lab, where the best stories and the most engaging and satisfying misinterpretations rise to the top and are then elaborated upon for the next version.

Even Q-Anon was only one of several “anons” including FBIanon and CIAanon, etc, etc. Q rose to the top, so it got its own YouTube channels. That tested, so it moved to Reddit. The theories that didn’t work, disappeared while others got up-voted. It’s ingenious. It’s AI with a group-think engine. The group, led by the puppet masters, decide what is the most entertaining and gripping explanation, and that is amplified. It’s a Slenderman board gone amok.

Let’s go back to the arrow on the ground again.

It was not an arrow on the ground, pointing to a clue in a wall. It was just some random bits of wood. They did not discover an arrow. They created it. They saw random pieces of wood and applied their intelligence to it, and this is everything.

It’s easy for people to forget that they are not discovering the story, but creating it from random data.

Propaganda and Manipulation

Another major difference between QAnon and an actual game, is that Q is almost pure propaganda. That IS the sole purpose of this. It’s not advertising a product, it’s not for fun, and it’s not an art project. There is no doubt about the political nature of the propaganda either. From ancient tropes about Jews and Democrats eating babies (blood-libel re-booted) to anti-science hysteria, this is all the solid reliable stuff of authoritarianism. This is the internet’s re-purposing of hatred’s oldest hits. The messaging is spot on. The “drops” implanted in an aspic of anti-Semitic, misogynist, and grotesque posts on posting boards that, indeed, have been implicated in many of the things the fake conspiracy is supposed to be guilty of!

Q is also operating in conjunction with  . . .

Written by LeisureGuy

11 January 2021 at 12:30 pm

Why poor people find Trump attractive

leave a comment »

This is a Twitter thread that seems to have been deleted. It is by @jpbrammer and was posted 18 Nov 2016. I have typed it out from screengrabs of the tweets.

So I’m a Mexican-American from a poor rural (mostly white) town in Oklahoma. Missing from this debate? How poor whites seem themselves.

If you’re wondering how poor exploited white people could vote for a dude with a golden elevator who will fuck them over, here’s how.

They don’t see themselves as poor. They don’t base their identity on it. They see themselves as “temporarily embarrassed millionaires.”

The stigma against poverty is incredibly strong. It is shameful to be poor, to not have the comforts of the middle class. So they pretend —

that they aren’t poor. They are willing to lie to make it seem that they aren’t poor. They purchase things to make it seem like they’re not.

In my town, wealth waan’t associated with greed, but with hard work and inherent goodness. You are blessed if you have material wealth.

When they see Trump they don’t see an extortionist who is rich because of the very conditions that keep their own communities in poverty.

They see someone who worked hard and was justly rewarded with wealth. Most men, especially, think they too could be Trump were it not for

the unfair obstacles put in their way. White men who don’t consider themselves successful enough have so many excuses for their “failures.”

The idea that immigrants are the reason they are poor and not wealthy like Trump is so appealing. It takes all the shame and blame away.

And here we have a man who, they think, “tells it like it is’ and is willing to name the things stealing prosperity out of their hands.

If these people saw themselves as an exploited class of people, if American culture didn’t stigmatize poverty so much, it might be different.

But American has so entangled wealth with goodness and poverty with moral deficiency that they can’t build that identity. They won’t.

Trump is rich, and so according to American criteria, he is also:
1. Wise
2. Fair
3. Moral
4. Deserving
5. Strong
6. Clever
He *has* to be.

Capitalism and the American Dream teach that poverty is a temporary state that can be transcend with hard work and cleverness.

To fail to transcend poverty, and to admit that you are poor, is to admit that you are neither hardworking nor clever. It’s cultural brainwashing.

So if an exploited class of people don’t want to admit they’re exploited and they blame themselves for their oppression, what manifests?

Xenophobia. Hatred of anyone who is “different,” queer people, people of color. These people are eroding the “goodness” of America.

And if they would just stop ruining America, then the perfect design of America could work again and deliver prosperity.

I’m telling you, as someone who has spent almost his entire life in this environment, that if you think cities are a “bubble…” Good God.

How you balance those realities, and what conclusions you reach to improve the lives of both, well, I’m not smart enough to have the answer.

Still, we need to understand the identity working class white people have built for themselves, on diametrically opposed to, well, reality.

Because Trump won’t make them rich. Even if he deports all the brown people, it won’t bring them what they’re hoping for.

It strikes me that once a person’s falls into accepting an illusion as true, they become vulnerable to more deceptions because they’ve lost touch with the testing ground of reality — false hopes, false dreams, false statements have more power on those who already live in self-deception or who already believe a false vision.

Written by LeisureGuy

10 January 2021 at 3:01 pm

Arnold Schwarzenegger points out similarities between Capitol Hill insurrection and Austria’s Kristallnacht

leave a comment »

Written by LeisureGuy

10 January 2021 at 11:27 am

I have to agree with Baruch Spinoza

with one comment

The brief video summary gives the reasons I agree, and I further would say that the only appropriate prayers are those of praise or gratitude. Asking God to make your football team win is (IMO) so misguided that one doesn’t know where to begin — it’s on a part with asking God to make Lima the capital of Boliva because that’s the answer you gave on the mid-term. Petitions to God are, in my view, never appropriate — at least for those who believe that God is all-loving, all-wise, and all-powerful. The only petitionary prayer that makes sense is, “Thy will be done. Amen”

So far as Spinoza’s omissions, I don’t see them as significant. So far as rituals are concerned, one can celebrate any number of things — obvious examples are the winter solstice, when the celebration is the turn toward longer days; the vernal equinox, when daylight lasts longer than night; the summer solstice, the peak of the sun and the beginning of the sumer; and the autumnal equinox, when harvest celebrations might begin. And of course one has rituals to recognize births, marriages, deaths, and anniversaries (such as birthdays or Bastille Day or the like). In fact, the equinoxes and solstices are often co-opted by religions, which time religious rituals to (roughly) coincide with those natural demarcations.

And belonging to a group does not require religion — one’s family is a start, and there are groups based on shared enthusiasms (sports fans, game players, literary discussion groups, bowling leagues), shared experiences (classmates, veterans organizations), shared location (neighborhood groups, civic organizations), shared outlooks (political parties and organizations, environmental groups).

At any rate, I was struck by Spinoza’s view of the world in which we find ourselves.

The video is from an Open Culture post by Josh Jones, which begins:

The so-called Enlightenment period encompasses a surprisingly diverse collection of thinkers, if not always in ethnic or national origin, at least in intellectual disposition, including perhaps the age’s most influential philosopher, the “philosopher’s philosopher,” writes Assad MeymandiBaruch Spinoza did not fit the image of the bewigged philosopher-gentleman of means we tend to popularly associate with Enlightenment thought.

He was born to a family of Sephardic Portuguese Marranos, Jews who were forced to convert to Catholicism but who reclaimed their Judaism when they relocated to Calvinist Amsterdam. Spinoza himself was “excommunicated by Amsterdam Jewry in 1656,” writes Harold Bloom in a review of Rebecca Goldstein’s Betraying Spinoza: “The not deeply chagrined 23-year-old Spinoza did not become a Calvinist, and instead consorted with more liberal Christians, particularly Mennonites.” . . .

Continue reading. There’s more, including links.

Written by LeisureGuy

7 January 2021 at 11:43 am

New article on Medium: “Choosing Which Student Goes Next”

leave a comment »

The article will be primarily of interest to teachers. If you teach or know someone who does, take a look.

Written by LeisureGuy

4 January 2021 at 10:49 am

Another 52 interesting things

leave a comment »

Early last month I posted some of the 52 things Tom Whitewell had learned the previous year (with a link to his full list). I just learned that a year ago he posted a similar list, which begins:

  1. Each year humanity produces 1,000 times more transistors than grains of rice and wheat combined. [Mark P Mills]

Continue reading. There are 41.5 more.

Written by LeisureGuy

3 January 2021 at 11:47 am

What New Science Techniques Tells Us About Ancient Women Warriors

leave a comment »

The past is a foreign country; they do things differently there.
— L.P. Hartley, writer (30 Dec 1895-1972)

A NY Times article suggests how people attempt to project their own cultural and social conventions on other societies even when it is totally inappropriate. To be fair, such projection is generally done from ignorance rather than ill will, though ill will quickly arises if the conventions are questioned. I think this is because people construct their identities from memes, generally taken from their cultural/social environment, so those conventions tend to be view as natural law with a heavy moral overlay. To deny them can feel to some as if their identity is in danger.

Of course, culture and social convention are subject to evolution and thus change over time. As an example of a change in social/cultural conventions, take the author of the article mentioned below, Mx. Annalee Newitz. Some decades back we went through a cultural shift away from the requirement that the marital status of women must be signified in the honorific: back then one had to use “Miss” for unmarried women, “Mrs.” for married women. (Men, of course, were called “Mr.” regardless of their marital status.)

The inequality was obvious, so in a relatively short period of time, the honorific “Ms.” (pronounced “mizz”) became common, readily adopted because it solved the problem of knowing which honorific to use when you did not know the woman’s marital status. (“Miss” and “Mrs.” were outliers among honorifics in requiring knowledge of marital status, since other honorifics — Mr., Capt., Dr., Rev., Prof., etc. — required no knowledge of marital status.) Indeed, in Southern speech, “mizz” was long since commonly used for both “Miss” and “Mrs.”

“Mx.” eases the burden of knowledge one step further: “Mx.” (pronounced “mix”) is an honorific that applies to a person without regard to gender — in effect, it is the honorific equivalent of “human” or “person.”

I think it would be quite useful, and will be quickly adopted by those whose name is ambiguous as regards gender and thus frequently get the wrong honorific (“Mr.” when “Ms.” is right, or “Ms.” when “”Mr.” is right — “Mx.” finesses the problem altogether). I’m thinking of names like Shirley (remember the spots columnist Shirley Povich?), Pat, Robin, Leslie, Sandy, Kim, Marion (John Wayne’s real name), Charlie, Evelyn, Sue, and so on.

So “Mx.” is the honorific equivalent of “human” or “person”: no comment regarding gender, but showing respect as a person.

Mx. Newitz writes:

Though it’s remarkable that the United States finally is about to have a female vice president, let’s stop calling it an unprecedented achievement. As some recent archaeological studies suggest, women have been leaders, warriors and hunters for thousands of years. This new scholarship is challenging long-held beliefs about so-called natural gender roles in ancient history, inviting us to reconsider how we think about women’s work today.

In November a group of anthropologists and other researchers published a paper in the academic journal Science Advances about the remains of a 9,000-year-old big-game hunter buried in the Andes. Like other hunters of the period, this person was buried with a specialized tool kit associated with stalking large game, including projectile points, scrapers for tanning hides and a tool that looked like a knife. There was nothing particularly unusual about the body — though the leg bones seemed a little slim for an adult male hunter. But when scientists analyzed the tooth enamel using a method borrowed from forensics that reveals whether a person carries the male or female version of a protein called amelogenin, the hunter turned out to be female.

With that information in hand, the researchers re-examined evidence from 107 other graves in the Americas from roughly the same period. They were startled to discover that out of 26 graves with hunter tools, 10 belonged to women. Bonnie Pitblado, an archaeologist at the University of Oklahoma, Norman, told Science magazine that the findings indicate that “women have always been able to hunt and have in fact hunted.” The new data calls into question an influential dogma in the field of archaeology. Nicknamed “man the hunter,” this is the notion that men and women in ancient societies had strictly defined roles: Men hunted, and women gathered. Now, this theory may be crumbling.

While the Andean finding was noteworthy, this was not the first female hunter or warrior to be found by re-examining old archaeological evidence using fresh scientific techniques. Nor was this sort of discovery confined to one group, or one part of the world.

Three years ago, scientists re-examined the remains of a 10th-century Viking warrior excavated in Sweden at the end of the 19th century by Hjalmar Stolpe, an archaeologist. The skeleton had been regally buried at the top of a hill, with a sword, two shields, arrows and two horses. For decades, beginning with the original excavation, archaeologists assumed the Viking was a man. When researchers in the 1970s conducted a new anatomical evaluation of the skeleton, they began to suspect that the Viking was in fact a woman. But it wasn’t until 2017, when a group of Swedish archaeologists and geneticists extracted DNA from the remains, that the sex of the warrior indeed proved to be female.

The finding led to controversy over whether the skeleton was really a warrior, with scholars and pundits protesting what they called revisionist history. Although the genetic sex determination thus was indisputable (the bones of the skeleton had two X chromosomes), these criticisms led the Swedish researchers to examine the evidence yet again, and present a second, more contextual analysis in 2019. Their conclusion again was that the person had been a warrior.

The naysayers raised fair points. In archaeology, as the researchers admitted, we can’t always know why a community buried someone with particular objects. And one female warrior does not mean that many women were leaders, just as the reign of Queen Elizabeth I was not part of a larger feminist movement.

Challenges to “man the hunter” have emerged in new examinations of the early cultures of the Americas as well. In the 1960s, an archaeological dig uncovered in the ancient city of Cahokia, in what is now southwestern Illinois, a 1,000-to-1,200-year-old burial site with two central bodies, one on top of the other, surrounded by other skeletons. The burial was full of shell beads, projectile points and other luxury items. At the time, the archaeologists concluded that this was a burial of two high-status males flanked by their servants.

But in 2016 archaeologists conducted a fresh examination of the grave. The two central figures, it turned out, were a male and a female; they were surrounded by other male-female pairs. Thomas Emerson, who conducted the study with colleagues from the Illinois State Archaeological Survey at the University of Illinois, alongside scientists from other institutions, said the Cahokia discovery demonstrated the existence of male and female nobility. “We don’t have a system in which males are these dominant figures and females are playing bit parts,” as he put it.

Armchair history buffs love to obsess about  . . .

Continue reading.

Written by LeisureGuy

2 January 2021 at 4:54 pm

American Death Cult: Why has the Republican response to the pandemic been so mind-bogglingly disastrous?

leave a comment »

Jonathan Chait wrote this back in July 2020 in New York. And just a reminder: the US as of today has seen 20 million cases and more than 346,000 deaths due to Covid-19.

Last October, the Nuclear Threat Initiative and the Johns Hopkins Center for Health Security compiled a ranking system to assess the preparedness of 195 countries for the next global pandemic. Twenty-one panel experts across the globe graded each country in 34 categories composed of 140 subindices. At the top of the rankings, peering down at 194 countries supposedly less equipped to withstand a pandemic, stood the United States of America.

It has since become horrifyingly clear that the experts missed something. The supposed world leader is in fact a viral petri dish of uncontained infection. By June, after most of the world had beaten back the coronavirus pandemic, the U.S., with 4 percent of the world’s population, accounted for 25 percent of its cases. Florida alone was seeing more new infections a week than China, Japan, Korea, Vietnam, Thailand, Malaysia, Indonesia, the Philippines, Australia, and the European Union combined.

During its long period of decline, the Ottoman Empire was called “the sick man of Europe.” The United States is now the sick man of the world, pitied by the same countries that once envied its pandemic preparedness — and, as recently as the 2014 Ebola outbreak, relied on its expertise to organize the global response.

Our former peer nations are now operating in a political context Americans would find unfathomable. Every other wealthy nation in the world has successfully beaten back the disease, at least significantly, and at least for now. New Zealand’s health minister was forced to resign after allowing two people who had tested positive for COVID-19 to attend a funeral. The Italian Parliament heckled Prime Minister Giuseppe Conte when he briefly attempted to remove his mask to deliver a speech. In May — around the time Trump cheered demonstrators into the streets to protest stay-at-home orders — Boris Johnson’s top adviser set off a massive national scandal, complete with multiple calls for his resignation, because he’d been caught driving to visit his parents during lockdown. If a Trump official had done the same, would any newspaper even have bothered to publish the story?

It is difficult for us Americans to imagine living in a country where violations so trivial (by our standards) provoke such an uproar. And if you’re tempted to see for yourself what it looks like, too bad — the E.U. has banned U.S. travelers for health reasons.

The distrust and open dismissal of expertise and authority may seem uniquely contemporary — a phenomenon of the Trump era, or the rise of online misinformation. But the president and his party are the products of a decades-long war against the functioning of good government, a collapse of trust in experts and empiricism, and the spread of a kind of magical thinking that flourishes in a hothouse atmosphere that can seal out reality. While it’s not exactly shocking to see a Republican administration be destroyed by incompetent management — it happened to the last one, after all — the willfulness of it is still mind-boggling and has led to the unnecessary sickness and death of hundreds of thousands of people and the torpedoing of the reelection prospects of the president himself. Like Stalin’s purge of 30,000 Red Army members right before World War II, the central government has perversely chosen to disable the very asset that was intended to carry it through the crisis. Only this failure of leadership and management took place in a supposedly advanced democracy whose leadership succumbed to a debilitating and ultimately deadly ideological pathology.

How did this happen? In 1973, Republicans trusted science more than religion, while Democrats trusted religion more than science. The reverse now holds true. In the meantime, working-class whites left the Democratic Party, which has increasingly taken on the outlook of the professional class with its trust in institutions and empiricism. The influx of working-class whites (especially religiously observant ones) has pushed Republicans toward increasingly paranoid varieties of populism.

This is the conventional history of right-wing populism — that it was a postwar backlash against the New Deal and the Republican Party’s inability or unwillingness to roll it back. The movement believed the government had been subverted, perhaps consciously, by conspirators seeking to impose some form of socialism, communism, or world government. Its “paranoid style,” so described by historian Richard Hofstadter, became warped with anti-intellectualism, reflecting a “conflict between businessmen of certain types and the New Deal bureaucracy, which has spilled over into a resentment of intellectuals and experts.” Its followers seemed prone to “a disorder in relation to authority, characterized by an inability to find other modes for human relationship than those of more or less complete domination or submission.” Perhaps this sounds like someone you’ve heard of.

But for all the virulence of conservative paranoia in American life, without the sanction of a major party exploiting and profiting from paranoia, and thereby encouraging its growth, the worldview remained relatively fringe. Some of the far right’s more colorful adherents, especially the 100,000 reactionaries who joined the John Birch Society, suspected the (then-novel, now-uncontroversial) practice of adding small amounts of fluoride to water supplies to improve dental health was, in fact, a communist plot intended to weaken the populace. Still, the far right lacked power. Republican leaders held Joe McCarthy at arm’s length; Goldwater captured the nomination but went down in a landslide defeat. In the era of Sputnik, science was hardly a countercultural institution. “In the early Cold War period, science was associated with the military,” says sociologist Timothy O’Brien who, along with Shiri Noy, has studied the transformation. “When people thought about scientists, they thought about the Manhattan Project.” The scientist was calculating, cold, heartless, an authority figure against whom the caring, feeling liberal might rebel. Radicals in the ’60s often directed their protests against the scientists or laboratories that worked with the Pentagon.

But this began to change in the 1960s, along with everything else in American political and cultural life. New issues arose that tended to pit scientists against conservatives. Goldwater’s insouciant attitude toward the prospect of nuclear war with the Soviets provoked scientists to explain the impossibility of surviving atomic fallout and the formation of Scientists and Engineers for Johnson-Humphrey. New research by Rachel Carson about pollution and by Ralph Nader on the dangers of cars and other consumer products made science the linchpin of a vast new regulatory state. Business owners quickly grasped that stopping the advance of big government meant blunting the cultural and political authority of scientists. Expertise came to look like tyranny — or at least it was sold that way.

One tobacco company conceded privately in 1969 that it could not directly challenge the evidence of tobacco’s dangers but could make people wonder how solid the evidence really was. “Doubt,” the memo explained, “is our product.” In 1977, the conservative intellectual Irving Kristol urged business leaders to steer their donations away from public-interest causes and toward the burgeoning network of pro-business foundations. “Corporate philanthropy,” he wrote, “should not be, cannot be, disinterested.” The conservative think-tank scene exploded with reports questioning whether pollution, smoking, driving, and other profitable aspects of American capitalism were really as dangerous as the scientists said.

The Republican Party’s turn against science was slow and jagged, as most party-identity changes tend to be. The Environmental Protection Agency had been created under Richard Nixon, and its former administrator, Russell Train, once recalled President Gerald Ford promising to support whatever auto-emissions guidelines his staff deemed necessary. “I want you to be totally comfortable in the fact that no effort whatsoever will be made to try to change your position in any way,” said Ford — a pledge that would be unimaginable for a contemporary Republican president to make. Not until Ronald Reagan did Republican presidents begin letting business interests overrule experts, as when his EPA used a “hit list” of scientists flagged by industry as hostile. And even Reagan toggled between giving business a free hand and listening to his advisers (as he did when he signed a landmark 1987 agreement to phase out substances that were depleting the ozone layer and a plan the next year to curtail acid rain).

The party’s rightward tilt accelerated in the 1990s. “With the collapse of the Soviet Union, Cold Warriors looked for another great threat,” wrote science historians Naomi Oreskes and Erik Conway. “They found it in environmentalism,” viewing climate change as a pretext to impose government control over the whole economy. Since the 1990s was also the decade in which scientific consensus solidified that greenhouse-gas emissions were permanently increasing temperatures, the political stakes of environmentalism soared.

The number of books criticizing environmentalism increased fivefold over the previous decade, and more than 90 percent cited evidence produced by right-wing foundations. Many of these tracts coursed with the same lurid paranoia as their McCarthy-era counterparts. This was when the conspiracy theory that is currently conventional wisdom on the right — that scientists across the globe conspired to exaggerate or falsify global warming data in order to increase their own power — first took root.

This is not just a story about elites. About a decade after business leaders launched their attack on science from above, a new front opened from below: Starting in the late 1970s,  . . .

Continue reading.

Written by LeisureGuy

1 January 2021 at 4:55 pm

The Endgame of the Reagan Revolution

leave a comment »

Heather Cox Richardson writes a good summary of modern American political history:

And so, we are at the end of a year that has brought a presidential impeachment trial, a deadly pandemic that has killed more than 338,000 of us, a huge social movement for racial justice, a presidential election, and a president who has refused to accept the results of that election and is now trying to split his own political party.

It’s been quite a year.

But I had a chance to talk with history podcaster Bob Crawford of the Avett Brothers yesterday, and he asked a more interesting question. He pointed out that we are now twenty years into this century, and asked what I thought were the key changes of those twenty years. I chewed on this question for awhile and also asked readers what they thought. Pulling everything together, here is where I’ve come out.

In America, the twenty years since 2000 have seen the end game of the Reagan Revolution, begun in 1980.

In that era, political leaders on the right turned against the principles that had guided the country since the 1930s, when Democratic President Franklin Delano Roosevelt guided the nation out of the Great Depression by using the government to stabilize the economy. During the Depression and World War Two, Americans of all parties had come to believe the government had a role to play in regulating the economy, providing a basic social safety net and promoting infrastructure.

But reactionary businessmen hated regulations and the taxes that leveled the playing field between employers and workers. They called for a return to the pro-business government of the 1920s, but got no traction until the 1954 Brown v. Board of Education decision, when the Supreme Court, under the former Republican governor of California, Earl Warren, unanimously declared racial segregation unconstitutional. That decision, and others that promoted civil rights, enabled opponents of the New Deal government to attract supporters by insisting that the country’s postwar government was simply redistributing tax dollars from hardworking white men to people of color.

That argument echoed the political language of the Reconstruction years, when white southerners insisted that federal efforts to enable formerly enslaved men to participate in the economy on terms equal to white men were simply a redistribution of wealth, because the agents and policies required to achieve equality would cost tax dollars and, after the Civil War, most people with property were white. This, they insisted, was “socialism.”

To oppose the socialism they insisted was taking over the East, opponents of black rights looked to the American West. They called themselves Movement Conservatives, and they celebrated the cowboy who, in their inaccurate vision, was a hardworking white man who wanted nothing of the government but to be left alone to work out his own future. In this myth, the cowboys lived in a male-dominated world, where women were either wives and mothers or sexual playthings, and people of color were savage or subordinate.

With his cowboy hat and western ranch, Reagan deliberately tapped into this mythology, as well as the racism and sexism in it, when he promised to slash taxes and regulations to free individuals from a grasping government. He promised that cutting taxes and regulations would expand the economy. As wealthy people—the “supply side” of the economy– regained control of their capital, they would invest in their businesses and provide more jobs. Everyone would make more money.

From the start, though, his economic system didn’t work. Money moved upward, dramatically, and voters began to think the cutting was going too far. To keep control of the government, Movement Conservatives at the end of the twentieth century ramped up their celebration of the individualist white American man, insisting that America was sliding into socialism even as they cut more and more domestic programs, insisting that the people of color and women who wanted the government to address inequities in the country simply wanted “free stuff.” They courted social conservatives and evangelicals, promising to stop the “secularization” they saw as a partner to communism.

After the end of the Fairness Doctrine in 1987, talk radio spread the message that Black and Brown Americans and “feminazis” were trying to usher in socialism. In 1996, that narrative got a television channel that personified the idea of the strong man with subordinate women. The Fox News Channel told a story that reinforced the Movement Conservative narrative daily until it took over the Republican Party entirely.

The idea that people of color and women were trying to undermine society was enough of a rationale to justify keeping them from the vote, especially after Democrats passed the Motor Voter law in 1993, making it easier for poor people to register to vote. In 1997, Florida began the process of purging voter rolls of Black voters.

And so, 2000 came.

In that year, the presidential election came down to the electoral votes in Florida. Democratic candidate Al Gore won the popular vote by more than 540,000 votes over Republican candidate George W. Bush, but Florida would decide the election. During the required recount, Republican political operatives led by Roger Stone descended on the election canvassers in Miami-Dade County to stop the process. It worked, and the Supreme Court upheld the end of the recount. Bush won Florida by 537 votes and, thanks to its electoral votes, became president. Voter suppression was a success, and Republicans would use it, and after 2010, gerrymandering, to keep control of the government even as they lost popular support.

Bush had promised to unite the country, but his installation in the White House gave new power to the ideology of the Movement Conservative leaders of the Reagan Revolution. He inherited a budget surplus from his predecessor Democrat Bill Clinton, but immediately set out to get rid of it by cutting taxes. A balanced budget meant money for regulation and social programs, so it had to go. From his term onward, Republicans would continue to cut taxes even as budgets operated in the red, the debt climbed, and money moved upward.

The themes of Republican dominance and tax cuts were the backdrop of the terrorist attack of September 11, 2001. That attack gave the country’s leaders a sense of mission after the end of the Cold War and, after launching a war in Afghanistan to stop al-Qaeda, they set out to export democracy to Iraq. This had been a goal for Republican leaders since the Clinton administration, in the belief that the United States needed to spread capitalism and democracy in its role as a world leader. The wars in Afghanistan and Iraq strengthened the president and the federal government, creating the powerful Department of Homeland Security, for example, and leading Bush to assert the power of the presidency to interpret laws through signing statements.

The association of the Republican Party with patriotism enabled Republicans in this era to call for increased spending for the military and continued tax cuts, while attacking Democratic calls for domestic programs as wasteful. Increasingly, Republican media personalities derided those who called for such programs as dangerous, or anti-American.

But while Republicans increasingly looked inward to their party as the only real Americans and asserted power internationally, changes in technology were making the world larger. The Internet put the world at our fingertips and enabled researchers to decode the human genome, revolutionizing medical science. Smartphones both made communication easy. Online gaming created communities and empathy. And as many Americans were increasingly embracing rap music and tattoos and LGBTQ rights, as well as recognizing increasing inequality, books were pointing to the dangers of the power concentrating at the top of societies. In 1997, J.K. Rowling began her exploration of the rise of authoritarianism in her wildly popular Harry Potter books, but her series was only the most famous of a number of books in which young people conquered a dystopia created by adults.

In Bush’s second term, his ideology created a perfect storm. His . . .

Continue reading. There’s much more.

When ‘The American Way’ Met the Coronavirus

leave a comment »

Bryce Covert writes in the NY Times:

The end of the year has been awkward for Gov. Andrew Cuomo. As he promotes his new, self-congratulatory book about navigating New York through its first coronavirus wave of in the spring, he is also battling a new surge of cases.

He’s not been too happy. At a news conference in late November, he lashed out at his constituents.

“I just want to make it very simple,” he said. “If you socially distanced and you wore a mask, and you were smart, none of this would be a problem. It’s all self-imposed. It’s all self-imposed. If you didn’t eat the cheesecake, you wouldn’t have a weight problem.”

His blunt rhetoric exemplifies how political leaders — in Washington and in red and blue states — are responding to the Covid-19 crisis. They’ve increasingly decided to treat the pandemic as an issue of personal responsibility — much as our country confronts other social ills, like poverty or joblessness.

Yes, it’s absolutely critical that we wear masks and continue to keep our distance. But these individual actions were never meant to be our primary or only response to the pandemic.

Instead, more than 10 months into this crisis, our government has largely failed to act. There is no national infrastructure for testing or tracing. States have been put in a bind by federal failure, but even so, many governors have dithered on taking large-scale actions to suppress the current surge.

As Governor Cuomo excoriated New Yorkers about mask-wearing, he took no responsibility for not shutting down indoor dining for weeks, well into the new spike.

“We’re putting a lot of faith in individual actions and individual collective wisdom to do the right thing,” Rachel Werner, the executive director of the Leonard Davis Institute of Health Economics at the University of Pennsylvania, told me, “but it’s without any leadership.”

It’s no great mystery what the government could do to control the virus. Every expert I spoke to agreed on the No. 1 priority: testing.

“The primary thing we really should have had is ubiquitous testing, and the government has just not chosen to do that,” said Ashish Jha, the dean of the School of Public Health at Brown University and an early adviser to the White House Covid task force.

States can do only so much with their limited resources to roll out a testing regime; it requires the resources and heft of the federal government. And while the year-end relief legislation provides more money for testing, it’s not nearly enough, particularly for producing and sustaining rapid testing.

Dr. Jha said that early in his time on the task force there was a lot of interest in building a robust testing system. “But it was killed by the political leadership in the White House,” he said.

Then, the Trump administration allowed financial aid to businesses and households to dry up during some of the worst months of the pandemic — and only just struck a last-minute deal to partly revive it.

The inconsistent aid had a cascading effect. Governors like Mr. Cuomo, who don’t have the budgetary ability of the federal government to extend substantial business relief, ended up in a difficult situation as the virus surged in late summer and fall. New York had to keep high-risk businesses open, it was argued, so that they could earn whatever meager revenue they could. But what is “the economy” worth if it comes at the cost of our physical well-being, our very lives?

Calling Covid restrictions “Orwellian,” Kayleigh McEnany, the White House press secretary, said on “Fox & Friends” in late November that “the American people are a freedom-loving people” who “make responsible health decisions as individuals.” That, she said, is “the American way.”

I agree with her on one point: It is the American way to champion individualism over collective obligation. In 2019, 34 million Americans officially lived below the poverty line in this country, with many millions more struggling just above it — and that number has only increased since then. We could lift every family out of poverty by sending out regular checks; other countries use taxes to fund benefits that significantly reduce their poverty rates. Poverty, then, is a policy choice.

The pandemic gave us a crystal-clear window into this. The government’s initial response kept poverty from rising. But once stimulus checks and enhanced unemployment benefits started expiring, millions of people were pushed into destitution. It took Congress months to reach a deal to send more help, and even so, the latest relief bill cut back on stimulus spending and slashed supplementary federal unemployment benefits in half.

“We’ve basically had a complete abdication of the federal response,” Gregg Gonsalves, an assistant professor in epidemiology of microbial diseases at Yale, told me when asked about the interplay between public health and economic struggles.

If we want people to take individual actions to help curb the spread of the virus, we also need to invest in their ability to do so. The government could send every household masks — a plan the Trump administration nixed early on. It could pay Americans to stay home if they feel sick, test positive or work for a business that should close for public health reasons, to avoid choosing between their health and their bills.

“If you want people to do the right thing you have to make it easy, and we’ve made it hard,” Dr. Gonsalves noted. States, too, have been told they’re on their own, with Congressional Republicans refusing to agree to the money Democrats want to send to help fill the vast hole left by the pandemic. In response, some governors seem to be prioritizing businesses over public health, handing out ineffectual curfews to restaurants and bars rather than just shutting them down.

But to help . ..

Continue reading.

The meme of the independent individual, beholding to no one, going his or her own way without no community, no responsibilities to anyone save himself, is constantly promoted in movies, in books, and in stories — think of the Serge Leone/Clint Eastwood westerns as an archetype.

Written by LeisureGuy

29 December 2020 at 12:28 pm

Cellphones cripple social skills

leave a comment »

Ron Srigley writes in MIT Technology Review:

A few years ago, I performed an experiment in a philosophy class I was teaching. My students had failed a midterm test rather badly. I had a hunch that their pervasive use of cell phones and laptops in class was partly responsible. So I asked them what they thought had gone wrong. After a few moments of silence, a young woman put up her hand and said: “We don’t understand what the books say, sir. We don’t understand the words.” I looked around the class and saw guileless heads pensively nodding in agreement.

I extemporized a solution: I offered them extra credit if they would give me their phones for nine days and write about living without them. Twelve students—about a third of the class—took me up on the offer. What they wrote was remarkable, and remarkably consistent. These university students, given the chance to say what they felt, didn’t gracefully submit to the tech industry and its devices.

The usual industry and education narrative about cell phones, social media, and digital technology generally is that they build community, foster communication, and increase efficiency, thus improving our lives. Mark Zuckerberg’s recent reformulation of Facebook’s mission statement is typical: the company aims to “give people the power to build community and bring the world closer together.”

Without their phones, most of my students initially felt lost, disoriented, frustrated, and even frightened. That seemed to support the industry narrative: look how disconnected and lonely you’ll be without our technology. But after just two weeks, the majority began to think that their cell phones were in fact limiting their relationships with other people, compromising their own lives, and somehow cutting them off from the “real” world. Here is some of what they said.

“You must be weird or something”

“Believe it or not, I had to walk up to a stranger and ask what time it was. It honestly took me a lot of guts and confidence to ask someone,” Janet wrote. (Her name, like the others here, is a pseudonym.) She describes the attitude she was up against: “Why do you need to ask me the time? Everyone has a cell phone. You must be weird or something.” Emily went even further. Simply walking by strangers “in the hallway or when I passed them on the street” caused almost all of them to take out a phone “right before I could gain eye contact with them.”

To these young people, direct, unmediated human contact was experienced as ill-mannered at best and strange at worst. James: “One of the worst and most common things people do nowadays is pull out their cell phone and use it while in a face-to-face conversation. This action is very rude and unacceptable, but yet again, I find myself guilty of this sometimes because it is the norm.” Emily noticed that “a lot of people used their cell phones when they felt they were in an awkward situation, for an example [sic] being at a party while no one was speaking to them.”

The price of this protection from awkward moments is the loss of human relationships, a consequence that almost all the students identified and lamented. Without his phone, James said, he found himself forced to look others in the eye and engage in conversation. Stewart put a moral spin on it. “Being forced to have [real relations with people] obviously made me a better person because each time it happened I learned how to deal with the situation better, other than sticking my face in a phone.” Ten of the 12 students said their phones were compromising their ability to have such relationships.

Virtually all the students admitted that ease of communication was one of the genuine benefits of their phones. However, eight out of 12 said they were genuinely relieved not to have to answer the usual flood of texts and social-media posts. Peter: “I have to admit, it was pretty nice without the phone all week. Didn’t have to hear the fucking thing ring or vibrate once, and didn’t feel bad not answering phone calls because there were none to ignore.”

Indeed, the language they used indicated that they experienced this activity almost as a type of harassment. “It felt so free without one and it was nice knowing no one could bother me when I didn’t want to be bothered,” wrote William. Emily said that she found herself “sleeping more peacefully after the first two nights of attempting to sleep right away when the lights got shut off.” Several students went further and claimed that communication with others was in fact easier and more efficient without their phones. Stewart: “Actually I got things done much quicker without the cell because instead of waiting for a response from someone (that you don’t even know if they read your message or not) you just called them [from a land line], either got an answer or didn’t, and moved on to the next thing.”

Technologists assert that their instruments make us more productive. But for the students, phones had the opposite effect. “Writing a paper and not having a phone boosted productivity at least twice as much,” Elliott claimed. “You are concentrated on one task and not worrying about anything else. Studying for a test was much easier as well because I was not distracted by the phone at all.” Stewart found he could “sit down and actually focus on writing a paper.” He added, “Because I was able to give it 100% of my attention, not only was the final product better than it would have been, I was also able to complete it much quicker.” Even Janet, who missed her phone more than most, admitted, “One positive thing that came out of not having a cell phone was that I found myself more productive and I was more apt to pay attention in class.”

Some students felt not only distracted by their phones, but morally compromised. Kate: “Having a cell phone has actually affected my personal code of morals and this scares me … I regret to admit that I have texted in class this year, something I swore to myself in high school that I would never do … I am disappointed in myself now that I see how much I have come to depend on technology … I start to wonder if it has affected who I am as a person, and then I remember that it already has.” And James, though he says we must continue to develop our technology, said that “what many people forget is that it is vital for us not to lose our fundamental values along the way.”

Other students were worried that their cell-phone addiction was depriving them of a relationship to the world. Listen to James: “It is almost like . . .

Continue reading.

Written by LeisureGuy

27 December 2020 at 7:59 am

Backstory to Apple’s new M1 System on a Chip: How an obscure British PC maker invented ARM and changed the world

leave a comment »

Image by Jason Korchinsky

Jason Torchinsky has in Ars Technica a fascinating article that includes videos. His article begins:

Let’s be honest: 2020 sucks. So much of this year has been a relentless slog of bad news and miserable events that it’s been hard to keep up. Yet most of us have kept up, and the way most of us do so is with the small handheld computers we carry with us at all times. At least in America, we still call these by the hilariously reductive name “phones.”

We can all use a feel-good underdog story right now, and luckily our doomscrolling 2020 selves don’t have to look very far. That’s because those same phones, and so much of our digital existence, run on the same thing: the ARM family of CPUs. And with Apple’s release of a whole new line of Macs based on their new M1 CPU—an ARM-based processor—and with those machines getting fantastic reviews, it’s a good time to remind everyone of the strange and unlikely source these world-controlling chips came from.

If you were writing reality as a screenplay, and, for some baffling reason, you had to specify what the most common central processing unit used in most phones, game consoles, ATMs, and other innumerable devices was, you’d likely pick one from one of the major manufacturers, like Intel. That state of affairs would make sense and fit in with the world as people understand it; the market dominance of some industry stalwart would raise no eyebrows or any other bits of hair on anyone.

But what if, instead, you decided to make those CPUs all hail from a barely-known company from a country usually not the first to come to mind as a global leader in high-tech innovations (well, not since, say, the 1800s)? And what if that CPU owed its existence, at least indirectly, to an educational TV show? Chances are the producers would tell you to dial this script back a bit; come on, take this seriously, already.

And yet, somehow, that’s how reality actually is.

In the beginning, there was TV

The ARM processor, the bit of silicon that controls over 130 billion devices all over the world and without which modernity would effectively come to a crashing halt, has a really strange origin story. Its journey is peppered with bits of seemingly bad luck that ended up providing crucial opportunities, unexpected technical benefits that would prove absolutely pivotal, and a start in some devices that would be considered abject failures.

But everything truly did sort of get set in motion by a TV show—a 1982 BBC program called The Computer Programme. This was an attempt by the BBC to educate Britons about just what the hell all these new fancy machines that looked like crappy typewriters connected to your telly were all about.

The show was part of a larger Computer Literacy Project started by the British government and the BBC as a response to fears that the UK was deeply and alarmingly unprepared for the new revolution in personal computing that was happening in America. Unlike most TV shows, the BBC wanted to feature a computer on the show that would be used to explain fundamental computing concepts and teach a bit of BASIC programming. The concepts included graphics and sound, the ability to connect to teletext networks, speech synthesis, and even some rudimentary AI. As a result, the computer needed for the show would have to be pretty good—in fact, the producers’ demands were initially so high that nothing on the market really satisfied the BBC’s aspirations.

So, the BBC put out a call to the UK’s young computer industry, which was then dominated by Sinclair, a company that made its fortune in calculators and tiny televisions. Ultimately, it was a much smaller upstart company that ended up getting the lucrative contract: Acorn Computers.

An Acorn blooms

Acorn was a Cambridge-based firm that started in 1979 after developing computer systems originally designed to run fruit machines—we call them slot machines—then turning them into small hobbyist computer systems based on 6502 processors. That was the same CPU family used in the Apple II, Atari 2600, and Commodore 64 computers, among many others. This CPU’s design will become important later, so, you know, don’t forget about it.

Acorn had developed a home computer called the Atom, and when the BBC opportunity arose, they started plans for the Atom’s successor to be developed into what would become the BBC Micro.

The BBC’s demanding list of features ensured the resulting machine would be quite powerful for the era, though not quite as powerful as Acorn’s original Atom-successor design. That Atom successor would have featured two CPUs, a tried-and-true 6502 and an as-yet undecided 16-bit CPU.

Acorn later dropped that CPU but kept an interface system, called the Tube, that would allow for additional CPUs to be connected to the machine. (This too will become more important later.)

The engineering of the BBC Micro really pushed Acorn’s limits, as it was a pretty state-of-the-art machine for the era. This resulted in some fascinatingly half-ass but workable engineering decisions, like having to replicate the placement of an engineer’s finger on the motherboard with a resistor pack in order to get the machine to work.

Nobody ever really figured out why the machine only worked when a finger was placed on a certain point on the motherboard, but once they were able to emulate the finger touch with resistors, they were just satisfied it worked, and moved on.

Here, listen to one of the key engineers tell you himself: . . .

Continue reading. There’s much more, and it’s fascinating (to me, at any rate).

Written by LeisureGuy

24 December 2020 at 1:16 pm

“All I Want for Christmas,” Star-Trek style

leave a comment »

Written by LeisureGuy

24 December 2020 at 12:54 pm

Posted in Memes, Movies & TV, Music, Video

An English word that has come down directly from Proto-Indo-European

leave a comment »

Sevindj Nurkiyazova writes in Nautilus:

One of my favorite words is lox,” says Gregory Guy, a professor of linguistics at New York University. There is hardly a more quintessential New York food than a lox bagel—a century-old popular appetizing store, Russ & Daughters, calls it “The Classic.” But Guy, who has lived in the city for the past 17 years, is passionate about lox for a different reason. “The pronunciation in the Proto-Indo-European was probably ‘lox,’ and that’s exactly how it is pronounced in modern English,” he says. “Then, it meant salmon, and now it specifically means ‘smoked salmon.’ It’s really cool that that word hasn’t changed its pronunciation at all in 8,000 years and still refers to a particular fish.”

How scholars have traced the word’s pronunciation over thousands of years is also really cool. The story goes back to Thomas Young, also known as “The Last Person Who Knew Everything.” The 18th-century British polymath came up with the wave theory of light, first described astigmatism, and played a key role in deciphering the Rosetta Stone. Like some people before him, Young noticed eerie similarities between Indic and European languages. He went further, analyzing 400 languages spread across continents and millennia and proved that the overlap between some of them was too extensive to be an accident. A single coincidence meant nothing, but each additional one increased the chance of an underlying connection. In 1813, Young declared that all those languages belong to one family. He named it “Indo-European.”

Today, roughly half the world’s population speaks an Indo-European language. That family includes 440 languages spoken across the globe, including English. The word yoga, for example, which comes from Sanskrit, the language of ancient India, is a distant relative of the English word yoke. The nature of this relationship puzzled historical linguists for two centuries.

In modern English, well over half of all words are borrowed from other languages. To trace how language changes over time, linguists developed an ingenious toolkit. “Some parts of vocabulary are more stable and don’t change as much. The linguistic term [for these words] is ‘a core vocabulary.’ These are numbers, colors, family relations like ‘mother,’ ‘father,’ ‘sister,’ ‘brother,’ and basic verbs like ‘walk’ and ‘see,’ says Guy. “If you look at words of that sort in different languages, it becomes fairly clear which ones are related and which ones are not. For example, take the English word for number two, which is dva in Russian and deux in French, or the word night, which is nacht in German and noch in Russian.”

Analyzing the patterns of change that words undergo, moving from one language to another, showed how to unwind these changes and identify the possible originals. “Reconstructed vocabulary of Indo-European is based on a comparison of descendant languages,” explains Guy. “You collect words that mean more or less the same thing in all the languages, and if they look like each other in terms of their pronunciation, then it’s a good candidate for a descendant from a common ancestor.” The English word honey is madhu in Sanskrit and myod in Russian. Sanskrit and Russian haven’t shared a common ancestor since Indo-European, so these words had to come from the same source. (There are also the words mead in English, met in German and mjød in Danish that refer to an alcoholic drink made from honey.)

After discovering a word that might have existed in the Indo-European, linguists compared how its pronunciations changed from language to language. For example, sound [k] changes to [h] from Latin to Germanic, and the Latin word casa transforms into the English house while the French word cœur transforms into the English heart.* With hints like that, linguists could undo the sound changes and trace the original pronunciation. In several thousand years, most words change beyond recognition, like the word wheel, which initially might have sounded “kʷékʷlos.” But there were some remarkable exceptions—like the timeless lox.

The family tree of the Indo-European languages sprawls across Eurasia, including such different species as English and Tocharian B, an extinct language once spoken on the territory of Xinjiang in modern China. In Tocharian B, the word for “fish/salmon” is laks, similar to German lachs, and Icelandic lax—the only ancestor all these languages share is the Proto-Indo-European. In Russian, Czech, Croatian, Macedonian, and Latvian, the [k] sound changed to [s,] resulting in the word losos.

This kind of millennia-long semantic consistency also appears in other words. For example, the Indo-European porkos, similar to modern English pork, meant a young pig. “What is interesting about the word lox is that it simply happened to consist of sounds that didn’t undergo changes in English and several other daughter languages descended from Proto-Indo-European,” says Guy. “The sounds that change across time are unpredictable, and differ from language to language, and some may not happen to change at all.”

The word lox was one of the clues that eventually led linguists to discover who the Proto-Indo-Europeans were, and where they lived. The fact that those distantly related Indo-European languages had almost the same pronunciation of a single word meant that the word—and the concept behind it—had most likely existed in the Proto-Indo-European language. “If they had a word for it, they must have lived in a place where there was salmon,” explains Guy. “Salmon is a fish that lives in the ocean, reproduces in fresh water and swims up to rivers to lay eggs and mate. There are only a few places on the planet where that happens.”

In reconstructed Indo-European, there were words for bearhoneyoak tree, and snow, and, which is . . .

Continue reading.

There’s also a good discussion of this in David Anthony’s fascinating book The Horse, the Wheel, and Language: How Bronze-Age Riders from the Eurasian Steppes Shaped the Modern World.

See also this earlier post and this one as well.

Written by LeisureGuy

24 December 2020 at 11:36 am

Posted in Books, Daily life, Memes

%d bloggers like this: