Later On

A blog written for those whose interests more or less match mine.

Archive for October 22nd, 2019

Why does the NY Times so clearly despise Hillary Clinton?

with 2 comments

The NY Times incessant beating the drum about the Whitewater non-scandal was just the beginning. Kevin Drum has a brief post worth reading on how the NY Times still refuses to acknowledge their role in the 2016 election.

Written by LeisureGuy

22 October 2019 at 4:49 pm

Posted in Election, NY Times

Some news about the Best Healthcare System in the World™

leave a comment »

From the NY Times Evening Briefing today:

More than a million children have lost health insurance in the last two years, according to new Census Bureau data, disappearing from the rolls of the two main state-federal health programs for lower-income children.

There is growing evidence that administrative changes aimed at fighting fraud and waste — and rising fears of deportation in immigrant communities — are to blame.

At the same time, prices for the most popular plan offered through Obamacare will actually drop next year, and the number of insurers offering plans will go up, after several years of turmoil caused in part by President Trump’s aggressive efforts to undermine it.

Written by LeisureGuy

22 October 2019 at 4:46 pm

Trump still has the mind of a boy in junior high

leave a comment »

And he doesn’t like to have anything he says challenged, much less corrected. That’s how he avoids learning.

Written by LeisureGuy

22 October 2019 at 4:24 pm

How Democrats Became the Party of Monopoly and Corruption

leave a comment »

This is an extract published in Vice from Matt Stoller’s book Goliath: The 100-Year War Between Monopoly Power and Democracy:

In 1985, the Dow Jones average jumped 27.66 percent. Making money in stocks, as a journalist put it, “was easy.” With lower interest rates, low inflation, and “takeover fever,” investors could throw a dart at a list of stocks and profit. The next year was also very good. The average gain of a Big Board stock in 1986 was 14 percent, with equity market indexes closing at a record high.

For the top performers, the amounts of money involved were staggering. In 1987, Michael Milken awarded himself $550 million in compensation. In New York City, spending by bankers—a million dollars for curtains for a Fifth Avenue apartment, a thousand dollars for a vase of precious roses for a party—was obscene. A major financier announced in the Hamptons one night that “if you have less than 750 million, you have no hedge against inflation.” In Paris, a jeweler “dazzled his society guests when topless models displayed the merchandise between courses.” In west Los Angeles, the average price of a house in Bel Air rose to $4.6 million. There was so much money it was nicknamed “green smog.”

Ambitious men now wanted to change the world through finance. Bruce Wasserstein had been a “Nader’s Raider” consumer advocate; he now worked at First Boston as one of the most successful mergers and acquisitions bankers of the 1980s. Michael Lewis wrote his best-seller Liar’s Poker as a warning of what unfettered greed in finance meant, but instead of learning the lesson, students deluged him with letters asking if he “had any other secrets to share about Wall Street.” To them, the book was a “how-to manual.”

Finance was the center, but its power reached outward everywhere. The stock market was minting millionaires in a collection of formerly sleepy towns in California. Sunnyvale, Mountain View, Los Altos, Cupertino, Santa Clara, and San Jose in the 1960s had been covered with “apricot, cherry and plum orchards,” and young people there often took summer jobs at local canneries. Immediately after Reagan’s election, in December of 1980, Apple Computer went public, instantly creating 300 millionaires, and raising more money in the stock market than any company since Ford Motor had in its initial public offering of shares in 1956. A young Steve Jobs was instantly worth $217 million.

Meanwhile, the family farmer had lots of people who said they were friends at election time—even the glamorous music industry put on a giant “Farm Aid” concert in 1985 to raise money for bankrupt growers. But there was no populist leader like Congressman Wright Patman had been during the New Deal in the Democratic Party anymore. On the contrary, “new” Democrats like Dale Bumpers and Bill Clinton of Arkansas worked to rid their state of the usury caps meant to protect the “plain people” from the banker and financier. And the main contender for the Democratic nomination in 1988, the handsome Gary Hart, with his flowing—and carefully blow-dried—chestnut brown hair, spoke a lot about “sunrise” industries like semiconductors and high-tech, but had little in his vision incorporating the family farm.

It wasn’t just the family farmer who suffered. On the South Side of Chicago, U.S. Steel, having started mass layoffs in 1979, continued into the next decade, laying off more than 6,000 workers in that community alone. Youngstown, Johnson, Gary—all the old industrial cities were going, in the words of the writer Studs Terkel, from “Steel Town” to “Ghost Town.” And the headlines kept on coming. John Deere idled 1,500 workers, GE’s turbine division cut 1,500 jobs, AT&T laid off 2,900 in its Shreveport plant, Eastern Air Lines fired 1,010 flight attendants, and docked pay by 20 percent. “You keep saying it can’t get worse, but it does,” said a United Autoworker member.

And all the time, whether in farm country or steel country, the closed independent shop and the collapsed bank were as much monuments to the new political order as the sprouting number of Walmarts and the blizzard of junk-mail credit cards from Citibank. As Terkel put it, “In the thirties, an Administration recognized a need and lent a hand. Today, an Administration recognizes an image and lends a smile.”

Regional inequality widened, as airlines cut routes to rural, small, and even medium-sized cities. So did income inequality, the emptying farm towns, the hollowing of manufacturing as executives began searching for any way to be in any business but one that made things in America. It wasn’t just the smog and the poverty, the consumerism, the debt, and the shop-till-you-drop ethos. It was the profound hopelessness.

Within academic and political institutions, Americans were taught to believe their longing for freedom was immoral. Power was re-centralizing on Wall Street, in corporate monopolies, in shopping malls, in the way they paid for the new consumer goods made abroad, in where they worked and shopped. Yet policymakers, reading from the scripts prepared by Chicago School of Economics “experts,” spoke of these changes as natural, “scientific,” a result of consumer preferences, not the concentration of power.


By the time of the 1992 election, there was a sullen mood among the voters, similar to that of 1974. “People are outraged at what is going on in Washington. Part of it had to do with pay raises, part of it has to do with banks and S&Ls and other things that are affecting my life as a voter,” said a pollster. That year, billionaire businessman Ross Perot ran the strongest third-party challenge in American history, capitalizing on anger among white working-class voters, the Democrats who had switched over to Reagan in the 1980s. He did so by pledging straightforward protectionism for U.S. industry, attacking the proposed North American Free Trade Agreement (NAFTA), and political corruption. Despite a bizarre campaign in which he withdrew and then reentered the race, Perot did so well he shattered the Republican coalition, helping throw the election to the Democrats. There would be one last opportunity for the Democrats to rebuild their New Deal coalition of working-class voters.

The winner of the election, Bill Clinton, looked like he might do so. He had run a populist campaign using the slogan “Putting People First.” He attacked the failed economic theory of Reagan, criticized tax cuts for the rich and factory closings, and pledged to protect Americans from foreign and domestic threats. “For too long, those who play by the rules and keep the faith have gotten the shaft,” Clinton said. “And those who cut corners and cut deals have been rewarded.” His campaign’s internal slogan was “It’s the economy, stupid,” and the 1992 Democratic platform used the word “revolution” 14 times.

As a candidate, Clinton’s Democratic platform called for a “Revolution of 1992,” capturing the anger of the moment. But the platform was written by centrist Democratic Leadership Council boss Al From, and for the first time since 1880 there was no mention of antitrust or corporate power, despite a decade with the worst financial manipulation America had seen since the 1920s. This revolution would be against government, in government, around government. In 1993, a book came out on lobbying in Washington. Wayne Thevenot, a Clinton donor, laid out the new theme of the modern Democratic Party: “I gave up the idea of changing the world. I set out to get rich.”

Like Reagan, Clinton went after restrictions on banking. Reagan sought to free restrictions on finance by allowing banks and non-banks to enter new lines of business. Clinton continued this policy, but over the course of his eight years attacked restrictions on banks themselves. In 1994, the Clinton administration and a Democratic Congress passed the Riegle-Neal Interstate Banking and Branching Efficiency Act, which allowed banks to open up branches across state lines. Clinton appointed Robert Rubin as his treasury secretary, super-lawyer Eugene Ludwig to run the Office of the Comptroller of the Currency, and reappointed Alan Greenspan as the chairman of the Federal Reserve.

All three men worked hard through regulatory rulemaking to allow unfettered trading in derivatives, to break down the New Deal restrictions prohibiting commercial banks from entering the trading business, and to let banks take more risks with less of a cushion. Citigroup finally got an insurance arm, merging with financial conglomerate Travelers Group, approved by Greenspan, who granted the authority for the acquisition under the Bank Holding Company Act. In 1999, Clinton and a now-Republican Congress passed the Gramm-Leach-Bliley Act, which fully repealed the Glass-Steagall Act that had shattered the Houses of J.P. Morgan and Andrew Mellon. The very last bill Clinton signed was the Commodity Futures Modernization Act of 2000, which removed public rules limiting the use of exotic gambling instruments known as derivatives by now-enormous banks.

Clinton signed the Telecommunications Act of 1996, which he touted as “truly revolutionary legislation,” and this began the process of reconsolidating the old AT&T as the “Baby Bells” merged. At the signing ceremony, actress Lily Tomlin reprised her role as a Ma Bell operator. Huge pieces of the AT&T network came back together, as Baby Bells merged from seven to three. Clear Channel grew from 40 radio stations to 1,240. In 1996, the Communications Decency Act was signed, with Section 230 of the Act protecting certain internet businesses from being liable for wrongdoing that occurred on their platform. While not well understood at the time, Section 230 was one policy lever that would enable a powerful set of internet monopolies to emerge in the next decade.

Clinton also sped up the corporate takeover of rural America by allowing a merger wave in farm country. Food companies had always had some power in America, but before the Reagan era, big agribusinesses were confined to one or two stages of the food system. In the 1990s, the agricultural sector consolidated under a small number of sprawling conglomerates that organized the entire supply chain. Cargill, an agricultural conglomerate that was the largest privately owned company in America, embarked on a series of mergers and joint ventures, buying the grain-trading operations of its rival, Continental Grain Inc., as well as Azko Salt, thus becoming one of the largest salt production and marketing operations in the world.

Monsanto consolidated the specialty chemicals and seed markets, buying up DeKalb Genetics and cotton-seed maker Delta & Pine Land. ConAgra, marketing itself as selling at every link of the supply chain from “farm gate to dinner plate,” bought International Home Foods (the producer of Chef Boyardee pasta and Gulden’s mustard), Knott’s Berry Farm Foods, Gilroy Foods, Hester Industries, and Signature Foods. As William Heffernan, a rural sociologist at the University of Missouri, put it in 1999, a host of formal and informal alliances such as joint ventures, partnerships, contracts, agreements, and side agreements ended up concentrating power even further into “clusters of firms.” He identified three such clusters—Cargill/Monsanto, ConAgra, and Novartis/ADM—as controlling the global food supply.

The increase in power of these trading corporations meant that profit would increasingly flow to middlemen, not farmers themselves. Montana senator Conrad Burns complained his state’s farmers were “getting less for our products on the farm now than we did during the Great Depression.” The Montana state legislature passed a resolution demanding vigorous antitrust investigations into the meatpacking, grain-handling, and food retail industries, and the state farmer’s union asked for a special unit at the Department of Justice to review proposed agricultural mergers. There was so little interest in the Clinton antitrust division that when Burns held a Senate Commerce Committee hearing on concentration in the agricultural sector, the assistant attorney general for antitrust, Joel Klein, didn’t bother to show up. “Their failure to be here to explain their policies to rural America,” said Burns, “speaks volumes about what their real agenda is.”


In the Reagan era, Walmart had already become the most important chain store in America, surpassing the importance of A&P at the height of its power. But it was during the Clinton administration that the company became a trading giant. First, the corporation jumped in size, replacing the auto giant GM as the top private employer in America, growing to 825,000 employees in 1998 while planting a store in every state. The end of antitrust enforcement in the retail space meant that Walmart could wield its buying power to restructure swaths of industries and companies, from pickle producers to Procter & Gamble. Clinton allowed Walmart to reorder world trade itself. Even in the mid-1990s, only a small percentage of its products were made abroad. But the passage of NAFTA—which eliminated tariffs on Mexican imports—as well as Clinton’s embrace of Chinese imports, allowed Walmart to force its suppliers to produce where labor and environmental costs were lowest. From 1992 to 2000, America’s trade deficit with China jumped from $18 billion to $84 billion, while it went from a small trade surplus to a $25 billion trade deficit with Mexico. And Walmart led the way. By 2003, consulting firm Retail Forward estimated more than half of Walmart merchandise was made abroad.

Clinton administration officials were proud of Walmart, and this new generation of American trading monopolies, dubbing them part of a wondrous “New Economy” underpinned by information technology. “And if you think about what this new economy means,” said Clinton deputy treasury secretary Larry Summers in 1998 at a conference for investment bankers focusing on high-tech, “whether it is AIG in insurance, McDonald’s in fast-food, Walmart in retailing, Microsoft in software, Harvard University in education, CNN in television news—the leading enterprises are American.”

It was also under Clinton that the last bastion of the New Deal coalition—a congressional majority held by the Democrats since the late 1940s—fell apart as the last few holdout southern Democrats were finally driven from office or switched to the Republican Party. And it was under Clinton that the language of politics shifted from that of equity, justice, and potholes to the finance-speak of redistribution, growth and investment, and infrastructure decay.

The Democratic Party embraced not just the tactics, but the ideology of the Chicago School. As one memo from Clinton’s Council of Economic Advisors put it, “Large size is not the same as monopoly power. For example, an ice cream vendor at the beach on a hot day probably has more market power than many multi-billion-dollar companies in competitive industries.”

During the 12 years of the Reagan and Bush administrations, there were 85,064 mergers valued at $3.5 trillion. Under just seven years of Clinton, there were 166,310 deals valued at $9.8 trillion. This merger wave was larger than that of the Reagan era, and larger even than any since the turn of the twentieth century, when the original trusts were created. Hotels, hospitals, banks, investment banks, defense contractors, technology, oil—everything was merging.

The Clinton administration organized this new concentrated American economy through regulatory appointments and through non-enforcement of antitrust laws. Sometimes it even seemed they had put antitrust enforcement itself up for sale. In 1996, Thomson Corporation bought West Publishing, creating a monopoly in digital access to court opinions and legal publishing; the owner of West had given a half a million dollars to the Democratic Party and personally lobbied Clinton to allow the deal. The DOJ even approved the $81 billion Exxon and Mobil merger, restoring a chunk of the Rockefeller empire. . .

Continue reading.

See also “The Con-Artist Wing of the Democratic Party,” by the same author, Matt Stoller.

Written by LeisureGuy

22 October 2019 at 2:53 pm

Raw and red-hot: Inflammation and chronic illness

leave a comment »

Jonathan Shaw writes in Harvard Magazine:

IN 2007, associate professor of medicine Samia Mora and colleagues published a study of exercise that sought to understand why physical activity is salutary. They already knew that exercise reduces the risk of cardiovascular disease as much as cholesterol-lowering statin drugs do. By analyzing biomarkers in the blood of 27,055 women participating in a long-term study, and other objective measures, they hoped to tease out the source of this effect. How much of the benefit was attributable to improved blood pressure? To lower body weight? Or to something else? The women had donated blood in the 1990s when they entered the study. Eleven years later, the researchers analyzed this frozen blood to see if they could find anything that correlated with long-term cardiovascular outcomes such as heart attack and stroke. “We were actually surprised that reduced inflammation was the biggest explainer, the biggest contributor to the benefit of activity,” says Mora, “because we hadn’t hypothesized that. We knew that regular exercise does reduce inflammation over the long term, but we also knew that acute exercise transiently increases inflammatory biomarkers during and immediately after exertion.” About a third of the benefit of regular exercise, they found, is attributable to reduced inflammation.

The anti-inflammatory effect of exercise was much greater than most people had expected. That raised another question: whether inflammation might also play a dominant role in other lifestyle illnesses that have been linked to cardiovascular disease, such as diabetes and dementia.

In 2017, two cardiologists at Brigham and Women’s Hospital in Boston, who suspected such a link, published the results of a human clinical trial that will forever change the way people think about inflammation. The trial, which involved more than 10,000 patients in 39 countries, was primarily designed to determine whether an anti-inflammatory drug, by itself, could lower rates of cardiovascular disease in a large population, without simultaneously lowering levels of cholesterol, as statin drugs do. The answer was yes. But the researchers went a step further, building into the trial additional tests seeking to clarify what effect the same anti-inflammatory drug, canakinumab, might have on illnesses seemingly unrelated to cardiovascular disease: arthritis, gout, and cancer. Only the researchers themselves, and their scientific colleagues, were unsurprised by the outcome. Lung cancer mortality dropped by as much as 77 percent. Reports of arthritis and gout also fell significantly.

In medicine, believing something is true is not the same as being able to prove it. Because the idea that inflammation—constant, low-level, immune-system activation —could be at the root of many noncommunicable diseases is a startling claim, it requires extraordinary proof. Can seemingly unconnected illnesses of the brain, the vasculature, lungs, liver, and joints really share a deep biological link? Evidence has been mounting that these common chronic conditions—including Alzheimer’s, cancer, arthritis, asthma, gout, psoriasis, anemia, Parkinson’s disease, multiple sclerosis, diabetes, and depression among them—are indeed triggered by low-grade, long-term inflammation. But it took that large-scale human clinical trial to dispel any lingering doubt: the immune system’s inflammatory response is killing people by degrees.

Now the pertinent question is why, and what can be done about it. The pharmaceutical industry is deeply interested in finding ways to stop inflammation with medicines like canakinumab, an orphan drug that blocks a specific pro-inflammatory pathway called IL-1beta. But some researchers suggest that the inflammatory process—a normal and necessary part of the natural immune response—has itself has been misunderstood. Scientists know that the process can be turned on and off, but have only recently understood that this doesn’t mean normal physiology will resume once the inflammation caused by infection, injury, or irritant has been shut down. Instead, the restoration of health is an active phase of the inflammatory process itself, facilitated by a little-known class of molecules called pro-resolving mediators—the protectins, resolvins, maresins, and lipoxins—brimming with marvelous, untapped, regenerative capacities.

Origins of Atherosclerosis

THE 2017 clinical trial, called CANTOS (Canakinumab Anti-Inflammatory Thrombosis Outcomes Study), is the result of a long-term collaboration between Paul Ridker and Peter Libby, who suspected as long ago as the 1980s that inflammation played a role in cardiovascular disease. Ridker, an epidemiologist who is Braunwald professor of medicine, came to this conclusion through studies of cardiac patients. He is the physician-scientist who first demonstrated that a molecule called C-reactive protein (CRP), easily measured by a simple and now ubiquitous blood test, could be used like a thermometer to take the temperature of a patient’s inflammation. Elevated CRP, he discovered years ago, predicts future cardiovascular events, including heart attacks. Although nobody knows what it does biologically, this marker is downstream from IL-1beta, and thus provides a reliable yardstick of that pro-inflammatory pathway’s level of activation.

Libby, the Mallinckrodt professor of medicine, is a bench scientist and clinician with expertise in the study of heart disease. In the 1980s, orthodoxy within the cardiovascular establishment held that circulating fats (including cholesterol) build up in the arteries of patients with progressive cardiovascular disease. But no one knew why or how the plaques formed. It took work by some of the most distinguished cardiology researchers of the era to lay the groundwork that eventually produced an understanding of the molecular mechanisms that drive deposition of those plaques.

Today, in his Harvard Medical School (HMS) office, Libby sketches the origins of atherosclerosis. The interior walls of blood vessels, he explains, are made from smooth muscle cells, lined in turn with endothelial cells that are in direct contact with circulating blood. When a problem arises, caused by anything from cholesterol to bacteria, the vascular system recruits white blood cells, the immune system’s front-line guardians, to the site. Two Harvard professors of pathology, Michael Gimbrone Jr. and the late Ramzi Cotran, figured out that naturally occurring adhesion molecules could attract these white blood cells and get them to stick to the endothelium lining the arteries. Their experiments implicated a pro-inflammatory signal called interleukin-1 (IL-1), which is produced by both circulating and tissue-based immune cells.

Libby, then at Tufts, followed their work closely. IL-1 had been discovered in 1977 by one of his Tufts colleagues, Charles Dinarello, who had been focused on understanding what causes fever, one of the cardinal signs of inflammation. The others, described by Aulus Cornelius Celsus in the first century C.E., are redness (rubor), which occurs when the endothelial lining of arteries dilates to permit more blood flow; swelling (tumor), caused by endothelial cells leaking protein, which carries water; and pain (dolor). By measuring the factors in rabbits’ blood, Dinarello was able to isolate and then clone the specific factor—called a pyrogen—that causes fever: interleukin-1.

But before this inflammatory pyrogen even had a name, Dinarello gave some to Libby, whose lab was down the hall. Libby discovered that arterial wall cells not only responded to IL-1, but could secrete it. This was heretical, he explains, because it was thought that only “a properly pedigreed immunological cell” could produce such signals. Now it was clear that the cells of the artery walls were capable of summoning an immune response. Libby further found that IL-1, by altering gene expression in local blood vessel cells, amplifies its signal at the site of the disease. But, “No one in cardiology was interested in inflammation” then, Libby notes. In fact, none of his early work on IL-1 appears in the cardiology literature: “My papers were rejected,” he recalls; “my grants turned down.” In 1986, he published his first paper showing that the lining of the arteries could produce IL-1 in the American Journal of Pathology.

The Evolution of Excessive Inflammation

TODAY, inflammation is a focus of intense research in many fields. Roni Nowarski, assistant professor of neurology and immunology, explains that inflammation is important across a range of seemingly distinct pathologies because immune cells are everywhere, even resident in organs, where they play an important role in monitoring and maintaining health. The paradigm that everyone knows—that the immune system’s front line consists of circulating white blood cells that patrol the body to guard against infection and injury—is a bit misleading. An important arm of the immune system resides outside the blood vessels. Pac-Man-like macrophages occupy tissues, where they engulf and digest invading pathogens, debris, and dying cells. An invaluable role of these tissue macrophages is to “act as sensors,” Nowarski says. “They have hard-wired mechanisms to detect signals that are out of the ordinary,” and so play a critical role in maintaining a healthy equilibrium. “If there is any fluctuation,” he notes, “the role of these cells is to return the system to this point of homeostasis.”

These tissue-based white blood cells can also call for backup. When that happens,  . . .

Continue reading.

Written by LeisureGuy

22 October 2019 at 1:10 pm

Under digital surveillance: How American schools spy on millions of kids

leave a comment »

The US seems to be putting in place the tools to transition to an authoritarian state (for the public’s own good, of course). Law enforcement (police departments, ICE, TSA, and other agencies) already are aggressive (and sometimes lethal) in their interactions with citizens and visitors, and now the new generation students are learning how to live under constant surveillance from the state, which seems likely to be a useful skill to have given the direction the US is going.

Lois Beckett reports in the Guardian:

For Adam Jasinski, a technology director for a school district outside of St Louis, Missouri, monitoring student emails used to be a time-consuming job. Jasinski used to do keyword searches of the official school email accounts for the district’s 2,600 students, looking for words like “suicide” or “marijuana”. Then he would have to read through every message that included one of the words. The process would occasionally catch some concerning behavior, but “it was cumbersome”, Jasinski recalled.

Last year Jasinski heard about a new option: following the school shooting in Parkland, Florida, the technology company Bark was offering schools free, automated, 24-hour-a-day surveillance of what students were writing in their school emails, shared documents and chat messages, and sending alerts to school officials any time the monitoring technology flagged concerning phrases.

The automated alerts were a game-changer, said Jason Buck, the principal of the Missouri district’s middle school. One Friday evening last fall, Buck was watching television at home when Bark alerted him that one of his students had just written an email to another student talking about self-harm. The principal immediately called the first student’s mother: “Is the student with you?” he asked. “Are they safe?”

Before his school used Bark, the principal said, school officials would not know about cyberbullying or a student talking about hurting themselves unless one of their friends decided to tell an adult about it. Now, he said, “Bark has taken that piece out of it. The other student doesn’t have to feel like they’re betraying or tattling or anything like that.”

Although students at his school are aware they’re being monitored, they were surprised at first at how quickly school administrators could follow up on what they had typed, Buck said. “It’s not, ‘Hey, I sent this email two days ago,’ [it’s] ‘You just sent this email three minutes ago, let’s talk.’”

Bark and similar tech companies are now monitoring the emails and documents of millions of American students, across thousands of school districts, looking for signs of suicidal thoughts, bullying or plans for a school shooting.

The new school surveillance technology doesn’t turn off when the school day is over: anything students type in official school email accounts, chats or documents is monitored 24 hours a day, whether students are in their classrooms or their bedrooms.

Tech companies are also working with schools to monitor students’ web searches and internet usage, and, in some cases, to track what they are writing on public social media accounts.

Parents and students are still largely unaware of the scope and intensity of school surveillance, privacy experts say, even as the market for these technologies has grown rapidly, fueled by fears of school shootings, particularly in the wake of the Parkland shooting in February 2018, which left 17 people dead.

Digital surveillance is just one part of a booming, nearly $3bn-a-year school security industry in the United States, where Republican lawmakers have blocked any substantial gun control legislation for a quarter century.

“Schools feel massive pressure to demonstrate that they’re doing something to keep kids safe. This is something they can spend money on, roll out and tell parents, this is what we’re doing,” said Chad Marlow, a privacy expert at the American Civil Liberties Union.

Unlike gun control, Marlow said, “Surveillance is politically palatable, and so they’re pursuing surveillance as a way you can demonstrate action, even though there’s no evidence that it will positively impact the problem.”

Huge growth

There is still no independent evaluation of whether this kind of surveillance technology actually works to reduce violence and self-harm. Privacy experts say pervasive monitoring may hurt children, and may be particularly dangerous for students with disabilities and students of color.

Despite the lack of research evidence, tech companies are marketing school monitoring technologies with bold claims of hundreds of lives saved, mostly through prevention of youth suicide attempts.

Gaggle, a leading provider of school email and shared document monitoring, says its technology is currently used to monitor 4.5 million students across 1,400 school districts. The company claims that in the last academic year alone its technology “helped districts save the lives of more than 700 students who were planning or actually attempting suicide”.

Bark says it works with at least 1,400 school districts across the country, and claims its technology has helped prevent “16 credible school shootings” and detected “twenty thousand severe self-harm situations”.

Securly, another leading provider, says its products are used to protect 10 million students across 10,000 individual schools. In the past year, Securly said it helped school officials intervene in 400 situations that presented an “imminent threat”.

The companies’ statistics on lives saved are based on their own anecdotal data, and have not been independently evaluated.

“I heard from a lot of districts that in the weeks after Parkland, they were getting nonstop email solicitations from all sorts of brand new, or fairly new companies specializing in social media, that were saying, ‘We can fix your problems,’ and a lot of them were adopting it,” Amelia Vance, the director of education privacy at the Future of Privacy Forum, said.

“Some people think that technology is magic, that artificial intelligence will save us,” Vance said. “A lot of the questions and a lot of the privacy concerns haven’t [been] thought of, let alone addressed.”

How it works

In Florence, South Carolina, school officials intervened after a middle school student started writing about suicide while working on an in-class English assignment. The phrases she typed in a Google document triggered an alert from Gaggle, the surveillance company working with the school district. “Within minutes”, the student was pulled out of class for a conversation with school officials, according to Dr Richard O’Malley, the district superintendent.

In Cincinnati, Ohio, the school district’s chief information officer had to call the police in the middle of the night to conduct a wellness check on a student who had been flagged by Gaggle for writing about self-harm. The situation was serious enough that the student was hospitalized to receive mental health services, the chief information officer, Sarah Trimble-Oliver, said.

In rural Weld county, Colorado, a school official got an alert from GoGuardian, a company that monitors students’ internet searches, that a student was doing a Google search for “how to kill myself” late one evening. The official worked with a social worker to call law enforcement to conduct an in-person safety check at the student’s home, said Dr Teresa Hernandez, the district’s chief intervention and safety officer. When the student’s mother answered the door, she was confused, and said that her child had been upstairs sleeping since 9pm. “We had the search history to show, actually, no, that’s not what was going on,” Hernandez said.

Federal law requires that American public schools block access to harmful websites, and that they “monitor” students’ online activities. What exactly this “monitoring” means has never been clearly defined: the . . .

Continue reading.

I’m sure this information will be useful to law enforcement agencies as well.

Written by LeisureGuy

22 October 2019 at 12:57 pm

Breaking News: A New Gillette DE Razor!

with 6 comments

The Heritage DE razor by Gillette will ship 1 November. Yes, I did pre-order one. Sharpologist has more info. Note that the “stainless steel” in the description refers to the blade, not the razor, which is probably chrome-plated zinc alloy (like the Edwin Jagger). It might be brass (like the old Gillette razors), but I doubt it.

I would say that this is a strong indication of the growth of the traditional shaving movement. In the various prefaces to the Guide, I list various indications, most recently in the preface to the 7th edition:

I’ve seen considerable change over the years of writing and revising the book: more vendors, more forums, more new razors, more men using a double-edge (DE) razor and true lather. The remarkable growth in traditional shaving with a DE razor gets its energy and impetus from a promise fulfilled: DE shaves really are better, and cheaper, and actually enjoyable.

The rapid growth of the wet-shaving movement, using single-blade razors—double-edge (DE) safety razors, the focus of this book, and also straight razors (SR) and single-edge (SE) safety razors—is due to several factors:

  • Increasing awareness by men of their skin’s health and health issues.
  • Increasing visibility and availability of traditional shaving tools: more vendors and more products, including razors.
  • Increasing prices of multiblade cartridges and canned foam, coupled with an economy that has required many to find cheaper lifestyles.

Over the past couple of years the third reason has come into high relief as the global economy stuttered and faltered, and in this edition I identify many good low-cost options for razors, brushes, soaps, and shaving creams.

I would add that using a DE razor and true lather is more enjoyable than using a cartridge razor and canned foam, a reason I explore in this comment from earlier this morning.

Written by LeisureGuy

22 October 2019 at 10:12 am

Posted in Shaving

%d bloggers like this: