Archive for March 2017
Jeff Guo reports in the Washington Post:
Economists have long argued that automation, not trade, is responsible for the bulk of the six million jobs shed by the manufacturing sector over the last 25 years. Now, they have a put a precise figure on some of the losses.
Industrial robots alone have eliminated up to 670,000 American jobs between 1990 and 2007, according to new research from MIT’s Daron Acemoglu and Boston University’s Pascual Restrepo.
The number is stunning on the face of it, and many have interpreted the study as an indictment of technological change — a sign that “robots are winning the race for American jobs.” But the bigger takeaway is that the nation has been ill-equipped to deal with the upheaval caused by automation.
The researchers estimate that half of the job losses resulted from robots directly replacing workers. The rest of the jobs disappeared from elsewhere in the local community. It seems that after a factory sheds workers, that economic pain reverberates, triggering further unemployment at, say, the grocery store or the neighborhood car dealership.
In a way, this is surprising. Economists understand that automation has costs, but they have largely emphasized the benefits: Machines makes things cheaper, and they free up workers to do other jobs. For instance, 41 percent of Americans were farmers a century ago, but thanks to tractors and mechanical harvesters, only 2 percent work in the agriculture today. The rest of us now can now aspire to be programmers or anesthesiologists or DJs or drone pilots.
The latest study reveals that for manufacturing workers, the process of adjusting to technological change has been much slower and more painful than most experts thought. “We were looking at a span of 20 years, so in that timeframe, you would expect that manufacturing workers would be able to find other employment,” Restrepo said. Instead, not only did the factory jobs vanish, but other local jobs disappeared too. Acemoglu and Restrepo say that every industrial robot eliminated about three manufacturing positions, plus three more jobs from around town.
If we are to make it through the next wave of automation, which is predicted to upend even more industries, we may have to rethink our policies about work and education — and learn from the industries that have coped the best.
Their research from Acemoglu and Restrepo joins the work of David Autor, David Dorn and Gordon Hanson, who have shown that the harms of trade with China were similarly concentrated in certain communities. The laid-off manufacturing workers couldn’t quickly find new jobs, so the economic pain lingered in their neighborhoods. Experts still believe that trade and automation can benefit Americans overall, contributing to lower prices and creating new kinds of jobs. But this evidence draws attention to the losers — the dislocated factory workers who just can’t bounce back.
The United States does have a program to retrain workers who lost their jobs to overseas competition, but research shows that most of them turn to other parts of the government safety net, such as Social Security, disability benefits and Medicaid. None of these efforts, though, seem to be doing enough for communities that have lost their manufacturing bases, where people have reduced earnings for the rest of their lives.
Perhaps that much was obvious. After all, anecdotes about the Rust Belt abound. But the new findings bolster the conclusion that these economic dislocations are not brief setbacks, but can hurt areas for an entire generation.
Acemoglu and Restrepo’s paper is also notable for its specificity. It has been difficult to pinpoint the impacts of technology on employment, in part because the effects have been so widespread. “When economists talk about automation, we’re actually talking about a bunch of stuff — we’re talking about capital, software, machinery, robots, artificial intelligence,” Restrepo said.
Many of these changes are invisible, or at least taken for granted, which is why false narratives persist, like the idea that trade with China caused the vast majority of job losses in the past decade. . .
Doug Garr writes at Backchannel.
Last August, Andrea Giacobbe logged on to Skyscanner, a European metasearch engine like Expedia and Travelocity that scans multiple travel websites and surfaces the cheapest fare. Giacobbe, a 52-year-old management consultant, was looking to book a flight from New York City to Genoa, Italy—a trip he’s made numerous times for family visits. He’d always relied on Skyscanner for a discount.
This time, the cheapest fare wasn’t that cheap: It was for an Alitalia flight that made two stops, through Milan and Rome, for $2,050. Surprised at the high quote, he decided to call Alitalia. Immediately, the airline offered a $1,550 flight with only one stop in Rome. It was cheaper. It would get there faster. They even offered him a discounted car rental.
“It blew my mind,” recalled Giacobbe. He hadn’t called Alitalia directly in years. He was accustomed to almost always relying on Skyscanner for the best deal.
Giacobbe’s experience is becoming more typical. Over the past several years, the conventional wisdom has been that cruising the net would yield the best prices in the travel, hotel, and car rental spaces. There’s been a tidal shift in the travel industry, to a point where most of us use aggregators to book our trips. Who bothers talking to a human being—a travel agent? You’re just going to be stuck in a long option queue.
Most of us rely on metasearch engines, like Priceline, Expedia, or Travelocity, which typically use dozens (sometimes as many as 200) of online travel agents, called OTAs, and aggregators to find the best deals. (A metasearch engine and an aggregator are interchangeable terms — they both scour other sites and compile data under one roof. An OTA is an actual travel agency that actually does the booking and is the lone site responsible for everything you buy through them.) We rely on these sites because we assume they have the secret sauce — the most powerful search engines, tweaked by superstar programmers armed with the most sophisticated algorithms—to guide us to the cheapest options. With a single search, you can feel assured that you are paying a rock bottom price.
Over time, however, the convention has flipped. As competition among the sites heated up, the hard-to-believe cheap fares required some filtering. A too-good-to-be-true fare ($99 to Europe from California) usually came with a catch (the $400, indirect, ticket home). And as the business models that on which these aggregators rely are getting tighter, the deals are getting worse. How can you be certain you’re getting the lowest quote? The short answer is, you can’t.
While reporting this piece I spoke to several software engineers, executives of hotel chains, as well as academics and researchers who have spent a considerable amount of time and effort digging into the issue. Their conclusion is that the industry is in flux, and that really good bargains—for hotels, flights, and car rentals—are often largely illusory. “Hotels are not giving the aggregators as many good deals as they did in the past,” a former software engineer who used to work for Priceline told me. (He didn’t want his name used because he still is seeking work in the industry.) “You might as well call Sheraton’s front desk.”
And good luck finding the delinquent parties: The number of players behind each transaction has ratcheted up. For every potential deal, there are likely to be multiple aggregators in the food chain, with each site taking a cut and ultimately driving up the final cost. My ex-Priceline source told me that aggregators explain away price fluctuations by citing the ebb and flow of supply and demand, which varies greatly in seasonal resort areas. But really, it’s a breakdown in the system. Just as airlines and hotels began trimming travel agent commissions more than 20 years ago, now history is repeating itself. “The airlines don’t want to pay the aggregators anymore,” he told me.
Which means the people who are paying them are us.
As early as the 1990s, before the ubiquity of the internet, you called your travel agent and he or she took care of everything — your flights, the hotel, the rental car. America Airlines, the Hyatt, Hertz and the like paid the travel agents to offer their services. But slowly, the landscape changed. The airlines and hotel chains stopped paying. Travel agents stopped offering their services for “free.” The consumer shouldered their fees, and travel agents became irrelevant players to all but the boutique wealthy travelers who didn’t need to worry about cost.
Technology aided in this inevitable disruption, and ultimately helped create some of the pricing chaos we see today. As the internet became the first stop for travel shopping, we started searching for bargains via keystroke commands. We stopped talking to hotel clerks, rental car agents, and airline reservation agents, and we boasted to our friends about our low-cost vacations to Lake Como.
All of this was made possible by the web aggregators.
Web aggregators work like this: . . .
Dan Grazier writes at the Project on Government Oversight:
The F-35 still has a long way to go before it will be ready for combat. That was the parting message of Dr. Michael Gilmore, the now-retired Director of Operational Test and Evaluation, in his last annual report.
The Joint Strike Fighter Program has already consumed more than $100 billion and nearly 25 years. Just to finish the basic development phase will require at least an extra $1 billion and two more years. Even with this massive investment of time and money, Dr. Gilmore told Congress, the Pentagon, and the public, “the operational suitability of all variants continues to be less than desired by the Services.”
Dr. Gilmore detailed a range of remaining and sometimes worsening problems with the program, including hundreds of critical performance deficiencies and maintenance problems. He also raised serious questions about whether the Air Force’s F-35A can succeed in either air-to-air or air-to-ground missions, whether the Marine Corps’ F-35B can conduct even rudimentary close air support, and whether the Navy’s F-35C is suitable to operate from aircraft carriers.
He found, in fact, that “if used in combat, the F-35 aircraft will need support to locate and avoid modern threat ground radars, acquire targets, and engage formations of enemy fighter aircraft due to unresolved performance deficiencies and limited weapons carriage availability.”
In a public statement, the F-35 Joint Program Office attempted to dismiss the Gilmore report by asserting, “All of the issues are well-known to the JPO, the U.S. services, our international partners, and our industry.”
JPO’s acknowledgement of the numerous issues are fine as far as it goes, but there’s no indication that the Office has any plan—including cost and schedule re-estimates—to fix those currently known problems without cutting corners. Nor, apparently, do they have a plan to cope with and fund the fixes for the myriad unknown problems that will be uncovered during the upcoming, much more rigorous, developmental and operational tests of the next four years. Such a plan is essential, and should be driven by the pace at which problems are actually solved rather than by unrealistic pre-existing schedules.
What will it take to fix the numerous problems identified by Dr. Gilmore, and how do we best move forward with the most expensive weapon program in history, a program that has been unable to live up to its own very modest promises? . . .
Continue reading. It’s a very thorough analysis. Here are the sections with links:
Electronics Used to Justify Cost Not Delivering Capabilities
Ineffective as a Fighter
Ineffective as an Interdiction Bomber
Ineffective as a Close Air Support Platform
Navy’s F-35 Unsuitable for Carrier Operations
Price Tag Is the Only Thing Stealthy about the F-35
Combat Effectiveness at Risk
Can the F-35 Be Where It’s Needed, When It’s Needed?
F-35 Reliability Problems
Officials Hiding Truth about F-35’s Problems and Delays from Taxpayers
It occurs to me that this is not unrelated to the immediately previous post on Tversky and Kahneman.
Tamsin Shaw in the NY Review of Books reviews Michael Lewis’s new book about Tversky and Kahneman:
The Undoing Project: A Friendship That Changed Our Minds
by Michael Lewis
Norton, 362 pp., $28.95
We are living in an age in which the behavioral sciences have become inescapable. The findings of social psychology and behavioral economics are being employed to determine the news we read, the products we buy, the cultural and intellectual spheres we inhabit, and the human networks, online and in real life, of which we are a part. Aspects of human societies that were formerly guided by habit and tradition, or spontaneity and whim, are now increasingly the intended or unintended consequences of decisions made on the basis of scientific theories of the human mind and human well-being.
The behavioral techniques that are being employed by governments and private corporations do not appeal to our reason; they do not seek to persuade us consciously with information and argument. Rather, these techniques change behavior by appealing to our nonrational motivations, our emotional triggers and unconscious biases. If psychologists could possess a systematic understanding of these nonrational motivations they would have the power to influence the smallest aspects of our lives and the largest aspects of our societies.
Michael Lewis’s The Undoing Project seems destined to be the most popular celebration of this ongoing endeavor to understand and correct human behavior. It recounts the complex friendship and remarkable intellectual partnership of Daniel Kahneman and Amos Tversky, the psychologists whose work has provided the foundation for the new behavioral science. It was their findings that first suggested we might understand human irrationality in a systematic way. When our thinking errs, they claimed, it does so predictably. Kahneman tells us that thanks to the various counterintuitive findings—drawn from surveys—that he and Tversky made together, “we now understand the marvels as well as the flaws of intuitive thought.”
Kahneman presented their new model of the mind to the general reader in Thinking, Fast and Slow (2011), where he characterized the human mind as the interrelated operation of two systems of thought: System One, which is fast and automatic, including instincts, emotions, innate skills shared with animals, as well as learned associations and skills; and System Two, which is slow and deliberative and allows us to correct for the errors made by System One.
Lewis’s tale of this intellectual revolution begins in 1955 with the twenty-one-year-old Kahneman devising personality tests for the Israeli army and discovering that optimal accuracy could be attained by devising tests that removed, as far as possible, the gut feelings of the tester. The testers were employing “System One” intuitions that skewed their judgment and could be avoided if tests were devised and implemented in ways that disallowed any role for individual judgment and bias. This is an especially captivating episode for Lewis, since his best-selling book, Moneyball (2003), told the analogous tale of Billy Beane, general manager of the Oakland Athletics baseball team, who used new forms of data analytics to override the intuitive judgments of baseball scouts in picking players.
The Undoing Project also applauds the story of the psychologist Lewis Goldberg, a colleague of Kahneman and Tversky in their days in Eugene, Oregon, who discovered that a simple algorithm could more accurately diagnose cancer than highly trained experts who were biased by their emotions and faulty intuitions. Algorithms—fixed rules for processing data—unlike the often difficult, emotional human protagonists of the book, are its uncomplicated heroes, quietly correcting for the subtle but consequential flaws in human thought.
The most influential of Kahneman and Tversky’s discoveries, however, is “prospect theory,” since this has provided the most important basis of the “biases and heuristics” approach of the new behavioral sciences. They looked at the way in which people make decisions under conditions of uncertainty and found that their behavior violated expected utility theory—a fundamental assumption of economic theory that holds that decision-makers reason instrumentally about how to maximize their gains. Kahneman and Tversky realized that they were not observing a random series of errors that occur when people attempted to do this. Rather, they identified a dozen “systematic violations of the axioms of rationality in choices between gambles.” These systematic errors make human irrationality predictable.
Lewis describes, with sensitivity to the political turmoil that constantly assailed them in Israel, the realization by Kahneman and Tversky that emotions powerfully influence our intuitive analysis of probability and risk. We particularly aim, on this account, to avoid negative emotions such as regret and loss. . . .
I just posted about the problems Johnson & Johnson is facing for selling a known carcinogen for cosmetic use. They are not alone. The site of Schultz & Myers, a personal-injury law firm, contains this information?
It’s about time you take a look at the label on the back of your beauty products. While drugs in the United States must be approved by the FDA before hitting the market, the U.S. government has no authority over the cosmetic industry.
As long as they don’t contain any ingredients that are classified as drugs, products do not have to be approved by the U.S. As a result, popular cosmetic companies continue to use ingredients that may be considered dangerous. Many of which are banned in other countries.
This is far from a comprehensive list of ingredients that we use in the United States that have been banned in the European Union as well as Canada and Japan.
Baby Powder: Talcum Powder
Talc was a popular mineral used in powdered cosmetics and deodorants until it was discovered that, in its natural state, it could contain asbestos. However, after asbestos was discovered to be a carcinogen, regulation of talc has been strict.
Consumer Talcum products have been asbestos-free since the 1970s, however studies have noted a link between asbestos-free talc and ovarian cancer. While the link has not been proven, the possibility of asbestos-free talc being carcinogenic has prompted the European Union to ban talc-based cosmetics altogether. It is still widely used in the United States.
Acne Medication: Salicylic Acid
Salicylic acid is typically used as an acne treatment, and was banned in the EU in February of 2014 due to its close relation to acetylsalicylic acid (aspirin). Aspirin has been linked to salicylate poisoning and Reye’s syndrome in children and young adults.
While Reye’s syndrome is described as “sudden (acute) brain damage and liver function problems of unknown cause,” it has become a very uncommon occurrence since aspirin is no longer recommended for routine use in children.
Triclosan keeps gingivitis at bay, but it might lead to weakened immune systems and even birth defects. The European Union has already banned the chemical that is often found in toothpaste and antibacterial soaps.
The United States may not be far behind on this ban. Minnesota Governer, Mark Dayton recently signed a bill that would ban triclosan products in the state effective January of 2017.
Nail Polish: Formaldehyde
Formaldehyde is a preservative—and known carcinogen—often used in nail polish. In addition to cancer, formaldehyde is known to cause severe allergic reactions. Canada has banned the use of Formaldehyde in personal care products, but the chemical may still be lurking in Missouri nail polish aisles.
Skin Lightener: Hydroquinone
Skin lightening product, Hydroquinone is known to be effective in fading liver spots, freckles, and acne scars. However, there has been some recent controversy over the ingredient, as there has been some evidence suggesting that it could be carcinogenic.
Eye Shadow: Butylparaben
Butylparaben is typically used as an antimicrobial preservative used in eye shadow, foundation, facial moisturizer, and anti-aging treatments to prevent decomposition. Because Butylparaben mimics estrogen, it has been linked to several health problems typically associated with estrogenic substances.
It has been reported that butylparabens can decrease sperm function in men—potentially leading to sterility. Additionally, a study on rats exposed to a high concentration of butylparaben during pregnancy reports a proportionate increase of pups being born with malformed reproductive organs.
Should I Stop Using Beauty Products that Include These Ingredients?
As we’ve said, some of these chemicals are known carcinogens, or known to be bad for your health in other ways, but others have only seen a correlation. In other words, the official stance in the United States is that they “might” be safe. The decision is ultimately up to you, but make sure you’re checking ingredients and that you know what you’re putting on your skin. . .
Lorenzo Franceschi-Bicchierai reports at Motherboard:
A year and a half ago, a Motherboard investigation revealed that several US government agencies, weren’t using basic, easy-to-implement, encryption technology, failing to protect their employees emails travelling across the internet. At the time, the Army, the Navy, and even the CIA and FBI didn’t use the widespread email encryption technology known as STARTTLS.
Since then, the FBI, NSA, CIA, the Director of National Intelligence and the Department of Homeland Security have all adopted it. But the Defense Information Systems Agency or DISA, the Pentagon’s branch that oversees email through the mail.mil service, and other technologies, still has not, according to an online testing tool.
And one of the most tech savvy people in Congress is starting to wonder what’s going on. In a letter sent to DISA last week, Sen. Ron Wyden (D-Oregon) slammed the agency for failing to turn STARTTLS on.
“I am concerned that DISA is not taking advantage of a basic, widely used, easily-enabled cybersecurity technology,” Wyden wrote in the letter, which was obtained by Motherboard. “Indeed, until DISA enables STARTTLS, unclassified email messages sent between the military and other organizations will be needlessly exposed so surveillance and potentially compromise by third parties.”
DISA, which is responsible for providing email services to the Army, the Navy, the Marines and the Coast Guard, declined to comment.
“DISA did receive Senator Wyden’s letter and is in the process of providing a formal response back to the senator,” a DISA spokesperson said in an email. “As such, we will not comment further until Senator Wyden is provided that response.”
Historically, emails used to travel across the internet completely completely exposed. That’s why the famed security expert Bruce Schneier once said that email is nothing more than “a postcard that anyone can read along the way.” That has obviously changed in recent years, thanks to the adoption of an old protocol called STARTTLS, which adds an opportunistic layer of web encryption (TLS) over the email protocol SMTP. . .