Archive for December 20th, 2009
China is moving quickly to not only catch up with the US but move ahead. Evan Osnos in the New Yorker:
On March 3, 1986, four of China’s top weapons scientists—each a veteran of the missile and space programs—sent a private letter to Deng Xiaoping, the leader of the country. Their letter was a warning: Decades of relentless focus on militarization had crippled the country’s civilian scientific establishment; China must join the world’s xin jishu geming, the “new technological revolution,” they said, or it would be left behind. They called for an élite project devoted to technology ranging from biotech to space research. Deng agreed, and scribbled on the letter, “Action must be taken on this now.” This was China’s “Sputnik moment,” and the project was code-named the 863 Program, for the year and month of its birth.
In the years that followed, the government pumped billions of dollars into labs and universities and enterprises, on projects ranging from cloning to underwater robots. Then, in 2001, Chinese officials abruptly expanded one program in particular: energy technology. The reasons were clear. Once the largest oil exporter in East Asia, China was now adding more than two thousand cars a day and importing millions of barrels; its energy security hinged on a flotilla of tankers stretched across distant seas. Meanwhile, China was getting nearly eighty per cent of its electricity from coal, which was rendering the air in much of the country unbreathable and hastening climate changes that could undermine China’s future stability. Rising sea levels were on pace to create more refugees in China than in any other country, even Bangladesh.
In 2006, Chinese leaders redoubled their commitment to new energy technology; they boosted funding for research and set targets for installing wind turbines, solar panels, hydroelectric dams, and other renewable sources of energy that were higher than goals in the United States. China doubled its wind-power capacity that year, then doubled it again the next year, and the year after. The country had virtually no solar industry in 2003; five years later, it was manufacturing more solar cells than any other country, winning customers from foreign companies that had invented the technology in the first place. As President Hu Jintao, a political heir of Deng Xiaoping, put it in October of this year, China must “seize preëmptive opportunities in the new round of the global energy revolution.”
A China born again green can be hard to imagine, especially for people who live here. After four years in Beijing, I’ve learned how to gauge the pollution before I open the curtains; by dawn on the smoggiest days, the lungs ache. The city government does not dwell on the details; its daily air-quality measurement does not even tally the tiniest particles of pollution, which are the most damaging to the respiratory system. Last year, the U.S. Embassy installed an air monitor on the roof of one of its buildings, and every hour it posts the results to a Twitter feed, with a score ranging from 1, which is the cleanest air, to 500, the dirtiest. American cities consider anything above 100 to be unhealthy. The rare times in which an American city has scored above 300 have been in the midst of forest fires. In these cases, the government puts out public-health notices warning that the air is “hazardous” and that “everyone should avoid all physical activity outdoors.” As I type this in Beijing, the Embassy’s air monitor says that today’s score is 500.
China is so big—and is growing so fast—that in 2006 it passed the United States to become the world’s largest producer of greenhouse gases. If China’s emissions keep climbing as they have for the past thirty years, the country will emit more of those gases in the next thirty years than the United States has in its entire history. So the question is no longer whether China is equipped to play a role in combating climate change but how that role will affect other countries. David Sandalow, the U.S. Assistant Secretary of Energy for Policy and International Affairs, has been to China five times in five months. He told me, “China’s investment in clean energy is extraordinary.” For America, he added, the implication is clear: “Unless the U.S. makes investments, we are not competitive in the clean-tech sector in the years and decades to come.” …
Continue reading. The US, of course, has the free market and the financial services industry, and that should solve all our problems.
Very interesting article in the New Yorker by Atul Gawande:
Cost is the spectre haunting health reform. For many decades, the great flaw in the American health-care system was its unconscionable gaps in coverage. Those gaps have widened to become graves—resulting in an estimated forty-five thousand premature deaths each year—and have forced more than a million people into bankruptcy. The emerging health-reform package has a master plan for this problem. By establishing insurance exchanges, mandates, and tax credits, it would guarantee that at least ninety-four per cent of Americans had decent medical coverage. This is historic, and it is necessary. But the legislation has no master plan for dealing with the problem of soaring medical costs. And this is a source of deep unease.
Health-care costs are strangling our country. Medical care now absorbs eighteen per cent of every dollar we earn. Between 1999 and 2009, the average annual premium for employer-sponsored family insurance coverage rose from $5,800 to $13,400, and the average cost per Medicare beneficiary went from $5,500 to $11,900. The costs of our dysfunctional health-care system have already helped sink our auto industry, are draining state and federal coffers, and could ultimately imperil our ability to sustain universal coverage.
What have we gained by paying more than twice as much for medical care as we did a decade ago? The health-care sector certainly employs more people and more machines than it did. But there have been no great strides in service. In Western Europe, most primary-care practices now use electronic health records and offer after-hours care; in the United States, most don’t. Improvement in demonstrated medical outcomes has been modest in most fields. The reason the system is a money drain is not that it’s so successful but that it’s fragmented, disorganized, and inconsistent; it’s neglectful of low-profit services like mental-health care, geriatrics, and primary care, and almost giddy in its overuse of high-cost technologies such as radiology imaging, brand-name drugs, and many elective procedures.
At the current rate of increase, the cost of family insurance will reach twenty-seven thousand dollars or more in a decade, taking more than a fifth of every dollar that people earn. Businesses will see their health-coverage expenses rise from ten per cent of total labor costs to seventeen per cent. Health-care spending will essentially devour all our future wage increases and economic growth. State budget costs for health care will more than double, and Medicare will run out of money in just eight years. The cost problem, people have come to realize, threatens not just our prosperity but our solvency.
So what does the reform package do about it? Turn to page 621 of the Senate version, the section entitled “Transforming the Health Care Delivery System,” and start reading…
The first year of the Obama Administration is almost over, yet many mysteries surrounding the so-called “dark side” of the “war on terror” remain unsolved. Here are ten:
- Did former Vice-President Cheney know the full, clinical details of the Bush Administration’s interrogation and detention program for terror suspects? Did he have a supervisory role?
- How much did President Bush know about the alleged abuse? Cheney has said that the former President “knew a great deal about the program” and “basically authorized it.” Did he know, for instance, that one suspect was waterboarded a hundred and eighty-three times? Did he know that another died in C.I.A. custody after having been left to freeze overnight? If he did know, what was his reaction?
- The C.I.A. destroyed ninety-two videotapes of interrogation sessions. What exactly was on the tapes, and why were they destroyed? Are there written transcripts describing what was on the tapes? Did the tapes document potential evidence of a crime? If so, did their destruction constitute obstruction of justice? And if so, which officials authorized the tapes’ destruction?
- Have all the former C.I.A. prisoners been accounted for? Some seem not to have been sent to Guantánamo when the C.I.A.’s black-site prisons were closed, in 2006. Instead, it appears they may have been sent to other countries, including Egypt, Jordan, and Libya. If so, who were these prisoners, and where are they now?
- Who provided the “muscle” in the C.I.A. interrogation and detention program? Were the notional global “hit squads” authorized, or made operational? Were their activities fully briefed to Congress? Were they staffed by C.I.A. officers, Special Operations officers, private contractors, or others? If there were abuses, will anyone face any consequences?
- Vice-President Cheney and other defenders of “enhanced interrogation” techniques have insisted that coercion produced intelligence and saved lives. Many other experts have argued that the same information or better could have been obtained by less controversial methods. Will the public ever be able to access the record, in order to judge this on its own?
- A small handful of politically appointed lawyers during the Bush years approved many forms of prisoner abuse that would previously have been judged criminal. Those lawyers have fanned out to teach, practice law, and, in one case, sit on the federal bench. Will there be professional consequences for any of these lawyers? A report on them by the Justice Department has been pending release for the entire last year. Why has it been so delayed?
- Several contract psychologists designed and helped to implement the C.I.A.’s program of “enhanced” interrogation techniques. Will these psychologists face professional consequences? They have indicated they would like to tell their story—will they?
- Who forged the “yellowcake” Niger documents that helped spur the U.S. into the war in Iraq?
- Who are the chief financiers of terror, and do any of them have state sponsors?
Bonus question: Where is Osama bin Laden?
There’s been some back and forth in the liberal blogosphere about Matt Taibbi’s latest masterpiece in Rolling Stone – the one about how Obama stuffed his administration with Wall Streeters. Taibbi, indeed, has two tiny things wrong: 1) He confused two guys named Jamie Rubin in one sentence and 2) He seemed to imply that Karen Kornbluh was a non-Rubinite, even though she was actually Bob Rubin’s former deputy chief of staff. The former is a minor screw up that doesn’t negate the thrust of the piece, and the latter actually gives Obama a benefit of the doubt he doesn’t deserve.
Of course, the criticism of Taibbi from liberals and conservatives in the Washington Establishment runs much deeper than a few non-germane factual errors. And it is telling – not about Taibbi, but about the rot, corruption and elitism that now defines Washington Establishment.
When you read criticism from the American Prospect, you see the magazine acknowledging the factual accuracy of everything Taibbi has reported (accuracy which Reuters verifies and that Taibbi himself defends) – but you see some other stuff, too. You see an ugly form of jealousy at a reporter who doesn’t feel (as the Prospect so often does) the need to obsequiously worship Democratic politicians. You also see rage at a writer for being way more talented than almost any other writer in journalism. Even more important, you see an obnoxious Beltway elitism that suggests Taibbi doesn’t get it.
I guess we’ll find out. Katherine Harmon at Scientific American:
Bacteria, viruses and fungi have been primarily cast as the villains in the battle for better human health. But a growing community of researchers is sounding the warning that many of these microscopic guests are really ancient allies.
Having evolved along with the human species, most of the miniscule beasties that live in and on us are actually helping to keep us healthy, just as our well-being promotes theirs. In fact, some researchers think of our bodies as superorganisms, rather than one organism teeming with hordes of subordinate invertebrates.
The human body has some 10 trillion human cells—but 10 times that number of microbial cells. So what happens when such an important part of our bodies goes missing?
With rapid changes in sanitation, medicine and lifestyle in the past century, some of these indigenous species are facing decline, displacement and possibly even extinction. In many of the world’s larger ecosystems, scientists can predict what might happen when one of the central species is lost, but in the human microbial environment—which is still largely uncharacterized—most of these rapid changes are not yet understood. "This is the next frontier and has real significance for human health, public health and medicine," says Betsy Foxman, a professor of epidemiology at the University of Michigan (U.M.) School of Public Health in Ann Arbor.
Meanwhile, each new generation in developed countries comes into the world with fewer of these native populations. "They’re actually missing some component of their microbiota that they’ve evolved to have," Foxman says…
Dark matter may have been "felt" for the first time deep in a Minnesota mine, physicists say.
Detectors in the mine, part of the Cryogenic Dark Matter Search experiment, were tripped recently by what might be weakly interacting massive particles, or WIMPs.
WIMPs are among the most popular candidates for dark matter, the invisible material that scientists think makes up more than 80 percent of the mass in the universe.
Recently detectors in the mine recorded two hits with "characteristics consistent with those expected from WIMPs," according to a statement posted on the Cryogenic Dark Matter Search Web site.
There is a one-in-four chance, however, that the particles detected are not dark matter but ordinary subatomic particles such as neutrons, the team cautions. (Related: "Dark Matter Proof Found Over Antarctica?")
Mike Shull, an astrophysicist at the University of Colorado at Boulder, also urged restraint in interpreting the results.
"I regard this as interesting but very much an interim ‘progress report’ on a promising technique," said Shull, who did not participate in the research.
"I hope they’ve detected [WIMPs]," he added, "It’s exciting if it’s true."
Scientists have predicted that WIMPs can interact with normal atoms but only weakly and very rarely—hence the name.
When such an interaction happens, a WIMP careens like a billiard ball off an atom, the theory goes. But the collision leaves behind a unique signature in the form of a small amount of heat, which can be detected.
The smashup also creates charged atoms, or ions, that are detectable.
The Cryogenic Dark Matter Search experiment uses 30 detectors made of …
Chiropractors, homeopaths, naturopaths, acupuncturists, and other alternative medicine practitioners constantly criticize mainstream medicine for “only treating the symptoms,” while alternative medicine allegedly treats “the underlying causes” of disease.
Nope. Not true. Exactly backwards. Think about it. When you go to a doctor with a fever, does he just treat the symptom? No, he tries to figure out what’s causing the fever and if it’s pneumonia, he identifies which microbe is responsible and gives you the right drugs to treat that particular infection. If you have abdominal pain, does the doctor just give you narcotics to treat the symptom of pain? No, he tries to figure out what’s causing the pain and if he determines you have acute appendicitis he operates to remove your appendix.
I guess what they’re trying to say is that something must have been wrong in the first place to allow the disease to develop. But they don’t have any better insight into what that something might be than scientific medicine does. All they have is wild, imaginative guesses. And they all disagree with one another. The chiropractor says if your spine is in proper alignment you can’t get sick. Acupuncturists talk about the proper flow of qi through the meridians. Energy medicine practitioners talk about disturbances in energy fields. Nutrition faddists claim that people who eat right won’t get sick. None of them can produce any evidence to support those claims. No alternative medicine has been scientifically shown to prevent disease or to cure it. If it had, it would have been incorporated into conventional medicine and would no longer be “alternative.”
Are these practitioners treating the underlying cause, or are they simply applying their one chosen tool to treat everything? Chiropractors treat every patient with chiropractic adjustments. What if a doctor used one treatment for everything? You have pneumonia? Here’s some penicillin. You have a broken leg? Here’s some penicillin. You have diabetes? Here’s some penicillin. Acupuncturists only know to stick needles in people. Homeopaths only know to give out ridiculously high dilutions that amount to nothing but water. Therapeutic touch practitioners only know to smooth out the wrinkles in imaginary energy fields. They are not trying to determine any underlying cause: they are just using one treatment indiscriminately.
How do you define “cause”? …
In a remarkable bit of good timing, I’ve been re-reading Shelby Foote’s history of the Civil War, in particular his first volume (covering 1861-1862). It provides some much-needed perspective on the current situation with health care reform.
Like President Obama, President Lincoln was seen by many of his supporters as something of a disappointment once in office. This was largely due to the number and types of compromises he needed to make, most notably with the institution of slavery. In his first inaugural address, Lincoln came out and said that he was not bound and determined to end slavery, that the President does not in any case have the power to unilaterally change the law of the land, and that his first priority was the preservation of the Union, even if the price of that preservation was to accept the continuation of slavery. During the war, when pressed by a group of ministers about why he had not more forcefully worked to end slavery, he reiterated that his overriding priority was to preserve the Union, and added that there were four slave states which had stayed loyal and which were currently contributing 50,000 soldiers to the war effort; these, he pointed out, were states and soldiers which he could not afford to lose in a dispute over slavery.
When Lincoln finally issued the Emancipation Proclamation, its scope was remarkably circumscribed: it did not call for the emancipation of slaves in loyal states (for this, Lincoln would need the participation of Congress, and in any event, as described above, he did not seek such an act for fear of worsening the Union’s position in the war); it did not call for the emancipation of slaves in those areas under military control by the Union; it limited emancipation to those areas which would be brought under military control subsequent to January 1, 1863, which was about 3 months after the Proclamation itself was issued. As one historian noted, this meant the Proclamation carefully excused all of the slaves which the United States actually had any authority over at the time of issuance! As another historian noted, the Proclamation was in essence the offer of a bribe: any state then in rebellion which would lay down its arms and return to the Union would not be compelled to give up its slaves; any state conquered by force of arms after January 1, 1863 would be so compelled.
Needless to say, the Proclamation was seen by anti-slavery partisans of the time as wholly unacceptable, a compromise too far, and yet more evidence of the unfitness of their elected standard-bearer in the White House. And yet, as Foote points out, Lincoln is today hailed as the preserver of the Union, which he was, but as The Great Emancipator, which he was not. This is because the Proclamation, while useless in a practical sense at the moment of issuance, was the crucial starting point for the abolition of slavery, a project which was completed just a few years later.
I trust that the parallels with our own current situation are apparent (though I think that the Senate HCR bill is far more immediately useful now then the Emancipation Proclamation was then). I would also note that after winning election both Lincoln and Davis were widely condemned as weak, stupid, cowardly, vain, and tyrannical by their supporters, who wondered aloud why they had ever voted for those people and what was to become of a nation led by such a man. Apparently intense dissatisfaction with elected officials you have heretofore supported is an American tradition.
Take a look. God knows what their houses and lawns look like during the holiday season.
Are you a verbal learner or a visual learner? Chances are, you’ve pegged yourself or your children as either one or the other and rely on study techniques that suit your individual learning needs. And you’re not alone— for more than 30 years, the notion that teaching methods should match a student’s particular learning style has exerted a powerful influence on education. The long-standing popularity of the learning styles movement has in turn created a thriving commercial market amongst researchers, educators, and the general public. The wide appeal of the idea that some students will learn better when material is presented visually and that others will learn better when the material is presented verbally, or even in some other way, is evident in the vast number of learning-style tests and teaching guides available for purchase and used in schools. But does scientific research really support the existence of different learning styles, or the hypothesis that people learn better when taught in a way that matches their own unique style?
Unfortunately, the answer is no, according to a major new report published this month in Psychological Science in the Public Interest, a journal of the Association for Psychological Science. The report, authored by a team of eminent researchers in the psychology of learning—Hal Pashler (University of San Diego), Mark McDaniel (Washington University in St. Louis), Doug Rohrer (University of South Florida), and Robert Bjork (University of California, Los Angeles)—reviews the existing literature on learning styles and finds that although numerous studies have purported to show the existence of different kinds of learners (such as "auditory learners" and "visual learners"), those studies have not used the type of randomized research designs that would make their findings credible.
Nearly all of the studies that purport to provide evidence for learning styles fail /./
This is interesting:
Most Americans fail to appreciate that the Civil Rights movement was about the overthrow of an entrenched political order in each of the Southern states, that the segregationists who controlled this order did not hesitate to employ violence (law enforcement, paramilitary, mob) to preserve it, and that for nearly a century the federal government tacitly or overtly supported the segregationist state governments. That the Civil Rights movement employed nonviolent tactics should fool us no more than it did the segregationists, who correctly saw themselves as being at war. Significant change was never going to occur within the political system: it had to be forced. The aim of the segregationists was to keep the federal government on the sidelines. The aim of the Civil Rights movement was to "capture" the federal government — to get it to apply its weight against the Southern states. As to why it matters: a major reason we were slow to grasp the emergence and extent of the insurgency in Iraq is that it didn’t — and doesn’t — look like a classic insurgency. In fact, the official Department of Defense definition of insurgency still reflects a Vietnam era understanding of the term. Looking at the Civil Rights movement as an insurgency is useful because it assists in thinking more comprehensively about the phenomenon of insurgency and assists in a more complete — and therefore more useful — definition of the term.
The link to his talk is broken, unfortunately. Video here.
A couple hundred thousand years ago, the planet became a much colder and drier place. In Africa, deserts expanded, species were wiped out and the human race was in deep trouble.
See, humans today may look pretty different from one another but, genetically speaking, there’s not much diversity at all within our species. In fact, chimpanzees, which look pretty much the same from one individual to the next, are much more genetically diverse than we are. To scientists, that suggests that humans have come through a genetic bottleneck—a point where our numbers shrunk dramatically, and a relatively small population had to rebuild the species. For about 20 years, genetic anthropologists have been comparing the genes of modern human populations. Over time, they’ve used bigger and bigger samples, and better and better analysis, to hone in on when our bottleneck likely happened, and how many humans managed to slip through it.
Turns out, somewhere between 130,000 to 190,000 years ago, the human species was reduced to less than 1000 breeding individuals—just a few thousand people in total. Ancient, naturally driven climate change pushed our species to the brink, said Curtis Marean, Ph.D., a professor with the Institute of Human Origins and the School of Human Evolution and Social Change at Arizona State University.
What saved us? According to Marean, the answer may be "shellfish".
"They’re a great source of protein," he said. "And shellfish are immune to colder ocean temperatures. In fact, when the water gets colder, those populations go up."
Marean used climate models to pinpoint locations in Africa where human hunter-gatherers could have hunkered down during a long glacial period that dried out the continent and expanded deserts. Of the four-to-six possible locations, he focused in on an area along the coast of South Africa.
"That area has a super high diversity of below-ground tuberous plants, which have high carb loads. People are excellent foragers for them. You need a digging stick and there wouldn’t be a lot of animal competitors," he said. "And the tuberous plants are adapted to arid environments."
His team eventually found a site, dating to 164,000 years ago, that shows evidence of humans eating shellfish, working with natural pigments and creating technologically sophisticated tools. He thinks this could be the remnants of the humans of the bottleneck—ancestors of everyone alive today.
Other researchers have theorized that …
Take a look at these top 5 frugality blogs that MakeUseOf.com found, all with tips on saving money.
A University of California study on human subjects seems to indicate what food activists have long believed: high fructose corn syrup has special qualities which cause humans to pork up like animals in a feed lot. Oh, and it also may help cause life-threatening chronic diseases. The study was small, but frightening.
Over 10 weeks, 16 volunteers on a strictly controlled diet, including high levels of fructose, produced new fat cells around their heart, liver, and other digestive organs. They also showed signs of food-processing abnormalities linked to diabetes and heart disease. Another group of volunteers on the same diet, but with glucose sugar replacing fructose, did not have these problems.
People in both groups put on a similar amount of weight. However, researchers at the University of California who conducted the trial, said the levels of weight gain among the fructose consumers would be greater over the long term.
High fructose corn syrup is in nearly everything Americans eat, from fruit juices to bread to ketchup. It’s cheap, but is such cheap sweetness worth it in the long run, when it may actually be killing us?
Child diabetes blamed on food sweetener [Times Online]
Now will you check nutrition labels and put back on the shelf all foods and drinks containing HFCS?
UPDATE: As Mike points out in the comment below, the Tmes report has many problems. Look, as he suggests, at this corrected report.
Lots more information (and a different video) here. From the link:
Octopuses are masters of camouflage that can change their shape, colour and texture to perfectly blend into their environment. But the soft bodies that make them such excellent con artists also make them incredibly vulnerable, should they be spotted. Some species have solved that problem with their fierce intellect, which allows them to make use of other materials that are much harder. The veined octopus, for example, dons a suit of armour made of coconut shells.
The veined octopus (Amphioctus marginatus) lives in sandy, exposed habitats that have little in the way of cover. To protect itself, it hides among the hollow husks of coconuts. It even carries its armour around with it, tucking the shell under its body, sitting on it like a bowl, and moving around on tip-tentacles.
I read fairly often that the global warming the earth is obviously experiencing is "natural variation", but that does not explain where all the heat is coming from. (The actual source of the heat—the sun’s rays trapped by the rapidly growing greenhouse gasses due to human activity—is well known, but the skeptics don’t accept that, which leaves them the burden of discovery for the source of the heat.) Steve Connor in The Independent points out a few things:
Climate sceptics who dispute the link between global warming and carbon dioxide emissions frequently argue that the increase in world temperatures over the past half century is part of a natural cycle. They cite previous periods in history when the climate has swayed into extremes, such as the "medieval warm period" when vines grew in northern England and the Vikings settled in Greenland.
Or they quote the Little Ice Age, which happened somewhere between the 14th and 18th centuries, when the Thames froze over and the Bruegels painted their snowy winter landscapes. History shows that climate is a variable feast, they argue, and what we are getting now is just another side dish.
At the heart of this view is the belief that natural variations in the Sun’s activity are responsible for the warming of the past few decades. No one would dispute that the Sun is the main driver of climate – without it we would not have any climate. But what the sceptics are arguing is more subtle and complex.
They cite the work of two Danish scientists – Professor Eigil Friis-Christensen, director of the Danish National Space Centre, and Henrik Svensmark, who works in the same institute. Together, they have provided the rationale for believing that global warming has got more to do with natural variations in the cycle of sunspots on the solar surface than man-made emissions of CO2.
The theory is not that the intensity of the Sun has simply increased. Scientists are confident from 31 years of accurate, direct measurements of total solar radiation by satellites that there has been no overall increase in the amount of sunlight coming to Earth. Total solar irradiance, as it is called, has stayed remarkably constant and so cannot be held responsible for the warming of the past half century.
No, the theory of Friis-Christensen and Svensmark revolves around a far more subtle argument connected to the well-established 11-year cycle of sunspots that appear on the surface of the Sun. Sunspots are dark pools of magnetic activity that well up to the solar surface in periodic peaks of 11 years or so. When there are a lot of sunspots, the Sun is said to be more active.
In fact 11 years is only the average length of the activity cycle, which can vary between seven and 17 years. Shorter cycles of 10 years or less are associated with a more magnetically active Sun, when the solar wind of charged particles streams out towards the Earth with greater-than-normal intensity.
When sunspots are most active there is also a slight increase in solar intensity of 0.1 per cent. But this is hardly enough to account for the increase in global warming over the past half century, and this cyclical variation is not what Friis-Christensen and Svensmark are proposing as the cause of global warming.
The two Danes believe instead that there is a complex relationship between the length of the solar cycle and the amount of low-level cloud that forms in the Earth’s atmosphere. Because shorter cycles are associated with a more magnetically active Sun, this affects the cloud cover and hence the climate on Earth.
The crux of their argument relies on several unproven suppositions. The main one is that clouds are more likely to form when solar activity is at its lowest and fewer magnetic pulses reach Earth. A second is that there will be enough of these clouds to reflect sunlight and lower global temperatures significantly, perhaps accounting for that famous Little Ice Age.
Conservatives spend a lot of time whining these days about how Barack Obama is always blaming them for all the problems he faces. Personally, though, I’d say Obama has been remarkably restrained about the whole thing, especially when it comes to our disastrous fiscal situation. In a mere eight years, George Bush and the Republican Party managed to take a thriving economy and a federal surplus and turn it into a hair’s breadth escape from Great Depression II and an endless fiscal sinkhole. Rome may not have been built in a day, but it didn’t take much longer than that for the modern Republican Party to bankrupt America.
I think they take the right tone here:
While President Obama inherited a bad fiscal legacy, that does not diminish his responsibility to propose policies to address our fiscal imbalance and put the weight of his office behind them. Although policymakers should not tighten fiscal policy in the near term while the economy remains fragile, they and the nation at large must come to grips with the nation’s deficit problem. But we should all recognize how we got where we are today.
All clear now?
A freelance writer tells a story of how failure finally turned into success:
I had high-quality skills and a good education. I was fast on turnaround and very professional. I hustled and I delivered on my promises, every single time. I worked hard and built the business, putting in long hours and reinvesting a lot of the money I made.
I really, really wanted to make this work. But I was still having a hard time landing jobs. I was being turned down for gigs I should’ve gotten, for reasons I couldn’t put a finger on. My pay rate had hit a plateau, too. I knew I should be earning more. Others were, and I soaked up everything they could teach me, but still, there was something strange about it [...]
One day, I tossed out a pen name, because I didn’t want to be associated with my current business, the one that was still struggling to grow. I picked a name that sounded to me like it might convey a good business image. Like it might command respect.
Instantly, jobs became easier to get. There was no haggling. There were compliments, there was respect. Clients hired me quickly, and when they received their work, they liked it just as quickly. There were fewer requests for revisions — often none at all. Customer satisfaction shot through the roof. So did my pay rate.
Without knowing more about this, it’s impossible to say if this is really the whole story. But the writer is a woman, and the pen name she chose was "James Chartrand." And suddenly life changed. It’s all very plausible if you also remember stories like this and this.
Read the full story at that first link of the post. Amazing.
Very good breakfast at Toasties, and then The Wife and I went for a 5-minute walk. Not onerous at all.