Increase in October-April accumulated rainfall in Australia, 1900-2013. Northern Australian wet season (October to April) rainfall has shown wet and dry decades through the 20th century, but with a slight increase indicated in the linear trend in 1900-2012. In recent decades, increases are discernible across northern and central Australia, with the increase in summer rainfall most apparent since the early 1970s (Figure 4.2.4; Braganza, et al., 2011), and has been large enough to increase total Australian rainfall (averaged over the entire continent) by about 50 mm when comparing the 1900-to-1960 period with 1970-to-2013. Graphic: CSIRO / BOM

27 January 2015 (CSIRO) – Northern Australian wet season (October to April) rainfall has shown wet and dry decades through the 20th century, but with a slight increase indicated in the linear trend in 1900-2012. In recent decades, increases are discernible across northern and central Australia, with the increase in summer rainfall most apparent since the early 1970s (Figure 4.2.4; Braganza, et al., 2011), and has been large enough to increase total Australian rainfall (averaged over the entire continent) by about 50 mm when comparing the 1900-to-1960 period with 1970-to-2013.

Climate Change in Australia: Projections for Australia’s NRM Regions

World oil supply type, New Policies scenario, 2000-2035. Graphic: IEA / Kjell Aleklett

By Kjell Aleklett
15 February 2015

(Aleklett's Energy Mix) – On Tuesday 10 February at 13:00 GMT the IEA released its “Oil Medium-Term Market Report 2015”. The day before the release I was contacted by Jens Ergon at Sveriges Television (“Sweden’s Television, SVT) who wanted to get my opinion on the report. I had a number of hours to read through the 140 pages of the version provided to media prior to the report’s official release. This meant that I could comment on the report immediately it was released. SVT has now reported some of those comments in an article that Jens Ergon has written, “The Price Crash Will Reshape The Oil Market”. The subtitle is, “American oil boom behind the falling price. But opinions vary widely on the future of oil.”

Let’s now go through the article together and I will make a few comments as we do so.

“The comprehensive fall in the price of oil has taken the world’s experts by surprise. Since the summer of 2014 the price of oil has more than halved from over $100 per barrel to a price today of around $50. Last Tuesday the International Energy Agency, IEA, made its first report since the price fall. In the report, the development is described as the beginning of a new era. The IEA’s press release that accompanied the release of the report stated, “The recent crash in oil prices will cause the oil market to rebalance in ways that challenge traditional thinking about the responsiveness of supply and demand.

“What is surprising is not that the oil price has fallen but the severity of the drop and that it has continued during half a year until up to a few weeks ago”, commented Daniel Spiro, economist with focus on price developments for oil and natural resources at Oslo University.”

Aleklett: If we look back in time there were similar price falls at the beginning of the 1980s and in 2008. For economists the price fall were as surprising then as it is now. The interesting thing is that there has always been a good explanation for their occurrence but economists are very poor at predicting when a fall will occur. I have chosen never to predict the price of oil but to say always and only that, “the price will be what the market is prepared to pay”.

From end of cheap oil to price crash

The price crash comes after a number of years of historically high prices. With the exception of the temporary fall that occurred with the financial crisis of 2008-9 the oil price rose steeply during the entire first decade of this century and stayed at around $100 until the summer of 2014. This contrasts with the price of around $20 per barrel during the 1990s. Some researchers have regarded the high oil prices as a sign of Peak Oil, i.e., that the rate of global oil production is near the maximum that is geologically possible. Others have doubted the concept of Peak Oil and asserted that the higher prices will only encourage new, if more expensive, oil production. Data such as that published by the IEA clearly show that the easily produced, so-called “conventional oil” reached maximum production several years ago – at around 70 million barrels per day. But during recent years the introduction of more expensive, so-called “unconventional oil” – such as from the Canadian oil sands and, foremost, US shale oil – has compensated for the fall in conventional oil and has allowed total world oil production to increase somewhat, to just over 74 million barrels per day.”

Aleklett: During the 1990s the oil price even fell below $10 per barrel. It was in 1998 that Colin J. Campbell and Jean H. Laherrère published their famous article, “The End of Cheap Oil” in the journal Scientific American at the same time as The Economist wrote that the world was “Drowning in oil”. See my blog post, “How cheap is oil today?”. Colin and Jean wrote that cheap oil would reach a production maximum in around 2004 and today we know that the conventional oil, that was cheap in 1998, peaked in 2005. From 1998 until 2008 the price of oil rose from $10 per barrel to $147 per barrel. The era of cheap oil was over. Normally economists interpret such a price rise as a sign of scarcity and in this case we can call this shortage “Peak Oil”. Despite that, there are many who use any argument, no matter how contrived, to assert that Peak Oil lies far in the future.

Today, many people regard $50 per barrel oil as cheap and as a sign that we are, once again, “drowning in oil”. According to BP, production of crude oil and natural gas liquids totalled 82.6 Mb/d in 2006. If that production had continued at the same rate during the following ten years then the additional of oil would have raised total production to 92.3 Mb/d in 2013. Instead, 2013 saw total production at 86.8 Mb/d. The increase of 4.2 Mb/d we saw from 2008 to 2013 was not cheap oil. It came from deepwater, from Canada’s oilsands and as NGL and shale oil from fracking in the USA. We can see now that Colin and Jean’s 1998 predictions have proven completely correct.

As you can see in the figure [above], conventional crude oil reached maximum production in 2005-6 at 70 Mb/d and today is down at 67-8 Mb/d if one includes deepwater oil production as conventional. But deepwater production is expensive. If one also includes oil from oilsands and shale oil then the total production rate reaches 74 Mb/d. Note that NGL are not included in these numbers. [more]

The crash in the price of oil may change the oil market – a look at the IEA’s “Oil Medium-Term Market Report 2015”

Severe water stress (left red) since the 1930s mirrors the decline of large trees (right red) seen throughout California state, from the Sierra Nevada to the Coast Ranges. Graphic: McIntyre, et al., 2015

By Robert Sanders
20 January 2015

BERKELEY – Historical California vegetation data that more than once dodged the dumpster have now proved their true value, documenting that a changing forest structure seen in the Sierra Nevada has actually happened statewide over the past 90 years.

A team of scientists from the University of California, Berkeley, UC Davis and the U.S. Geological Survey compared unique forest surveys collected by UC Berkeley alumnus Albert Wieslander in the 1920s and ‘30s with recent U.S. Forest Service data to show that the decline of large trees and increase in the density of smaller trees is not unique to the state’s mountains.

“Older, larger trees are declining because of disease, drought, logging and other factors, but what stands out is that this decline is statewide,” said study leader Patrick McIntyre, who began the research while a postdoctoral fellow at UC Berkeley and now manages biodiversity data for the California Department of Fish and Wildlife. “Forests are becoming dominated by smaller, more densely packed trees, and oaks are becoming more dominant as pines decline.”

The authors found that the density of large trees declined in all regions of California, with declines up to 50 percent in the Sierra Nevada highlands, the south and central coast ranges and Northern California.

“Based on our data, water stress helps to explain the decline of large trees,” McIntyre said. “Areas experiencing declines in large-tree density also experienced increased water stress since the 1930s.”

The increased density of smaller trees is usually attributed to fire suppression statewide, he noted. Scientists debate the cause of the decline of larger trees, which has been observed in other parts of the world as well, but many suspect that larger trees need more water than smaller trees to withstand droughts and disease.

Co-author David Ackerly, a professor of integrative biology, said that stressed forests and the loss of large trees could exacerbate the global carbon situation, especially since many are hoping that forests will soak up more and more fossil fuel emissions.

“There’s no question that if you are losing large trees, you are losing the standing carbon in the forest,” he said. “Loss of these big trees and the impact of drought stress become a big concern going forward in terms of its impact on the carbon cycle; they can turn a carbon sink into a source of carbon released to the atmosphere.”

The results may help forecast future forest responses to climate change, and in particular suggest that increased temperatures and changing water availability may lead to large-scale changes in forest composition throughout western North America.

The study was published online this week in the early edition of the Proceedings of the National Academy of Sciences.

Oaks taking over California forests

One change the study observed occurs repeatedly throughout California’s history, as documented by paleoclimatic records in pollen, McIntyre said. Oaks are becoming more prevalent, replacing pines. Pines tend to dominate during cooler, wetter periods.

“Our study shows that areas of greater water stress tend to be dominated more by oaks than by pines, a signal we see despite variation in logging and fire around the state,” McIntyre said.

The study might never have happened if Wieslander’s data, stored both in Sacramento and at Berkeley, had not been saved several times from the trash bin, said co-author Maggi Kelly, a UC Berkeley cooperative extension specialist and professor of environmental science policy and management (ESPM). Wieslander acquired the vegetation data while he worked for the California Forest Experiment Station, a Berkeley outpost of the U.S. Forest Service and the forerunner of UC Berkeley’s Department of Forestry, now part of ESPM.

“This is really an astonishingly broad and detailed depiction of vegetation in California at that time and it’s important that through its nearly 100-year life it has almost been lost a number of times,” she said. “Patrick’s is one of the largest and most comprehensive looks at this historic data set in comparison to comparable contemporary data.”

Most of the plots, maps and photos have been digitized thanks to efforts by Kelly, co-author James Thorne of UC Davis and campus librarians who saw future value in the data. Digitization and the study were funded by the Keck Foundation through the Berkeley Initiative on Global Change Biology (BiGCB), as part of an ongoing effort to create an ecological informatics engine, or EcoEngine, for analyzing historical digitized data relating to ecological change.

“All these records are now brought together in digital form in the EcoEngine, which will allow more people to plumb the data and ask more questions, such as, What about logging? What do the photographic records show?” Kelly said. “We need to remember that there are a lot of valuable collections of data that we can use to make inferences about the future.”

Other co-authors are Christopher Dolanc of UC Davis and Alan and Lorraine Flint of the USGS California Water Science Center in Sacramento.

RELATED INFORMATION

Warmer, drier climate altering forests statewide

Ruins of the city of Cantona in the Mexican state of Puebla, with the mountain Cerro Pizarro in the background. The city was abandoned almost 1,000 years ago, probably as a result of a prolonged drought. Photo: Ines Urdaneta / Wikimedia Commons

By Robert Sanders
27 January 2015

BERKELEY (UC Berkeley) – Archaeologists continue to debate the reasons for the collapse of many Central American cities and states, from Teotihuacan in Mexico to the Yucatan Maya, and climate change is considered one of the major causes.

UC Berkeley study sheds new light on this question, providing evidence that a prolonged period of below-average rainfall was partly responsible for the abandonment of one such city, Cantona, between A.D. 900 and A.D. 1050.

At its peak, Cantona, located in a dry, volcanic basin (La Cuenca Oriental) east of today’s Mexico City, was one of the largest cities in the New World, with 90,000 inhabitants. The area was a major source of obsidian, and the city may have played a military role alongside an important trade route from the Veracruz coast into the highlands.

To assess the climate in that area before and after Cantona’s collapse, UC Berkeley geographers analyzed sediment cores from a lake located 20 miles south of the former city. They found evidence of a 650-year period of frequent droughts that extended from around A.D. 500 to about A.D. 1150. This was part of a long-term drying trend in highland Mexico that started 2,200 years ago, around 200 B.C. The climate became wetter again in about A.D. 1300, just prior to the rise of the Aztec empire. [Cultural implications of late Holocene climate change in Cuenca Oriental, Mexico]

“The decline of Cantona occurred during this dry interval, and we conclude that climate change probably played a role, at least towards the end of the city’s existence,” said lead author Tripti Bhattacharya, a UC Berkeley graduate student.

Surprisingly, the population of Cantona increased during the early part of the dry period, perhaps because of political upheaval elsewhere that increased the importance of the heavily fortified city, she said. Teotihuacan, less than 100 miles to the west, was in decline at the time, also possibly because of more frequent droughts.

“In a sense the area became important because of the increased frequency of drought,” said UC Berkeley associate professor of geography Roger Byrne. “But when the droughts continued on such a scale, the subsistence base for the whole area changed and people just had to leave. The city was abandoned.”

Bhattacharya, Byrne and their colleagues report their findings in an article appearing this week in the early edition of the journal Proceedings of the National Academy of Sciences. The UC Berkeley researchers analyzed lake cores provided by scientists at the National Autonomous University of Mexico in Juriquilla, Querétaro, Mexico and the German Research Centre for Geosciences in Potsdam, Germany.

Political upheaval and climate change

Byrne emphasized that the area’s typical monsoon weather with wet summers and dry winters did not stop, but was interrupted by frequent short-term droughts, no doubt affecting crops and water supplies. Today the area is close to the northern limit of maize production without irrigation, and would have been particularly vulnerable to drier conditions, he said.

Byrne, a member of the Berkeley Initiative on Global Change Biology (BiGCB) and curator of fossil pollen in the Museum of Paleontology, has studied sediment cores from many lakes in Mexico and California, and is particularly interested in possible links between climate change and human activities.

Nearly 20 years ago, he learned of Cantona and traveled with students to the areas three times to obtain cores from lakes near the site, most of which are maar lakes created by magma explosions. They are deep and often contain undisturbed and regularly layered sediments ideal for chronological studies.

German colleagues cored this particular lake, Aljojuca, in 2007, and Bhattacharya traveled to Potsdam to collect sediment samples. Oxygen isotope ratios in carbonate sediments are correlated with the ratio of precipitation to evaporation and thus indicate aridity. Organic material in the sediments was used for accelerator mass spectroscopy carbon-14 dating.

“We can show that both the growth and decline of the site took place during a time period of frequent drought, which forces us to think in more nuanced ways about how political and social factors interact with environmental factors to cause social and cultural change,” Bhattacharya said. “That makes the study particularly interesting.”

Bhattacharya noted that more studies are necessary to reconstruct the prehistoric climate of highland Mexico. Such studies could reveal the causes of prehistoric climatic change and whether they were similar to the factors that regulate the region’s climate today, such as the El Niño/Southern Oscillation.

Co-authors include Harald Böhnel and Kurt Wogau of UNAM, Juriquilla; Ulrike Kienel of the German Research Center for Geosciences in Potsdam; B. Lynn Ingram of UC Berkeley; and Susan Zimmerman of Lawrence Livermore National Laboratory. The work was funded by the National Science Foundation.

Long dry spell doomed Mexican city 1,000 years ago

Land use and climate pollution, 1990-2010. Values are for average annual greenhouse gas emissions, in gigatons of CO2 equivalent, for the agriculture and forestry sectors. Graphic: Climate Central

By John Upton
3 February 2015

(Climate Central) – Efforts to restrain deforestation are working, which means pollution from farming has increased in importance.

The federal raids in Alta Floresta, Brazil surprised locals in 2005. The year before, nearly 60,000 acres of rainforest had been torn out of the municipality. Now farmers and loggers were being arrested by armed police, accused of environmental crimes. “It was a radical operation,” the newly elected mayor later recalled during an interview with a Princeton University researcher. “All our economic activity stopped.” A few years later, Brazil’s central bank made it harder for property owners there, and in 35 other blacklisted areas, to borrow money unless they proved they were protecting the rainforest. The campaign marked a sharp change from the 1970s, when the federal government, then a military dictatorship, had encouraged clearcutting. Now the federal government was cracking down on it — and doing so successfully. In 2010, fewer than 1,000 acres of Alta Floresta was deforested.

Efforts such as these to slow deforestation have delivered some of humanity’s few gains in its otherwise lackadaisical battle so far against global warming. A gradual slowdown in chainsawing and bulldozing, particularly in Brazil, helped reduce deforestation’s annual toll on the climate by nearly a quarter between the 1990s and 2010.

A new study describes how this trend has seen agriculture overtake deforestation as the leading source of land-based greenhouse gas pollution during the past decade. While United Nations climate negotiations focus heavily on forest protections, the researchers note that delegates to the talks ignore similar opportunities to reform farming.

“The decline in deforestation over the past decade or two is a success story,” Rob Jackson, a professor at Stanford University’s earth sciences school, said. He was not involved with the new study. The deforestation slowdown has, “in large part,” he said, been driven by new forestry rules in Brazil, by the U.N.’s Reducing Emissions from Deforestation and Forest Degradation (REDD) program, which funds forest conservation, and similar policies elsewhere.

The new study, led by the U.N. Food and Agriculture Organization and published in Global Change Biology, quantifies the reductions in climate pollution from the degradation and clearcutting of forests. Clearcutting most often clears space for agriculture, suggesting agriculture’s indirect climate impacts surpass the impacts of deforestation for timber and other commodities. The researchers aim to tally those indirect impacts later this year. This paper was an early step in a larger effort to better understand and report on the climate repercussions of how land is used. “Every year, we’ll have updates,” lead author Francesco Tubiello said. […]

“We’re seeing an expansion of agricultural lands in some areas because of the growing global population,” Jackson, who is a co-chair of the Global Carbon Project, which studies the global carbon cycle, said. “We’re also seeing intensification of agriculture.”

Although annual climate pollution from deforestation is declining, experts warn that recent gains could quickly be reversed. Deforestation in the Amazon rainforest spiked recently following nearly a decade of declines, for example, as farmers and loggers rushed to exploit loopholes in forest protection laws. Some parts of Central Africa are seeing deforestation in areas where it was not previously a problem. And cutting down trees can reduce moisture levels in a rainforest, which could cause parts of the Amazon to start dying off — even if everybody’s chainsaws simultaneously jammed. [more]

Farming Now Worse for Climate Than Clearing Forests

In this photo taken in September 2005 and provided by Millie Hawley, Kivalina, an Inupiat Eskimo village is seen in on a barrier island off the coast of northwest Alaska. Inupiat Eskimo villagers in the Chukchi Sea village of Kivalina rely on wild animals to survive, but a recent arrival associated with climate warming is causing health concerns. Photo: Millie Hawley / AP Photo

By Michael Walsh
25 February 2015

(Yahoo News) – Climate change is forcing an isolated Alaskan village, roughly 80 miles above the Arctic Circle, to relocate.

The very existence of Kivalina, a town with about 400 residents on a tiny barrier island off Alaska's northwest coast, is under threat as Arctic sea ice continues to melt into the surrounding Chukchi Sea.

Now the whaling community needs to figure out where to move the town and how to pay for it, after several previous attempts failed. It’s a dilemma that could become more common as global warming continues, scientists warn.

Colleen Swan, who was born and raised in Kivalina, says residents realized they were in serious trouble during 2004’s fall storm surges, when the ice that had typically protected the island had not formed yet — leaving them vulnerable.

“We need to get off this island. We can’t stay here. It’s not an option anymore,” she said in an interview with Yahoo News.

A defensive wall has been erected, but that can only buy a bit more time. Swan says the threat of climate change extends far beyond Kivalina — and people should be prepared. 

“We’re not the only ones that this is happening to, and it’s coming to an area near you,” Swan said. “You should become familiar with the environment around you and be aware of the disaster response plans in your area.”

Christine Shearer, the program director for an energy research organization called CoalSwarm, says it could cost up to $100 million to move the village, according to federal government estimates.

“There are four villages that need to be relocated imminently,” she said in an interview with Yahoo News. “The problem will likely get worse and more communities will be affected.”

According to BBC News, the U.S. Army Corps of Engineers estimates that Kivalina will be uninhabitable in 2025.

But Shearer, author of a book on the plight of Kivalina, says it’s a little strange to give a specific deadline, because tragedy can strike at any time.

“There could all of a sudden be a huge storm that causes a lot of damage or floods the village. The issue isn’t necessarily slow and steady erosion,” she said. [more]

The front line of climate change: Alaska village must relocate as Arctic sea ice thins

a, b Microplastics present in the mouth and among the mesenteries of coral polyps, and c plastic fragments found in plankton tows in reef waters. Graphic: Hall, et al., 2015

By Oliver Milman
24 February 2015

(The Guardian) – Corals such as those found on the Great Barrier Reef are at risk from the estimated 5 trillion pieces of plastic in the world’s oceans because researchers have discovered they digest tiny fragments of plastic at a significant rate.

A study led by the ARC centre of excellence for coral reef studies at James Cook University found that corals consumed “microplastics” – plastics measuring under 5mm – about the same rate as their normal food.

These small plastics were found deep within the gut cavity tissue of analysed corals, showing that they weren’t able to expel the fragments.

Dr Mia Hoogenboom, who worked on the research, said: “Corals are not very selective in what they eat and they are sensitive to a range of environmental stressors.

“We know in other animals that plastics block feeding activities, as well as soak up toxins. It’s quite worrying and it’s a reminder that we can manage this kind of stress on the reef at a local level, as well as looking at larger challenges such as climate change.”

Researchers took hard corals from different colonies on the central Great Barrier Reef and put them in separate chambers of water, with one chamber of water empty of corals to compare what happened to the plastics.

Plastic fragments weighing 0.4g per litre of water were added, and corals were tested for their reaction over different time periods over the course of a month.

Researchers found that the corals ingested plastics about the same rate of their standard food, such as zooplankton.

Hoogenboom said that while corals benefited from the process of photosynthesis, they also required nutrients from consumed food and would suffer a “very slow process of starvation” should their stomachs become overloaded with plastic.

“In my opinion we need a general focus on cleaning up plastic pollution, to clean up beaches and reduce the amount of plastics in the waterways and into the oceans,” she said. “It’s a significant problem globally.”

Research published in December estimated that there are more than 5tn pieces of plastic, collectively weighing nearly 269,000 tonnes, floating in the world’s oceans.

Large pieces of plastic can strangle animals such as seals, while smaller pieces are ingested by fish and then travel up the food chain, all the way to humans.

It is expected this problem will worsen due to the rise of throwaway plastic, such as drinks containers and food packaging, with only 5% of the world’s plastic recycled at present.

A separate study published this month found that coastal populations dumped 8m tonnes of plastic rubbish into the oceans in 2010, equivalent to five full shopping bags of debris for every foot of coastline in the nearly 200 countries surveyed.

The coastline of Australia, including the Great Barrier Reef, is not short of plastic pollution, with a 2013 study finding that each square kilometre of Australia’s sea-surface water is contaminated by about 4,000 pieces of tiny plastic. [more]

Corals face 'slow starvation' from ingesting plastics pollution, experts find

Superbug Clostridium difficile is responsible for nearly 29,000 deaths in the U.S. each year. Graphic: ABC News

26 February 2015
By Liz Neporent

(ABC News) – A stubborn, hard-to-treat “super bug” causes more than 450,000 infections a year and is directly responsible for nearly 15,000 deaths in the United States, a report from the Centers for Disease Control and Prevention revealed today.

Clostridium difficile, or C. Diff, is a bacterial infection that leads to inflammation of the colon, the agency explained. The bacterium is found in feces, the agency said, and is spread by hand contact or contaminated surfaces.

"It’s the most common infection picked up in hospitals," said ABC News Chief Health and Medical Editor Dr. Richard Besser. "The thing about this infection is you can pick it up and it can cause no problems. Then, you take an antibiotic and it takes over."

More than 80 percent of C. Diff deaths were people 65 or older, with residents of nursing homes especially vulnerable to infection, the report said.

“C. difficile infections cause immense suffering and death for thousands of Americans each year,” CDC Director Dr. Tom Frieden said.

Hospital stays and especially long-term antibiotic use seem to up the risk of C. Diff infection.

“Antibiotics kill off beneficial bacteria in the gut which fight infection, leaving space for C. Diff to come in and release its toxins,” explained Dr. William Schaffner, an infectious disease expert with Vanderbilt University School of Medicine in Nashville, Tennessee.

Studies show that more than half of patients receive antibiotics at some point in their stay, and up to 50 percent of antibiotic use is unnecessary. Over-prescribing antibiotics, combined with poor infection control, may allow the spread of C. diff and other bacteria within a facility and to other facilities when a sick patient is transferred, the CDC report speculated.

The CDC report said preventing and controlling C. Diff should be a national priority. The infection costs up to $4.8 billion each year in excess health care costs, the agency reported. [more]

'Super Bug' Linked to Antibiotic Use Kills Nearly 15,000 Annually


RESULTS: A total of 15,461 cases of C. difficile infection were identified in the 10 geographic areas; 65.8% were health care–associated, but only 24.2% had onset during hospitalization. After adjustment for predictors of disease incidence, the estimated number of incident C. difficile infections in the United States was 453,000 (95% confidence interval [CI], 397,100 to 508,500). The incidence was estimated to be higher among females (rate ratio, 1.26; 95% CI, 1.25 to 1.27), whites (rate ratio, 1.72; 95% CI, 1.56 to 2.0), and persons 65 years of age or older (rate ratio, 8.65; 95% CI, 8.16 to 9.31). The estimated number of first recurrences of C. difficile infection was 83,000 (95% CI, 57,000 to 108,900), and the estimated number of deaths was 29,300 (95% CI, 16,500 to 42,100). The North American pulsed-field gel electrophoresis type 1 (NAP1) strain was more prevalent among health care–associated infections than among community-associated infections (30.7% vs. 18.8%, P<0.001)

Burden of Clostridium difficile Infection in the United States

Technorati Tags: ,

Robert Kenner, director of 'Merchants of Doubt', poses for a portrait during the 2014 Toronto International Film Festival on 8 September 2014 in Toronto, Ontario. Photo: Maarten de Boer / Getty

By Greg Evans  
1 March 2015

(Newsweek) – In Merchants of Doubt, their 2010 book that vivisects bad science and industrial cynicism, science historians Naomi Oreskes and Erik M. Conway decried the uneven battle for the popular imagination fought, on one side, by scientists ill-equipped for high-volume cable-TV tussles and, on the other, by the “well-financed contrarians” bent on dismantling whatever lab results, peer-reviewed theories and settled science might lead to even the most benign corporate regulations.

The authors unraveled the deny-and-obfuscate tactics concocted in the 1950s by Mad Men and Big Tobacco to cloud understanding of what even the proto-mainstream media was beginning to grasp. “Cancer by the Carton,” read a 1953 headline in Reader's Digest. “Doubt,” countered a public relations memo exhumed decades later from Big Tobacco's yellowed files, “is our product.”

And doubt, argued Oreskes and Conway, became the mantra for purveyors of acid rain, ozone holes and, most significant, global warming. Keep the cigarettes burning, the CO2 combusting and the profits flowing for as long as possible.

Joining the fray is filmmaker Robert Kenner, whose surprisingly rollicking screen adaptation of Merchants of Doubt opens March 6 in New York and Los Angeles. It’s a worthy follow-up to his 2008 Oscar-nominated Food, Inc., which arrived when Americans were primed to point fat fingers at Big Agra. This time, Doubt lands amid a national debate over science—legit, pseudo or just plain bad—that intensifies with every foot of Boston snow or new case of Disneyland measles.

Along with corporate greed and Madison Avenue chicanery, Kenner's film exposes a devoted and long-lived cadre of scientists (and their philosophical descendants) who established their careers during the A-bomb era and the Cold War's Big Science rivalries. Anti-communist ideologues, well-trained and often brilliant scientists such as physicists S. Fred Singer and the late Frederick Seitz saw (and see) corporate regulation as a pathway to socialism, an endgame more fearsome than any secondhand smoke or patchy ozone.

Kenner spoke to Newsweek as he was heading for the Ambulante Documentary Film Festival in Mexico City, the latest stop on his festival circuit after Telluride, Toronto and New York. He was prepared, he said, for more of the anti-science vitriol documented in his film. “It’s pretty amazing, this anger out there. … I'm going to be attacked. I just hope it only takes the form of written words.” [more]

Exposing the Doubt-Mongers Trying to Convince You Climate Change Isn’t Real

 

Blog Template by Adam Every. Sponsored by Business Web Hosting Reviews