Top: Multimodel mean summer (JJA) PDSI and standardized soil moisture (SM-30cm and SM-2m) over North America for 2050–2099 from 17 CMIP5 model projections using the RCP 8.5 emissions scenario.SM-30cm and SM-2m are standardized to the same mean and variance as the model PDSI over the calibration interval fromthe associated historical scenario (1931–1990). Dashed boxes represent the regions of interest: the Central Plains (105°W–92°W, 32°N–46°N) and the Southwest (125°W–105°W, 32°N–41°N). Bottom: Regional average time series of the summer season moisture balance metrics from the NADA and CMIP5models. The observational NADA PDSI series (brown) is smoothed using a 50-year loess spline to emphasize the low-frequency variability in the paleo-record. Model time series (PDSI, SM-30cm, and SM-2m) are the multimodel means averaged across the 17 CMIP5models, and the gray shaded area is the multimodel interquartile range for model PDSI. Graphic: Cook,  et al., 2015

By Eric Holthaus
12 February 2015

(Slate) – When it comes to drought in the West, we ain’t seen nothin’ yet. That’s the conclusion from a new study that links an increasing risk of decades-long drought episodes in the western United States to human-induced climate change. The study predicts drought severity outside the bounds of what’s thought to have occurred over the past 1,000 years, based on local tree-ring records.

“It’s certainly not good news,” said co-author Jason Smerdon, a climate scientist at Columbia University’s Lamont-Doherty Earth Observatory. The study was published Thursday in the inaugural issue of Science Advances, an open-access journal from AAAS, the same publisher as Science.

Smerdon’s study is the first to examine the future risk of “megadrought” in the southwest and central United States in the context of historical episodes of drought in the same regions. Smerdon’s study suggests that the coming years are likely to see droughts worse than the epic dry periods that are thought to have caused profound changes to human settlement in the region over the last millennium.

“They’re ‘mega’ because they are droughts that lasted in these regions for multiple decades,” said Smerdon in an interview with Slate. “We haven’t seen anything like this since at least the 1400s.” In comparison, the current California drought is four years old, though drought has been present in most of the last 15 years somewhere in the West. [more]

The United States of Megadrought

ABSTRACT: In the Southwest and Central Plains of Western North America, climate change is expected to increase drought severity in the coming decades. These regions nevertheless experienced extended Medieval-era droughts that were more persistent than any historical event, providing crucial targets in the paleoclimate record for benchmarking the severity of future drought risks. We use an empirical drought reconstruction and three soil moisture metrics from 17 state-of-the-art general circulation models to show that these models project significantly drier conditions in the later half of the 21st century compared to the 20th century and earlier paleoclimatic intervals. This desiccation is consistent across most of the models and moisture balance variables, indicating a coherent and robust drying response to warming despite the diversity of models and metrics analyzed. Notably, future drought risk will likely exceed even the driest centuries of the Medieval Climate Anomaly (1100–1300 CE) in both moderate (RCP 4.5) and high (RCP 8.5) future emissions scenarios, leading to unprecedented drought conditions during the last millennium.

Unprecedented 21st century drought risk in the American Southwest and Central Plains

An aerial view of the Aspen Free-Air Carbon dioxide and ozone Enrichment (Aspen FACE) experiment site once located near Rhinelander, Wisconsin. The circular plots consist of aspen and birch trees, surrounded by PVC pipes that allowed scientists to vent carbon dioxide and ozone gas into the air around the trees. Photo: Rick Anderson / Skypixs Aerials

By Kelly April Tyrrell
2 March 2015

(UW-Madison News) – In a high carbon dioxide world, the trees would come out ahead. Except for the munching bugs.

A new study published today [Monday, March 2, 2015] in Nature Plants shows that hungry, plant-eating insects may limit the ability of forests to take up elevated levels of carbon dioxide in the atmosphere, reducing their capacity to slow human-driven climate change.

The finding is significant because climate change models typically fail to consider changes in the activities of insects in the ecosystem, says Richard Lindroth, a professor of ecology at the University of Wisconsin-Madison and the leader of the study. The research suggests it’s time to add insects to the models.

Carbon dioxide typically makes plants grow faster and makes them more efficient in how they use nutrients. But the amount of damage caused by leaf-munching bugs in the study nearly doubled under high carbon dioxide conditions, leading to an estimated 70g of carbon-sequestering biomass lost per meter squared per year.

“This is the first time, at this scale, that insects have been shown to compromise the ability of forests to take up carbon dioxide,” Lindroth says.

In addition, as feeding increased, more nutrients moved from the canopy to the forest floor in the form of insect fecal material and chewed-on leaf scraps, mixing into the soil and likely altering the nutrient profile of the forest.

“Insects are munching on leaves and they’re pooping out remnants, so they are changing the timing of nutrient cycling as well as the quality,” Lindroth says.

John Couture, a former graduate student in Lindroth’s lab and the lead author of the study, spent three years with his team studying the impact of elevated carbon dioxide alone, elevated ozone (which is highly toxic to plants) alone, and elevated levels of both gases combined on stands of aspen and birch growing in what was once one of the largest simulated ecosystems in the world, the Aspen Free-Air Carbon dioxide and ozone Enrichment (Aspen FACE) experiment located near Rhinelander, Wisconsin.

Unlike a greenhouse or atmospheric chamber, the FACE site (now decommissioned) was a massive outdoor experimental area that allowed trees to grow under natural conditions, like natural soil, sunlight, and rainfall. The only artificial conditions were those that were experimentally manipulated.

The site consisted of a dozen stands of trees growing in 30 meter diameter plots, surrounded by a network of PVC pipes designed to vent gases into the environment around them.

They were exposed to carbon dioxide and ozone at levels predicted for the year 2050, although Lindroth says the 560 parts-per-million carbon dioxide level studied is probably too low.

The trees were planted as saplings in the mid-1990s and by the time Couture collected data for the study from 2006 though 2008, they had grown to resemble any number of the disturbed forest stands found throughout Wisconsin.

Couture and his team walked through each site, clipping leaves from the canopy using scissors at the end of pruner poles or from scaffolding near the top of the canopy. They also set out frass baskets — laundry baskets lined with sheets — to collect scraps of leaves dropped by messy, munching caterpillars and other bugs dining in the canopy, and to collect their fecal droppings.

Tens of thousands of leaves and countless frass baskets later, Couture measured the amount of leaf area consumed by the insects in each plot and sifted through the frass and food droppings in the baskets to assess just how much eating the bugs were doing, to measure the amount of nutrients leaving the trees via their droppings, and to assess the loss of tree biomass.

Why insects would do more munching in a carbon dioxide rich forest is in part a matter of chemistry. Because carbon dioxide is a limiting resource for plant growth, high levels of the gas change the way trees use other resources, like nitrogen, typically leading to less nutritious plants.

“It’s like a slice of Wonder Bread versus a slice of high density, protein-rich bakery bread; there’s a lot more protein in the bakery bread than the white bread,” says Couture. “Insects have a base level of nutrients they need in order to grow and to reach that, they can choose either to eat higher-nutrient food — unfortunately, insects don’t always have that choice — or to eat more.”

Overall, the team found high ozone plots were less hospitable to insects, reducing their munching behavior and leading to less biomass loss.

With the findings, the researchers created models allowing them to predict what could happen in forests under changing environmental conditions.

“The big question is, will northern forests grow faster under elevated carbon dioxide?” says Lindroth. “Carbon dioxide is a substrate for photosynthesis. It gets converted into sugars, which then become plant biomass. Will trees take up more carbon dioxide and thus help reduce its increase in the atmosphere?”

As humans continue to contribute more carbon dioxide to Earth’s atmosphere, the answer should be yes as trees act as sponges for the greenhouse gas. But it turns out, very hungry caterpillars and their bug brethren — in their own quest for food in an elevated carbon dioxide environment — may limit that growth and reduce the capacity of forests to slow climate warming.

Munching bugs thwart eager trees, reducing the carbon sink

ABSTRACT: Stimulation of forest productivity by elevated concentrations of CO2 is expected to partially offset continued increases in anthropogenic CO2 emissions. However, multiple factors can impair the capacity of forests to act as carbon sinks; prominent among these are tropospheric O3 and nutrient limitations1,2. Herbivorous insects also influence carbon and nutrient dynamics in forest ecosystems, yet are often ignored in ecosystem models of forest productivity. Here we assess the effects of elevated levels of CO2 and O3 on insect-mediated canopy damage and organic matter deposition in aspen and birch stands at the Aspen FACE facility in northern Wisconsin, United States. Canopy damage was markedly higher in the elevated CO2 stands, as was the deposition of organic substrates and nitrogen. The opposite trends were apparent in the elevated O3 stands. Using a light-use efficiency model, we show that the negative impacts of herbivorous insects on net primary production more than doubled under elevated concentrations of CO2, but decreased under elevated concentrations of O3. We conclude that herbivorous insects may limit the capacity of forests to function as sinks for anthropogenic carbon emissions in a high CO2 world.

Insect herbivory alters impact of atmospheric change on northern temperate forests

Rangers stand guard over a rhinoceros is South Africa. Photo: Air Shepherd

[Air Shepherd brings rare good news from the poaching front. –Des]

By Ben Guarino
2 March 2015

(The Dodo) – Rhino poaching in parts of South Africa has dropped to essentially nil, thanks to a combination of cheap drones and complex number-crunching.

"Since October 1," said Thomas Snitch, Ph.D., the University of Maryland professor who helped get the program off the ground, "not one rhino has been killed where we're flying."

Elsewhere, unfortunately, poaching is still a serious problem for elephants and rhinoceros. With more than 1,200 animals killed globally for their horns, 2014 was the worst year on record for illegal rhino hunting. We can't graft wings on a 7,000-pound mammal so that he can evade illegal hunters. But the next best thing? Drones, baby, drones — plus a University of Maryland supercomputer.

To plot a drone's course, the computer relies on the previous two years of rhino-tracking data, updated once a week. It can predict where the rhinos will be on a given evening with, typically, more than 90 percent accuracy. This info is emailed halfway around the globe from the Maryland computer — though Snitch hopes the program will soon be handed off to African universities — then downloaded and transferred to drone via USB stick.

Their trails thus blazed, the drones give a ranger team enough time to plot an interdiction course between poacher and rhino. The idea, according to Snitch, is that you don't have to find the poachers — just the animals most likely to end up in harm's way. As massive as a rhino or elephant is, however, blindly patrolling a preserve the size of Texas won't cut it.

The ace in the hole? "Poachers, like people all over the world, are lazy," Snitch said, speaking during a recent lecture at the New York University Digital Animals Conference. (Full disclosure: The Dodo was one of the conference's sponsors.) The hunters mainly stick close to roads and rivers, for easy access to and from a kill.

To plot a drone's course, the computer relies on the previous two years of rhino-tracking data, updated once a week. It can predict where the rhinos will be on a given evening with, typically, more than 90 percent accuracy. This info is emailed halfway around the globe from the Maryland computer — though Snitch hopes the program will soon be handed off to African universities — then downloaded and transferred to drone via USB stick.

Their trails thus blazed, the drones give a ranger team enough time to plot an interdiction course between poacher and rhino. The idea, according to Snitch, is that you don't have to find the poachers — just the animals most likely to end up in harm's way. As massive as a rhino or elephant is, however, blindly patrolling a preserve the size of Texas won't cut it. […]

The drones in Air Shepherd's program are inexpensive. This is fighting poaching in "an African way," Snitch said, "not an American way." [more]

Rhino Poaching Grinds To A Halt Under Drones' Watch

Global sea level in December 2014. Sea level rise is caused primarily by two factors related to global warming: the added water coming from the melting of land ice and the expansion of sea water as it warms. This chart tracks the change in sea level since 1993 as observed by satellites. Graphic: NASA

(NASA) – Sea level rise is caused primarily by two factors related to global warming: the added water coming from the melting of land ice and the expansion of sea water as it warms. This chart tracks the change in sea level since 1993 as observed by satellites.

Sea Level

The average annual sea ice thickness, in meters, for the central Arctic Ocean. Red dots are submarine records. The green line is the long-term trend. Graphic: Lindsay and Schweiger, 2015

By Hannah Hickey
3 March 2015

(UW Today) – It’s no surprise that Arctic sea ice is thinning. What is new is just how long, how steadily, and how much it has declined. University of Washington researchers compiled modern and historic measurements to get a full picture of how Arctic sea ice thickness has changed.

The results, published in The Cryosphere, show a thinning in the central Arctic Ocean of 65 percent between 1975 and 2012. September ice thickness, when the ice cover is at a minimum, is 85 percent thinner for the same 37-year stretch.

“The ice is thinning dramatically,” said lead author Ron Lindsay, a climatologist at the UW Applied Physics Laboratory. “We knew the ice was thinning, but we now have additional confirmation on how fast, and we can see that it’s not slowing down.”

The study helps gauge how much the climate has changed in recent decades, and helps better predict an Arctic Ocean that may soon be ice-free for parts of the year.

The project is the first to combine all the available observations of Arctic sea ice thickness. The earlier period from 1975 to 1990 relies mostly on under-ice submarines. Those records are less common since 2000, but have been replaced by a host of airborne and satellite measurements, as well as other methods for gathering data directly on or under the ice.

“A number of researchers were lamenting the fact that there were many thickness observations of sea ice, but they were scattered in different databases and were in many different formats,” Lindsay said. The U.S. National Oceanic and Atmospheric Administration funded the effort to compile the various records and match them up for comparison.

The data also includes the NASA IceSat satellite that operated from 2003 to 2008, IceBridge aircraft-based measurements that NASA is conducting until its next satellite launches, long-term under-ice moored observations in the Beaufort Sea from the Woods Hole Oceanographic Institution, and other measures from aircraft and instruments anchored to the seafloor.

The older submarine records were unearthed for science by former UW professor Drew Rothrock, who used the U.S. Navy submarine measures of ice thickness to first establish the thinning of the ice pack through the 1990s. Vessels carried upward-looking sonar to measure the ice draft so they knew where they could safely surface. Further analysis of those records found a 36 percent reduction in the average thickness in the quarter century between 1975 and 2000.

“This confirms and extends that study,” Lindsay said. The broader dataset and longer time frame show that what had looked like a leveling off in the late 1990s was only temporary. Instead, adding another 12 years of data almost doubles the amount of ice loss.

The observations included in the paper all have been entered in the Unified Sea Ice Thickness Climate Data Record that now includes around 50,000 monthly measurements standardized for location and time. The archive is curated by scientists at the UW Applied Physics Laboratory and stored at the U.S. National Snow and Ice Data Center.

Lindsay also is part of a UW group that produces a widely cited calculation of monthly sea-ice volume that combines weather data, sea-surface temperatures and satellite measurements of sea ice concentration to generate ice thickness maps. Critics have said those estimates of sea ice losses seemed too rapid and questioned their base in a numerical model. But the reality may be changing even faster than the calculations suggest.

“At least for the central Arctic basin, even our most drastic thinning estimate was slower than measured by these observations,” said co-author Axel Schweiger, a polar scientist at the UW Applied Physics Laboratory.

The new study, he said, also helps confirm the methods that use physical processes to calculate the volume of ice each month.

“Using all these different observations that have been collected over time, it pretty much verifies the trend that we have from the model for the past 13 years, though our estimate of thinning compared to previous decades may have been a little slow,” Schweiger said.

The new paper only looks at observations up to the year 2012, when the summer sea ice level reached a record low. The two years since then have had slightly more sea ice in the Arctic Ocean, but the authors say they are not surprised.

“What we see now is a little above the trend, but it’s not inconsistent with it in any way,” Lindsay said. “It’s well within the natural variability around the long-term trend.”

Additional funding for the project was from the National Science Foundation and NASA.


For more information, contact Lindsay at or Schweiger at 206-543-1312 or

On thin ice: Combined Arctic ice observations show decades of loss

ABSTRACT: Sea ice thickness is a fundamental climate state variable that provides an integrated measure of changes in the high-latitude energy balance. However, observations of mean ice thickness have been sparse in time and space, making the construction of observation-based time series difficult. Moreover, different groups use a variety of methods and processing procedures to measure ice thickness, and each observational source likely has different and poorly characterized measurement and sampling errors. Observational sources used in this study include upward-looking sonars mounted on submarines or moorings, electromagnetic sensors on helicopters or aircraft, and lidar or radar altimeters on airplanes or satellites. Here we use a curve-fitting approach to determine the large-scale spatial and temporal variability of the ice thickness as well as the mean differences between the observation systems, using over 3000 estimates of the ice thickness. The thickness estimates are measured over spatial scales of approximately 50 km or time scales of 1 month, and the primary time period analyzed is 2000–2012 when the modern mix of observations is available. Good agreement is found between five of the systems, within 0.15 m, while systematic differences of up to 0.5 m are found for three others compared to the five. The trend in annual mean ice thickness over the Arctic Basin is −0.58 ± 0.07 m decade−1 over the period 2000–2012. Applying our method to the period 1975–2012 for the central Arctic Basin where we have sufficient data (the SCICEX box), we find that the annual mean ice thickness has decreased from 3.59 m in 1975 to 1.25 m in 2012, a 65% reduction. This is nearly double the 36% decline reported by an earlier study. These results provide additional direct observational evidence of substantial sea ice losses found in model analyses.

Arctic sea ice thickness loss determined using subsurface, aircraft, and satellite observations

A polar bear is perched on the edge of an ice floe. While watching a polar bear gorging on birds' eggs high on the cliffs, due to lack of ice and opportunity to fish, the effects of climate change hit home. 'It's painful to see the devastation,' Seaman says. 'To know what is being lost, and what we may not get back.' Photo: Camille Seaman

By Duncan McCue
2 March 2015

(CBC News) – It was by chance that Camille Seaman first travelled north — a bumped flight on Alaska Airlines led to a free trip to Kotzebue on the Bering Strait.

Little did the San Francisco-based photographer know it was the beginning of a decade-long quest, an unshakable compulsion to take pictures of icebergs in some of the most extreme environments on Earth.

At the time, she simply wanted to document beauty in visually stunning vistas of the Arctic and Antarctica.

She wasn’t thinking about climate change then, but as the newest artist-in-residence at Denali National Park in central Alaska, it’s top of mind now. On her last Arctic sailing in 2011, she says there was almost no ice.

"There was nothing on the radar for ice," she said. "We could have kept going [to the North Pole], if we had enough fuel. It just shouldn’t be."

Part Native American, Seaman attributes her environmental awareness to her childhood spent near the Shinnecock Reservation in New York State, and especially to the teachings of her grandfather.

He would take her for long walks in the woods to introduce her, individually, to trees.

"He really required that you stop at each tree and acknowledge it physically, place your hands on it and feel its life force, its physical structure, and understand it’s a relative to you."

She later brought her grandfather’s sensibilities to her polar art.

While working as an expedition photographer aboard science vessels and commercials ships, she approached each photograph of an iceberg as if it was a portrait of an ancestor: "I've never met two which were alike."

She compiled those photos and stories in a new book titled Melting Away: A Ten-Year Journey through Our Endangered Polar Regions.

"It's painful to see the devastation," said Seaman. "It's painful to know what is being lost, and what we may not get back." [more]

Polar ice loss 'painful to see' for photographer Camille Seaman

Timeline of events leading up to the civil uprising in Syria that began in March 2011, along with a graph depicting the net urban influx (in millions) of Syrian IDPs and Iraqi refugees since 2005. Graphic: Kelly, et al., 2015

By Ian Sample
2 March 2015

(The Guardian) – The prolonged and devastating drought that sparked the mass migration of rural workers into Syrian cities before the 2011 uprising was probably made worse by greenhouse gas emissions, US scientists say.

The study is one of the first to implicate global warming from human activities as one of the factors that played into the Syrian conflict which is estimated to have claimed more than 190,000 lives.

The severity of the 2006 to 2010 drought, and more importantly the failure of Bashar al-Assad’s regime to prepare, or respond to it effectively, exacerbated other tensions, from unemployment to corruption and inequality, which erupted in the wake of the Arab spring revolutions, the scientists say.

“We’re not arguing that the drought, or even human-induced climate change, caused the uprising,” said Colin Kelley at the University of California in Santa Barbara. “What we are saying is that the long term trend, of less rainfall and warmer temperatures in the region, was a contributing factor, because it made the drought so much more severe.”

From 2006, the Fertile Crescent, where farming was born 12,000 years ago, faced the worst three-year drought in the instrumental record. Unsustainable agricultural policies meant that the drought led to the broad collapse of farming in northeastern Syria. Their livelihoods gone, an estimated one to 1.5 million people surged into the cities.

The arrival of so many rural families came on the heels of a million Iraqi refugees who arrived after 2006, causing what Kelley refers to as a “huge population shock” in Syria’s most affected urban centres. Many of the displaced settled on the edges of cities, where already tough living conditions were made more challenging by poor access to water and electricity.

Writing in the journal, Proceedings of the National Academy of Sciences, Kelley describes how the unsustainable farming practices in Syria led to a massive depletion of groundwater which was crucial for irrigating land beyond the reaches of the rivers.

But the dwindling groundwater was accompanied by a long term decline in rainfall in the region that affected farms watered from rivers. According to records Kelley studied, the Fertile Crescent, including Syria, witnessed a 13% drop in its winter rainfall since 1931. Another trend saw summer temperatures rising, which dried out much of the remaining moisture in the soils. [more]

Global warming contributed to Syria's 2011 uprising, scientists claim

Observed summer (May−October) and winter (November−April) near-surface temperature for the Syria area mean, CRU 3.1 data, with 5-year Butterworth low-pass filter (black) and least squares fit (blue). Graphic: Kelly, et al., 2015

ABSTRACT: Before the Syrian uprising that began in 2011, the greater Fertile Crescent experienced the most severe drought in the instrumental record. For Syria, a country marked by poor governance and unsustainable agricultural and environmental policies, the drought had a catalytic effect, contributing to political unrest. We show that the recent decrease in Syrian precipitation is a combination of natural variability and a long-term drying trend, and the unusual severity of the observed drought is here shown to be highly unlikely without this trend. Precipitation changes in Syria are linked to rising mean sea-level pressure in the Eastern Mediterranean, which also shows a long-term trend. There has been also a long-term warming trend in the Eastern Mediterranean, adding to the drawdown of soil moisture. No natural cause is apparent for these trends, whereas the observed drying and warming are consistent with model studies of the response to increases in greenhouse gases. Furthermore, model studies show an increasingly drier and hotter future mean climate for the Eastern Mediterranean. Analyses of observations and model simulations indicate that a drought of the severity and duration of the recent Syrian drought, which is implicated in the current conflict, has become more than twice as likely as a consequence of human interference in the climate system.

Climate change in the Fertile Crescent and implications of the recent Syrian drought

Increase in October-April accumulated rainfall in Australia, 1900-2013. Northern Australian wet season (October to April) rainfall has shown wet and dry decades through the 20th century, but with a slight increase indicated in the linear trend in 1900-2012. In recent decades, increases are discernible across northern and central Australia, with the increase in summer rainfall most apparent since the early 1970s (Figure 4.2.4; Braganza, et al., 2011), and has been large enough to increase total Australian rainfall (averaged over the entire continent) by about 50 mm when comparing the 1900-to-1960 period with 1970-to-2013. Graphic: CSIRO / BOM

27 January 2015 (CSIRO) – Northern Australian wet season (October to April) rainfall has shown wet and dry decades through the 20th century, but with a slight increase indicated in the linear trend in 1900-2012. In recent decades, increases are discernible across northern and central Australia, with the increase in summer rainfall most apparent since the early 1970s (Figure 4.2.4; Braganza, et al., 2011), and has been large enough to increase total Australian rainfall (averaged over the entire continent) by about 50 mm when comparing the 1900-to-1960 period with 1970-to-2013.

Climate Change in Australia: Projections for Australia’s NRM Regions

World oil supply type, New Policies scenario, 2000-2035. Graphic: IEA / Kjell Aleklett

By Kjell Aleklett
15 February 2015

(Aleklett's Energy Mix) – On Tuesday 10 February at 13:00 GMT the IEA released its “Oil Medium-Term Market Report 2015”. The day before the release I was contacted by Jens Ergon at Sveriges Television (“Sweden’s Television, SVT) who wanted to get my opinion on the report. I had a number of hours to read through the 140 pages of the version provided to media prior to the report’s official release. This meant that I could comment on the report immediately it was released. SVT has now reported some of those comments in an article that Jens Ergon has written, “The Price Crash Will Reshape The Oil Market”. The subtitle is, “American oil boom behind the falling price. But opinions vary widely on the future of oil.”

Let’s now go through the article together and I will make a few comments as we do so.

“The comprehensive fall in the price of oil has taken the world’s experts by surprise. Since the summer of 2014 the price of oil has more than halved from over $100 per barrel to a price today of around $50. Last Tuesday the International Energy Agency, IEA, made its first report since the price fall. In the report, the development is described as the beginning of a new era. The IEA’s press release that accompanied the release of the report stated, “The recent crash in oil prices will cause the oil market to rebalance in ways that challenge traditional thinking about the responsiveness of supply and demand.

“What is surprising is not that the oil price has fallen but the severity of the drop and that it has continued during half a year until up to a few weeks ago”, commented Daniel Spiro, economist with focus on price developments for oil and natural resources at Oslo University.”

Aleklett: If we look back in time there were similar price falls at the beginning of the 1980s and in 2008. For economists the price fall were as surprising then as it is now. The interesting thing is that there has always been a good explanation for their occurrence but economists are very poor at predicting when a fall will occur. I have chosen never to predict the price of oil but to say always and only that, “the price will be what the market is prepared to pay”.

From end of cheap oil to price crash

The price crash comes after a number of years of historically high prices. With the exception of the temporary fall that occurred with the financial crisis of 2008-9 the oil price rose steeply during the entire first decade of this century and stayed at around $100 until the summer of 2014. This contrasts with the price of around $20 per barrel during the 1990s. Some researchers have regarded the high oil prices as a sign of Peak Oil, i.e., that the rate of global oil production is near the maximum that is geologically possible. Others have doubted the concept of Peak Oil and asserted that the higher prices will only encourage new, if more expensive, oil production. Data such as that published by the IEA clearly show that the easily produced, so-called “conventional oil” reached maximum production several years ago – at around 70 million barrels per day. But during recent years the introduction of more expensive, so-called “unconventional oil” – such as from the Canadian oil sands and, foremost, US shale oil – has compensated for the fall in conventional oil and has allowed total world oil production to increase somewhat, to just over 74 million barrels per day.”

Aleklett: During the 1990s the oil price even fell below $10 per barrel. It was in 1998 that Colin J. Campbell and Jean H. Laherrère published their famous article, “The End of Cheap Oil” in the journal Scientific American at the same time as The Economist wrote that the world was “Drowning in oil”. See my blog post, “How cheap is oil today?”. Colin and Jean wrote that cheap oil would reach a production maximum in around 2004 and today we know that the conventional oil, that was cheap in 1998, peaked in 2005. From 1998 until 2008 the price of oil rose from $10 per barrel to $147 per barrel. The era of cheap oil was over. Normally economists interpret such a price rise as a sign of scarcity and in this case we can call this shortage “Peak Oil”. Despite that, there are many who use any argument, no matter how contrived, to assert that Peak Oil lies far in the future.

Today, many people regard $50 per barrel oil as cheap and as a sign that we are, once again, “drowning in oil”. According to BP, production of crude oil and natural gas liquids totalled 82.6 Mb/d in 2006. If that production had continued at the same rate during the following ten years then the additional of oil would have raised total production to 92.3 Mb/d in 2013. Instead, 2013 saw total production at 86.8 Mb/d. The increase of 4.2 Mb/d we saw from 2008 to 2013 was not cheap oil. It came from deepwater, from Canada’s oilsands and as NGL and shale oil from fracking in the USA. We can see now that Colin and Jean’s 1998 predictions have proven completely correct.

As you can see in the figure [above], conventional crude oil reached maximum production in 2005-6 at 70 Mb/d and today is down at 67-8 Mb/d if one includes deepwater oil production as conventional. But deepwater production is expensive. If one also includes oil from oilsands and shale oil then the total production rate reaches 74 Mb/d. Note that NGL are not included in these numbers. [more]

The crash in the price of oil may change the oil market – a look at the IEA’s “Oil Medium-Term Market Report 2015”


Blog Template by Adam Every. Sponsored by Business Web Hosting Reviews