Climate scientists predict more blazing heat, drought, fires, and millions of dead trees in the U.S. West – ‘The climate is changing, and these fires are a very strong indicator of that’0 comments Posted by Jim at Friday, June 24, 2016
By Darryl Fears
23 June 2016
(Washington Post) – The burning sensation in the southwestern United States was diagnosed by climate scientists more than a year ago.
As California broiled in high temperatures and drought last year, academic institutions across the country released study after study that suggested rising temperatures and less moisture were part of a new normal for the state. One study by NASA predicted in February that the Southwest can expect to endure a 30-year megadrought starting as early as 2050. In early March, a study from Stanford University said California could face a drought every other year based on a 30-year trend of higher-than-normal temperatures and dwindling rainfall.
In August, Columbia University’s Earth Institute found evidence that global warming has contributed to California’s drought. And in September, NASA and Columbia teamed up to produce a study showing that five centuries have passed since the Golden State has been as dry as it currently is. Each of the studies drew on research that goes back decades.
None of the studies could explain exactly why the West is baking today. Temperatures reached 120 degrees Fahrenheit in parts of California this week. Since October, 26 million trees have died in six counties across 760,000 acres in the Sierra Nevada mountains that run along California’s spine.
That brings the number of dead trees to 66 million over four years of drought, the service said. A combination of heat, dryness and a greedy little beetle, according to the latest estimate by the U.S. Forest Service.
In Arizona, where temperatures reached 118 degrees in Yuma on Tuesday, a pair of tourists who went for a hike died on the trail under the scorching heat. A third member of their party informed authorities because he was the only one strong enough to straggle back for help.
California, Nevada, New Mexico and Arizona are experiencing large wildfires fairly early in the season. Two large fires burning near Santa Barbara are threatening to combine, and another is burning north of San Francisco, stretching thin the personnel and material needed to fight them.
When a monster wildfire struck Colorado in 2012, Sherman Harris, then-undersecretary at the Agriculture Department, which oversees the Forest Service, acknowledged that big fires driven by climate were here to stay. The wildfire season that ran from June to September expanded to include May and October.
Since then, it has gotten even worse. The season starts in March and ends in December. Once, it was rare to see 5 million cumulative acres burn in a year; recent seasons have recorded twice that. Last year’s wildfire season set a record with more than 10 million acres burned — more land than Maryland, the District and Delaware combined, the Forest Service said.
More than half the fires were in Alaska, where dryness due to historically low mountain snowpack and a freak lightning storm created perfect conditions for a huge blaze. But there were also mammoth fires in Washington and Oregon, where drought had left forests dry and ready to burn.
“The climate is changing, and these fires are a very strong indicator of that,” Sherman said in 2012, predicting what was to come. [more]
Pacific Northwest orcas are starving – ‘There simply aren’t enough salmon out there for them to eat’0 comments Posted by Jim at Friday, June 24, 2016
By David Neiwert
24 June 2016
(Crosscut) – Vancouver photographer Mark Malleson took this photograph of the Southern Resident killer whale known as J-34, or Doublestuf, breaching while he was in the interior waters of the Salish Sea this spring. It’s a remarkable and frightening photo for orca lovers, because the male orca’s ribs appear to be protruding prominently.
That’s abnormal, especially for a resident killer whale at this time of year, when the orcas are typically well fed after a winter of preying on Chinook salmon. And so Malleson’s photo set off a number of alarm bells in the Northwest whale-watching community as it circulated on social media.
Subsequent photos taken of J-34 and his pod from a scientific drone suggested that, while the whales weren’t particularly plump, their girth was within their normal range. Nonetheless, veteran whale scientist Ken Balcomb is blunt about what he is seeing for the Southern Residents long-term: “These whales are starving,” he says. “There simply aren’t enough salmon out there for them to eat.”
Balcomb and the crew at San Juan Island’s Center for Whale Research have been observing the Southern Residents foraging this winter and spring, and the behavior has been disconcerting: The whales are much more spread out, meaning they are having to forage harder for individual fish. Many of them appear underfed, he says. It’s an especially alarming development following last year’s “baby boom,” in which nine new calves were born into the population, one of whom has apparently already vanished and is presumed dead.
Normally, at this time of year, the Southern Residents are being relatively well fed, since they typically hang out along the Continental Shelf between northern California and British Columbia for the winter and spring months, dining on the large runs of returning Chinook. Many of them spend inordinate amounts of time at the mouth of the Columbia River in the winter.
There is an established and powerful correlation between salmon abundance and orca populations. The uptick in Chinook runs of the past few years on the Columbia/Lower Snake have been linked to the recent orca baby boom.
The spike in salmon numbers is largely attributed to good ocean conditions for the past 12 years, and to some degree to a federal court ruling requiring the Bonneville Power Administration to spill water over Columbia and lower Snake River dams at key times of the year to aid migrating salmon smolt in their downstream journey. But it is the continuing presence of those same four dams — Ice Harbor, Lower Monumental, Little Goose, and Lower Granite, located on the Snake between the Tri-Cities of Pasco, Kennewick, and Richland and Lewiston, Idaho — that may ultimately doom the Southern Resident orca population. [more]
By Jodi Helmer
6 June 2016
(NPR) – Between December and March, beekeepers send millions of hives to California to pollinate almond trees. Not all of the hives make it back home.
"The number of beehive thefts is increasing," explains Jay Freeman, a detective with the Butte County Sheriff's Office.
In California, 1,734 hives were stolen during peak almond pollination season in 2016. In Butte County alone, the number of stolen hives jumped from 200 in 2015 to 400 this year, according to Freeman.
Denise Qualls, a California bee broker who arranges contracts between beekeepers and almond growers, isn't surprised that beehive thefts are on the rise.
It takes more than 2 million beehives to pollinate California almonds. Currently, beekeepers are paid $200 per hive for pollination services (compared with $130 per hive in 2010). [more]
Ulba river in Siberia poisoned by mining city Ridder in Kazakhstan, 1100 km upstream – ‘It is a real ecological disaster’0 comments Posted by Jim at Friday, June 24, 2016
31 May 2016 (Siberian Times) – Alert as dump including cyanide at zinc plant leaks into Ulba Rover and pollution flows towards Omsk.
An acidic smell causes breathing difficulties as far as 700 metres from the river, according to worried locals on the bank of the Ulba.
Omsk residents, around 1,100 kilometres from the source of the pollution at Kazzinc plant in Ridder, once called Leninogorsk, are stocking up on bottled water, evidently panicked by the impending risk as the polluted water - including cyanide - flows into Russia.
Officials in Russia and Kazakhstan have played down the threat, especially in Omsk, claiming the river will have cleansed the pollution by the time the leak reaches the western Siberian city.
But biologist Dr Sergey Solovyov, from Omsk, warned: "It is real ecological disaster. Zinc has a negative impact on the human reproductive function. It impacts on the gastrointestinal system, the nervous system." [more]
By Alexander L. Forrest
13 June 2016
(The Conversation) – In an age of rapid global population growth, demand for safe, clean water is constantly increasing. In 2010 the United States alone used 355 billion gallons of water per day. Most of the available fresh water on Earth’s surface is found in lakes, streams and reservoirs, so these water bodies are critical resources.
As a limnologist, I study lakes and other inland waters. This work is challenging and interesting because every lake is an ecosystem that is biologically, chemically and physically unique. They also are extremely sensitive to changes in regional and global weather and long-term climate patterns.
For these reasons, lakes are often called “sentinels of change.” Like the figurative canary in the coal mine, lakes may experience change to their ecosystem dynamics before we start to see shifts in the greater watersheds around them.
In a study I recently co-authored with Goloka Behari Sahoo, S. Geoffrey Schladow, John Reuter, Robert Coats and Michael Dettinger, we projected that future climate change scenarios will significantly alter natural mixing processes in Lake Tahoe in the Sierra Nevada range that are critical to the health of the lake’s ecosystem. This could potentially create a condition that we termed “climatic eutrophication.”
While many groups have studied the long-term impact of climate change on lakes, this process can now be added to the growing list of drivers of eutrophication. This is a potentially damaging phenomenon that could affect a number of vital deep-water lakes around the world, degrading water quality and harming fish populations.
Eutrophication is a condition that occurs when lakes and reservoirs become overfertilized. Cultural eutrophication is a well-understood process in which lake and reservoir ecosystems become overloaded with chemical nutrients, mainly nitrogen and phosphorus. These nutrients come from human activities, including fertilizer runoff from farms and releases from sewage systems and water treatment plants. Natural weathering processes, atmospheric deposition of air pollutants, and erosion also transport nutrients that are already present in the watershed into the water supply. [more]
ABSTRACT: Using water column temperature records collected since 1968, we analyzed the impacts of climate change on thermal properties, stability intensity, length of stratification, and deep mixing dynamics of Lake Tahoe using a modified stability index (SI). This new SI is easier to produce and is a more informative measure of deep lake stability than commonly used stability indices. The annual average SI increased at 16.62 kg/m2/decade although the summer (May–October) average SI increased at a higher rate (25.42 kg/m2/decade) during the period 1968–2014. This resulted in the lengthening of the stratification season by approximately 24 d. We simulated the lake thermal structure over a future 100 yr period using a lake hydrodynamic model driven by statistically downscaled outputs of the Geophysical Fluid Dynamics Laboratory Model (GFDL) for two different green house gas emission scenarios (the A2 in which greenhouse-gas emissions increase rapidly throughout the 21st Century, and the B1 in which emissions slow and then level off by the late 21st Century). The results suggest a continuation and intensification of the already observed trends. The length of stratification duration and the annual average lake stability are projected to increase by 38 d and 12 d and 30.25 kg/m2/decade and 8.66 kg/m2/decade, respectively for GFDLA2 and GFDLB1, respectively during 2014–2098. The consequences of this change bear the hallmarks of climate change induced lake warming and possible exacerbation of existing water quality, quantity and ecosystem changes. The developed methodology could be extended and applied to other lakes as a tool to predict changes in stratification and mixing dynamics.
By Matt Wood
17 June 2016
(University of Chicago) – California mussel shells collected off the coast of Washington state in the 1970s are, on average, 32 percent thicker than modern specimens, according to a new study published by UChicago biologists.
Shells collected by Native Americans 1,000 to 1,300 years ago were also 27 percent thicker than modern shells, on average. The decreasing thickness over time, in particular the last few decades, is likely due to ocean acidification as a result of increased carbon in the atmosphere.
“Archival material provided by past researchers, the Makah Tribal Nation, and the Olympic National Park allowed us to document this intriguing and concerning pattern in shell thickness,” said Cathy Pfister, professor of ecology and evolution and lead author. The study was published June 15 in the Proceedings of the Royal Society B.
As humans burn fossils fuels, the oceans absorb a large portion of the additional carbon released into the atmosphere. This in turn causes pH levels of ocean water to drop, making it more acidic. Mussels, oysters and certain species of algae have difficulty producing their calcium carbonate shells and skeletons in such an environment, and can provide an early indicator of how increasing ocean acidification affects marine life.
In previous studies, Pfister and her colleagues documented declining pH levels in the waters surrounding Tatoosh Island off the coast of Washington. In 2011, they further analyzed carbon and oxygen isotopes taken from modern mussel shells, shells collected by the local Makah tribe between 668 and 1008 A.D., and shells collected by biologists in the 1970s.
For the new study, the researchers compared the thicknesses of the same sets of shells. On average, the shells provided by the Makah Cultural and Research Center were 27.6 percent thicker than modern counterparts. Shells from the 1970s were 32.2 percent thicker. Shells collected from a different Native American site in Sand Point, Wash., dating between 2150 and 2420 years old were almost 94 percent thicker than modern shells.
The long-term decline in thickness likely shows a response to ocean acidification, though the researchers also consider other environmental drivers including changes in food supply (e.g. plankton) for mussels.
The researchers also point out that their findings raise concerns about the California mussel’s ability to retain its role as a foundational species in these waters. Decreased shell thickness makes them increasingly vulnerable to predators and environmental disturbances. This in turn could affect interactions with hundreds of other species of organisms that live near mussel beds in tidal waters.
“The California mussel is a common species along the entire west coast of the United States, and their fate will be linked to that of a rich diversity of predators, including sea stars and sea otters, as well as myriad species that are part of the mussel bed habitat,” Pfister said. “It is imperative that we understand more about how these species will change as ocean conditions change.”
The study, “Historical baselines and the future of shell calcification for a foundation species in a changing ocean,” was supported by the SeaDoc Foundation, the National Science Foundation and the United States Department of Defense. Additional authors include Timothy Wootton from the University of Chicago; Kaustuv Roy from the University of California, San Diego; Sophie McCoy, who conducted the work as a graduate student at UChicago, now at Florida State University; Robert Paine from the University of Washington; Thomas Suchanek from the U.S. Geological Survey and the University of California, Davis; and Eric Sanford from the University of California, Davis.
ABSTRACT: Seawater pH and the availability of carbonate ions are decreasing due to anthropogenic carbon dioxide emissions, posing challenges for calcifying marine species. Marine mussels are of particular concern given their role as foundation species worldwide. Here, we document shell growth and calcification patterns in Mytilus californianus, the California mussel, over millennial and decadal scales. By comparing shell thickness across the largest modern shells, the largest mussels collected in the 1960s–1970s and shells from two Native American midden sites (∼1000–2420 years BP), we found that modern shells are thinner overall, thinner per age category and thinner per unit length. Thus, the largest individuals of this species are calcifying less now than in the past. Comparisons of shell thickness in smaller individuals over the past 10–40 years, however, do not show significant shell thinning. Given our sampling strategy, these results are unlikely to simply reflect within-site variability or preservation effects. Review of environmental and biotic drivers known to affect shell calcification suggests declining ocean pH as a likely explanation for the observed shell thinning. Further future decreases in shell thickness could have significant negative impacts on M. californianus survival and, in turn, negatively impact the species-rich complex that occupies mussel beds.
Video: Forest Service survey finds record 66 million dead trees in southern Sierra Nevada – ‘Tree dies-offs of this magnitude are unprecedented and increase the risk of catastrophic wildfires’0 comments Posted by Jim at Thursday, June 23, 2016
VALLEJO, California, 22 June 2016 (USFS) – The U.S. Forest Service today announced that it has identified an additional 26 million trees dead in California since October 2015. These trees are located in six counties across 760,000 acres in the southern Sierra Nevada region of the state, and are in addition to the 40 million trees that died statewide from 2010 to October 2015, bringing the total to at least 66 million dead trees. Four consecutive years of severe drought in California, a dramatic rise in bark beetle infestation and warmer temperatures are leading to historic levels of tree die-off.
"Tree dies-offs of this magnitude are unprecedented and increase the risk of catastrophic wildfires that puts property and lives at risk," said Agriculture Secretary Tom Vilsack. "While the fire risk is currently the most extreme in California because of the tree mortality, forests across the country are at risk of wildfire and urgently need restoration requiring a massive effort to remove this tinder and improve their health. Unfortunately, unless Congress acts now to address how we pay for firefighting, the Forest Service will not have the resources necessary to address the forest die-off and restore our forests. Forcing the Forest Service to pay for massive wildfire disasters out of its pre-existing fixed budget instead of from an emergency fund like all other natural disasters means there is not enough money left to do the very work that would help restore these high mortality areas. We must fund wildfire suppression like other natural disasters in the country."
Between 2010 and late 2015, Forest Service aerial detection surveys found that 40 million trees died across California - with nearly three quarters of that total succumbing to drought and insect mortality from September 2014 to October 2015 alone. The survey identified approximately 26 million additional dead trees since the last inventory in October, 2015. The areas surveyed in May covered six southern Sierra counties including Fresno, Kern, Madera, Mariposa, Tuolumne, and Tulare. Photos and video of the May survey are available on the Forest Service multimedia webpage.
Last fall, Governor Brown declared a state of emergency on the unprecedented tree die-off in California and formed a Tree Mortality task force to help mobilize additional resources for the safe removal of dead and dying trees. The Forest Service is committing significant resources to restore impacted forests including reprioritizing $32 million in California to conduct safety-focused restoration along roads, trails and recreation sites. To date, the Forest Service has felled over 77,000 hazard trees, treated over 13,000 acres along 228 miles of roads around communities and recreation sites, and created 1,100 acres of fuel breaks. Work on another 15,000 acres is in progress.
Forest Service scientists expect to see continued elevated levels of tree mortality during 2016 in dense forest stands, stands impacted by root diseases or other stress agents and in areas with higher levels of bark beetle activity. Additional surveys across the state will be conducted throughout the summer and fall.
With the increasing size and costs of suppressing wildfires due to climate change and other factors, the very efforts that would protect watersheds and restore forests to make them more resilient to fire in the future are being squeezed out of the budget. Last year fire management alone consumed 56 percent of the Forest Service's budget.
Learn more about tree mortality and the work to restore our forests in California at the Forest Service's web page Our Changing Forests.
The mission of the U.S. Forest Service, part of the U.S. Department of Agriculture, is to sustain the health, diversity and productivity of the nation's forests and grasslands to meet the needs of present and future generations. The agency manages 193 million acres of public land, provides assistance to state and private landowners, and maintains the largest forestry research organization in the world. Public lands managed by the Forest Service contribute more than $13 billion to the economy each year through visitor spending alone and provide 20 percent of the nation's clean water supply.
For an interactive look at USDA's work in conservation and forestry over the course of this Administration, visit USDA Results: Caring for our Air, Land and Water (link is external).
By Matthew Chin
21 June 2016
(UCLA) – Even with this winter’s strong El Niño, the Sierra Nevada snowpack will likely take until 2019 to return to pre-drought levels, according to a new analysis led by UCLA hydrology researchers.
Additionally, they suggest their new method, which provided unprecedented detail and precision, could be useful in characterizing water in the snowpack in other mountains, including ranges in western North America, the Andes or the Himalayas. These areas currently have much less on-site monitoring than in the Sierra Nevada.
The study was published online today in The American Geophysical Union journal Geophysical Research Letters.
“With the consecutive years of ongoing drought, the Sierra Nevada snowpack’s total water volume is in deficit and our analysis shows it will to take a few years for a complete recovery, even if there are above-average precipitation years,” said the study’s principal investigator, Steve Margulis, professor of civil and environmental engineering at the UCLA Henry Samueli School of Engineering and Applied Science.
Much of California’s water comes from the when the Sierra Nevada snowpack melts. The winter of 2015 capped four consecutive years of drought that resulted in the largest cumulative drought deficit spanning the 65 years that have been examined. The water volume of the snowpack in 2015 was just 2.9 cubic kilometers, when a typical year is about 18.6 cubic kilometers.
“It is critical for regions like California, that rely on their regional snowpack for water supply, to understand the dynamics of the system,” Margulis said. “Our new tool could help not just California, but other regions, gain insight about their regional snowpack.”
The researchers created a dataset covering 31 years (from 1985 to 2015), using measurements from NASA Landsat satellites, which provide daily maps of the full Sierra Nevada snowpack that have about 10 times sharper resolution that previously available. While there are on-site sensors throughout the mountain range, they are typically in the middle elevations and do not provide a full, high-resolution picture of the entire range, particularly at higher elevations, Margulis said. The researchers combined their new dataset with other snow survey data, collected by the state’s Department of Water Resources, to extend the time series of range-wide snowpack volumes back 65 years to 1951.
Using the data, the researchers applied probabilistic modeling methods to make predictions of snowpack water availability. Accounting for the four-year snowpack deficit from the 2012-2015 drought, the researchers say it will likely take until 2019 to get back to pre-drought conditions.
“Our larger goal is to build a very detailed, continuous picture of the historical snowpack, diagnose the primary factors that cause it to vary, and then ultimately improve models for predicting how much water will be available from it,” Margulis said. “This unprecedented information can help policy makers make more informed decisions with regard to this critical resource, especially as climate change affects it.”
Other authors include graduate students Gonzalo Cortés and Laurie Huning, both members of Margulis’ research group at UCLA; Manuela Girotto, a research scientist with NASA’s Goddard Space Flight Center and with the Universities Space Research Association in Columbia, Maryland; and Dongyue Li, graduate student, and Michael Durand, associate professor of earth sciences, both of The Ohio State University.
The research was supported by NASA and the National Science Foundation.
ABSTRACT: Analysis of the Sierra Nevada (USA) snowpack using a new spatially distributed snow reanalysis data set, in combination with longer term in situ data, indicates that water year 2015 was a truly extreme (dry) year. The range-wide peak snow volume was characterized by a return period of over 600 years (95% confidence interval between 100 and 4400 years) having a strong elevational gradient with a return period at lower elevations over an order of magnitude larger than those at higher elevations. The 2015 conditions, occurring on top of three previous drought years, led to an accumulated (multiyear) snowpack deficit of ~ −22 km3, the highest over the 65 years analyzed. Early estimates based on 1 April snow course data indicate that the snowpack drought deficit will not be overcome in 2016, despite historically strong El Niño conditions. Results based on a probabilistic Monte Carlo simulation show that recovery from the snowpack drought will likely take about 4 years.
By Kate Ravilious
20 June 2016
(environmentalresearchweb) – Mapping high-latitude Arctic regions is a thankless task right now. Hillsides are vanishing overnight, new lakes and ponds are coming and going every week, and streams and rivers are changing course frequently. This restless landscape is due to permafrost thaw. Now a study reveals that in some regions the amount of land on the move has increased more than fourfold over the last 50 years.
Arctic warming is twice as fast as anywhere else in the world. Already average temperatures in the Arctic have risen by more than 3°C since 1900, and sea ice is melting so fast that most scientists believe we'll see an ice-free summer within the next 20 years. Permafrost, some of which has persisted for thousands of years, is rapidly turning to slush. Scars show where land has slumped, but until now few had measured how quickly this change is occurring.
Trevor Lantz and Rebecca Segal from the University of Victoria in Canada, and Steve Kokelj from the NWT Geological Survey, used aerial photos and satellite imagery to measure the impact of climate change on thaw slumping in the landscape in four ice-rich regions of northwestern Canada. Scouring the images for signs of slump activity, they were able to assess how much change there had been in the last 50 years or so.
The team found that the area impacted by slumps had increased between 2 and 407%, the average slump sizes had increased between 0.31 and 1.82ha, and slump growth rates had increased by 169 to 465 sq. m per year. Increased temperatures and precipitation have both contributed to these changes. [more]
ABSTRACT: Climate change is increasing the frequency and intensity of thermokarst, but the influences of regional climate and physiography remain poorly understood. Retrogressive thaw slumping is one of the most dynamic forms of thermokarst and affects many areas of glaciated terrain across northwestern Canada. In this study, we used air photos and satellite imagery to investigate the influence of climate and landscape factors on thaw slump dynamics. We assessed slump size, density, and growth rates in four regions of ice-rich terrain with contrasting climate and physiographic conditions: the Jesse Moraine, the Tuktoyaktuk Coastlands, the Bluenose Moraine, and the Peel Plateau. Observed increases in: (1) the area impacted by slumps (+2 to +407%), (2) average slump sizes (+0.31 to +1.82 ha), and (3) slump growth rates (+169 to +465 m2 yr−1) showed that thermokarst activity is rapidly accelerating in ice-rich morainal landscapes in the western Canadian Arctic, where slumping has become a dominant driver of geomorphic change. Differences in slump characteristics among regions indicate that slump development is strongly influenced by topography, ground ice conditions, and Quaternary history. Observed increases in slump activity occurred in conjunction with increases in air temperature and precipitation, but variation in slump activity among the four regions suggests that increased precipitation has been an important driver of change. Our observation that the most rapid intensification of slump activity occurred in the coldest environment (the Jesse Moraine on Banks Island) indicates that ice-cored landscapes in cold permafrost environments are highly vulnerable to climate change.