Green Local 175 on , Follow @greenlocal175
'Industrial-era warming has led to current summer temperatures that are unprecedented,' say researchers
Alexandra Byers · CBC News · Posted: Apr 15, 2019 5:00 AM CT | Last Updated: April 15
'I think the Dempster [Highway] is one of the most breathtaking places I've worked,' said paleoclimatologist Trevor Porter.
It was a warm day in late June 2013, and Trevor Porter was up the Dempster Highway, just past Tombstone Territorial Park.
"It was a little cloudy, sometimes the sun peeking through," he recalled. "I think the Dempster is one of the most breathtaking places I've worked. Big mountains all around you — big valleys."
Porter is a paleoclimatologist and an assistant professor at the University of Toronto Mississauga. On that day in 2013, he was a postdoctoral fellow at the University of Alberta and was on the Dempster specifically for the permafrost.
At two spots just off the highway, Porter and a small team drilled for core samples. Using a diamond-tipped drill, they spun and melted their way into the permafrost.
The team could only drill about a metre at a time, pulling out each piece that was seven centimetres in diameter, and quickly putting it in a cooler. In all, they drilled five metres deep.
In those five metres was 13,600 years of Yukon history.
In June 2013, a team of researchers and students from the University of Alberta drilled a five metre-long sample of permafrost from a peat plateau on the Dempster Highway in central Yukon.
The research on those cores was published last week in the journal Nature Communications.
Porter and his team were able to show that Yukon summer temperatures in recent decades are the warmest they've ever been in the 13,600-year period.
The report is just one of a series of recent studies indicating that Canada's North is warming faster than the rest of the world. It confirms previous, separate research, but was also a groundbreaking form of permafrost analysis.
Research is first of its kind
The technique centres on studying water isotopes frozen in layers of ice. Prior to this, it was typically done with ice cores from glaciers.
Five metres of permafrost contained 13,600 years worth of Yukon precipitation history. (Trevor Porter)
Different types of hydrogen and oxygen atoms preserved in the ice indicate either warm or cold temperatures.
By analyzing the different atoms, scientists can calculate the temperature at the time the water fell onto the surface.
"[The study] kind of shows the world, shows the research community that, 'Wow, look, you can do this in places that don't have glaciers,'" said Porter.
"And that's really exciting for me."
The successful new technique and the study's disarming revelations for Yukon are thanks to the team's six years of analysis and data processing in academic labs — but their roots stem from that lush and muddy summer 2013 research trip.
That's really more justification for us to get out there and save these records before they're all gone.
- Trevor Porter, University of Toronto Mississauga
'It still feels special to hold in my hands'
Porter and his team spent three weeks based out of Dawson City that year, working on a variety of different projects.
They lived in cabins next to the Yukon River, spent the day working hard in nature, and fell asleep exhausted but invigorated after dinners spent around a picnic table next to their open cook house.
Researchers and students from the University of Alberta at their base camp in Dawson City, Yukon in June 2013. (Trevor Porter)
"It's my favourite time of year, really. You know, we have a lot of very serious jobs around the university, teaching and grading and [going to] meetings," said Porter.
"But going into the field is just nothing else really matters."
He said they all felt the significance of the "incredible archive that's coming out of the ground."
Canada warming at twice the global rate, leaked report finds
Porter admitted he's used to working on samples that are "a lot older" than 13,600 years.
"But it still feels special to hold in my hands and say, 'Look. This is a really important time period that we're going to be able to understand with this sample.'"
Porter said his team recognized the significance of the 'archive' they were pulling out of the ground. (Trevor Porter)
After spending time in a chest freezer in Dawson, the cores were shipped to the University of Alberta.
They were split lengthwise. One half was preserved as an archive, the other half was sliced into centimetre-thick little half moon pieces which are allowed to thaw. The water was filtered, processed and analyzed.
The team needed to gather more context and information to be able to interpret the data, but Porter said they saw a very strong trend from the very beginning.
After calculating temperatures in central Yukon for the last 13,600 years, Porter and his co-authors said in a news release that it is clear "industrial-era warming has led to current summer temperatures that are unprecedented" and exceed "all previous maximum temperatures by nearly 2 degrees Celsius."
Porter and his team were able to show that high temperatures in Yukon are the warmest they've ever been in nearly 14,000 years.
Porter's technique works, but the warming trend it reveals also puts the very permafrost it relies on at risk.
"Some of these sites will reach these critical temperatures where they just start thawing, and you do start losing that record," he said.
"I guess that's really more justification for us to get out there and save these records before they're all gone."
By Benno Kass | Fox News
NYC set to become first US city to charge congestion tax on drivers
Starting in 2021, drivers will be charged approximately $10 for driving into parts of Manhattan as part of the state's new budget deal; Laura Ingle has the details.
Several major cities are now considering a so-called “congestion” tax, on the heels of New York approving the controversial first-in-the-nation fee on downtown drivers in a bid to ease gridlock.
New York state lawmakers earlier this month approved a congestion surcharge for drivers at all Manhattan points of entry below 60th Street, the culmination of a decade-long fight that began in 2007 when former New York City Mayor Michael Bloomberg began pushing the plan. Now supporters in Los Angeles, San Francisco, Philadelphia, Boston, Seattle and Portland are considering following New York’s lead, in an effort to cut down on traffic and pollution and raise money for public transportation.
NEW YORK OFFICIALS FACE BACKLASH OVER 'CONGESTION' TAX PUSH
But critics say the charges, in New York and beyond, would only punish commuters who have no choice but to drive into cities, as well as businesses that rely on downtown deliveries.
“Congestion itself is enough of a deterrent for folks driving into the city,” Republican Pennsylvania state Rep. Todd Stephens told Fox News.
Stephens represents suburban Montgomery County outside Philadelphia -- which is now looking at a congestion tax for the first time, according to the office of Democratic Mayor Jim Kenney. Spokeswoman Kelly Cofrancisco told The New York Times the city will examine New York’s experience “to see how this can help improve equity, safety, sustainability and mobility.”
Some local lawmakers are welcoming the idea. “I applaud my … colleagues in New York for their work to make the city more navigable and more accessible. As cities take on gridlock and congestion with meaningful policy solutions like congestion pricing, I’m certainly paying attention,” Democratic Councilmember Helen Gym said in a statement to Fox News.
Stephens, whose district includes many Philadelphia commuters, countered that a congestion tax would be “unfair” to workers, adding “some people have to drive.”
Also paying attention are advocates of the tax in Los Angeles, San Francisco and Portland – all cities that are conducting or planning impact studies on such a proposal.
A report recently issued by the Southern California Association of Governments recommended a $4 fee for entering West Los Angeles and Santa Monica during the weekday rush hour. It concludes that this change would reduce congestion by roughly 20 percent.
"Congestion pricing is a creative solution to gridlock that appropriately prices road use and also generates significant funding for transit while reducing pollution. The alternative is simply doing nothing and allowing congestion to persist unchallenged. I think it is positive that cities around the world, including some in California, are willing to try [innovative] solutions over inaction,” said Democratic California Assemblyman Richard Bloom, who represents parts of Los Angeles.
In Massachusetts, meanwhile, The Boston Herald reports that state lawmakers have introduced a new bill aimed at imposing a congestion tax in East Boston. The push may be difficult as Massachusetts GOP Gov. Charlie Baker vetoed such a bill last year and Boston Democratic Mayor Marty Walsh faced intense backlash from commuters at the time.
The analytics company Inrix, using over 350 million traffic data points, rates Boston as the most congested city in the country, with New York City, Los Angeles, Seattle, San Francisco, Philadelphia and Portland also ranking in the top 10.
Seattle has also been studying the issue since before New York approved its plan. Democratic Mayor Jenny Durkan pitched the idea as part of her budget push last year to make the city compliant with the Paris climate accord after President Trump withdrew the U.S. from the treaty.
Democratic Councilman Mike O’Brien, who spearheaded the push in his city, told Fox News in a statement, “As congestion and pollution increase in our growing city, we need tools such as congestion pricing to ensure our city continues to function well.” O’Brien stresses Seattle is assessing the possible impact on low-income households he said “should not disproportionately bear the impacts of congestion pricing. This is why Seattle started a study over a year ago to better understand who currently drives downtown, when they drive downtown, and why.”
As these cities consider the idea, they’re likely to encounter the same kind of resistance New York experienced. Amid that debate, the Independent Drivers Guild (representing over 70,000 app-based drivers) blasted the plan as a “sham” tax that “unjustly singles out low income for-hire drivers and their already highly-taxed riders.” Uber, meanwhile, lobbied for the idea.
AAA spokesman Robert Sinclair told Fox News that “as far as we are concerned it is all about the money, it is terribly unfair.”
Congestion pricing is an idea first popularized overseas. Back in 2003, London imposed a fee of £11.50 ($15) to drive into Central London, and piled on another fee several years ago for older vehicles.
While the price has not been set in stone for New York, a report commissioned by Gov. Andrew Cuomo’s office recommends cars entering Manhattan during peak hours be charged $11.52, and trucks be charged $25.34 – on top of any bridge tolls.
Germany has inaugurated its biggest wind farm in the Baltic Sea. The official start of Arkona project operations comes as offshore wind utilization in the country has seen roughly 10 years of ups and downs.
The Arkona wind farm in the Baltic between Germany's Isle of Rügen and the Danish island of Bornholm has been generating electricity for some months, but was officially inaugurated on Tuesday in the presence of German Chancellor Angela Merkel.
She said renewables had made it onto the center stage of energy supply policies. By 2030, 65 percent of all electricity generated in Germany is to come from renewable sources.
The new offshore farm has a capacity of 385 megawatts — enough to provide some 400,000 households with electricity. Building the Arkona wind farm involved a workforce of roughly 5,000 and total investments to the tune of €1.2 billion ($1.36 billion), shared between German utility E.ON and Norwegian energy company Equinor.
Reshaping the energy market
Wind energy in general has become a pillar of Germany's energiewende, the nation's large-scale and closely watched energy transition policy. It involves the gradual increase of renewables in the overall energy mix, while phasing out nuclear energy by 2022 and coal by 2038.
In 2018, renewable sources accounted for 35.2% of the country's energy mix. Within renewables, onshore wind claimed 14.5%, while offshore wind accounted for only 3%. But this is likely to change in the years ahead.
Promoting renewables has been agreed government policy in Germany for many years now
The utilization of offshore wind in Europe's biggest economy dates back to August 2009 when the Alpha Ventus farm in the North Sea recorded it first feed-in to become fully operational in 2010. E.ON, EWE and Vattenfall pumped some €250 million into the farm, with Berlin adding another €30 million in state aid.
Similar projects had experienced their share of problems as initial investments were huge and challenges such as hooking up offshore turbines to the land-based electricity grid resulted in temporary setbacks.
According to the Clean Energy Wire (CLEW) network, The UK and Germany have been at the forefront of bringing down costs for offshore wind operations by steadily adding capacities. The two nations together account for well over 60% of all offshore capacity installed globally, a CLEW dossier points out.
It also notes that the recent offshore wind push was largely helped by Germany's first offshore wind power auction in 2017 where bidders offered to build new facilities without receiving any financial support. Three years earlier, potential investors were still wary about costs and grid connection delays that left them wondering about the future of offshore power generation in general.
Although conditions for generating power at sea remain difficult, offshore wind farms have now become a viable business, which — as CLEW puts it — "can fully compete with other forms of power generation."
Onshore wind power facing resistance
What's also contributed to the latest push is that offshore wind projects have not nearly faced the same level of public protests that has often delayed or even thwarted onshore expansion.
The German Wind Energy Association (BWE) and the VDMA Power Systems association told reporters in January of this year that businesses in the onshore sector were expecting severe headwinds in 2019 and beyond, saying that they would only be able to add about 2,000 megawatts in onshore wind capacity throughout the current year.
They said this was because of the problems arising from an increasing number of complaints by individuals and communities concerned about nature conservation and animal protection.
This is not to say that offshore projects are completely unproblematic. While environmental pressure groups welcome the use of the clean energy source involved, such offshore farms are more often than not built in fragile marine habitats "and are largely hidden from public scrutiny," says CLEW.
Nonetheless, the advantages should not be underestimated. An average offshore turbine already produces up to twice as much power as its onshore counterpart, and new and even more powerful models are in the pipeline.
The future speed of offshore wind expansion will hinge on how fast Germany can connect is northern parts (where wind power generation is strongest) with the industrial hubs in the south (where huge amounts of energy are needed).
"Only then can offshore wind unleash its full potential as a grid-stabilizing energy source that provides baseload power."
Study examines the incentives and goals driving corporate disclosure
April 18, 2019
Firms that value and practice environmental transparency in their reporting to stakeholders are in general better economic performers than those whose practices are more opaque.
Is honesty the best policy when it comes to being green?
It just might be, according to a new paper by Michel Magnan, a professor of accountancy at the John Molson School of School of Business.
In their article for Sustainability Accounting, Management and Policy Journal, Magnan and co-author Hani Tadros of Elon University in North Carolina looked at 78 US firms in environmentally sensitive industries from 1997 to 2010. They wanted to deepen their understanding of the driving forces behind the firms' disclosure of their environmental practices and management.
"There is tension out there," says Magnan, the Stephen A. Jarislowsky Chair in Corporate Governance. "Many people are skeptical and will adopt a cynical perspective regarding what corporations choose to disclose."
With public trust in business in general decline, it may be natural to assume that most firms are padding their numbers or deciding to obscure their environmental behaviour. But Magnan says he has found that that is not the case.
Many are keenly aware of growing environmental concerns among members of the public, including investors and consumers of their products. In response, some are quite literally cleaning up their act.
What is said vs. what is done
The researchers separated the firms they studied into two groups based on the data they collected, including public information and the firms' annual disclosure reports and regulatory filings.
The companies whose environmental performance scored positively when compared to existing government regulations (meaning they respected guidelines on pollution, emissions and so on) were designated "high performers." Those that did poorly were designated "low performers."
"High- and low-performing firms will adopt different patterns when it comes to disclosure," explains Magnan. "High performers will provide more information because they are doing well and want to convey that message to their various stakeholders. Poor performers, meanwhile, will try to manage impressions in some way."
The researchers paid close attention to the usefulness of the information firms disclosed. They preferred data that was objectively verifiable and quantitative -- they called that "hard" information. "Soft" information generally consisted of vague statements, unattached to specifics.
They found that high-performing corporations were more likely to disclose hard information because they could afford to be forthcoming. They were using their disclosure as a way of building trust and earning public goodwill, which pays dividends down the line.
"If more disclosure raises your market value, it makes sense," Magnan says.
Look for good, clean facts
With stakeholders paying more attention to environmental issues, Magnan says there is added pressure on firms to come clean on their environmental performance. He sees corporate culture heading in that direction already.
"Some firms will be more forthcoming because that is their governance model, and they feel that it is better to be forthcoming early on," he says. "The costs will be less, and it shows they are operating in good faith."
Companies that engage in practices designed to obfuscate, deny or lie about poor environmental performances are likely to suffer serious consequences, he adds.
"In the short run, that kind of behaviour may help you, but in the long run it may come back to hurt you. Everything becomes public at some point."
Materials provided by Concordia University.
Hani Tadros, Michel Magnan. How does environmental performance map into environmental disclosure? Sustainability Accounting, Management and Policy Journal, 2019; 10 (1): 62 DOI: 10.1108/SAMPJ-05-2018-0125
Erica Evans, @elevanserica, Desert News
Published: April 17, 2019 3:00 pm
SALT LAKE CITY — Anti-abortion Christians are among those protesting an Environmental Protection Agency proposal that would limit consideration of health benefits in regulating mercury and other toxic emissions from industrial sources like coal-burning power plants.
"Today, over 145,000 pro-life Christians from across the country — including over 94,000 from states that voted for President Trump — are calling on the Trump administration to stop its efforts to dismantle protections that defend children in the womb from mercury pollution," the Rev. Mitchell C. Hescox, president of The Evangelical Environmental Network, wrote for The Hill. "Dismantling these protections is wrong, and it does not square with our faith or the faith of millions of pro-life Americans."
Wednesday was the last day for public comment on changes to the Mercury and Air Toxics Standards, which the Trump administration proposed in December. The changes would overturn federal rules imposed by the Obama administration, which the Environmental Protection Agency now says are too costly to justify.
"Children in the womb are uniquely vulnerable to mercury — a potent neurotoxin — because a protective shield around the developing child’s brain, called the 'blood-brain barrier,' is not fully formed until the first year of life," wrote the Rev. Hescox. "Mercury passes across the mother’s placenta, enters the bloodstream of her child and then into the developing child’s brain, causing brain damage, developmental disabilities, neurological disorders, lowered intelligence and learning difficulties."
Children exposed to mercury during a mother’s pregnancy can experience lifelong IQ and motor function deficits, according to a 2017 study.
The EPA calculated it would cost industry between $7.4 billion and $9.6 billion annually to install the necessary technology to cut mercury, while the health benefits ranged from $4 million to $6 million, The New York Times reported.
The Obama administration, however, had also factored in "co-benefits" from reduced particulate matter linked to mercury emissions, including reduced premature deaths, sick days and hospital visits, adding an additional $80 billion in health benefits a year, according to the article.
Hal Quinn, president of the National Mining Association, praised the EPA's proposed change, calling the mercury limits "perhaps the largest regulatory accounting fraud perpetrated on American consumers," the Times reported.
But anti-abortion advocates are less concerned with monetary costs and more concerned with costs to lives.
"As an evangelical Christian, I believe that all human life is sacred; that each person conceived is of equal and innate value, and that all human life is worthy of protection," the Rev. Jeremy Summers, chairman of the board of The Evangelical Environmental Network, wrote for The State. "Our commitment to Jesus Christ compels us to do all we can to protect unborn children from mercury poisoning. It is a pro-life concern, plain and simple."
Melody Zhang, a member of Young Evangelicals for Climate Action and the Climate Justice Coordinator for Sojourners, a Christian advocacy organization, testified before the EPA on March 15. She wrote that on the morning of the hearing, she gathered together with more than 30 other faith leaders "in prayer and petitioning against the rollback of the Mercury Rule."
"We are motivated by our Christian faith to act as responsible stewards of creation and to protect the abundance and diversity of the fullness of life by advocating for alternatives to fossil fuel energy which are more equitable and just, for the sake of the most vulnerable among us," Zhang testified before the EPA. "Any amount of exposure to mercury is a real threat to the development of any child in utero and early in life."
There are no known safe levels of mercury. Coal-fired electric utilities are America’s largest source of mercury, according to a 2018 study. When emitted, the mercury deposits in bodies of water, is consumed by fish and enters the human food supply.
" Any amount of exposure to mercury is a real threat to the development of any child in utero and early in life. "
Melody Zhang, a member of Young Evangelicals for Climate Action and the Climate Justice Coordinator for Sojourners
The Economist reported that coal plants have spent billions in compliance costs and mercury emissions have fallen by nearly 90 percent since the regulation was implemented in 2011. As a result, 11,000 deaths have been prevented each year, according to Harold P. Wimmer, president and CEO of the American Lung Association.
"The rollback to MATS has galvanized people of different faiths to work together," Ricardo Simmons wrote for Daily Camera.
On March 20, 19 national religious organizations joined together in sending a letter to EPA Administrator Andrew Wheeler opposing the change, according to the Albany Times Union. Signatories included the Mormon Environmental Stewardship Alliance, Creation Justice Ministries, Evangelical Lutheran Church in America, The Friends Committee on National Legislation, National Baptist Convention USA, Inc., Union for Reform Judaism, United Church of Christ, Justice and Witness Ministries; and the United Methodist Church – General Board of Church and Society.
They wrote, “Limiting the benefits attributed to curbing mercury pollution in its cost-benefit analysis not only shortchanges the health of vulnerable populations but puts the health and well-being of all communities at risk.”
The U.S. Conference of Catholic Bishops also sent a letter to the EPA stating, “A human life — at any stage of development — has inestimable value because all persons are created in the image of God. Given the threat that these particular pollutants pose to unborn children, some of the most vulnerable among us, these principles must be upheld with a special importance.”
New evidence suggests one of the most important climate change data sets is getting the right answer.
By Chris Mooney April 17
A high-profile NASA temperature data set, which has pronounced the last five years the hottest on record and the globe a full degree Celsius warmer than in the late 1800s, has found new backing from independent satellite records — suggesting the findings are on a sound footing, scientists reported Tuesday.
If anything, the researchers found, the pace of climate change could be somewhat more severe than previously acknowledged, at least in the fastest warming part of the world — its highest latitudes.
“We may actually have been underestimating how much warmer [the Arctic’s] been getting,” said Gavin Schmidt, who directs NASA’s Goddard Institute for Space Studies, which keeps the temperature data, and who was a co-author of the new study released in Environmental Research Letters.
NASA’s flagship data set, known as GISTEMP, is one of two kept by agencies of the U.S. government, the other being maintained by the National Oceanic and Atmospheric Administration. Both data sets — along with several others maintained by institutions and academic groups around the world — are based on a merger of the records of thousands of thermometers spread across Earth’s land surfaces, and a growing volume of ocean measurements from buoys and other instruments.
As the data sets have shown not only steady global warming also but a string of new temperature records, they have come under increased scrutiny, with occasional criticism of the high-profile findings and how they are stitched together. However, the research groups have maintained that their methods are valid and that the different records agree considerably more than they disagree, suggesting that the warming trend they are showing is, more or less, correct.
Enter NASA’s Aqua satellite, which has been in orbit since 2002, and carries an infrared device that is able to independently measure temperatures at the surface of Earth and, in fact, do so with a higher degree of resolution than characterizes the NASA climate data set.
The temperature record provided by the satellite, which runs from 2003 through 2018 at present, supports NASA’s finding that 2016 was the hottest year on record and, generally, that the warming trend continues just as the surface thermometers have claimed, finds the study led by NASA’s Joel Susskind.
[Hurricanes are strengthening faster in the Atlantic, and climate change is a big reason why, scientists say]
“What you end up with is a really impressive correspondence between the trends that you’re seeing in this satellite product, which is totally independent of the surface temperatures, and the interpretations of the weather stations,” said Schmidt, one of Susskind’s three co-authors.
Here is a figure from the study showing how closely NASA’s data set from the years 2003 through 2017 matches the findings of the Atmospheric Infra-Red Sounder on the Aqua satellite, or AIRS — and how those in turn track three other global temperature data sets:
Global mean anomalies for the AIRS and GISTEMP data sets for January 2003 through December 2017, along with three other selected data sets. (Susskind et al., Environmental Research Letters, 2019)
“What you end up with is a really impressive correspondence between the trends that you’re seeing in this satellite product, which is totally independent of the surface temperatures, and the interpretations of the weather stations,” said Schmidt, one of Susskind’s three co-authors.
Here is a figure from the study showing how closely NASA’s data set from the years 2003 through 2017 matches the findings of the Atmospheric Infra-Red Sounder on the Aqua satellite, or AIRS — and how those in turn track three other global temperature data sets:
Notably, AIRS sometimes shows more warming than the NASA data set, and especially does so in the Arctic, a region where measurements are scarce and warming is fastest. Shockingly, it even finds that over the Barents and Kara seas in the Arctic, the warming trend is at a rate of 2.5 degrees Celsius — or 4.5 degrees Fahrenheit — per decade.
This suggests that, if anything, Earth as a whole may be warming faster than NASA has until now claimed — not more slowly.
“These findings should help put to rest any lingering concerns that modern warming is somehow due to the location of sensors in urban heat islands or other measurement errors at the surface,” said Zeke Hausfather, a researcher at the University of California at Berkeley who works on another of the temperature data sets — called Berkeley Earth — and commented on the new study, with which he was not involved.
“The AIRS satellite data captures the whole surface of the planet and shows that, if anything, our surface measurements are slightly underestimating the rate of modern warming,” he said.
The study also reinforces "that the Arctic is warming much faster than the rest of the world, and that correctly estimating temperatures in the region is important to understanding what is happening to the world as a whole,” Hausfather said.
The new research “confirms (yet again) from an independent source that the surface temperature records over the past couple of decades are robust,” added Ed Hawkins, a climate researcher at the University of Reading in Britain, by email.
The methodologies used to calculate Earth’s temperature are being improved all the time — and the data sets are constantly updated with the most recent information. Lively debates will persist about how to deal with some of the problems involved in this process, such as that cities tend to be warmer than the countryside, and that records are far more numerous and reliable today than they were at the close of the 19th century or a little bit before it, when the data sets begin.
But the new study suggests none of this weakens the major conclusion: Warming is ongoing; and Earth keeps pushing record temperature highs, at least in the context of the past 140 years or so.
“For all the issues that there are, the patterns are not just qualitatively right, they’re pretty much quantitatively right, too,” Schmidt said.
This paper presents Atmospheric Infra-Red Sounder (AIRS) surface skin temperature anomalies for the period 2003 through 2017, and compares them to station-based analyses of surface air temperature anomalies (principally the Goddard Institute for Space Studies Surface Temperature Analysis (GISTEMP)). The AIRS instrument flies on EOS Aqua, which was launched in 2002 and became stable in September 2002. AIRS surface temperatures are completely satellite-based and are totally independent of any surface-based measurements. We show in this paper that satellite-based surface temperatures can serve as an important validation of surface-based estimates and help to improve surface-based data sets in a way that can be extended back many decades to further scientific research. AIRS surface temperatures have better spatial coverage than those of GISTEMP, though at the global annual scale the two data sets are highly coherent. As in the surface-based analyses, 2016 was the warmest year yet.
The GISTEMP data set, and the totally independent satellite-based AIRS surface skin temperature data set, are very consistent with each over the past 15 years. Both data sets demonstrate that the Earth's surface has been warming globally over this time period, and that 2016, 2017, and 2015 have been the warmest years in the instrumental record, in that order. In addition to being an independent data set, AIRS products complement those of GISTEMP because they are at a higher spatial resolution than those of GISTEMP and have more complete spatial coverage, despite a shorter record. Differences in the products (and lower temporal correlations) mostly reflect areas without much directly observed station data (the Arctic, Southern Ocean, sub-Saharan Africa) suggesting that the fault lies in the station-based products rather than with the AIRS data. Notably, surface-based data sets may be underestimating the changes in the Arctic.
The characteristics of the Earth's climate change continually. Complementary satellite-based surface temperature analyses serve as an important validation of surface-based estimates, and they may point the way to make improvements in surface-based products that can perhaps be extended back many decades.
To fight climate change, the city is forcing the buildings, like the Empire State Building and Trump Tower, to reduce greenhouse gas emissions.
Buildings like the Freedom Tower and the Empire State Building could face fines of up to millions of dollars per year if they do not significantly reduce emissions by 2030.
By William Neuman
April 17, 2019
New York City is about to embark on an ambitious plan to fight climate change that would force thousands of large buildings, like the Empire State Building and Trump Tower, to sharply reduce their greenhouse gas emissions.
The legislation, expected to be passed by the City Council on Thursday, would set emission caps for many different types of buildings, with the goal of achieving a 40 percent overall reduction of emissions by 2030. Buildings that do not meet the caps could face steep fines.
The effort comes as New York, among other states, has undertaken a number of initiatives to reduce carbon emissions, and as the country debates the merits and necessity of the Green New Deal, the congressional proposal to tackle climate change and create new high-paying jobs in clean energy industries.
But New York City’s move — opposed by real estate industry executives in part because of the associated costs to meet the new targets — may be unprecedented, according to John Mandyck, the chief executive of the Urban Green Council, an umbrella group that includes real estate developers and environmental groups, among others.
“This is huge,” Mr. Mandyck said. “I haven’t seen a city that has tackled climate change head-on in a way like this, setting specific targets for buildings and providing a path forward for how they can comply through innovative policy tools.”
Buildings are among the biggest sources of greenhouse gas emissions because they use lots of energy for heating, cooling and lighting, and they tend to be inefficient, leaking heat in the winter and cool air in the summer through old windows or inadequate insulation. An inventory of greenhouse gas emissions published in 2017 found that buildings accounted for 67 percent of the city’s emissions.
The legislation, part of a package of bills known as the Climate Mobilization Act, comes after little progress was made during years of efforts to nudge, cajole or provide incentives to building owners to make voluntary cuts in energy use.
Mayor Bill de Blasio, who called the bill “very aggressive” during an unrelated news conference last week, has said that he would sign the legislation, and that his administration had worked closely with the Council as it drafted the bill. The mayor has been toying with running for president, and he has sought opportunities to project himself on a national stage as a champion of progressive causes, including as a leader in fighting climate change.
“It’s going to revolutionize our ability to reduce emissions through our buildings which are really our No. 1 problem here in New York City,” the mayor said.
The cost to building owners will be high. Mark Chambers, the director of the Mayor’s Office of Sustainability, said the cumulative cost to building owners to make the upgrades needed to meet the caps would exceed $4 billion. He added that building owners would eventually recoup those costs through lower operating expenses.
Many types of buildings were exempted from the caps, including houses of worship and apartment houses with rent-regulated units and other types of affordable housing. Those buildings are required to carry out several energy-saving measures, such as insulating pipes, but those measures fall well short of the costly steps required to meet the caps.
Real estate industry executives say that while they support reducing emissions, they believe too many types of buildings were given exemptions, placing an undue burden for reducing the city’s greenhouse gas output on the remaining buildings.
Ed Ermler is the board president of a group of four co-op apartment buildings with a total of 437 units in Jackson Heights, Queens. He said that in recent years he has invested hundreds of thousands of dollars to install computerized boiler controls and other systems to make the 1950s-era buildings, called Roosevelt Terrace, more energy efficient, and yet the target that the city has set is still “totally unattainable.”
“To get down to even 20 percent from where I am today, with the technology that exists, there’s nothing more that I can do,” Mr. Ermler said. “It’s not like there’s this magic wand.”
He said that the majority of apartment owners in the complex are over 65 years old. “Most are on a fixed income and I have to be very cognizant about anything that I do because I don’t want to put an undue burden on people that can’t afford it.”
Carl Hum, the general counsel for the Real Estate Board of New York, which primarily represents large developers, expressed concern that the exemptions given to thousands of large buildings would undercut the goal.
He also said that some businesses have much higher energy consumption — such as media, technology or life sciences companies — and that the law would effectively discourage landlords from leasing space to them, for fear of paying fines when the energy use of their buildings exceeds the caps.
“The overall effect is going to be that an owner is going to think twice before she rents out any space: ‘Is the next tenant I’m renting to going to be an energy hog or not?’” Mr. Hum said. “There’s a clear business case to be made that having a storage facility is a lot better than having a building that’s bustling with businesses and workers and economic activity.”
Mr. Chambers said that a new city agency that would be created by the law, the Office of Building Energy and Emissions Performance, would consider appeals from building owners and be empowered to give variances for tenants that have high energy needs.
“They would come to the city and explain that they have an outsize energy usage because of this particular use type, and their compliance path would be adjusted as a result,” Mr. Chambers said.
The math of the law is complex. Many large buildings have been tracking and reporting their energy use and related factors for years — a process known as benchmarking — allowing the city to calculate the greenhouse gas emissions per square foot for individual buildings.
The caps set limits for different types of buildings, such as apartment houses or office buildings. For instance, the city’s benchmarking website shows that the Empire State Building produces 6.27 kilograms of carbon dioxide per square foot in the course of a year.
The cap would require it to emit just 4.53 kilograms of CO2 per square foot in 2030, or pay a fine. Fines for large buildings could reach into the millions of dollars a year, depending on how far above the cap they are.
“New York City is a complicated, dynamic place with lots of old buildings and new buildings and hospitals and houses of worship and affordable apartments,” said the City Council speaker, Corey Johnson. “So to craft a bill that would make a significant difference while at the same time understanding the variation of building stock was a challenge.”
BY MIRANDA GREEN - 04/18/19 04:15 PM EDT
New York City approved a sweeping climate legislation package Thursday that is being compared to the Green New Deal.
In a 45-2 vote, the city legislature passed the Climate Mobilization Act, which aims to enact the largest carbon reduction measures of any city globally.
At the heart of the package is a bill that would require New York’s largest residential and commercial buildings to reduce carbon emissions 40 percent by 2040 and 80 percent by 2050. In comparison, the Green New Deal resolution introduced in Congress aims to get the U.S. electric grid running on 100 percent green energy by 2030.
The NYC bill’s requirements focus on buildings over 25,000 square feet, which represent just 2 percent of New York’s real estate yet account for about half of emissions from all buildings in the city. Overall, 70 percent of emissions in the city come from buildings.
Buildings that would be bound to the new emissions regulations include One World Trade Center and Trump Tower.
A report by the environmentalist group, Climate Works for all Coalition, ranked the Trump Tower as the fourth biggest energy user amongst all buildings in New York City.
Other measures included in the comprehensive climate package include requirements for certain structures to build green roofs or be equipped with solar panels, measures to ease the construction of wind projects, and a requirement for the city to study how to shut down its 24 utility-scale power plants to be replaced with renewable energy sources and storage.
Supporters claim the bill would produce 40,000 new jobs, of which nearly 25,000 would be in construction.
“I am proud to be a co-sponsor of Introduction 1253 as it sets ambitious, comprehensive standards on New York City’s worst polluters, old buildings. By modernizing buildings to raise efficiency standards we will dramatically cut pollution long term,” said Council Member Ben Kallos, in a statement.
“Retrofitting for efficiency and sustainability will reduce our City’s carbon footprint and create thousands of much-needed, good paying jobs.”
The bill now awaits NYC Mayor Bill De Blasio's (D) signature.
Environment Group Take: http://alignny.org/wp-content/uploads/2015/11/Elite-Emissions-Final-version-02-1.pdf
"Washington Gov. Jay Inslee (D), who has made climate change the focal point of his presidential campaign, called on the Democratic National Committee (DNC) to hold a debate centered solely on the issue.
“Climate change is the biggest threat facing our nation. And it demands to be the sole focus of a nationally televised debate. Democrats must put this issue front and center,” he said in a petition to pressure the DNC to hold the debate.
“This can’t be a one-off question where candidates get to give a soundbite and move on: Climate change is at the heart of every issue that matters to voters, and voters deserve to hear what 2020 presidential candidates plan to do about it,” he added in a press release."
A BMW electric vehicle at a road-side charging station in Budapest, Hungary. Albert Lugosi/Wikimedia Commons
Electric vehicles will be cost-competitive with combustion-engine cars by 2022, according to a new report by transportation analysts at Bloomberg New Energy Finance (BNEF). The trend is due to the plunging price of EV batteries. Batteries made up 57 percent of an electric vehicle’s total costs in 2015. Today, that number is down to 33 percent, and it is expected to drop to 20 percent by 2025.
The new forecast highlights the rapid development of EV technologies globally. BNEF first suggested in 2017 that electric cars would become cost-competitive with conventional cars in 2026. A year later, BNEF changed it to 2024. The analysts argue that this competitive pricing will allow customers to make a choice between a combustion or electric car based on style, performance, and preference — not cost.
BNEF also notes that the cost of electric powertrain systems is also dropping. By 2030, costs for motors, inverters, and power electronics could be 25 to 30 percent cheaper than today. The affordability of EVs will also be helped by a modest rise in costs for combustion vehicles as they utilize lighter-weight materials or other technologies to meet fuel efficiency targets.
And as Nathaniel Bullard, an analyst with BNEF, writes on Bloomberg Opinion, cheaper electric vehicle batteries don’t “just mean cheaper electric passenger cars. It also means all sorts of other vehicles that weren’t previously practical to electrify now are.” Bullard points to a new all-electric excavator developed by Komatsu Ltd., Asia’s top construction equipment maker.
Every stage of plastic's life cycle, from fossil fuel extraction to disposal, produces greenhouse gases. A new study looked at ways to lower the toll.
By Phil McKenna
Apr 15, 2019
The greenhouse gas emissions associated with plastics are projected to be nearly four times greater by mid-century. Increasing the use of renewable energy, plant-based feedstocks and aggressive recycling and reducing demand can help lower the impact.
As concern over plastic waste grows, researchers are raising red flags about another problem: plastic's rapidly growing carbon footprint. Left unchecked, greenhouse gas emissions associated with plastics will be nearly four times greater by mid-century, when they are projected to account for nearly one-sixth of global emissions.
Not all plastics have the same carbon footprint, though. What they are made from, the source of the energy that powers their production, and how they are disposed of at the end of their life cycle all make a difference.
In a study published Monday in the scientific journal Nature Climate Change, researchers compared the life cycle emissions of different types of plastics, made from fossil fuels and from plants, and looked for ways to lower their total greenhouse gas emissions.
They found that there is no silver bullet. Every combination of plastics production and end-of-life disposal generates greenhouse gas emissions. But by combining four different approaches, they found they could lower emissions up to 93 percent compared to business as usual by 2050 if each measure was taken to the extreme.
The researchers found that using 100 percent renewable energy in plastics production with fossil fuel-based feedstock could reduce greenhouse gas emissions by half compared to business as usual in 2050, though it would still be roughly twice the emissions generated from plastics in 2015.
Adding both aggressive recycling that keeps plastics out of landfills and incinerators and efforts that reduce the growth in demand for plastics by half to that scenario could further lower greenhouse gas emissions from plastics to roughly the amount produced in 2015.
Using plant-based feedstocks instead of fossil fuels could lower it even more, the researchers found.
The most effective combination the researchers found was to use a plant-based feedstock (sugarcane in this case), with 100 percent renewable energy for production, recycling of all plastics rather than incinerating or dumping them in landfills, and reducing the annual growth in demand for plastics by half.
That combination could theoretically reach a 93 percent reduction compared to business as usual in 2050, or about a 74 percent reduction from 2015 levels, the researchers found.
The Unprecedented Scale and Pace Required
The authors readily acknowledge that this would require implementing these strategies at "an unprecedented scale and pace" in an industry that is projected to have sustained 4 percent annual growth through 2050.
Less than 1 percent of the plastic produced globally was made from biological feedstocks and only 18 percent of plastic waste was recycled as of 2015.
Shifting all plastic from petroleum to bio-based feedstocks such as corn or sugarcane would also require as much as 5 percent of all arable land, the researchers said. Such a shift under current agriculture practices would put added pressure on food security and freshwater resources, though other plant-based feedstocks, such as crop residue like leaves and stalks, may yet emerge.
"The strategies that we tested are anywhere between unrealistic to ridiculous, to be honest," said Sangwon Suh a professor of environmental science and management at the University of California, Santa Barbara, and a co-author of the study. "What we realized as a result of the study was the magnitude of the challenge that we are facing really requires an unprecedented level of effort to mitigate greenhouse gas emissions."
To limit global warming to 2 degrees Celsius or less, a target in the Paris climate agreement, scientists say greenhouse gas emissions must be reduced to near zero by mid century.
"That is a tremendous challenge, and there is no room for us to increase our greenhouse gas emissions," Suh said.
Plastics Also Release Methane as They Degrade
The study looked at plastics' life cycle greenhouse gas emissions—emissions associated with every aspect of plastics, from extracting oil and natural gas for fossil fuel-based plastics, to manufacturing, to end of life processes, such as dumping plastic waste in landfills, recycling it or incinerating it.
Though not accounted for in the current study, roughly 3 percent of plastic waste produced each year ends up in oceans at the end of its life. A study published last fall concluded that as ocean plastic slowly degrades, it emits methane and other pollutants into the atmosphere. Methane, a short-lived climate pollutant, is also emitted during oil and gas extraction and is many times more potent than carbon dioxide.
The amount of methane currently released from such decomposing ocean plastic in the world's oceans is tiny, likely less than 1 percent of total methane emitted into the atmosphere each year from natural and manmade sources according to Sarah-Jeanne Royer, a marine scientist at the University of Hawaii and lead author of the ocean plastics study. Unless waste management practices improve, however, the amount of plastic entering oceans could be 10 times greater in 2025 than they were in 2015 according to a 2015 study in the journal Science.
Cost an Issue for Plant-Based Plastics
Efforts so far to replace polyethylene and polypropylene, two of the most common types of plastic used in everything from plastic bags and bottles to pipes and containers, have fallen short, said Geoffrey W. Coates, a chemistry and chemical biology professor at Cornell University.
"There is not a bio-based polymer that gets anywhere close to replicating those materials," he said.
Coates added, however, that for more niche applications, other solutions are being developed. Novomer, a company Coates co-founded, uses carbon dioxide as a feedstock to make polyurethane, a polymer used in paints, varnishes, and foams.
Cost also poses a challenge as plant-based plastics try to gain a foothold in an industry where petroleum-based plastics benefit from economies of scale and a half-century head start over newer bio-based alternatives. Developing bio-based plastics that can compete with petroleum-based plastics will come at a cost, said Marc Hillmyer director of the Center for Sustainable Polymers at the University of Minnesota and co-founder of a company that is exploring new bio-based plastic materials.
"The economics of bio-based resources are not at parity with fossil fuel-based sources," Hillmyer said. "The real question is are we willing to pay for it?"
News Release 12-Apr-2019The trouble with thaw
About one fourth of the Northern Hemisphere is covered in permafrost. Now, these permanently frozen beds of soil, rock, and sediment are actually not so permanent: They're thawing at an increasing rate.
Human-induced climate change is warming these lands, melting the ice, and loosening the soil. This may sound like any benign Spring thaw, but the floundering permafrost can cause severe damage: Forests are falling; roads are collapsing; and, in an ironic twist, the warmer soil is releasing even more greenhouse gases, which could exacerbate the effects of climate change.
From the first signs of thaw, scientists rushed to monitor emissions of the two most influential anthropogenic (human-generated) greenhouse gases (carbon dioxide and methane). But until recently, the threat of the third largest (nitrous oxide) has largely been ignored.
In the Environmental Protection Agency's (EPA) most recent report (from 2010), the agency rates these emissions as "negligible." Perhaps because the gas is hard to measure, few studies counter this claim.
Now, a recent paper shows that nitrous oxide emissions from thawing Alaskan permafrost are about twelve times higher than previously assumed. "Much smaller increases in nitrous oxide would entail the same kind of climate change that a large plume of CO2 would cause" says Jordan Wilkerson, first author and graduate student in the lab of James G. Anderson, the Philip S. Weld Professor of Atmospheric Chemistry at Harvard.
Since nitrous oxide is about 300 times more potent than carbon dioxide, this revelation could mean that the Arctic--and our global climate--are in more danger than we thought.
In August 2013, members of the Anderson lab (pre-Wilkerson) and scientists from the National Oceanic and Atmospheric Administration (NOAA) traveled to the North Slope of Alaska. They brought along a plane just big enough for one (small) pilot.
Flying low, no higher than 50 meters above the ground, the plane collected data on four different greenhouse gases over about 310 square kilometers, an area 90 times larger than Central Park. Using the eddy-covariance technique--which measures vertical windspeed and the concentration of trace gases in the atmosphere--the team could determine whether more gas went up than down.
In this case, what goes up, does not always come down: Greenhouse gases rise into the atmosphere where they trap heat and warm the planet. And, nitrous oxide poses a second, special threat: Up in the stratosphere, sunlight and oxygen team up to convert the gas into nitrogen oxides, which eat at the ozone. According to the EPA, atmospheric levels of the gas are rising, and the molecules can stay in the atmosphere for up to 114 years.
In Alaska, Anderson's field team focused on carbon dioxide, methane, and water vapor (a natural greenhouse gas). But, their little plane picked up nitrous oxide levels, too.
When Wilkerson joined the lab in 2013, the nitrous oxide data was still raw, untouched. So, he asked if he could analyze the numbers as a side-project. Sure, Anderson said, go right ahead. Both of them expected the data to confirm what everyone already seemed to know: Nitrous oxide is not a credible threat from permafrost.
Wilkerson ran the calculations. He checked his data. He sent it to Ronald Dobosy, the paper's second author, an Atmospheric Scientist and eddy-covariance expert at the Oak Ridge Associated Universities (ORAU) at NOAA.
After triple checks, Wilkerson had to admit: "This is widespread, pretty high emissions." In just one month, the plane recorded enough nitrous oxide to fulfill the expected cap for an entire year.
Still, the study only collected data on emissions during August. And, even though their plane covered more ground than any previous study, the data represents just 310 of the 14.5 million square kilometers in the Arctic, like using a Rhode Island-sized plot to represent the entire United States.
Even so, a few recent studies corroborate Wilkerson's findings. Other researchers have used chambers--covered, pie plate-sized containers planted into tundra--to monitor gas emissions over months and even years.
Other studies extract cylindrical "cores" from the permafrost. Back in a lab, the researchers warm the cores inside a controlled environment and measure how much gas the peat releases. The more they heated the soil, the more nitrous oxide leaked out.
Both chambers and cores cover even less ground (no more than 50 square meters) than Anderson's airborne system. But together, all three point to the same conclusion: Permafrost is emitting far more nitrous oxide than previously expected. "It makes those findings quite a bit more serious," Wilkerson says.
Wilkerson hopes this new data will inspire further research. "We don't know how much more it's going to increase," he says, "and we didn't know it was significant at all until this study came out."
Right now, eddy-covariance towers--the same technology the Anderson crew used in their plane--monitor both carbon dioxide and methane emissions across the Arctic. Anderson was the first to use airborne eddy-covariance to collect data on the region's nitrous oxide levels. And, apart from the small-scale but significant chamber and core studies, no one is watching for the most potent greenhouse gas.
Since the Arctic is warming at almost twice the rate of the rest of the planet, the permafrost is predicted to thaw at an ever-increasing rate. These warm temperatures could also bring more vegetation to the region. Since plants eat nitrogen, they could help decrease future nitrous oxide levels. But, to understand how plants might mitigate the risk, researchers need more data on the risk itself.
In his place, Wilkerson hopes researchers hurry up and collect this data, whether by plane, tower, chamber, or core. Or better yet, all four. "This needs to be taken more seriously than it is right now," he says.
The permafrost may be stuck in a perpetual climate change cycle: As the planet warms, permafrost melts, warming the planet, melting the frost, and on and on. To figure out how to slow the cycle, we first need to know just how bad the situation is.
News Release 15-Apr-2019
UNIVERSITY PARK, Pa. -- In the future, weather forecasts that provide storm warnings and help us plan our daily lives could come up to five days sooner before reaching the limits of numerical weather prediction, scientists said.
"The obvious question that has been raised from the very beginning of our whole field is, what's the ultimate limit at which we can predict day-to-day weather in the future," said Fuqing Zhang, distinguished professor of meteorology and atmospheric science and director of the Center for Advanced Data Assimilation and Predictability Techniques at Penn State. "We believe we have found that limit and on average, that it's about two weeks."
Reliable forecasts are now possible nine to 10 days out for daily weather in the mid-latitudes, where most of Earth's population lives. New technology could add another four to five days over the coming decades, according to research published online in the Journal of the Atmospheric Sciences.
The research confirms a long-hypothesized predictability limit for weather prediction, first proposed in the 1960s by Edward Lorenz, a Massachusetts Institute of Technology mathematician, meteorologist and pioneer of the chaos theory, scientists said.
"Edward Lorenz proved that one cannot predict the weather beyond some time horizon, even in principle," said Kerry Emanuel, professor of atmospheric science at MIT and coauthor of the study. "Our research shows that this weather predictability horizon is around two weeks, remarkable close to Lorenz's estimate."
Unpredictability in how weather develops means that even with perfect models and understanding of initial conditions, there is a limit to how far in advance accurate forecasts are possible, scientists said.
"We used state-of-the-art models to answer this most fundamental question," said Zhang, lead author on the study. "I think in the future we'll refine this answer, but our study demonstrates conclusively there is a limit, though we still have considerable room to improve forecast before reaching the limit."
To test the limit, Zhang and his team used the world's two most advanced numerical weather prediction modeling systems -- The European Center for Medium Range Weather Forecasting and the U.S. next generation global prediction system.
They provided a near-perfect picture of initial conditions and tested how the models could recreate two real-world weather events, a cold surge in northern Europe and flood-inducing rains in China. The simulations were able to predict the weather patterns with reasonable accuracy up to about two weeks, the scientists said.
Improvements in day-to-day weather forecasting have implications for things like storm evacuations, energy supply, agriculture and wild fires.
"We have made significant advances in weather forecasting for the past few decades, and we're able to predict weather five days in advance with high confidence now," Zhang said. "If in the future we can predict additional days with high confidence, that would have a huge economic and social benefit."
Researchers said better data collection, algorithms to integrate data into models and improved computing power to run experiments are all needed to further improve our understanding of initial conditions.
"Achieving this additional predictability limit will require coordinated efforts by the entire community to design better numerical weather models, to improve observations, and to make better use of observations with advanced data assimilation and computing techniques," Zhang said.
Y. Qiang Sun, a former graduate student in the meteorology and atmospheric science at Penn State, and now a postdoctoral fellow at the Princeton University, co-led the study with Zhang. Also collaborating were researchers from the National Oceanic and Atmospheric Administration and the European Center for Medium Range Weather.
The National Science Foundation and the Office of Naval Research partially funded this work.
Paris (AFP) April 15, 2019 - Disneyland Paris on Monday announced a series of measures to make Europe's biggest private tourist attraction more environmentally friendly, including banning plastic straws.
The theme park east of the French capital, which draws 15 million visitors a year, is like a small town in its own right, producing 19 tonnes of waste last year.
It currently recycles paper, glass and 18 other types of materials accounting for around half of all its waste, a level it aims to increase to 60 percent in 2020, said Nicole Ouimet-Herter, the park's environment manager.
Starting Thursday it will bin plastic straws, to be replaced with fully biodegradable paper versions that will be distributed only if patrons request them.
The announcement follows a vote last month in the European Parliament to ban single-use plastic products such as straws, cutlery and cotton buds from 2021.
It comes as pressure mounts on companies and citizens to wean themselves off the plastics blamed for clogging up oceans.
Disneyland Paris, owned by The Walt Disney Company, also announced several other initiatives to clean up its act.
Next week, shops in the park will stop handing out free plastic bags, offering instead the option of purchasing bags made of 80 percent recycled plastic for 1 or 2 euros ($1.13-$2.26).
And starting in June several of the park's hotels will no longer stock bathrooms with small bottles of shower gel or shampoo, replacing them with bigger ones that can be refilled.
Euro Disney, the park's operator, said it was also planning to install solar panels on the sprawling 22-square-kilometre site to get more power from renewables.
Currently, renewable energy sources account for only 10 percent of the electricity used.
"We are undertaking concrete actions to reduce our impact on the environment. But we also have the power to dazzle children and want to have a positive influence on them to encourage them to take care of nature," said Mireille Smeets, Euro Disney's director of corporate social responsibility.
By Marlowe HOOD
Paris (AFP) April 15, 2019
A secluded mountain region thought to be free of plastic pollution is in fact blanketed by airborne microplastics on a scale comparable to a major city such as Paris, alarmed researchers reported Monday.
Over a five-month period in 2017-2018, an average of 365 tiny bits of plastic settled every day on each square metre of an uninhabited, high-altitude area in the Pyrenees straddling France and Spain, they reported in the journal Nature Geoscience.
"It is astounding and worrying that so many particles were found in the Pyrenees field site," said lead author Steve Allen, a doctoral student at the University of Strathclyde in Scotland.
The study focused on microplastics mostly between 10 and 150 micrometres across, including fragments, fibres and sheet-like pieces of film.
By comparison, a human hair is, on average, about 70 micrometres in width.
"We would never have anticipated that this study would reveal such high levels of microplastic deposits," added co-author Gael Le Roux, a researcher at EcoLab in Toulouse, in southwestern France.
Plastic litter has emerged in the last few years as a major environmental problem.
Up to 12 million tonnes of plastics are thought to enter the world's oceans every year, and millions more clog inland waterways and landfills.
Plastic takes decades to break down, and even then continues to persist in the environment.
Scientists are only now beginning to measure the damage to wildlife and potential impacts on human health.
A study earlier this year uncovered plastic fragments in the guts of animals living more than 10 kilometres below the ocean surface.
Two whales found beached since the start of the year -- one in the Philippines, the other in Sardinia, Italy -- had 40 and 20 kilos, respectively, of plastic in their stomachs.
Microplastics have also been found in tap water around the world, and even the furthest reaches of Antarctica.
- As polluted as Paris -
"Our most significant finding is that microplastics are transported through the atmosphere and deposited in a remote, high-altitude mountain location far from any major city," co-author Deonie Allen, also from EcoLab, told AFP.
"This means that microplastics are an atmospheric pollutant."
Researchers used two monitoring devices to independently measure particle concentration in an area long considered to be among the most pristine in western Europe.
The nearest village is seven kilometres (more than four miles) away, and the nearest city, Toulouse, is more than 100 kilometres.
While the scientists were able to identify the types of plastic, they could not say with certainty where they came from or how far they had drifted.
Analysing the pattern of air flows, they surmised that some particles had travelled at least 100 kilometres.
"But due to the lack of significant local plastic pollution sources, they probably travelled farther," Deonie Allen said.
Samples -- transported by wind, snow and rain -- were collected at the meteorological station of Bernadouze at an altitude of more than 1,500 metres.
The researchers were stunned to find that the concentrations of microplastic pollution were on a par with those found in major cities, including Paris and the southern Chinese industrial city of Dongguan.
"Our findings are within the range of those reported for greater Paris, and can thus be considered comparable," Deonie Allen told AFP.
"We did not expect the number of particles to be so high."
"CHEYENNE, Wyo. — The first golden eagle in Yellowstone National Park fitted with a tracking device has died of lead poisoning, likely after consuming bullet fragments while scavenging the remains of an animal killed by a hunter, officials said Monday.
Wearing a GPS unit like a backpack, the adult, female eagle had flown outside Yellowstone into areas where hunters pursue game such as elk and deer.
The death of the bird was a setback for golden eagle research in Yellowstone but not the end. Several other golden eagles at the park have been fitted with tracking devices."
Doyle Rice, USA TODAY Published 8:00 a.m. ET April 16, 2019 | Updated 2:24 p.m. ET April 16, 2019
Fishing on the Gila River in New Mexico. The river has been named America's Most Endangered River of 2019. (Photo: Wick Beavers)
Sure, we know about endangered species, but did you know there are endangered rivers, too?
Environmental group American Rivers released its annual list of the USA's top 10 "most endangered" rivers Tuesday, and this year, the top "dishonor" goes to New Mexico’s Gila River. The river got the top spot due to the grave threat that climate change and a proposed diversion project pose to New Mexico’s last free-flowing river.
"New Mexicans can’t afford to dry up their last wild river,” said Matt Rice, Colorado Basin director for American Rivers. “Ruining the Gila River with an expensive diversion project doesn’t make sense when there are better, more cost-effective water supply options.”
These rivers aren't the nation's "worst" or most polluted rivers. According to American Rivers, three factors put rivers on the list: the significance of the river to human and natural communities; the magnitude of the threat to the river and its nearby communities, especially in light of a changing climate; and a major decision that the public can help influence in the coming year.
“Climate change is striking rivers and water supplies first and hardest,” said Bob Irvin, President and CEO of American Rivers, in a statement. “America’s Most Endangered Rivers is a call to action. We must speak up and take action, because climate change will profoundly impact every river and community in our country."
In addition to the 10 most troubled rivers, American Rivers also named the Cuyahoga River in Ohio as the "River of the Year." The title celebrates the progress Cleveland has made in cleaning up the Cuyahoga, 50 years after the river’s infamous fire that helped spark the USA's environmental movement.
American Rivers has been compiling an annual list of the nation's most endangered rivers since 1984.
Here is the list of American Rivers' top 10 most Endangered Rivers.
1. Gila River, New Mexico
2. Hudson River, New York
3. Upper Mississippi River, Illinois, Iowa, Missouri
4. Green-Duwamish River, Washington
5. Willamette River, Oregon
6. Chilkat River, Alaska
7. South Fork Salmon River, Idaho
8. Buffalo National River, Arkansas
9. Big Darby Creek, Ohio
10. Stikine River, Alaska
By Elizabeth Gribkoff
Apr 16 2019
The state’s largest electricity utility, Green Mountain Power, has announced plans to procure 100% of its electricity from renewable sources by 2030.
Three other utilities in the state — Burlington Electric Department, Swanton Electric Department and Washington Electric Co-op — already have 100% renewable portfolios.
Mary Powell, the chief executive of GMP, made the announcement Saturday in South Burlington. She said the utility, which serves 75% of the state, plans to have a “carbon free” electricity supply by 2025.
“The U.N. Intergovernmental Panel on Climate Change report makes clear, we have to act now, and take bold steps to cut carbon,” Powell said in a statement, referencing a bombshell IPCC report that came out last fall.
Capping global warming at a 1.5 degrees celsius increase would require global carbon dioxide emissions to decline 45 percent from 2010 levels in the next 12 years, wrote the report authors.
Green Mountain Power’s current power supply is just over 60 percent renewable and 90 percent carbon free, which Josh Castonguay, the company’s chief innovation officer, referred to as “a good starting point.”
Nuclear, which the utility estimates will account for 27.9 percent of its supply in 2018, does not count as a renewable source of energy under state law. Castonguay said there’s going to be a “transition period” over the next decade away from nuclear.
“That’s why we’re doing this phased-in approach,” he said.
GMP plans to have a fully renewable energy supply through a combination of increased locally generated solar and energy storage and by purchasing more wind and hydro energy, according to a statement from the company.
Almost all of the utility’s renewable electricity currently comes from hydro. GMP owns 44 of its own hydroelectric plants and has power purchase agreements in place to buy additional hydroelectricity. Castonguay said the utility will maximize how much power is coming from existing hydro-facilities and purchase more as needed.
GMP estimates that solar will account for 1.7% of its power supply in 2018, after it sells renewable energy credits. The utility has two existing solar and battery storage projects in Rutland and Panton, with three more planned projects approved by the Public Utility Commission in Essex, Ferrisburgh and Milton. And over 15,000 GMP customers have solar systems through third-party solar companies, according to the company’s press release.
Mary Powell, CEO of Green Mountain Power. Photo by John Herrick/VTDigger
“We continue to have net metering planned in our future…but we’re also looking at other solar opportunities” like community solar, said Castonguay, adding that battery storage was key for managing additional solar. GMP draws on stored power in home energy storage devices to reduce peak demand, which is when electricity is most costly.
Other areas of likely growth in renewables include off-shore wind and methane digesters, said Castonguay.
Kristin Kelly, a spokesperson for GMP, said that by aiming to meet the 100% renewable target in more than ten years, the utility will have time to meet it in a way that is “cost effective” for customers.
By law, the state’s Department of Public Service intervenes in rate cases pending before the PUC to represent the “public interest.” Riley Allen, the department’s deputy commissioner, said it feels GMP can meet its goals in a way that is “mindful of the affordability concerns.”
Vermont passed a law in 2015 that requires electrical utilities to source increasing amounts of electricity from renewable sources. By 2032, renewable electricity must make up 75% of their sales, with 10% of that from in-state, distributed generation, such as net-metered solar.
“We think our utilities are generally ahead of the game there,” Allen said of the renewable energy standard. “And given the availability of relatively low cost renewable resources, we think all that is achievable.”
GMP is playing catch up on renewables, compared to some of the state’s smaller electricity providers.
Burlington Electric Department and Washington Electric Co-op became 100 percent renewable in 2014, according to executives at the companies.
The Swanton Electric Department has essentially been renewably powered since its founding in 1894, according to Reg Beliveau, Swanton village manager. The department purchased a dam at Highgate Falls on the Missisquoi River that was built in the late 1700s.
“Since then, we’ve produced almost all of our electricity through that facility,” he said.
Olivia Campbell Andersen, director of trade group Renewable Energy Vermont, said in a statement Monday that REV “applauds GMP’s vision” to go 100% renewable, calling on the state to update energy laws to smooth the way for more in-state renewables.
For instance, developers currently face “redundant” permitting to put up solar canopies on parking lots, she said in an interview Tuesday.
“We’re kind of asking regulators to get out of the way because there are a lot of unintended barriers in our existing energy laws,” she said.
By Jennifer A Dlouhy
Order aims to undermine local authority to block gas pipelines
New York has wielded authority against at least three projects
Developers have been trying for six years to build a 124-mile natural gas pipeline from Pennsylvania to New York. Despite winning a federal approval in 2014, the project is still no closer to reality.
Enter President Donald Trump, who on Wednesday is poised to issue two executive orders to promote energy infrastructure, including projects like the long-stalled Constitution Pipeline, according to an administration official. The orders will direct federal agencies to make specific changes meant to remove bottlenecks to natural gas transport in the Northeast and streamline federal reviews of border-crossing pipelines and other infrastructure.
One of the orders seeks to short-circuit regulators in New York who have denied the planned pipeline a crucial permit, invoking their powers under the Clean Water Act to reject projects they deem a threat to water supplies and the environment.
Other states and tribes have wielded the power to restrict a coal export terminal and hydropower project on the U.S. West Coast -- which has frustrated the industry.
The Clean Water Act wasn’t “intended to give a state veto power,” said Dena Wiggins, president of the Natural Gas Supply Association. “The actions New York is taking not only impact New York, they are impacting the entire Northeast, because we can’t get a pipeline through the state in order to provide gas service to the Northeast.”
Trump’s orders, slated to be unveiled during a visit to the International Union of Operating Engineers International Training and Education Center in Crosby, Texas, comes as the president continues to chafe at regulatory barriers he says throttle the full potential of American “energy dominance.”
One of the executive orders is aimed at facilitating the shipments of U.S. natural gas in the Northeast, where limited pipeline capacity and legal constraints prompted the region to import natural gas from Russia last year, according to the administration official who asked not to be identified. It would ask the Department of Transportation to propose a new regulation to treat liquefied natural gas like other cryogenic liquids, allowing it to be shipped by rail in approved tank cars, something that isn’t allowed today.
Trump will also direct regulators to develop a master agreement for placing energy infrastructure in federal rights-of-way -- a move aimed at expediting approvals and renewals.
A separate executive order is aimed at bolstering cross-border energy infrastructure by limiting environmental reviews of the ventures, following long delays on high-profile projects such as TransCanada Corp.’s Keystone XL oil pipeline.
Trump is set to replace two executive orders -- one dating to 1968 -- and strip the State Department of some of its power to scrutinize the environmental impacts of border-crossing oil pipelines and other infrastructure, the official said. Under the president’s coming order, those permit decisions would rest with the president, and the department would instead would be responsible for advising the president on the projects within 60 days of receiving an application.
Energy Department review of border-crossing electricity projects and Federal Energy Regulatory Commission scrutiny of natural gas pipelines that span U.S. borders would not be affected.
Trump’s orders represent a reprise of earlier presidential memos and orders Trump issued in 2017 in a bid to hasten permitting of U.S. energy infrastructure.
His action is unlikely to jump-start widespread construction, since it’s up to Congress -- not the president -- to restrict states’ authority under the Clean Water Act. And the initiative isn’t expected to solve legal problems thwarting several pipelines in the Mid-Atlantic U.S., which hinge on inadequate Interior Department reviews -- not formal objections from states.
Still it marks a formal push by Trump to rein in states that have emerged as a major barrier to constructing pipelines.
Even if it’s not a “silver bullet,” the pipeline order “will be construed as opening the door to overcoming these hurdles that states are throwing up,” said Christi Tezak, managing director at ClearView Energy Partners.
Trump’s pipeline order will direct the Environmental Protection Agency to revise its guidance and regulations for how states can use their certification power to vet projects that cross wetlands, rivers and other bodies of water.
Those Section 401 certifications -- named for a provision in the Clean Water Act -- are supposed to be approved, denied or waived by states and tribes within a year of a project application. Yet federal regulators have said the one-year clock can be restarted whenever developers submit new or revised applications for state review.
Fights over state permits have led to years-long delays and protracted federal court battles, with some analysts urging investors to consider “elevated risk premiums” in making decisions about projects designed to cross some particularly challenging states.
In addition to the Constitution pipeline, a joint venture between Cabot Oil & Gas Corp., Williams Cos., Duke Energy Corp. and AltaGas Ltd., New York also has denied water quality certifications for Millennium Pipeline Co. LLC’s CPV Valley lateral project in the state’s Orange County and National Fuel Gas Co.’s Northern Access project.
The denials have drawn scrutiny because of the state’s prime location, standing between a prolific shale gas formation and consumers throughout the Northeast U.S. that are hungry for it.
New York Governor Andrew Cuomo has said that “no corporation should be allowed to endanger our natural resources,” and vowed the state “will not relent in our fight to protect our environment.” And the state’s Department of Environmental Conservation said by email that New York “will continue to use all available means to vigorously oppose any efforts to reduce states’ ability to protect our water resources.”
Trump’s order can’t override provisions in the Clean Water Act that give states a powerful role vetting such projects, Tezak said.
Read: New York Threatens to Spoil Trump’s Fossil Fuel Push
“An executive order cannot take off the table the ability of a state to say no for reasons it believes are appropriate, and the venue for adjudicating that is not the White House” -- it’s the courts, Tezak said. But “what states will be constrained from doing is meaningfully extending the process.”
Environmentalists have blasted the plan.
“This executive order is nothing but an attempt to trample people’s rights to protect their air, water, and climate from polluting oil and gas pipelines,” said Greenpeace USA Climate Campaigner Rachel Rye Butler.
The president’s effort also has rankled some Republicans -- including governors in the Western U.S. -- who say his initiative threatens to infringe states’ rights.
The Western Governors Association warned Trump in a Jan. 31 letter that curtailing “the vital role of states in maintaining water quality within their boundaries would inflict serious harm to the division of state and federal authorities established by Congress.”
— With assistance by Ari Natter, and Stephen Cunningham
By Miranda Green - 04/10/19 05:35 PM EDT
President Donald Trump on Wednesday signed two executive orders meant to eliminate hurdles to new and existing natural gas pipeline construction across the U.S.
“In a few moments I will sign two groundbreaking executive orders to continue the revival of the American energy industry and will cut through destructive permitting delays and denials," Trump said at an event with engineers in Texas on Wednesday, before signing the two orders. "Where it will take you 20 years to get a permit, those days are gone.”
The actions aim to boost energy infrastructure and remove specific barriers blocking existing plans for cross-country natural gas transportation and interstate pipeline construction.
Speaking at the International Union of Operating Engineers International Training and Education Center in Crosby, Texas, Trump called the presidential orders “groundbreaking” measures to “continue the revival of the American energy industry.”
“We made a lot of progress in the last two and half years haven’t we. We took down a lot of barriers to production and the pumping,” Trump told the crowd.
The orders specifically take aim at key pipeline hold-ups, such as on the Constitution Pipeline, a 124 mile natural gas pipeline project from Pennsylvania to New York.
The project received a federal permit in 2014 but has since been halted by state regulators. New York has refused to issue a key water permit to begin construction, arguing the pipeline would threaten groundwater reserves, which the state has the ability to regulate under the Clean Water Act (CWA).
One of the new executive orders will specifically limit such environmental reviews of the projects. Specifically, it will direct the Department of Energy (DOE) and the Environmental Protection Agency (EPA) to clarify a section of the CWA that gives states authority over their water quality permits.
Environmental groups and Democratic lawmakers were quick to push back on the president’s plan, arguing the move threatened the state powers.
Washington Gov. Jay Inslee (D), along with the state Attorney General Bob Ferguson (D), called the orders “an unprecedented assault” on states rights to protect their water under the CWA.
“No amount of politicking will change the facts — states have full authority under the Clean Water Act to protect our waters and ensure the health and safety of our people. Washington will not allow this or any presidential administration to block us from discharging that authority lawfully and effectively,” Inslee, a 2020 presidential candidate, said in the statement.
On a call with reporters Tuesday night, a senior administration official said Trump’s orders were a way to “build out an energy infrastructure and provide a good, consistent, reliable path forward and relationship between business and federal government going forward.”
The second executive order will focus on easing restrictions for cross-state transport of crude oil and natural gas. It will ask the Department of Transportation to create a new rule that would classify liquified natural gas (LNG) similar to other cryogenic liquids, which would newly allow it to be shipped via train. Critics have long flagged that cross-country LNG rail transport poses significant environmental and safety concerns as trains have in the past fallen off tracks and caused deadly outcomes.
The official said the LNG safety standard determined 40 years ago bear “little resemblance” to larger facilities that exist in the U.S. today.
The presidential orders also give Trump consolidated powers to scrutinize any potential environmental impacts of pipeline construction. The State Department until now has held those powers. Under the new order, the department will instead advise the president on the projects 60 days after a pipeline application is received.
“The president’s Executive Order clarifies that any decision to issue any cross border permit shall be issued only by the president,” the White House official said.
“Last year the U.S. became the top producer of LNG globally, a fact Trump highlighted in his speech on Wednesday.
“With the help of the incredible workers in this room, the United States is now the number one producer of oil and natural gas, anywhere in the world, anywhere on the planet,” he said.
A number of fossil fuel companies and industry groups backed Trump’s plan Wednesday.
“Currently, the process for reviewing and approving new or expanded interstate natural gas pipelines is robust and transparent – two things that we continue to believe are essential – but procedural inefficiencies can delay a process that already spans several years,” said Don Santa, CEO of the Interstate Natural Gas Association of America.
The debate on pipeline permitting has been a lightning rod in national energy policy for much of a decade. No other pipeline has developed as much notoriety as the proposed TransCanada Corp. Keystone XL oil pipeline.
Last week Trump issued a new presidential permit for the construction of the Keystone XL pipeline with a facility in Montana, a move seen as a way to circumvent previous court orders halting development.
The order superseded a March 2017 permit that was invalidated by a Montana federal judge in November. The ruling is being appealed in the 9th Circuit. Separately, a December lawsuit placed an injunction on most pre-construction activities.
A White House spokesperson told The Hill that the new permit "dispels any uncertainty."
The move has already generated at least one lawsuit.
Rebecca Beitsch contributed.
Mjösa Tower, the world's tallest wooden building, under construction in Brumunddal, Norway.
Mass timber construction is on the rise, with advocates saying it could revolutionize the building industry and be part of a climate change solution. But some are questioning whether the logging and manufacturing required to produce the new material outweigh any benefits.
By Jim Robbins • April 9, 2019
The eight-story Carbon 12 building in Portland, Oregon is the tallest commercial structure in the United States to be built from something called mass timber.
If the many fervent boosters of this new construction material are right, however, it is only one of the first mass timber buildings among many, the beginning of a construction revolution. “The design community in Portland is enthralled with the material,” said Emily Dawson, an architect at Kaiser + Path, the locally-based firm that designed Carbon 12.
The move to mass timber is even farther along in Europe. That’s because mass timber – large structural panels, posts, and beams glued under pressure or nailed together in layers, with the wood’s grain stacked perpendicular for extra strength – is not only prized as an innovative building material, superior to concrete and steel in many ways, it is also hoped it will come into its own as a significant part of a climate change solution.
Among architects, manufacturers, and environmentalists, many want nothing less than to turn the coming decades of global commercial construction from a giant source of carbon emissions into a giant carbon sink by replacing concrete and steel construction with mass timber. That, they say, would avoid the CO2 generated in the production of those building materials and sequester massive amounts of carbon by tying up the wood in buildings for decades or even longer, perhaps in perpetuity.
There are new mass timber buildings in London, Atlanta, and Minneapolis, and an 80-story high-rise is proposed for Chicago.
“Say the typical steel and concrete building has an emissions profile of 2,000 metric tons of CO2,” said Andrew Ruff, of Connecticut-based Gray Organschi Architecture, a leading proponent of the laminated wood revolution. “With mass timber you can easily invert so you are sequestering 2,000 tons of CO2. Instead of adding to climate change you are mitigating climate change. That’s the goal.”
And it is taking off. Mass timber has a two-decade track record in Europe. The 18-story Mjösa Tower just opened last month in Norway. An 18-story mass timber building was recently built in Vancouver as well, and an 80-story high-rise is proposed for Chicago. There are new commercial mass timber buildings in London, Atlanta, and Minneapolis. Some 21 timber buildings over 50 meters (164 feet) tall will be completed in Europe by the end of the year, according to one report.
But there are big questions being asked about just how sustainable the new building material is –especially about how forests that produce mass timber are managed, and how much CO2 would be emitted in the logging, manufacture, and transport of the wood products used in the construction. So far, critics say, there aren’t good answers to these questions.
Carbon 12 in Portland, Oregon is the tallest building in the United States made with mass timber.
“We want to debunk the myth that mass timber is in any way, shape, or form related to some kind of environmental benefit,” said John Talberth, president of the Center for Sustainable Economy, which is based near Portland. “That’s simply not true.”
Yet proponents say mass timber does have real promise as a way to sequester massive amounts of CO2, if a fully sustainable life cycle comes together. “We are working with a large interdisciplinary team of climate scientists, carbon cycle researchers, metallurgists, and foresters to really understand the potential climate impacts of mass timber at scale,” said Ruff.
A lack of understanding of the full CO2 picture has not kept the field from taking off. The burgeoning demand for mass timber posts and beams has seen sawmills open in the timber towns of the the U.S. Northwest and loggers go back to work to harvest the pine, fir, and spruce used in the manufacture. The first certified U.S. producer of mass timber – also known as cross-laminated timber – opened in Riddle, Oregon in 2015. Other producers have either recently opened or soon will. Analysts call it a revolution in building and the next great disruption of the construction industry, for a number reasons that have nothing to do with the environmental aspects.
“Because its components are fabricated off-site to [precise specifications], it goes together really fast on site,” said Dawson. “So you can cut months off the construction time. It’s more predictable than concrete. You can work through cold weather and don’t have to worry about the temperature tolerances of concrete. It’s also a lot quieter than other kinds of construction, so you can be a good neighbor.” It’s stronger than steel, lighter, and, surprisingly, may be as fireproof.
The possible prodigious climate benefits are what has many people taking mass timber seriously.
Architects say the exposed wood interiors in these buildings are warmer than other materials and far more aesthetically pleasing. Michael Green, who builds mass timber structures in British Columbia, said some people walk into buildings he has designed and want to hug the wooden interiors. The dense laminated beams also hold up well to fire, unlike other kinds of wood construction.
Mass timber can be cheaper than concrete and steel, depending on where it is sourced. And when production is scaled up across the globe, experts say, mass timber should be considerably cheaper.
The possible prodigious climate benefits, though, are what has many people taking mass timber seriously. These benefits come because of two big facts about commercial construction. First, CO2 emissions from the building industry account for about 40 percent or more of global CO2 emissions. And the manufacture of concrete and steel each contribute about 5 percent of global emissions.
Using mass timber for commercial construction could greatly change that equation. But there are key questions about the life cycle of mass timber, and some say the industry doesn’t have enough data yet to back up its claim that it is a major climate change solution.
Glue is applied to create cross-laminated timber at the D.R. Johnson Lumber Company in Riddle, Oregon, one of the first mass timber-certified manufacturers in the U.S.
After the building has run its course, the beams would need to be stored without decomposing or re-used without releasing the CO2 in order to make the carbon equation work. And there are numerous unknowns about how much CO2 would be expelled in the logging, manufacturing, and transport of mass timber products. The forest products industry is already the largest source of CO2 emissions in Oregon because of fuel burned by logging equipment and hauling trucks, the burning of wood, and the decomposition of trees after they are cut.
Beverly Law, a professor of global change biology and terrestrial systems science at Oregon State University who headed up the Oregon forest study, says there hasn’t been a thorough analysis of carbon emitted by mass timber production because it is enormously complex to track the factors that produce CO2 in forest ecosystems and in production. Some of the data needed, she said, is incomplete or absent. It took her team of researchers more than a decade of analysis to figure out that the Oregon wood products industry was the largest emitter of CO2 in the state, Law said.
“We looked at long- and short-term products, what mills burn for heat, fuel burned for harvesting, transporting from forest to mills to end use, and emissions along the way,” she said. Another major issue is how long the wood will be in use, which is not yet known. In addition, Law said, any analysis of CO2 must account for how much the forest is taking up before and after logging, “and a lot of people don’t pay attention to that part of it. We just don’t have the information to run this through a life cycle assessment.”
“We must ensure that mass timber drives sustainable forestry management, otherwise all of these benefits are lost.”
The forestry part is what has some skeptical of how ecologically sound mass timber is and, if and when it’s scaled up, whether it will truly provide a planetary climate solution. In a letter to the city of Portland last year, representatives of Oregon environmental groups — including the Audubon Society, the Sierra Club, and Oregon Physicians for Social Responsibility — raised serious doubts about mass timber as a green climate solution and questioned the city’s plan to use it.
First and foremost, they said, is the need to certify that wood is logged sustainably and certified as such. “Without such a requirement,” the letter stated, the city “may be encouraging the already rampant clear-cutting of Oregon’s forests… In fact, because it can utilize smaller material than traditional timber construction, it may provide a perverse incentive to shorten logging rotations and more aggressively clear-cut.”
Such industrial-type forestry — large-scale plantings of trees selected to grow fast — creates a “biological desert,” said Talberth, of the Center for Sustainable Economy. “And it’s driving the extinction of thousands of species. Mass timber is mass extinction.”
“We must ensure that mass timber drives sustainable forestry management, otherwise all of these benefits are lost,” agreed Mark Wishnie, director of forestry and wood products at The Nature Conservancy. “To really understand the potential impact of the increased use of mass timber on climate we need to conduct a much more detailed set of analyses.”
Wishnie said The Nature Conservancy, the U.S. Forest Service, and a dozen universities and other research institutions are launching a new analysis of mass timber.
At the same time, he said. “there is enough data to say the [CO2] savings are significant.” He said the substitution of concrete and steel with wood and the long-term carbon storage in mass timber buildings make up about 75 percent of the total benefit, and the forestry end, if executed sustainably, about 25 percent.
While there is disagreement on many points, making the mass timber movement work, proponents say, is essential. “If you look 30 years down the road to 2050, we’re projected to have 2.3 billion new urban dwellers,” said Ruff. “That is a huge amount of construction. Every day that goes by that we don’t convert from mineral-based extractive construction techniques to carbon sequestering building systems, we tend to dig ourselves further in a hole.
“So,” he added, “the question is, how can we grow this fast enough to be a solution for climate change?”
Jim Robbins is a veteran journalist based in Helena, Montana. He has written for the New York Times, Conde Nast Traveler, and numerous other publications. His latest book is the The Wonder of Birds: What they Tell Us about the World, Ourselves and a Better Future.
Slow progress on powering firm’s datacentres using renewables raises questions
Tue 9 Apr 2019 09.00 EDT
Last modified on Wed 10 Apr 2019 04.54 EDT
Amazon was accused of focusing its attention on winning business from the oil and gas industry
Amazon has been accused of abandoning a much-publicised goal of running its datacentres on 100% renewable energy – instead focusing its attention on winning business from the oil and gas industry.
According to a Greenpeace report released earlier this year, some of Amazon’s most important datacentres in the US state of Virginia, where the company has committed to building its second HQ, are powered by only 12% renewable energy. Across the company as a whole, Amazon reached 50% renewable usage in 2018, and has not issued any updates since.
This week, a report from the tech news site Gizmodo suggested one reason for the slowdown was Amazon’s increasing focus on bringing on board large oil and gas companies as Amazon Web Service customers.
The figures represent slow progress towards the goal, first announced in 2014, to power the entire company using renewables, and have led some to accuse Amazon of abandoning the goal entirely.
Alongside the organisation’s report, Greenpeace’s Elizabeth Jardim said: “Despite Amazon’s public commitment to renewable energy, the world’s largest cloud computing company is hoping no one will notice that it’s still powering its corner of the internet with dirty energy.
“Unless Amazon and other cloud giants in Virginia change course, our growing use of the internet could lead to more pipelines, more pollution and more problems for our climate.”
Gizmodo’s report cited Andrew Jassy, the AWS chief executive, who told an oil and gas conference in Houston last month: “A lot of the things that we have built and released recently have been very much informed by conversations with our oil and gas customers and partners.”
Gizmodo contrasted his statement with another, reported in December, from the AWS executive Peter DeSantis, who “told colleagues inside the company that renewable energy projects are too costly and don’t help it win business”.
Amazon’s renewables record is in stark contrast to some of its competitors, most notably Google, which reported success in reaching 100% renewables use in 2017. “Our engineers have spent years perfecting Google’s datacentres, making them 50% more energy-efficient than the industry average,” the company’s head of technical infrastructure, Urs Hölzle, said at the time.
“But we still need a lot of energy to process trillions of Google searches every year, play more than 400 hours of YouTube videos uploaded every minute and power the products and services that our users depend on. That’s why we began purchasing renewable energy – to reduce our carbon footprint and address climate change. But it also makes business sense.”
A year later, Apple declared its “retail stores, offices, datacentres and co-located facilities in 43 countries” were powered by 100% clean energy. Facebook has committed to do the same by 2020.
In a statement issued after this story was published, Amazon said: “Greenpeace has chosen to report inaccurate data about the energy consumption and renewable mix of AWS’s infrastructure and did not perform proper due-diligence by fact checking with AWS before publishing. Greenpeace’s estimates overstate both AWS’s current and projected energy usage.
“As of December 2018, Amazon and AWS have invested in 53 renewable energy projects (6 of which are in Virginia), totaling over 1,016 megawatts (MW) and are expected to deliver over 3,075,636 million megawatt hours (MWh) of energy annually.
“AWS remains firmly committed to achieving 100% renewable energy across our global network, achieving 50% renewable energy in 2018. We have a lot of exciting initiatives planned for 2019 as we work towards our goal and are nowhere near done.”
By Kate Linthicum
Apr 09, 2019
Ejido la Mesa, Mexic
As climate change threatens the habitats of migrating monarch butterflies, citizens and scientists in Mexico are taking a novel approach: planting new forests at higher altitudes.
As a boy, Francisco Ramirez Cruz loved hiking with his grandfather up into the mountains of central Mexico. While the old man grazed sheep or hunted for wild mushrooms, Ramirez would play amid the throngs of monarch butterflies that migrated 3,000 miles to this forest each autumn, turning the blue sky into a sea of orange.
Ramirez is 75 now, himself a great-grandfather, and each winter he still goes looking for butterflies. But these days, he might spend hours searching the forest without catching sight of a single one.
The world is losing monarch butterflies at a startling rate, as logging, herbicides and other human activities destroy natural habitats. But the biggest threat yet has only recently come into focus. Climate change, with its extreme storms, prolonged droughts and warming temperatures, is poised to eradicate the forest that serves as the butterfly’s winter refuge.
To help his beloved butterflies, Ramirez has partnered with scientists on a monumental experiment: They are trying to move an entire forest 1,000 feet up a mountain.
On one of the scientists’ early scouting trips to the region several years ago, locals suggested they meet Ramirez, a respected farmer with graying sideburns and a thin mustache who lives on a windswept hillside in Ejido La Mesa, a community that overlaps with the Monarch Butterfly Biosphere Reserve — a national park 70 miles west of Mexico City.
Known by the honorific “Don Pancho,” Ramirez is a former elected leader of La Mesa and is hailed as a hero for helping to bring electricity to the area in the late 1980s. Most importantly, he knows the forest intimately, thanks to the time he spent there, starting in the 1950s.
Ramirez has seen the effects of climate change firsthand — parched fields in the winter, violent thunderstorms in the summer — and felt a calling to protect the butterflies, whose annual arrival and departure have long helped the community mark the passage of time.
Each fall, when the butterflies arrive as if by magic from Canada and the eastern United States, gliding by the millions down over the rolling hills of La Mesa, locals stop what they are doing and look up to admire them. They do the same each spring, when the butterflies depart.
Francisco Ramirez Cruz with an empty water collector on his family's ranch in Ejido La Mesa, Mexico. Climate change has meant the wet season is shorter and more brutal than in previous years, with a drier season that is even more dry. Brian van der Brug / Los Angeles Times
“In the early days, we didn’t know where they came from,” said Ramirez, who talks the way he moves — slowly and deliberately. “But we have always been so happy to see them.”
He believes that protecting the butterfly and its habitat will also help protect his pueblo, which depends on timber from the forest and the tourists that flock to the region to see the monarchs.
He agrees with Cuauhtemoc Saenz-Romero, a forest geneticist who has hired Ramirez to help with the project, that it is necessary to create an ecosystem where the butterflies will be able to survive.
“It’s an idea that may sound radical,” said Saenz-Romero. “But by the end of the century it may be absolutely needed.”
The butterflies that winter here, known as eastern monarchs, seek sanctuary in the oyamel firs that tower in and around the reserve.
The trees, known as “sacred firs” because their conical shape calls to mind hands clasped in prayer, offer a dense canopy that acts as an umbrella for the butterflies that cluster by the thousands on their trunks and branches. The oyamel protects the butterflies from chilly winter rains and creates a microclimate cold enough to keep the butterflies in a state of hibernation but not so frosty as to kill them.
Cuauhtemoc Saenz-Romero, top, is a forest geneticist trying to move an entire forest 1,000 feet up a mountain to help preserve the habitat of monarch butterflies that gather each year in Mexico's Monarch Butterfly Biosphere Reserve, above. (Brian van der Brug / Los Angeles Times)
Scientists fear that climate change may kill off these firs altogether. A 2012 research paper coauthored by Saenz-Romero and published in the journal Forest Ecology and Management found that the area suitable for the oyamel is likely to diminish by 96% by 2090, and disappear completely within the reserve.
The region is warming at such an accelerated pace that the trees won’t be able to adapt, scientists say, and will need help migrating to areas where the climate is predicted to be suitable for them in future years.
Over the last several years, the team of researchers has overseen the relocation of about 1,000 fir saplings that were growing at lower altitudes up to higher — and cooler — elevations.
“We have to assist them,” said Saenz-Romero, a professor at the Michoacan University of St. Nicholas of Hidalgo. “They cannot do this themselves.”
“The world is losing monarch butterflies at a startling rate, as logging, herbicides and other human activities destroy natural habitats.”
Ramirez and the scientists have become fast friends. When he helped them search for the right location to plant the first round of saplings, he insisted that the researchers stay on his farm, which offers modest accommodations (the toilet is flushed with a bucket) but heart-stopping vistas of snow-covered peaks.
In the middle of Ramirez’s orchard of apple and plum trees, the scientists helped him construct a small greenhouse, where he tends to several dozen saplings that were taken from the forest and will eventually be replanted. On a recent blustery morning, he watered the trees with a hose while listening to the sentimental ranchera songs drifting from the house of a nearby neighbor. Then he slowly folded himself into the passenger seat of Saenz-Romero’s silver pickup truck, and they headed up the mountain to check on the trees that had already been planted.
They made an unlikely pair — the farmer in a straw cowboy hat and the city-dwelling scientist in a jaunty black beret — but on the 45-minute drive on a rutted dirt road, Ramirez and Saenz-Romero made easy conversation. They talked about a mudslide years earlier that had killed several people and taken down many fir trees, and they discussed the rain (there hadn’t been any recently).
At one point, Ramirez gestured with a roughened hand toward a vast pine tree at a bend in the road; he remembered it from his youth. Saenz-Romero slowed the car and they paused to admire it.
Up the mountain, they parked and hiked a few minutes to the first of two planting sites, a clearing that had been created by a forest fire three decades before. Rows of small trees were growing, each marked by a pink plastic spoon stuck in the dirt. In the 3 ½ years since the scientists hired locals to help plant these saplings, the trees have grown from 7 inches to a stately 4 feet. As Ramirez showed off the plants, he grinned proudly. “Beautiful,” he said.
Ramirez and the scientists are learning what the trees need to survive. They have found, for example, that trees are more likely to live if planted under the shade of nearby “nurse” plants. They hope to expand the project and establish the trees at even higher altitudes on other nearby mountains — seeding ecosystems now that monarchs could potentially use later if temperatures continue to rise.
So-called assisted migration is happening in other parts of the world, including Canada, where commercial forestry operations have begun replacing dead lodgepole pines with a species of larch that grows at lower and drier elevations. Plant moving is controversial among some scientists, who warn of unintended consequences when humans intervene in nature’s course, but it is increasingly seen as a needed response to a rapidly changing climate.
“There's no doubt that things are going to change,” said Chip Taylor, a retired ecology professor in Kansas and the director of Monarch Watch, which runs a butterfly tagging program. The experiment in Mexico, if successful, is a short-term solution, he said. But it’s an important one.
“What these measures do is give us time to address climate change,” he said. “If we don't do something eventually about CO2s, eventually the new trees will be pushed off the mountain too.”
La Mesa and the surrounding areas haven’t contributed many of the greenhouse gases that are transforming the climate. In fact, in many ways the region feels stuck in a previous century, with dusty burros that trudge up the pocked two-lane highway, firewood piled on their backs, and shepherds reclining in green pastures, one eye on their flocks. Ramirez’s wife, Petra, whom he met half a century ago at a Day of the Dead celebration in a nearby town, still cooks her chicken mole in a pan over an open fire.
But La Mesa is clearly suffering the effects of an altered climate.
When Ramirez was growing up, it used to rain from June to November. Now the rains arrive in July and end in October. Not only is the wet season shorter, but it is also more brutal; the drier season is even more dry.
Over the last couple of decades, the area’s agriculture industry suffered, partly because of warming and partly because of competition brought about by the North American Free Trade Agreement. Family-run corn and potato farms, which used to blanket the hillsides, couldn’t compete with U.S. agribusiness. Ramirez stopped planting corn because it became cheaper to buy it. For many years, he lived in Mexico City and worked in construction.
As jobs dried up, many locals had to migrate, just like the monarchs do. That included two of Ramirez’s sons, who toiled for a time in the United States, and several other family members who still live there. In a small chapel his father built on the farm, Ramirez and his wife keep fresh flowers and a framed picture of Toribio Romo Gonzalez, a Catholic saint known as a protector of immigrants. The money sent back from north of the border has funded construction of opulent — if architecturally discordant — homes in La Mesa, their archways and turrets standing out amid the typical adobe and concrete shacks.
Ramirez has helped win support for the tree-moving project from community members, some of whom tend to be suspicious of the scientists, journalists and tourists who have converged on the region since the 1980s.
Although tourism has provided some jobs for La Mesa, the community sees fewer visitors than nearby towns that have done more to promote themselves as easy access points to the reserve. Some locals grumble about the reserve itself, which sets limits on how much wood residents can chop from the forest.
Scientists predict that it will be many years before the nascent tree-moving experiment has any effect on the population of butterflies in the region.
Monarchs are measured by the total area in which they are observed. In January, Mexican officials said butterflies had been spotted in a region 144% larger than last year. But many researchers have pointed out that hibernating colonies are less dense than usual, and have cautioned that any increase is probably due to favorable weather conditions in recent months, not conservation efforts. In the longer term, the news has been dire.
In the last two decades, the population of monarchs overwintering in Mexico has declined from an estimated 1 billion to fewer than 100 million. If they go extinct, it would mark the loss of one of the world’s few migratory insects. In the United States, conservationists have petitioned to have the monarch protected under the Endangered Species Act; a ruling is expected this year.
After checking on the trees, Ramirez went off to look for the butterflies. Following the advice of a friend who had heard where they had last been spotted, he trudged down a needle-covered path, leaning on a machete as a cane.
Firs loomed around him, ringed by pastel-colored flowers. He stopped to pluck a pink thistle and smiled when he tasted its petals. Eating wild plants reminded him of his youth, and the quesadillas his grandmother would cook with the mushrooms, potatoes and leafy herbs his grandfather found in this forest.
He spotted a few butterflies flitting in the sun, then a few more. He followed them. Then he saw it: a colony of tens of thousands of butterflies, some fluttering about and others resting in massive clumps on the firs, their bodies obscuring the branches and trunks.
There weren’t as many as he remembered from his childhood, but it was still a sight to behold. Ramirez eased himself down to sit on the soft forest floor and watched in silence. The only sound was the whistle of the wind and the gentle flapping of their wings.
BY ADELE PETERS
After the Paris climate agreement in late 2015, J.P. Morgan Chase CEO Jamie Dimon spoke publicly in support of the agreement, which calls for finance flows to be “consistent with a pathway toward low greenhouse gas emissions.” But despite his rhetoric, between 2016 and 2018 his bank ramped up funding for fossil fuels, pouring $196 billion into financing coal, Arctic oil and gas, fracking, tar sands, and other fossil fuel projects. If you bank at Chase, your money might have helped fund drilling in the Amazon rainforest.
In total, according to a new report from a group of environmental nonprofits, the 33 largest global banks collectively provided $1.9 trillion in financing for fossil fuels. Of that, $600 billion went to 100 companies that are aggressively expanding fossil fuel projects at a time when climate scientists say that the world needs to rapidly transition to renewable energy.
“That expansion figure is particularly worrying for us,” says Alison Kirsch, lead researcher for the climate and energy program at the Rainforest Action Network, one of the nonprofits behind the report. “Previous analysis has shown that the potential carbon emissions from fossil fuel reserves already in production would take us beyond two degrees of warming, let alone 1.5 degrees.”
Some large banks have started to restrict fossil investments. HSBC, for example, announced in 2018 that it would stop funding new Arctic drilling, tar sands, and coal plants–though then it made an exception for new coal plants in Bangladesh, Vietnam, and Indonesia. Other policies are similarly incomplete; some restrict funding specific projects but not fossil fuel companies in general.
Chase is not alone in its support of fossil fuels, though it provides 29% more financing than Wells Fargo, which is in second place. Citi and Bank of America round out the top four banks financing fossil fuels in the world. (When asked for comment, a Chase spokesperson pointed us to the facts that the company has a commitment to move its own electricity use to renewable energy and also has a $200 billion commitment to “clean” financing that advances sustainability.)
On March 18, Wells Fargo and Goldman Sachs, both of which invest heavily in fossil fuels, filed motions in court to exclude shareholder resolutions that had asked them to reduce their carbon footprints in line with the Paris agreement. The SEC recently granted both banks permission to prevent shareholders from sharing these resolutions, claiming that the proposals “micromanage” the companies.
For consumers who want to take their money elsewhere, some banks, like Aspiration, offer fossil-fuel-free banking. Kirsch argues that consumer pressure–along with pressure from investors, some of whom have concerns about the long-term viability of oil and gas companies–could lead large banks to change their policies. “We’ve seen success with consumer pressure in the past,” she says. “We saw that with banks on coal.”
Exclusive: Environment Agency chief calls for use to be cut by a third
Damian Carrington Environment editor
Mon 18 Mar 2019 18.00 EDT Last modified on Tue 19 Mar 2019 06.02 EDT
. The UK’s population is expected to rise from 67 million to 75 million in 2050, increasing the demand for water .
England is set to run short of water within 25 years, the chief executive of the Environment Agency has warned.
The country is facing the ‘‘jaws of death”, Sir James Bevan said, at the point where water demand from the country’s rising population surpasses the falling supply resulting from climate change.
However, this could be avoided with ambitious action to cut people’s water use by a third and leakage from water company pipes by 50%, he says, along with big new reservoirs, more desalination plants and transfers of water across the country.
“Around 25 years from now, where those [demand and supply] lines cross is known by some as the ‘jaws of death’ – the point at which we will not have enough water to supply our needs, unless we take action to change things,” Bevan told the Guardian, before a speech on Tuesday at the Waterwise conference in London.
“We need water wastage to be as socially unacceptable as blowing smoke in the face of a baby or throwing your plastic bags into the sea,” he said.
In the speech, Bevan says: “Water companies all identify the same thing as their biggest operating risk: climate change.” By 2040, more than half of our summers are expected to be hotter than the 2003 heatwave, he says, leading to more water shortages and potentially 50-80% less water in some rivers in the summer.
The population of the UK is expected to rise from 67 million to 75 million in 2050, increasing the demand for water. But Bevan says the average person’s daily water use of 140 litres could be cut to 100 litres in 20 years by more efficient use in homes and gardens. Currently, about a third of water is lost to leaks or wastage.
The most controversial change needed to increase supply is building new mega-reservoirs, such as that proposed near Abingdon in Oxfordshire. “We have not built a new reservoir in the UK for decades, largely because clearing all the planning and legal hurdles necessary is so difficult and local opposition so fierce,” Bevan says. The government plans to streamline the planning process. “That will be controversial, but it’s the right thing to do,” says Bevan.
The ruins of Derwent Hall are exposed by low water levels in Ladybower reservoir in Sheffield, England Photograph: Anthony Devlin/Getty Images
More water will also need to be transferred across the country to water-stressed areas, such as the south-east, Bevan says, via pipelines or canals. Just 4% of current supplies are transferred between individual water companies, but there are plans for 20 new transfer projects. More desalination plants, such as Thames Water’s Beckton plant, will also be needed to turn seawater into drinking water, he says.
“While there will be political challenges, there should be less difficulty over the economics,” Bevan says. “That’s because the investment needed to increase our resilience is modest compared with the cost of not doing it. While a severe drought would cost each household more than £100, the cost per household of the investment that would greatly reduce the risk is only £4 a year.”
Water companies are required by the regulator Ofwat to cut leakage by 15% by 2020, although some have incurred huge fines in the past for failing to meet targets.
Thames Water to pay back £65m to customers as part of leakage penalty
Michael Roberts, the chief executive of Water UK, which represents the water companies, said: “As well as planning increasing investment, water companies have publicly committed to cut leakage by 50% by 2050.”
“We’re also working with government and regulators to find ways to make it easier for people to reduce their daily water use, and if we all work together on this we can make sure the country continues to get the water it needs,” said Roberts. “A twin-track approach is the right way to go, reducing demand for water at the same time as increasing supply.”
Quarter of England’s rivers at risk of running dry, finds WWF
It is also vital that wildlife and natural habitats are protected from excessive water abstraction, said Tom Lancaster, at the Royal Society for the Protection of Birds (RSPB), which is part of the Blueprint for Water coalition of 18 NGOs.
“Government proposals to reform water abstraction and improve water management are necessary if we are to balance the needs of people and the natural environment,” he said. “This should start with government placing a duty on water companies to restore and enhance nature.”
Used cans are piling up at scrapyards because U.S. aluminum companies are turning fewer of them into new metal, another indication of the economic challenges facing recycling.
Arconic Inc. and other aluminum rollers are producing less sheet for beverage cans and more higher-margin, flat-rolled aluminum for automotive and industrial components. Prices for used aluminum cans in the U.S. have fallen about 30% since last summer. Old cans are less versatile than other scrap. The makers of airplane and car parts prefer not to use aluminum made from recycled cans. More new cans in the U.S. are made from imported aluminum.
“We’d prefer to purchase domestic can sheet, but as of right now there is not enough to supply the domestic market,” said Jamie Westfahl, senior director of global packaging procurement for Denver-based brewer Molson Coors Brewing Co.
Producing aluminum for cans isn’t as profitable as rolling sheet for car companies. Aluminum rolling mills are paid about $1 a pound above the market price for the raw-aluminum ingots they use to make auto-body sheet, compared with about 35 cents a pound for converting can sheeting.
The challenging economics is a troubling sign for food and packaging companies that are facing pressure to embrace recycling. The glut of used cans shows how public calls for using more recyclable materials can fall short if companies decide it isn’t profitable enough to remake them into new products.
Other recycled materials are facing similar problems. Scrap paper and plastic prices have collapsed since China imposed higher standards on the purity of those products imported from the U.S. China implemented tariffs of 50% last year on aluminum scrap from the U.S. That has created a glut of shredded scrap from junked cars in the U.S. to mix with the growing stockpile of discarded cans.
Atlanta-based Novelis Inc. has shifted some production in recent years from cans to making more aluminum sheet for vehicle bodies. The company opened new lines for auto sheet at a plant in Oswego, N.Y., and is building a plant to make automotive aluminum in Guthrie, Ky.
“We’ve done it. Our competitors have done it,” Novelis Vice President Andy King said. The company also recently increased production from its remaining can-sheet lines as demand for cans improves.
Arconic is investing $100 million at one of its plants to shift production from can sheet to automotive and industrial aluminum. The company stopped making can sheet at the end of last year at the plant near Knoxville, Tenn., that accounted for 14% of the aluminum used in beverage-can bodies and was a major consumer of discarded beverage cans.
Alcoa Corp. is bucking the trend, keeping its rolling mill in southern Indiana committed to just making can sheet. The company has increased production about 20% in the past two years. While making metal for cans isn’t as profitable as producing aluminum for auto bodies, can sheet has become more profitable recently because falling prices for used cans have reduced producers’ scrap costs and widened their margins.
“It’s a good market to be in,” said Tim Reyes, president of Alcoa’s aluminum business.
Aluminum cans have been the most recycled packaging in the U.S. since they supplanted steel as the beverage container of choice in the 1970s. Aluminum can be repeatedly melted and rerolled into paper-thin sheets. About 70% of the aluminum in the 94 billion beverage cans made for the U.S. and Canada last year came from scrap, according to the Can Manufacturers Institute trade group.
But can-sheet production in the U.S. fell 10% between 2011 and 2018 to 1.8 million metric tons annually, according to industry groups. Market consulting firm Harbor Aluminum Intelligence Unit LLC expects annual domestic capacity to make can sheet to fall to 1.73 million metric tons by 2020, down 30% from 2010.
The hole left in the U.S. market is being filled by imports. Can-sheet imports have increased more than 200% since 2013, based on U.S. Census Bureau data. About 70% of imports last year came from China despite the 10% tariff the Trump administration levied on imported aluminum last March. The administration also has granted exemptions on 362,000 metric tons of imported can sheet, most of it from Saudi Arabia.
Can manufacturers Ball Corp. and Metal Container Corp., a unit of beer maker Anheuser-Busch InBev SA, have asked the Commerce Department to exempt about 64,000 metric tons of Chinese can sheet from the tariff. Their requests are pending.
Beverage companies say can-sheet manufacturers have raised prices to reflect the tariff and lower U.S. production. Kelly Clay, chief executive of Wyoming-based Admiral Beverage Corp., said his costs for cans from Crown Holdings Inc. and Ball have increased 15% since the tariff took effect. That obliged him to raise prices on the drinks he bottles and distributes for PepsiCo Inc. in seven Western states by about 15% as well, to $3.35 for 12 cans of soda.
“I don’t know anybody in this industry that is getting any of these tariff exemptions off their can price,” he said.
Jeremy P. Jacobs, E&E News reporter Greenwire: Wednesday, March 20, 2019
The Colorado River's seven states yesterday inked agreements for managing the river in the face of climate change and increasing droughts.
Bureau of Reclamation Commissioner Brenda Burman called the day "historic" as the states met her deadline for the drought contingency plan, or DCP.
The voluntary cutbacks mean Reclamation will not move ahead with the federally imposed restrictions on the river, which provides water to 40 million people and millions of acres of farmland.
"I am pleased to see their hard work is done," Burman said on a conference call with reporters.
The DCP has been years in the making and is intended to make the river's water supply more resilient through conservation and other measures. Despite a wet winter, the river has suffered through an unprecedented drought that has lasted nearly two decades.
"When you are looking at the possibility of a crisis on the Colorado River, we need to be able to move forward," said John Entsminger, general manager of the Southern Nevada Water Authority. "To make sure this river stays safe and sustainable for everyone."
The plan updates 2007 guidelines and outlines cutbacks that states and Mexico will take in order to maintain water levels on the river's two primary reservoirs, Lake Powell on the Utah-Arizona border and Lake Mead on the Nevada-Arizona border. They serve as important water sources and are buffers for the states in drought years.
It also provides incentives for further conservation and would run through 2026.
But the DCP and yesterday's signing ceremony of the seven basin states — Arizona, California, Nevada, New Mexico, Utah and Wyoming — were not without controversy. The largest single user of Colorado River water, Southern California's Imperial Irrigation District, refused to sign on.
IID wants $200 million in federal funds for environmental mitigation at the Salton Sea, an emerging environmental and public health disaster due to a 2003 farm-to-city water transfer that has reduced inflows and caused it to recede. Its exposed lake bed emits toxic dust during the area's frequent windstorms, posing a health risk to nearby disadvantaged communities already suffering from elevated asthma rates (Greenwire, June 20, 2016).
The district was effectively written out of the plan when the Metropolitan Water District of Southern California, the country's largest water retailer, voted to cover its DCP water obligations if a shortage is declared.
IID board director James Hanks lambasted the signing ceremony yesterday, noting that five of his six grandchildren have asthma.
"As we gather here today on the shore of the Salton Sea, strewn with bleached bones, bird carcasses and a growing shoreline," Hanks said at a State Water Resources Control Board hearing, according to media reports, "and as Champagne is being prepared for debauched self-congratulation in Phoenix, remember this: The IID is the elephant in the room on the Colorado River as we move forward. And like the elephant, our memory and rage is long."
Peter Nelson, chairman of the Colorado River Board of California, said he regretted IID not being on board.
"In California, we would much rather move forward together as a full state," he said. "IID not joining in is a disappointment."
Burman also said Reclamation supports IID's efforts to secure funding for the Salton Sea through programs in last year's farm bill.
Entsminger, of Nevada, was more critical.
He said that "not 1 gallon less water will go into the Salton Sea as a result of the implementation of the DCP."
Entsminger's argument was echoed by some environmental groups during the DCP negotiation process.
But IID has disputed it. The district argues that there is an extreme drought scenario where Metropolitan would come to the Imperial Valley to purchase more water to meet its DCP obligations. That, the district contends, would further reduce inflows into the lake.
The district's concerns have also fueled debate about proposed legislation to authorize the DCP. IID and some environmental groups say it contains language that would exempt the DCP from environmental reviews (Greenwire, March 13).
The legislation is critical. Without Congress passing it, Arizona can't join the DCP.
Entsminger yesterday sought to refute those claims, saying it was "always our idea, from day one, to make this work within existing environmental compliance."
But former California Sen. Barbara Boxer (D) wasn't convinced.
Boxer worked on Salton Sea issues during her tenure in the Senate.
She told E&E News the language in the legislation reminds her of anti-environmental riders she encountered as chairwoman and ranking member of the Senate Environment and Public Works Committee.
"There are times when a waiver makes sense. In this case, it's a disaster," said Boxer, who retired in 2016 after more than 20 years in the Senate. "This is a power play to be able to routinely use a waiver when it shouldn't be used."
She added: "You are trying to make life better for people by making sure we don't have a drought, but if the air is filthy and there is no protection, we are going to have a lot of sick people."
Two congressional hearings are scheduled on the DCP next week.
A new analysis finds that the environmental cost of raising cattle is very, very high.
BY ADELE PETERS
The newest version of the Impossible Burger–the plant-based meat that uses food science to replicate the taste and feel of beef–has a carbon footprint 89% smaller than a burger made from a cow.
A new analysis found that the burger also uses 87% less water than beef, uses 96% less land, and cuts water contamination by 92%. Those numbers are improvements on the last iteration of the burger, in part because the company has become more efficient as it grows and because it switched from wheat to soy as a key ingredient, because soy also yields more acres on a farm. But the majority of the impact simply comes from the fact that the product isn’t made from an animal.
“The best, fastest, easiest way to make meat more sustainable is to avoid the cow,” says Rebekah Moses, senior manager of impact strategy at Impossible Foods. “By making the Impossible Burger directly from plants, we have the luxury of bypassing the most inefficient stage in the entire food system.” Cows are known for their greenhouse gas-producing burps–the largest source of methane emissions in agriculture–but also require cattle feed that takes large amounts of land, water, fertilizer to grow, and often leads to deforestation. The cow’s manure is also another major of source of pollution.
The life-cycle analysis, which was verified by the sustainability consulting group Quantis, looked at each part of the plant-based burger’s production, from the water and energy used to produce heme, the ingredient that gives the flavor a blood-like taste, to the resources used to grow other ingredients like soy and potatoes, and produce the packaging. The product uses 4% of the land needed to produce beef. “That’s a very, very conservative estimate on our part–most cattle globally require far more land than that estimate,”Moses says. “It’s completely inefficient, and it’s why beef is the leading cause of deforestation in the Amazon. If most of the land that’s used for cattle feed were to be left alone, without the gassy animals, to re-vegetate and actually store carbon in trees and grasslands, it’s not an exaggeration to say that we could set the clock back on climate change through food choice alone.”
For an individual, the company calculated, swapping Impossible “meat” for a pound of ground beef saves seven pounds of greenhouse gas emissions, 90 gallons of water, and 290 square feet of land. Still, while some consumers might be choosing plant-based meat for environmental reasons, the startup isn’t relying on sustainability to sell the product. “What we really wanted was to create a delicious product that can compete with beef on taste and craveability,” she says. “That’s the primary motivator for most people, and that’s who we want to empower by providing a more planet-friendly option. Sustainability attributes are, for most consumers, a ‘nice to have’ in food choice, rather than the driving force of purchasing.”
ABOUT THE AUTHOR
Adele Peters is a staff writer at Fast Company who focuses on solutions to some of the world's largest problems, from climate change to homelessness. Previously, she worked with GOOD, BioLite, and the Sustainable Products and Solutions program at UC Berkeley.
By Coral Davenport
March 20, 2019
WASHINGTON — The Obama administration violated federal law by failing to adequately take into account the climate change impact of leasing public land for oil gas drilling in Wyoming, a federal judge ruled Tuesday.
But the decision by the United States District Court for the District of Columbia could also present a legal threat to President Trump’s agenda to quickly expand oil and gas drilling and coal mining across the nation’s public lands and waters. That’s because the decision amounts to a road map that could be used to challenge hundreds of Trump administration leases as well.
However, experts said that, while the decision could lead to legal delays for the drilling expansion envisioned by Mr. Trump by tangling them in litigation, it was unlikely to halt it entirely.
Tuesday’s decision by Judge Rudolph Contreras, which applied specifically to an Obama-era plan by the Interior Department’s Bureau of Land Management to lease several thousand acres of land for drilling in Wyoming, also concluded that the agency was legally required to consider the climate impact of all such lease sales for fossil fuel development.
“This is the first court ruling that specifically tears apart the Interior Department’s failure to take into account the climate change of impact on drilling, on a national scale,” said Jeremy Nichols, the climate change and energy program director for WildEarth Guardians.
In his decision, Judge Contreras wrote that, under the National Environmental Policy Act of 1970, federal agencies are required to consider and quantify the effect of the possible planet-warming emissions associated with the fossil fuels to be extracted from the sales of such leases. Already, that law requires the federal government to consider the on-site environmental effects of oil and gas drilling, such as water pollution and the effects on plants and animals of road construction.
“What this decision says is, in evaluating the environmental consequences of the lease, an agency has to look not just at the consequences of the impacts immediately surrounding the lease but also the consequences down the road of burning the fuel once it’s extracted,” said Richard L. Revesz, an expert on environmental law at New York University. “That’s enormously important.”
The Bureau of Land Management protested that it “would be required to identify any past, present, or reasonably foreseeable greenhouse-emitting projects worldwide,” an “impossible” scope of analysis.
Judge Contreras wrote that the agency was correct in that the law “does not require the impossible.” But he wrote, “In short, BLM did not adequately quantify the climate change impacts of oil and gas leasing.”
In general, the Trump administration’s proposed lease sales for oil and gas drilling have included less climate impact analysis than the Obama-era ones that were the focus of Judge Contreras’s decision.
A spokeswoman for the Interior Department declined to respond to questions about the decision, saying the agency did not comment on ongoing litigation. A spokeswoman for the Western Energy Alliance, a coalition of fossil fuel companies that joined with the Interior Department in the case, called the decision “ripe for successful appeal.”
“The judge is asking BLM to take a wild guess at how many wells would be developed on these leases and analyze greenhouse gas impacts for wells that will never be developed,” said the spokeswoman, Kathleen Sgamma.
But other experts said that Judge Contreras’s requirement was not particularly onerous for companies. “It’s not a very time-consuming calculation,” said Mr. Revesz of New York University. “It’s based on models that already exist.”
“The court is saying, you have to go back and take a harder look at the climate impacts,” said Harry Weiss, an expert in natural resources law at the firm Ballard Spahr. “But they’re also saying if they do a heck of job on figuring it out, then they can go back and drill.”
US District Court Decision:
"The Trump administration is set to close on new loan guarantees for Southern Co.'s Plant Vogtle, the nation's lone nuclear construction project, sources have confirmed to E&E News.
The $3.7 billion in loan guarantees represent a new lifeline for the reactors, which are years behind schedule and billions above their original projected budget. Energy Secretary Rick Perry will formally sign off on the agreement when he visits Vogtle, in southeast Georgia, on Friday.
Perry's visit brings fanfare to the reactors, but the loan guarantees highlight nuclear's chief challenge, which is cost. To be clear, Vogtle is the last project standing after what was supposed to be an industry resurgence that was killed in large part by falling natural gas prices, flat electricity demand and rapidly expanding renewable energy."
Coral Davenport covers energy and environmental policy, with a focus on climate change, from the Washington bureau. She joined The Times in 2013 and previously worked at Congressional Quarterly, Politico and National Journal.
The big Chinese fine chemical company Zhejiang NHU invest €5.5 million in the biotech start-up CysBio to develop new and affordable biochemicals
Technical University of Denmark
Public Release: 22-Mar-2019
CysBio is making cost-effective and novel biochemicals via sustainable fermentation processes. Through advanced synthetic biology and metabolic engineering approaches microorganisms are engineered to convert simple sugars into desired biochemical products. This makes it possible to produce expensive chemicals at very attractive cost levels and furthermore enables the production of novel or rare biochemicals.
The products will have applications in a variety of areas such as the food, feed, nutrition, pharma, cosmetics, and polymer industries.
"Our technologies enable CysBio to create some of the cheapest biochemicals available, as well as some very exciting new products with novel applications as functional chemical building blocks," says Alex Toftgaard Nielsen, CSO and Co-founder of CysBio and Professor at DTU Biosustain.
Based on scientific research from DTU
CysBio's technologies are patented and based on scientific research conducted at the Novo Nordisk Foundation Center for Biosustainability at the Technical University of Denmark. This research has made it possible to modify and engineer the bacteria's metabolism, thus, making them able to produce specific amino acids and sulfated biochemicals.
"The investment from NHU demonstrates how DTU's innovation eco-system has successfully assisted in translating academic research into a privately funded spin-out company with a strong technology platform," says Marianne Thellersen, Senior Vice President for Innovation and Entrepreneurship at DTU.
Big unexploited potential
The technology platform paves the way for producing compounds used for e.g. new polymer materials with new functionalities such as conductivity and adhesiveness. Products that are highly demanded by the chemical industry. One of the other biochemicals that CysBio works with is well-known from nature as an alternative to chemically produced pesticides to prevent the growth of bacteria and fungus on surfaces of various kinds. Thus, CysBio expects to produce greener alternatives that can replace the use of harsh chemicals in many different market segments.
"We aim at becoming a leading provider of functional biochemical monomers and the partnership with a leading company like Zhejiang NHU will allow a fast market introduction of parts of this technology. Our strategy will clearly be to enables our technologies to be exploited quickly and effectively through R&D and commercial partnerships," says Henrik Meyer, CEO and Co-founder of CysBio.
The Danish biotech company plans to get their first products on the market already at the beginning of 2020.
Future perspectives for pharma
Some of the key targets for the proprietary sulfation technology will be methods for improving the solubility and bioavailability of some existing drugs and nutraceuticals, thereby increasing the efficacy of the drugs.
"This will help in solving the problem of developing new pharmaceuticals, so we also see great potential in collaborating closely with partners in the pharma industry, "says Alex Toftgaard Nielsen.
Expands portfolio of products
For the stock listed Chinese chemical company, Zhejiang NHU, with global operations and subsidiaries in Europe, the seed investment in CysBio also opens up for exploring new opportunities in the market.
"We strongly believe the future of biotechnology. We have great strategic interests in expanding our biotechnology-based product portfolio by partnering with biotechnology companies. We look forward to assisting CysBio in their growth and further product development." says Mr. Bai Fan Hu, Chairman of Zhejiang NHU.
Taylor & Francis Group
Public Release: 21-Mar-2019
Scientists should stop using the term 'statistically significant' in their research, urges this editorial in a special issue of The American Statistician published today.
The issue, Statistical Inference in the 21st Century: A World Beyond P<0.05, calls for an end to the practice of using a probability value (p-value) of less than 0.05 as strong evidence against a null hypothesis or a value greater than 0.05 as strong evidence favoring a null hypothesis. Instead, p-values should be reported as continuous quantities and described in language stating what the value means in the scientific context.
Containing 43 papers by statisticians from around the world, the special issue is expected to lead to a major rethinking of statistical inference by initiating a process that ultimately moves statistical science - and science itself - into a new age.
In the issue's editorial, Dr. Ronald Wasserstein, Executive Director of the ASA, Dr. Allen Schirm, retired from Mathematica Policy Research, and Professor Nicole Lazar of the University of Georgia said: "Based on our review of the articles in this special issue and the broader literature, we conclude that it is time to stop using the term 'statistically significant' entirely.
"No p-value can reveal the plausibility, presence, truth, or importance of an association or effect. Therefore, a label of statistical significance does not mean or imply that an association or effect is highly probable, real, true, or important. Nor does a label of statistical non-significance lead to the association or effect being improbable, absent, false, or unimportant.
"For the integrity of scientific publishing and research dissemination, therefore, whether a p-value passes any arbitrary threshold should not be considered at all when deciding which results to present or highlight."
Articles in the special issue suggest alternatives and complements to p-values, and highlight the need for widespread reform of editorial, educational and institutional practices [quotes below].
While there is no single solution to replacing the outsized role that statistical significance has come to play in science, solid principles for the use of statistics do exist, say the editorial's authors.
"The statistical community has not yet converged on a simple paradigm for the use of statistical inference in scientific research - and in fact it may never do so," they acknowledge. "A one-size-fits-all approach to statistical inference is an inappropriate expectation. Instead, we recommend scientists conducting statistical analysis of their results should adopt what we call the ATOM model: Accept uncertainty, be Thoughtful, be Open, be Modest."
This ASA special issue builds on the highly influential ASA Statement on P-Values and Statistical Significance which has had over 293,000 downloads and 1,700 citations, an average of over 10 per week since its release in 2016.
Need for change
"Considerable social change is needed in academic institutions, in journals, and among funding and regulatory agencies. We suggest partnering with science reform movements and reformers within disciplines, journals, funding agencies and regulators to promote and reward 'reproducible' science and diminish the impact of statistical significance on publication, funding and promotion." - Goodman
"Evaluation of manuscripts for publication should be 'results-blind'. That is, manuscripts should be assessed for suitability for publication based on the substantive importance of the research without regard to their reported results." - Locascio
"Everything should be published in some form if whatever we measured made sense before we obtained the data because it was connected in a potentially useful way to some research question. Journal editors should be proud of their exhaustive methods sections and base their decisions about the suitability of a study for publication on the quality of its materials and methods rather than on results and conclusions; the quality of the presentation of the latter should only be judged after it is determined that the study is valuable based on its materials and methods." - Amrhein et al.
"Reproduction of research should be encouraged by giving byline status to researchers who reproduce studies. We would like to see digital versions of papers dynamically updated to display 'Reproduced by...' below the original research authors' names or 'Not yet reproduced' until it is reproduced." - Hubbard and Carriquiry
"An important role for statistics in research is the summary and accumulation of information. If replications do not find the same results, this is not necessarily a crisis, but is part of a natural process by which science evolves. The goal of scientific methodology should be to direct this evolution toward ever more accurate descriptions of the world and how it works, not toward ever more publication of inferences, conclusions, or decisions."- Amrhein et al.
Alternatives and complements to p-values
"A number of factors should no longer be subordinate to 'p<0.05'. These include relevant prior evidence, plausibility of mechanism, study design and data quality, and the real-world costs and benefits that determine what effects are scientifically important. The scientific context of the study matters and this should guide its interpretation." - McShane et al.
"Words like 'significance' in conjunction with p-values and 'confidence' with interval estimates mislead users into overconfident claims. We propose researchers think of p-values as measuring the compatibility between hypotheses and data, and interpret interval estimates as 'compatibility intervals' rather than 'confidence intervals'." - Amrhein et al.
"Continuous p-values should only be used in conjunction with the 'false positive risk (FPR)', which answers the question: If you observe a 'significant' p-value after doing a single unbiased experiment, what is the probability that your result is a false positive? " - Colquhoun
New study published in Geology
Geological Society of America
Public Release: 22-Mar-2019
Boulder, Colo., USA: Glaciers that drain ice sheets such as Antarctica or Greenland often flow into the ocean, ending in near-vertical cliffs. As the glacier flows into the sea, chunks of the ice break off in calving events. Although much calving occurs when the ocean melts the front of the ice, and ice cliff above falls down, a new study presents another method of calving: slumping. And this process could break off much larger chunks of ice at a quicker rate.
The ice-cliff research was spurred by a helicopter ride over Jakobshavn and Helheim glaciers on Greenland's eastern coast. Helheim ends abruptly in the ocean, in near-vertical ice-cliffs reaching 30-stories high (100 meters). On the flight, scientists viewed large cracks (called crevasses) on top of the ice that marched towards the end of the glacier.
"Geologists have spent decades -- centuries -- worrying about slumps," says Richard Alley, co-author of the new paper in Geology. A slump occurs when mass of rock or sediment loses some of its strength, breaks away from its neighboring land, and slides down a slope. Typically, slumps are marked by a steep scarp where the material broke away, followed by a block of material moved downslope.
Alley says the research team noted that features on Helheim glacier are typical of what you might see in a slump-prone terrestrial landscape and they wondered if ice might suffer the same fate. "You've got a crevasse that serves as a head scarp and then you've got the stresses [within the ice] maximized down at the water level," he says.
To test if slumping occurs on ice cliffs, the team monitored Helheim glacier during a calving event, using real-aperture terrestrial radar interferometery. They measured speed, position, and motion of the calving ice. The researchers observed an ice-flow acceleration just before an initial slump, followed by a rotating, full ice-thickness calving of the glacier -- including the entire remaining ice-cliff, reaching both above and below the water line.
Removing the weight of the upper ice by slumping encourages the underlying ice to pop upward. "Because it's still attached at the back, it's going to rotate a little bit," says Alley. The rotation causes a crack to form at the bottom of the glacier as the ice flexes. In turn, the crack can weaken the ice, creating a large calving event -- all triggered by the initial slump on top of the ice cliff.
After observing the slump-triggered calving event, the team modeled when slumping events were most likely to occur on an ice cliff. The modeling looked at tensile, shear, and compressive failure for ice cliffs, and included characteristics of the ice. The scientists found that cliffs reaching more than 100 meters of ice above water were likely to have slumping occur.
Alley says regular calving events happen relatively slowly, such as when the ice front melts over time, undercutting the ice and weakening the cliff. "But that's not going to go really, really, really fast because you have to wait for the melting to undercut it," he says.
With slumping, the calving occurs without waiting for the melt. "We'll go slump... basal crevasse... boom," he says, noting that when the calving happens it will take the 100 meters of ice above the water -- and the 900 meters below the water -- very quickly.
And 1000 meters of ice calving at once isn't the limit. Alley says that in some places in Antarctica, the glacial ice bed can be 1500 to 2000 meters below sea level, creating a much taller cliff above water. He says the worry is that taller cliffs are even more susceptible to slumping. "The scary thing is that if pieces of west Antarctica start doing what Helheim is doing then over the next hundred years models indicate that we get rapid sea level rise at rates that surpass those predicted," says Alley.
Understanding the slump-break process has been a collaborative effort, Alley says, and more investigations are planned for the near future. "We want to understand what are the rules for [ice] breakage by this process and others," says Alley, adding that they hope to collect more observational data as well as refine their models to better understand the slump-break process. "There's still work to be done."
The research was supported by NASA, the National Science Foundation, and the New York University Abu Dhabi Research Institute.
Ice-cliff failure via retrogressive slumping
Byron Parizek (email@example.com), Knut Christianson, Richard B. Alley, Denis Voytenko, Irena Va?ková, Timothy H. Dixon, Ryan T. Walker, David M. Holland. URL: https://pubs.geoscienceworld.org/gsa/geology/article/569567/ice-cliff-failure-via-retrogressive-slumping.
Portland State University
Public Release: 22-Mar-2019
Portland experiences both extreme heat in the summer months and frequent nuisance flooding in the winter and spring, and that's only expected to worsen with climate change. A new Portland State University study found the potential for flooding and extreme heat is most acute in East Portland's low-income neighborhoods that have fewer green spaces and larger concentrations of less-educated residents.
The PSU research team -- geography graduate students Benjamin Fahy and Emma Brenneman, geography professor Heejun Chang and urban studies and planning professor Vivek Shandas -- mapped winter flood and summer heat hazard potential, then tested it against sociodemographic and physical variables at a neighborhood scale, including income, level of education, population density, green space and the amount of impervious surface area. This study focused on nuisance flooding, the kind of flooding that shuts down roads, overwhelms storm drains and seeps into basements.
Their findings were published in the journal International Journal of Disaster Risk Reduction.
"Not surprisingly, those poorer, low-lying areas on the eastside along Highway 205 are disproportionately exposed to floods and urban heat islands," said Heejun Chang, a geography professor in PSU's College of Liberal Arts and Sciences and director of the WISE research group or Water as an Integrated System and Environment. "Those are the potential target areas where the city needs to pay attention to."
Among the findings:
High-flood potential areas are located consistently in East Portland along Highway 205, while most low-flood potential areas are found in NW and SW Portland and near the NE-SE Portland transition along I-84.
Most areas with high-heat hazards are clustered in East Portland, as well as North Portland and along major roadways, while low-heat hazard clustering is found in the western hills, central NE and SE Portland.
The areas with the greatest combined hazard potential (both extreme heat and flooding) were clustered in East Portland, SE Portland and North Portland. Conversely, the city's western hills and central NE and SW neighborhoods - wealthier neighborhoods - showed the lowest risk.
The methods used by the team are meant to be replicated by any researcher, practitioner or policy maker interested in identifying which regions of their cities are most at risk and what demographic factors characterize the most vulnerable citizens.
Chang said planting trees is an easy and effective action to ease both urban heat and flooding.
"If you can better manage land, you can better manage water in the urban areas," he said. "During the summer, trees can provide shading and reduce the heat island effect. But during winter, they can intercept the rainwater, hold water in the soils longer, and release water gradually."
Chang said that in some areas where impervious surface areas are too high, green roofs might be a better alternative. He and his students are continuing to look at ways to expand green infrastructure to reduce heat and nuisance flooding potentials. Shandas has worked with Portland city officials to develop an online mapping tool to identify specific locations where expanding tree canopy can improve social and environmental conditions.
Chang said cities also need to better educate the public, as the places likely to experience the most severe impacts of nuisance flooding and extreme heat often lack access to information and the ability to prepare for hazards.
The PSU study is part of the larger National Science Foundation-funded Urban Resilience to the Extremes Sustainability Research Network, a project working to analyze the impacts of climate change on 10 cities in North and South America; identify risks and vulnerabilities associated with extreme weather events; and produce scenarios, transitions and plans for resilient urban infrastructure and systems. The other cities are Baltimore, Maryland; Hermosillo, Mexico; Mexico City, Mexico; Miami, Florida; New York, New York; Phoenix, Arizona; San Juan, Puerto Rico; Syracuse, New York; and Valdivia, Chile.
The study was also supported by PSU's Institute for Sustainable Solutions. Shandas is research director of ISS, and Chang is a faculty fellow.
University of Colorado at Boulder
Public Release: 17-Feb-2019
Cooking, cleaning and other routine household activities generate significant levels of volatile and particulate chemicals inside the average home, leading to indoor air quality levels on par with a polluted major city, University of Colorado Boulder researchers say.
What's more, airborne chemicals that originate inside a house don't stay there: Volatile organic compounds (VOCs) from products such as shampoo, perfume and cleaning solutions eventually escape outside and contribute to ozone and fine particle formation, making up an even greater source of global atmospheric air pollution than cars and trucks do.
The previously underexplored relationship between households and air quality drew focus today at the 2019 AAAS Annual Meeting in Washington, D.C., where researchers from CU Boulder's Cooperative Institute for Research in Environmental Sciences (CIRES) and the university's Department of Mechanical Engineering presented their recent findings during a panel discussion.
"Homes have never been considered an important source of outdoor air pollution and the moment is right to start exploring that," said Marina Vance, an assistant professor of mechanical engineering at CU Boulder. "We wanted to know: How do basic activities like cooking and cleaning change the chemistry of a house?"
In 2018, Vance co-led the collaborative HOMEChem field campaign, which used advanced sensors and cameras to monitor the indoor air quality of a 1,200-square-foot manufactured home on the University of Texas Austin campus. Over the course of a month, Vance and her colleagues conducted a variety of daily household activities, including cooking a full Thanksgiving dinner in the middle of the Texas summer.
While the HOMEChem experiment's results are still pending, Vance said that it's apparent that homes need to be well ventilated while cooking and cleaning, because even basic tasks like boiling water over a stovetop flame can contribute to high levels of gaseous air pollutants and suspended particulates, with negative health impacts.
To her team's surprise, the measured indoor concentrations were high enough that that their sensitive instruments needed to be recalibrated almost immediately.
"Even the simple act of making toast raised particle levels far higher than expected," Vance said. "We had to go adjust many of the instruments."
Indoor and outdoor experts are collaborating to paint a more complete picture of air quality, said Joost de Gouw, a CIRES Visiting Professor. Last year, de Gouw and his colleagues published results in the journal Science showing that regulations on automobiles had pushed transportation-derived emissions down in recent decades while the relative importance of household chemical pollutants had only gone up.
"Many traditional sources like fossil fuel-burning vehicles have become much cleaner than they used to be," said de Gouw. "Ozone and fine particulates are monitored by the EPA, but data for airborne toxins like formaldehyde and benzene and compounds like alcohols and ketones that originate from the home are very sparse."
While de Gouw says that it is too early on in the research to make recommendations on policy or consumer behavior, he said that it's encouraging that the scientific community is now thinking about the "esosphere," derived from the Greek word 'eso,' which translates to 'inner.'
"There was originally skepticism about whether or not these products actually contributed to air pollution in a meaningful way, but no longer," de Gouw said. "Moving forward, we need to re-focus research efforts on these sources and give them the same attention we have given to fossil fuels. The picture that we have in our heads about the atmosphere should now include a house."
Updated Feb 15; Posted Feb 15
By Ron Fonger | firstname.lastname@example.org
Thirteen Michigan water systems failed to meet federal standards for lead in drinking water in the last half of 2018, and seven of those systems had lead levels at least twice as high as the state will allow starting in 2025.
Data requested by MLive-The Flint Journal from the Michigan Department of Environmental Quality shows water systems above that action limit in the most recent Lead and Copper Rule testing are located throughout the state and are both large and small -- one serving less than 100 homes and others providing water to cities as large as Hamtramck and Benton Harbor.
A total of 27 water providers registered 90th percentile lead levels of at least 13 parts per billion, beyond the 12 ppb future threshold established last year by the state.
The 90th percentile is a formula used by the U.S. Environmental Protection Agency to measure the prevalence of lead in water sampled by the more than 1,200 public water systems in the state. Federal regulations set the lead limit at 15 ppb.
A water system with a 90th percentile for lead of 15 ppb for example would mean that 10 percent of high-risk homes tested had lead readings of 15 ppb or more -- the level at which the Centers for Disease Control and Prevention says action should be taken.
The DEQ can require additional testing, changes in treatment for corrosion control or replacement of service lines that contain lead in response of readings that high.
In the Melrose-Chandler Water system in Charlevoix County, testing was repeated after one sample produced a very high lead level, said Walter Breidenstein, manager of the system that serves a single subdivision.
Because Melrose-Chandler serves just a few hundred people, it was only required to collect five samples from July until December 2018, and one sample -- collected from a rarely used basement sink -- was enough to put the system’s 90th percentile for lead at 25 ppb, Breidenstein said.
Seventy-eight water systems had higher lead levels than the city of Flint, which registered a 90th percentile reading of 6 ppb and which is in the midst of a massive public works project to remove all lead service lines in the city.
State records show that in the first six months of last year, the same system’s 90th percentile was just 2 ppb.
“We think it’s just a fluke thing because we’ve never had the problem,” Breidenstein said. “We were all freaked out” until residents learned that stagnant water from a single tap may have caused the jump. He said subsequent tests in the same home and elsewhere in the system have turned up clear since.
Lead can enter drinking water when service lines that contain lead corrode, especially where the water has high acidity or low mineral content that corrodes pipes and fixtures, according to the EPA. The most common problem in homes is the result of brass or chrome-plated brass faucets and fixtures with lead solder, from which significant amounts of lead can enter into the water, the agency says.
Another small water system registered the highest LCR test result in the state -- 48 ppb -- but a representative of the Hills of Walloon Association said the result was a historical deviation caused by sample from a single faucet that was rarely used in one home.
Lead found in Benton Harbor tap water
Eight of the 30 homes tested were above the action level of 15 parts per billion.
Fewer than 100 homes draw from the Hills of Walloon water system, according to state records, and the water -- drawn from deep wells -- registered a 90th percentile of just 6 ppb during the first six months of last year.
“The system was tested again in September” and was in compliance, said Thomas Saeli, president of the association. “Our water system doesn’t have lead (pipe) in it. It had an improper test (that) was done."
Other systems above 15 ppb of lead in the last half of last year are Maple Knoll in Eaton County (44), Lakeview Chalet Condominiums in Oakland County (43), Hermansville Housing Commission in Menominee County (29), the city of Hamtramck in Wayne County (28), the village of Lawrence in Van Buren County (24), Benton Harbor in Berrien County (22), Sims-Whitney Utilities Authority in Arenac County (20), Gun River Estates West in Allegan County (19), Country Living Adult Foster Care in Hillsdale County (16), Kellogg Biological Station in Kalamazoo County (16) and the city of Parchment in Kalamazoo County (16).
MLive-The Flint Journal could not reach a representative of the DEQ for comment on the results or representatives of some water systems that surpassed the federal action limit for lead.
In October, residents of Benton Harbor and Hamtramck were notified about the discovery of high levels of lead in those municipal water systems, and Benton Harbor announced it would supply bottled water to homes that tested above 15 ppb.
Late last year, a health advisory was issued for lead in the city of Parchment’s water, and its 90th percentile was 16 ppb -- above the action level.
Officials there have said a protective coating inside pipes was scoured when the city purged its municipal water supply to remove traces of toxic per- and polyfluorinated substances, or PFAS, in July. Parchment connected to the city of Kalamazoo’s water system in August but lead has continued to leach into tap water from old lead service lines still in use.
Lead notices would also have been required in the village of Lawrence in Van Buren County, which registered a 90th percentile of 24 ppb. The village -- with a population of 1,045 -- supplies water from three wells through two water towers, according to its website.
In the Upper Peninsula’s Menominee County, the Hermansville Housing Commission last year provided notice of its lead levels to residents in 24 apartments and has been evaluating potential actions to lower lead levels. The agency’s water was above the federal action level in both the first and second half of 2018 -- the only water system in the state to do so.
Exposure to lead can cause behavior problems and learning disabilities in young children and can also affect the health of adults, according to the CDC, which says no safe blood level has been identified and all sources of lead exposure for children should be controlled or eliminated.
Awareness of the dangers of lead in water has increased since the Flint water crisis.
Flint’s most recent LCR testing showed a 90th percentile of just 4 ppb.
North America absorbed two-thirds of the global cost of climate disasters over the last three years, Morgan Stanley says.
At $415 billion, the price of the disasters is equal to 0.66 percent of North America's GDP.
Morgan Stanley warns that near-term disruptions and long-term structural changes present risks to many sectors of the economy.
Tom DiChristopher | @tdichristopher
Published 6:27 AM ET Thu, 14 Feb 2019 Updated 8:30 AM ET Thu, 14 Feb 2019
A Cal Fire firefighter monitors a burning home as the Camp Fire moves through the area on November 9, 2018 in Magalia, California.
Climate-related disasters have cost the world $650 billion over the last three years, and North America is shouldering most of the burden, according to a new report from Morgan Stanley.
While governments and corporations are taking steps to mitigate the impacts of climate change, Morgan Stanley says private enterprises need to strongly consider preparing for a world gripped by more frequent and intense weather events, rising sea levels, changes to agriculture and the spread of infectious disease. Those outcomes will have a lopsided effect across industries, raising risks for some and creating opportunities for others.
"We expect the physical risks of climate change to become an increasingly important part of the investment debate for 2019," Morgan Stanley equity strategists Mark Savino, Jessica Alsford and Victoria Irving said in a research note Wednesday.
At $650 billion, the three-year price of climate disasters totals just over a quarter of a percent of global gross domestic product, the analysts say. The investment bank warns that the situation may only get worse, noting that damages associated with global warming could total $54 trillion by 2040, according to a UN panel composed of the world's top climate scientists.
The United States is bearing the brunt of climate change's toll on the economy. Morgan Stanley says climate-related disasters like hurricanes and wildfires have cost North America $415 billion, or two-thirds of the global total. That equals 0.66 percent of North American GDP.
Last week, the National Oceanic and Atmospheric Administration said 14 weather and climate disasters cost the nation $91 billion in 2018, Earth's fourth hottest year on record.
The assessment lands at a time when the U.S. is at a crossroads over climate change. Liberals on Capitol Hill are pushing a Green New Deal to overhaul the U.S. economy in just 10 years, while a select committee led by establishment House Democrats is pursuing a more modest approach to tackling global warming.
President Donald Trump continues to cast doubt on the consensus among climate scientists and U.S. government agencies that greenhouse gas emissions from human activity are warming the planet. His administration, backed by congressional Republicans, is seeking to boost fossil fuel production and push through a broad rollback of Obama-era policies aimed at lowering U.S. emissions.
After the U.S., Asia is most exposed to the cost of climate disasters, absorbing $180 billion in economic damages, equal to 0.24 percent of regional GDP, Morgan Stanley says.
Parts of the two regions — including the U.S. Gulf and East Coasts, China and the Philippines — are also at greatest risk of sea level rise and adverse weather events. Changes to agricultural conditions will also have big impact on parts of North America and Asia, in addition to Europe and Central America. The spread of infectious disease is of greatest concern to Africa, Latin America and other developing regions, the bank says.
In the near term, Morgan Stanley sees climate change posing risk of negative disruptions to a dozen sectors, from agriculture to oil and gas production. Just four sectors — capital goods, home improvement retail, lodging and construction machinery — could reap benefits from those near-term disruptions.
Over the longer term, structural changes are seen negatively impacting nine sectors, including many of the industries that also face near-term hurdles. Real estate, leisure and consumer retail are also on the list. Other sectors, like auto manufacturing, biotechnology, health care and pharmaceuticals, insurance, mining and utilities, could find opportunity in structural change.
Drilling down, the four main vectors that Morgan Stanley identifies — sea-level rise, weather events, changes in agriculture and infectious disease — will impact sectors in different ways.
For example, rising sea levels are expected to hurt property values for coastal real estate and disrupt apparel supply chains in Asia. However, it could also boost spending on machinery to rebuild seaside infrastructure and spark higher sales of capital goods like commercial pumps and other water management products.
Pulses of melting linked to rainfall doubled in summer and tripled in winter, a new climate change study found. That's a problem for sea level rise.
By Bob Berwyn, InsideClimate News
Mar 8, 2019
The total precipitation over the Greenland Ice Sheet didn’t change over the study period, but more of it fell as rain. The scientists estimated that almost a third of the total runoff they observed was triggered by rainfall. Credit: Joe Raedle/Getty Images
When a frozen snowflake falls on the Greenland Ice Sheet, it lands with a whisper and stays frozen, sometimes for months.
But raindrops splat down, making little craters and melting some of the adjacent snow crystals. Multiplied across thousands of square miles, they can trigger widespread melting and runoff, which can lead to more sea level rise.
A new analysis of satellite and weather data shows that melting associated with rain in Greenland doubled in the summers and tripled in the winters from 1988 to 2012 as temperatures rose , scientists write in a study published Thursday in The Cryosphere, a journal of the European Geosciences Union.
The total precipitation over the ice sheet didn't change over the study period, but more of it fell as rain, the study found. The scientists estimated that almost a third of the total runoff measured was triggered by rainfall.
They also found that melting events triggered by rain lasted longer, lengthening from an average of two days to three in the summer, and from two days to five in the winter.
The findings could help explain some of the increase in Greenland's meltwater runoff and show how global warming feedback loops can intensify the impact. Recent research published in the journal Nature found meltwater runoff from the ice sheet had increased 50 percent since the start of the industrial era and is continuing to accelerate.
"The idea was to investigate the triggers of melting events, and it was a surprise that rainfall was a significant trigger," said Marilena Oltmanns, a climate researcher at the Geomar Helmholtz Centre for Ocean Research in Kiel, Germany, and the lead author of the new study. "I expected to find that high pressure systems in summer would be the biggest deal."
A Window Into What's Driving the Melting
Up to now, much of the research on Greenland Ice Sheet melting has focused on how surface temperatures in the summer season affect melting, said study co-author Marco Tedesco of Columbia University's Lamont-Doherty Earth Observatory.
But winter changes, including longer and more widespread melting and an upward shift in the rain/snow line, are important to help project how fast and by how much sea level will rise in the future.
"This is starting to open a window to what's driving the melting," Tedesco said. "The liquid water is carrying the energy from the surface deep into the ice sheet."
The effects of rain events can persist for months, he added. If there's more snow in winter, the snowpack is brighter at the start of the summer. "Then, you need more energy to start the melting. But if you have more rain and melting during the winter, you have relatively darker snow because the individual grains are bigger."
"It preconditions the ice to make it more vulnerable, pushing it closer to the melting threshold," he said.
Arctic Ice Melt Is Speeding Up, Studies Show
Greenland has been losing about 270 billion tons of ice per year. About 70 percent of the loss is from direct surface melting of the ice sheet, the rest from icebergs breaking off into the ocean, where it contributes to sea level rise.
Understanding how fast Greenland's ice, along with the rest of the Arctic, will melt will help coastal communities around the world prepare for sea level rise and the storm surges, coastal flooding and erosion that come with it.
Across the Arctic region, land ice melting increased about tenfold from the quarter century after 1971 to the most recent quarter century, a team of scientists found in a study published in December. Greenland expert Jason Box, the study's lead author, described the change as going from 16 times the flow of England's Thames River to 160 times its flow.
Another study, published in February, found that melting on just the Greenland Ice Sheet had quadrupled between 2002 and 2012, and that the greatest increase was in southwest Greenland—a region with few glaciers that is projected to experience more rainfall in the coming decades.
"It's important to better understand precipitation impacts on Greenland land ice, for climate change is bringing more moisture into the Arctic, and warming leads to an increase in rain at the expense of snow," Box said.
But he also said that short-term natural variability can have an impact. Since the end of the study period for the new research on rain and melting, Greenland has seen some relatively cold years. Still, the trends identified by the study are clear against natural long-term climate variations in the region, Tedesco said.
'A Little More Rain Can Have a Significant Impact'
The researchers combined satellite imagery with on-the-ground readings from 20 weather stations spread around the ice sheet. The satellite images are critical because they can distinguish between snow and liquid water.
Using the two datasets, the scientists examined 313 melting events triggered by rain.
With rain, areas of fresh snow can change into hardened crusts of ice, which are darker than the snow and absorb more of the sun's energy. The meltwater rivers that can form sometimes end up running through deep cracks all the way to the ground, and some research suggests that can speed up the flow of the ice toward the sea. Oltmanns said the rainfall trends detected in her new research might affect that dynamic.
Mike MacFerrin, a University of Colorado, Boulder glaciologist who studies ice sheet meltwater feedbacks, said the study reinforces some of his own ongoing research showing that both rain and melting are increasing, and that the rate of melting is increasing 10 times faster than rainfall.
"These cyclones come in bringing rain and start melt events that persist long beyond the time of initial melting," he said. "A little less snow, a little more rain can have a significant impact on a given area, and an early spring melt event makes the summer melt season longer."
More Information :
By MICHAEL MELIA
March 6, 2019
As climate change becomes a hotter topic in American classrooms in 2019, some politicians are pushing back against the scientific consensus that global warming is real and man-made.
HARTFORD, Conn. (AP) — A Connecticut lawmaker wants to strike climate change from state science standards. A Virginia legislator worries teachers are indoctrinating students with their personal views on global warming. And an Oklahoma state senator wants educators to be able to introduce alternative viewpoints without fear of losing their jobs.
As climate change becomes a hotter topic in American classrooms, politicians around the country are pushing back against the near-universal scientific consensus that global warming is real, dire and man-made.
Of the more than a dozen such measures proposed so far this year, some already have failed. But they have emerged this year in growing numbers, many of them inspired or directly encouraged by a pair of advocacy groups, the Discovery Institute and the Heartland Institute.
“You have to present two sides of the argument and allow the kids to deliberate,” said Republican state Sen. David Bullard of Oklahoma, a former high school geography teacher whose bill, based on model legislation from the Discovery Institute, ran into opposition from science teachers and went nowhere.
Scientists and science education organizations have blasted such proposals for sowing confusion and doubt on a topic of global urgency. They reject the notion that there are “two sides” to the issue.
“You can’t talk about two sides when the other side doesn’t have a foot in reality,” said University of Illinois climate scientist Donald Wuebbles.
Michael Mann, director of the Earth System Science Center at Penn State University, said these legislative proposals are dangerous, bad-faith efforts to undermine scientific findings that the fossil-fuel industry or fundamentalist religious groups don’t want to hear.
In the mainstream scientific community, there is little disagreement about the basics that greenhouse gases from the burning of coal, oil and gas are causing the world to warm in a dangerous manner. More than 90 percent of the peer-reviewed studies and scientists who write them say climate change is a human-caused problem.
A Nobel Prize-winning international panel of scientists has repeatedly published reports detailing the science behind climate change and how the world is likely to pass a level of warming that an international agreement calls dangerous. The U.S. government last year issued a detailed report saying that “climate-related threats to Americans’ physical, social and economic well-being are rising.”
The battle over global warming resembles the fight that began decades ago over the teaching of evolution, in which opponents led by conservative Christians have long called for schools to present what they consider both sides of the issue.
Some of those who reject mainstream climate science have cast the debate as a matter of academic freedom.
James Taylor, a senior fellow at Heartland, an Illinois-based group that dismisses climate change, said it is encouraging well-rounded classroom discussions on the topic. The group, which in 2017 sent thousands of science teachers copies of a book titled “Why Scientists Disagree About Global Warming,” is now taking its message directly to students. A reference book it is planning for publication this year will rebut arguments linking climate change to hurricanes, tornadoes and other extreme weather.
“We’re very concerned the global warming propaganda efforts have encouraged students to not engage in research and critical thinking,” Taylor said, referring to news reports and scientific warnings.
Neither Discovery nor Heartland discloses the identities of its donors.
Instruction on the topic varies widely from place to place, but climate change and how humans are altering the planet are core topics emphasized in the Next Generation Science Standards, developed by a group of states. Nineteen states and the District of Columbia have adopted the standards, and 21 others have embraced some of the material with modifications.
Still, a survey released in 2016 found that of public middle- and high-school science teachers who taught something about climate change, about a quarter gave equal time to perspectives that “raise doubt about the scientific consensus.”
By early February, the Oakland, California-based nonprofit National Center for Science Education flagged over a dozen bills this year as threats to the integrity of science education, more than the organization typically sees in an entire year.
Several of them — including proposals in Oklahoma, North Dakota and South Dakota — had language echoing model legislation of the Seattle-based Discovery Institute, which says teachers should not be prohibited from addressing strengths and weaknesses of concepts such as evolution and global warming.
Similar measures became law in Louisiana in 2008 and Tennessee in 2012. In states where they may not be feasible politically, Discovery has urged legislators to consider nonbinding resolutions in support of giving teachers latitude to “show support for critical thinking” on controversial topics. Lawmakers in Alabama and Indiana passed such resolutions in 2017.
Discovery officials did not respond to requests for comment.
Florida state Sen. Dennis Baxley is pressing legislation that would allow schools to teach alternatives to controversial theories.
“There is really no established science on most things, you’ll find,” the GOP legislator said.
Elsewhere, lawmakers in Connecticut and Iowa, which both adopted the Next Generation Science Standards, have proposed rolling them back. Connecticut state Rep. John Piscopo, a Republican who is a Heartland Institute member, said he wants to eliminate the section on climate change, calling it “totally one-sided.”
Other bills introduced this year in such states as Virginia, Arizona and Maine call for teachers to avoid political or ideological indoctrination of their students.
“If they’re teaching about a subject, such as climate change, and they present both sides, that’s fine. That’s as it should be. A teacher who presents a skewed extension of their political beliefs, that’s closer to indoctrinating. That’s not good to kids,” said Virginia state Rep. Dave LaRock, a Republican.
While there are many details about climate science hotly debated among scientists, it is well-established that global warming is real, human-caused and a problem, said scientist Chris Field, director of the Stanford Woods Institute for the Environment.
“When people say we ought to present two sides, they’re saying we ought to present a side that’s totally been disproven along with a side that has been fundamentally supported by the evidence,” Field said.
For Release: Friday, February 8, 2019
State Petitions EPA to Comply with Clean Air Act, Protect Public Health by Requiring Upwind States to Reduce Smog Pollution
Department of Environmental Conservation (DEC) Commissioner Basil Seggos today announced that New York State is seeking to compel the U.S. Environmental Protection Agency (EPA) to address the interstate transport of ozone pollution from sources in states upwind of New York. New York is responding to EPA's inaction on a petition submitted by DEC on March 12, 2018, which showed that large pollution sources in upwind states are affecting New York's ability to achieve federal air quality standards for smog. EPA has failed to act on the state's petition despite granting itself a six-month extension. The New York Attorney General's office intends to bring suit if EPA does not take required action within 60 days.
DEC Commissioner Seggos said, "EPA needs to do its job of reducing pollution. As the past summer's elevated ozone levels demonstrate, progress to reduce ozone has stalled despite the success of New York and other northeastern states in reducing emissions. Rather than calling to limit pollution from coal-fired power plants and industrial sources, EPA is allowing these companies to continue or even increase pollution. Our communities suffer as a result, with increased asthma attacks and other respiratory and cardiovascular illnesses to show for it."
"The Environmental Protection Agency has routinely failed to properly address the issue of interstate smog pollution in the Northeast, and in doing so, failed to address the serious health concerns that come with smog pollution," said Attorney General Letitia James. "My office is prepared to continue to take legal action against the EPA to force them to confront its legal responsibility to address the unhealthy air that millions of New Yorkers breathe every single day."
Reducing smog levels is vital to protecting the health of New Yorkers. Elevated levels of smog can cause coughing, throat irritation, lung tissue damage, and the aggravation of existing medical conditions, such as asthma, bronchitis, heart disease, and emphysema. Exposure to ozone is also linked to premature mortality. Children, the elderly, and those with existing lung diseases, such as asthma, are more vulnerable to ozone's harmful effects.
New York has some of the country's strictest air quality regulations to reduce the release of pollutants that cause smog, such as nitrogen oxides (NOx) and volatile organic compounds (VOCs). DEC regulates these emissions from power plants, factories, motor vehicles, and other sources and has some of the lowest emissions of NOx and VOCs in the nation. Major stationary sources in New York reduced annual NOx emissions by 43 percent between 2008 and 2014, and the state's power plants reduced ozone-season NOx emissions by 73 percent between 2008 and 2017.
Despite New York's actions, the pollution that contributes to smog, such as NOx emitted from power plants and other sources, can travel hundreds of miles across state borders. The federal Clean Air Act recognizes the regional nature of this pollution and that emission sources located in upwind states contribute to smog in downwind states. Because downwind states cannot address this smog on their own, section 126(b) of the federal Clean Air Act allows states to petition the EPA for a finding that any upwind source or group of sources that emit(s) pollutants to an extent that affects the ability of downwind areas to attain federal standards, thus violating the Act's "Good Neighbor" provision. If approved by EPA, the petition would require these upwind sources to comply with new emission limitations or shut down.
New York submitted its section 126(b) petition on March 12, 2018, and identified sources in Illinois, Indiana, Kentucky, Maryland, Michigan, Ohio, Pennsylvania, Virginia, and West Virginia that are contributing to clean air violations in New York. EPA received this petition on March 14, 2018, and was required by the Clean Air Act to act within 60 days but granted itself a six-month extension to November 9, 2018. To date, EPA has failed to act on New York's petition. The New York's Attorney General's office has therefore announced its intention to file suit against EPA to force the federal agency to carry out its mandated duty.
By MICHAEL RUBINKAM
Pittsburgh’s beleaguered water authority will spend $50 million to replace lead service lines, give filters to low-income residents and take other steps to address the city’s lead crisis under a settlement approved Thursday by state utility regulators.
It comes a week after the Pennsylvania attorney general’s office filed criminal charges against the Pittsburgh Water and Sewer Authority, alleging it mishandled a lead pipe replacement program in 2016 and 2017 and put more than 150 households at elevated risk of lead poisoning. The authority, which had previously admitted civil liability in the case and was fined $2.4 million by state environmental regulators, is fighting the charges.
Clean-water advocates hailed Thursday’s settlement as a significant step toward reducing lead levels in the city’s drinking water, and as a model for other cities also struggling with lead.
“The people of Pittsburgh have been drinking lead-contaminated water for far too long. This settlement requires aggressive, affordable solutions to protect public health and hold officials accountable to the people they serve,” Dimple Chaudhary, an attorney for the Natural Resources Defense Council, said in a statement.
Chaudhary, who led a lawsuit over lead water pipes in Flint, Michigan , represented a coalition of Pittsburgh community, labor, religious and environmental groups in the water authority’s rate case before the Public Utility Commission.
The utility agreed to accelerate its existing lead line replacement program, promising to swap out 3,800 of its own water lines and 2,800 privately owned lines, at no charge to residents, this year and next. The money will come from grants and a loan from the Pennsylvania Infrastructure Investment Authority, or Pennvest, which funds sewer, storm water and drinking water projects throughout the state.
Other provisions require the water agency to focus its replacement program on neighborhoods at greatest risk of lead exposure, increase outreach to residents who refuse to get their water lines replaced, make more residents eligible for free filters and replacement cartridges, and restrict the use of partial lead line replacements, a practice that can lead to temporary spikes of lead in the water.
The utility estimates it owns about 10,000 miles of lead pipe out of a total of 71,000 miles. That doesn’t include privately owned lead service lines that take water from the curb to the house.
The authority is under a mandate to replace at least 7 percent of its lead service lines each year due to elevated lead levels in drinking water. Lead can cause brain damage and other lifelong injuries, especially in children.
Robert Weimar, the authority’s executive director, called it “an agreeable settlement that considers our most vulnerable customers, while also providing the investments needed to be the water, sewer, and storm water utility Pittsburgh expects and deserves.”
Customers’ bills are going up substantially. Regulators approved a 14 percent rate increase, meaning a typical residential customer will pay nearly $9 more per month. But low-income customers will get a bigger discount than they do now.
The rate hike will yield $21 million in new revenue for a water utility that has been trying to address decades of mismanagement, heavy debt, billing problems, chronic understaffing, elevated lead levels and a dilapidated system that loses 50 percent of the treated water it produces to broken pipes.
State lawmakers placed the authority — which an independent consultant once called a “failed organization atop a dangerous and crumbling structure” — under the control of the Public Utility Commission last year.
"A group of top hurricane experts, including several federal researchers at the National Oceanic and Atmospheric Administration, published striking new research Thursday suggesting that hurricanes in the Atlantic Ocean have grown considerably worse, and climate change is part of the reason why.
The study focused on rapid intensification, in which hurricanes may grow from a weak tropical storm or Category 1 status to Category 4 or 5 in a brief period. They found that the trend has been seen repeatedly in the Atlantic in recent years. It happened before Hurricane Harvey struck Texas and before Hurricane Michael pummeled the Gulf Coast with little warning last fall. Hurricane Michael, for example, transformed from a Category 1 into a raging Category 4 in the span of 24 hours.
The study, published in Nature Communications, describes its conclusion in blunt language, finding that the Atlantic already has seen “highly unusual” changes in rapid hurricane intensification, compared to what models would predict from natural swings in the climate. That led researchers to conclude that climate change played a significant role."
"WELLINGTON, New Zealand — An environmental disaster is unfolding in the Pacific after a large ship ran aground and began leaking oil next to a UNESCO World Heritage site in the Solomon Islands, Australian officials said Friday.
Footage taken this week shows little progress has been made in stopping the Solomon Trader ship from leaking oil since it ran aground Feb. 5, according to the Australian High Commission in the Solomon Islands.
Australian experts estimate more than 80 tons of oil has leaked into the sea and shoreline in the ecologically delicate area and that more than 660 tons of oil remains aboard the Hong Kong-flagged ship, which is continuing to leak."
By Julia Rosen
Mar 02, 2019 | 5:35 PM
Unused methane gas is burned off at a Qualco facility near Monroe, Wash. The potent greenhouse gas has been accumulating in the atmosphere at an accelerated rate, and scientists are trying to figure out why. (Mike Siegel / Seattle Times)
Scientists love a good mystery. But it’s more fun when the future of humanity isn’t at stake.
This enigma involves methane, a potent greenhouse gas. Twenty years ago the level of methane in the atmosphere stopped increasing, giving humanity a bit of a break when it came to slowing climate change. But the concentration started rising again in 2007 — and it’s been picking up the pace over the last four years, according to new research.
Scientists haven’t figured out the cause, but they say one thing is clear: This surge could imperil the Paris climate accord. That’s because many scenarios for meeting its goal of keeping global warming “well below 2 degrees Celsius” assumed that methane would be falling by now, buying time to tackle the long-term challenge of reducing carbon dioxide emissions.
“I don’t want to run around and cry wolf all the time, but it is something that is very, very worrying,” said Euan Nisbet, an Earth scientist at Royal Holloway, University of London, and lead author of a recent study reporting that the growth of atmospheric methane is accelerating.
Methane is produced when dead stuff breaks down without much oxygen around. In nature, it seeps out of waterlogged wetlands, peat bogs and sediments. Forest fires produce some too.
These days, however, human activities churn out about half of all methane emissions. Leaks from fossil fuel operations are a big source, as is agriculture — particularly raising cattle, which produce methane in their guts. Even the heaps of waste that rot in landfills produce the gas.
Agriculture, including growing rice in flooded fields, is one of the biggest sources of human-caused methane emissions.
The atmosphere contains far less methane than carbon dioxide, which is the primary driver of climate change. But methane is so good at trapping heat that one ton of the gas causes 32 times as much warming as one ton of CO2 over the course of a century.
Molecule for molecule, methane ”packs a bigger punch,” said Debra Wunch, an atmospheric physicist at the University of Toronto.
For 10,000 years, the concentration of methane in Earth’s atmosphere hovered below 750 parts per billion, or ppb. It began rising in the 19th century and continued to climb until the mid-1990s. Along the way, it caused up to one-third of the warming the planet has experienced since the onset of the Industrial Revolution.
Scientists thought that methane levels might have reached a new equilibrium when they plateaued around 1,775 ppb, and that efforts to cut emissions could soon reverse the historical trend.
“The hope was that methane would be starting on its trajectory downwards now,” said Matt Rigby, an atmospheric scientist at the University of Bristol in England. “But we’ve seen quite the opposite: It’s been growing steadily for over a decade.”
That growth accelerated in 2014, pushing methane levels up beyond 1,850 ppb. Experts have no idea why.
“It’s just such a confusing picture,” Rigby said. “Everyone’s puzzled. We’re just puzzled.”
The concentration of methane in the atmosphere, indicated by green circles, is not consistent with a climate scenario that aggressively limits global warming (the solid green line labeled RCP 2.6). The other pathways lead to more warming. (Martin Manning / Victoria University of Wellington)
Scientists have come up with various explanations. Could it be growing emissions from fossil fuels or agriculture? An uptick in methane production in wetlands? Changes in the rate at which methane reacts with other chemicals in the atmosphere?
Nisbet and his team examined whether any of these hypotheses synced up with the changing chemical signature of methane in the atmosphere.
Some molecules of methane weigh more than others, because some atoms of carbon and hydrogen are heavier than others. And lately, the average weight of methane in the atmosphere has been getting lighter.
That seems to implicate biological sources such as wetlands and livestock, which tend to produce light methane. Daniel Jacob, an atmospheric chemist at Harvard who was not involved in Nisbet’s study, said that explanation squares with his own research. His results suggest most of the additional methane comes from the tropics, which are home to vast wetlands and a large proportion of the world’s cattle.
Estimates of emissions from coal mines and oil and gas wells suggest that fossil fuel contributions are rising too, but those sources usually release heavier molecules of methane, which would seem to conflict with the atmospheric observations.
Some researchers have proposed a way to resolve this discrepancy. Fires create an even heavier version of methane, and agricultural burning — particularly in developing countries — appears to have decreased over the last decade. A drop in this source of ultra-heavy methane would make atmospheric methane lighter on the whole, potentially masking an increase in emissions from fossil fuels.
Finally, reactions that break down methane eliminate more of the lighter molecules than the heavier ones. If that process has slowed down — causing methane to build up in the atmosphere — it would leave more light gas behind, possibly helping explain the overall trend.
Nisbet and his colleagues concluded they can’t rule out any of these explanations yet. “They might all be happening,” he said.
One possibility is conspicuously missing from the list. Scientists have long feared that thawing Arctic sediments and soils could release huge amounts of methane, but so far there’s no evidence of that, said Ed Dlugokencky, an atmospheric chemist at the National Oceanic and Atmospheric Administration who worked on the study, which will be published in the journal Global Biogeochemical Cycles.
Nisbet said he fears the rising methane levels could be a sign of a dangerous cycle: Climate change may cause wetlands to expand and allow the environment to support more livestock, leading to even more methane emissions.
“It clearly seems as if the warming is feeding the warming,” he said. “It’s almost as if the planet changed gears.”
Scientists measured methane emissions from the Bangweulu wetlands of Zambia using a research plane. Wetlands like these are the largest source of methane on Earth and may be behind its recent rise. (Pat Barker / University of Manchester)
If methane keeps increasing, the researchers say, it could seriously endanger efforts to keep the planet’s temperature in check. Slashing CO2 emissions enough to meet climate targets is a tall order even without this extra methane.
“The unexpected and sustained current rise in methane may so greatly overwhelm all progress from other reduction efforts that the Paris Agreement will fail,” Nisbet and his coauthors wrote.
It doesn’t help that scientists recently revised the global warming potential of methane upward by 14%.
Regardless of what’s behind the recent increase, scientists say there are ways to reduce methane concentrations. And the benefits will accrue quickly because methane has a shorter lifetime than CO2, lingering in the atmosphere for only about a decade.
Humans account for as much as 60% of methane emissions, and nearly half of that may come from the fossil fuel industry, Jacob said.
One priority is to plug leaks from oil and gas wells, he said. Methane is the primary ingredient in natural gas, so companies have a financial incentive to try to capture as much as possible.
Often, a few culprits bear most of the blame, “which is both scary and a good thing,” because they represent big opportunities, Wunch said. At the Barnett Shale in Texas, 2% of the facilities produce half of the field’s methane emissions. In Southern California, the Aliso Canyon leak released roughly 100,000 tons of methane in 2015 and 2016 — the equivalent of burning 1 billion gallons of gasoline.
Scientists also have ideas for reducing methane emissions from livestock. Some experiments show that changing the diet of cattle by adding fats or seaweed, for instance, can reduce the amount of methane animals expel. Capping landfills and using the methane they produce for electricity would help too.
Measures like these could have a big impact, and Wunch said they give her reason to be hopeful.
“We could actually reduce the amount of methane in the atmosphere on timescales that are relevant to the problem we are facing right now,” she said.
An analysis of water monitoring reports found unsafe levels of toxic substances near hundreds of coal ash sites, many of them in the Midwest and Southeast.
By James Bruggers
Mar 4, 2019
Coal ash from the now-retired Allen Fossil Plant in Memphis, Tennessee, is among the worst groundwater polluters, a new analysis shows.
At a power plant in Memphis, Tennessee, coal ash waste that built up over decades has been leaching arsenic and other toxic substances into the groundwater.
The contamination, ranked as a top problem in a new national assessment of water testing at coal ash sites, is in a shallow aquifer for now. But below that lies a second aquifer that provides drinking water to more than 650,000 people, and there are concerns that the contamination could make its way into the deeper water supply the city relies on.
"We have one of the purest drinking water sources in the whole country, and now we'll have arsenic and other coal ash compounds leaking into our water supply if things don't get cleaned up," said Scott Banbury, a Memphis resident and representative of the Sierra Club. "So getting rid of that ash is important."
The scale of the Memphis problem emerged from industry reports on groundwater testing near ponds and landfills that store coal-burning wastes from power plants. These reports, required by recent regulations, show that polluted groundwater is a widespread problem, with unsafe levels of toxic contaminants linked to more than nine out of every 10 coal-fired power plants with monitoring data, about 91 percent. The Environmental Integrity Project and other advocacy groups compiled and analyzed the data in a report released Monday.
The worst contamination, according to the analysis, was at the San Miguel Power Plant south of San Antonio, Texas, with 12 pollutants above safe levels in groundwater. In Memphis, the Tennessee Valley Authority's Allen Fossil Plant, which was shut down last year, is in the Top 10.
The report covers water testing at 265 existing and retired coal plants, comprising more than 550 individual coal ash ponds and dumps that have groundwater monitoring wells. That represents about three-fourths of the coal power plants in the country, according to the authors. Some plant owners were not required to make groundwater testing results public because they closed their ash dumps before the U.S. Environmental Protection Agency's first national rules on coal burning waste took effect in 2015, or because they were eligible for an exemption or extension of reporting deadlines.
"It's everything we could get our hands on," said the report's lead author, Abel Russ, a senior attorney with the Environmental Integrity Project, which worked with Earthjustice on the report. The contamination is "widespread," Russ said, and in many places, "groundwater may be unusable for decades or hundreds of years."
In all, unsafe levels of contamination were reported in groundwater at sites in 39 states and Puerto Rico. Illinois has the most power plants with polluted ash storage sites with 16, followed by Texas, Indiana, Kentucky, Michigan, North Carolina and Missouri, all with more than 10.
Trump EPA Is Considering Weakening Standards
The results come as President Donald Trump's EPA, led by former coal and energy industry lobbyist Andrew Wheeler, is weighing further changes to the Obama-era EPA rules with the goal of cutting utility compliance costs. Earlier, the Trump administration relaxed EPA coal ash rules in part by extending the deadline to stop using some coal ash ponds from next month to Oct. 31, 2020.
"The fact that we have now analyzed the data and found significant contamination at almost every coal ash site provides a strong record to oppose any future rollbacks," said Lisa Evans, a senior attorney at Earthjustice.
Earthjustice last year won a lawsuit against EPA that will require the closure and cleanup of some 100 additional coal ash ponds left out of the 2015 rule. EPA is appealing.
An EPA spokesman on Friday did not comment on the agency's plans for changing coal ash rules. On Monday, EPA said it was reviewing the environmental groups' report and would not comment on it yet.
For decades, wastes from burning coal went unregulated by the federal government, leaving a patch of inconsistent state rules. The Obama administration in 2015 put in place national rules, favoring dry storage in landfills over wet storage in ponds. Those rules require utilities to conduct groundwater monitoring at ponds and landfills, close leaking ash ponds, clean up polluted groundwater and disclose their studies and actions.
The ash contains contaminants like selenium, mercury, cadmium, and arsenic, which are associated with cancer and other serious health effects, according to the EPA.
Posting water quality monitoring reports is just the start of a cleanup process. Earthjustice reported in January that utilities had acknowledged pollution at unsafe levels at 70 power plants in 22 states. By last week, those numbers had climbed to 88 plants in 25 states, Evans said.
Some utilities are blaming the pollution on other sources, and future disputes are expected over utilities' cleanup plans, she said.
Trouble in Texas and Memphis
The new report ranked the 10 most contaminated sites, based on the extent to which pollution—including all potentially unsafe pollutants—exceeded safe levels.
Brian McGovern, a spokesman for the Texas Commission on Environmental Quality, declined to comment on how his agency was responding to reports of groundwater pollution from the San Miguel plant, the nation's worst case. However, he said Texas is working on a new statewide coal ash management rule.
At the Memphis plant, the utility reported arsenic levels 350 times higher than drinking water standards, according to the report. TVA's Allen Fossil Plant, which burned coal for six decades along the banks of McKellar Lake, near the Mississippi River, is about five miles southwest of downtown Memphis.
The contaminated groundwater is above the deeper aquifer that the city uses for drinking water, most of it separated by a layer of clay.
In an annual report required by the coal ash rules and posted on the TVA website Friday, the TVA acknowledged that an underground area near its coal ash pond has no clay barrier. The report follows a 2018 U.S. Geological Survey study that found the two aquifers were connected and places the location of the connection near the leaking ash ponds, said Amanda Garcia, an attorney with the Southern Environmental Law Center.
The site set off alarm bells while TVA was testing water wells for a new natural gas power plant on the property. Pollution from the upper aquifer moved toward the lower aquifer during the testing.
TVA stopped using those wells and the utility plans to start pumping out and treating the contaminated groundwater at the Allen plant later this year, said Scott Brooks, a TVA spokesman. It is also studying the best way to stop the leaking and close its ash pond at Allen and other coal ash sites across the utility's seven-state system, he said.
No decision has been made yet on whether the ash can be safely managed on the spot or will need to be hauled away, Brooks said, adding that the utility intends to base the decision on the best available science.
For now, he stressed, none of the pollution documented under the plant had been detected in the city's drinking water supply.
For Banbury, the Memphis Sierra Club leader, that pollution is too close for comfort.
"This was something that nobody ever thought was a concern," he said. "Now it's a huge concern."
James Bruggers covers the U.S. Southeast, part of ICN's National Environment Reporting Network. He came to InsideClimate News in May 2018 from Louisville's Courier Journal, where he covered energy and the environment for more than 18 years.
By Amy Mayer • Feb 21, 2019
Amy Mayer / Harvest Public Media
Originally published on February 21, 2019 8:19 am
At Hummel’s Nissan in Des Moines, Kevin Caldwell sells the all-electric Leaf. Driving one is basically the same as driving a typical gasoline or gas-electric hybrid car, he said, except for a few new features like the semi-autonomous hands-free option. And the fact that you plug it in rather than pumping gas into it.
About a quarter to a third of Caldwell’s Leaf customers are farmers, some of whom grow corn for ethanol.
“You typically get two types of customers,” he said, “The customers that are more environmentally conscious and they want to lower their carbon footprint versus the customer that is simply wanting to not have to worry about gas prices.”
One million electric cars were sold in the U.S. in 2018, and while states with stricter emissions standards, such as California and Oregon, lead the way, plenty can be seen on the road even in central Iowa, where corn has long been king.
About 40 percent of the corn grown in the United States gets turned into ethanol for gasoline, but the demand to reduce fossil-fuel emissions could bring pretty drastic changes to agriculture in the Corn Belt. Think acres of corn replaced by perennial crops or the return of small grains such as oats and wheat.
Harvest Public Media's Amy Mayer tells how the electric car trend could change farming.
Lee Tesdell is all-in for reducing his carbon footprint. He began installing solar panels on his farm in Polk County, Iowa, in 2013.
“And all the time I was thinking that I would like to drive an electric car,” he said, “so that my solar electrical production would be part of the fuel for my electric car.”
In December, he bought an all-electric Chevy Bolt for his commute.
Government-backed ethanol has been a boon to corn farmers, but Tesdell said it’s led to overproduction, which in turn has kept corn prices low the past several years.
“I would like to see corn acres reduced by quite a lot in Iowa and more diversification,” Tesdell said.
He’s doing that himself, with a variety of conservation practices as well as raising sheep and growing alfalfa. He also has dozens of acres he rents out to a farmer who plants corn and soybeans. They’ve discussed other possible crops, such as industrial hemp or cereal rye for cover crop seed, but they’re still in the research stage. It’s risky and can be hugely expensive to change a crop rotation.
Transitioning away from corn and into alternative crops could have multiple benefits for farmers and the environment.
First, from a business standpoint, farmers need to anticipate market changes, according to Trevor Russell, water program director at Friends of the Mississippi River, an environmental nonprofit.
“Let's not be left holding the ball on 40 percent of our corn crop that doesn't have a market and can't get sold anywhere,” Russell said. “That would be an economic catastrophe for farmers and for rural communities that rely on agricultural economies and for, really, the entire Corn Belt.”
And secondly, different crops like kernza wheat or the oilseed camelina could help clean up polluted waterways and nurture depleted soils.
But Russell said it’s going to take the same types of government policies and incentives that expanded ethanol to establish viable markets for other crops. Without those, farmers – and their creditors – won’t see new crops as potentially profitable and won’t be able to invest in the necessary changes.
Russell is actively forming partnerships with agricultural groups and said his advocacy for water quality has morphed into an agriculture policy job as he works to help develop these new markets. He said he feels a sense of urgency, too, because even though no one can predict the timeline for the transition to electric cars, it’s clearly underway.
“Retooling our cropping system to respond to that is going to take longer than the transition to electric vehicles will take,” Russell said, “So we need to start yesterday.”
John Eichberger of the Washington, D.C.-based Fuels Institute attended the 2019 Iowa Renewable Fuels Summit to discuss what he foresees in the transition to electric cars. He told ethanol-industry experts not to panic … yet.
“We have people thinking, ‘Well, it's gotta be electric,’ or ‘It's gotta be liquid,’” Eichberger said of the future of cars. “It's probably going to be a combination for a long time because we're not going to transition 100 percent.”
That’s something Rand Faaborg knows well. He’s a welder by trade and commutes 20 miles one-way past acres and acres of corn fields every day in a used 2012 Nissan Leaf. But he also raises hogs and cattle, and charges the car in his barn.
Rand Faaborg drives his Nissan Leaf 40 miles a day and charges it overnight.
“I have several trucks, cars, tractors. I still use a lot of gas,” said Faaborg, who joked that no one predicted he’d be the first one in the work parking lot with an electric vehicle. He’s proud of his rural Iowa roots; his father-in-law and daughter also farm.
But, he said, “electric is the way to go. And electric will overtake all the automobiles very shortly, I’m convinced of that.”
Faaborg is equally confident farmers will be able to adapt to an eventual reduced market for corn, and in the end, he predicted, everybody will win.
This story was reported, in part, with support from the Institute for Journalism and Natural Resources.
By Jody Harrison @jodeharrisonHT
A TIDAL TURBINE array in the north of Scotland set a new world record for generating power and exporting it into the national grid.
Simec Atlantis Energy's Meygen four turbine set-up in the Pentland Firth has generated 12 gigawatt(GW) hours of electricity since it was switched on last April - enough to power almost 9,000 homes.
This beats a previous record set by Seagen in Strangford Lough, an inlet in County Down, Northern Ireland.
Simec Atlantis Energy's Chief Executive Tim Cornelius broke the news to staff, saying: "Meygen has now exported more than 12GWh of tidal energy to the grid in Scotland, surpassing the previous record held by SeaGen in Strangford Lough (11.6 GWh). "Congratulations to all involved. Onwards."
The Pentland Firth, between Caithness and the uninhabited island of Stroma, has some of the fastest flowing waters in the UK.
Simec's turbines are driven by the powerful flowing tides which surge through between the North Sea and the Atlantic Ocean each day.
Meygen is the largest tidal turbine project active the Britain, and is currently in phase one of a 25-year development cycle.
The first turbine was installed in 2017, with the others completed last year.
Each of the massive machines rests on foundations on the sea bed weighing between 250 and 350 tonnes, and is attached to six ballast blocks weighing 1,200 tonnes.
Environmental monitoring equipment is installed to assess any interaction between the tidal turbines and the marine environment, including marine mammals, while cables come ashore through tunnels drilled in the bedrock.
It is estimated the seas around the UK could one day be capable of generating 20 per cent of all electricity needs.
Last year the think tank Offshore Renewable Energy (ORE) Catapult said that the tidal stream industry could generate £1.4 billion for the UK and support 22,600 jobs by 2040, focused mostly in Scotland
At the same time, wave energy could contribute £4 billion to the UK economy and support 8,100 jobs by 2040.
The final phase of the Meygen project, which already has planning permission, would see an additional 49 turbines built in the Firth at an estimated cost of £420m and the creation of an entire industry in the north to support them.
Simec Atlantis Energy say that the project will be "transformational" for the tidal energy industry, providing the necessary scale to justify the establishment of turbine manufacturing facilities at Global Energy’s facility in Nigg Energy Park.
The record-breaking turbines are the latest in a series of tidal power developments in Scotland, once described by former First Minister Alex Salmond as the "Saudi Arabia" of tidal power.
Last year a green energy project in Shetland became the world’s first fully operational grid-connected ‘baseload’ tidal power station, using cutting-edge batteries from international tech firm Tesla to enable a predictable supply of renewable energy to be fed into the network.
Hannah Smith, Senior Policy Manager at Scottish Renewables, said that news coming out of the industry showed that the technology worked.
She said: "This milestone for the tidal energy industry once again demonstrates the untapped potential of this emerging sector.
"Scotland’s remarkable marine energy resource has placed us front and centre in developing an industry with global ambitions.
"Simec Atlantis Energy’s achievement with Meygen is remarkable, and paves the way for marine energy to become a mainstream part of Scotland’s energy mix while cutting carbon and delivering jobs and investment to our remote communities.
"To keep driving progress it’s critical that both Scottish and UK governments recognise the potential of these technologies and work with industry to fully commercialise this promising sector."
"Federal judges have again rejected a legal challenge from California and conservation groups opposed to border wall construction along the southern border.
The 9th U.S. Circuit Court of Appeals today tossed arguments that the Trump administration overstepped when the Department of Homeland Security fast-tracked barrier projects near San Diego and Calexico, Calif., and waived certain environmental laws for construction.
A three-judge panel found it lacked jurisdiction over some of the challengers' claims but could review some others.
Ultimately, though, the court sided with the government and found that the border projects — wall prototypes and new fencing — were within DHS's authority under the Illegal Immigration Reform and Immigrant Responsibility Act of 1996."
Feb 11, 2019, 09:46pm
Could tomorrow's homes be a by-product of our toilet habits?
Australian researchers make bricks from human waste
If you’ve been reading this column for a while, you’ll know that I am fascinated with redefining ‘waste’. From urine-powered fuel cells to thermal insulation made from chicken feathers, I love finding scientists and engineers who see opportunities in those objects the rest of us discard. Now added to that list is a team at Australia’s RMIT University, led by Associate Professor Abbas Mohajerani. He’s an expert in construction, pavements and roading materials, and for the past 15 years, he’s been exploring some… surprising ingredients.
I first came across Mohajerani’s work back in 2016. He had just published a paper ($) in the journal Waste Management, in which he suggested incorporating small quantities of cigarette butts into fired clay bricks. It would, his team argued, help solve a global littering problem – 1.3 million tons of cigarette butt waste is estimated to be produced globally every year. Mohajerani had first proposed the idea of using cigarette butts in bricks in 2008, and worked with an ashtray manufacturer and a Melbourne-based recycling program to develop the manufacturing method. Discarded cigarette butts were collected and disinfected, before being mixed (in varying quantities) into samples of clay-like sand. These were dried and fired at lower temperatures than used for traditional bricks. Because the properties of the cigarette-bricks varied depending on their composition, the results were a bit of a mixed bag – some were too weak to be used in construction, while others had good compressive strength.
The following year, Mohajerani and his team encapsulated cigarette butts in bitumen ($) and added them to traditional asphalt mixes used to build roads. Some of these samples worked really well, satisfying “…the requirements for light, medium and heavy traffic conditions.” The team also say that the addition of butts would lower the road’s thermal conductivity, reducing the urban heat island effect. I didn’t see any direct measurements of thermal conductivity in the paper, though, so I’ll reserve judgment on that particular claim.
As far as I can tell, Mohajerani’s work on cigarette butts continues apace, but it’s the latest (open-access) paper from his team that really caught my eye. Published in the January issue of Buildings, it reports on the use of ‘biosolids’ in bricks.
Biosolids are officially the dried, disinfected leftovers of wastewater (also known as sewage) treatment. Unofficially, it is concentrated human poop that has been very, very carefully-treated.
I’ve previously written all about the many processes wastewater goes through to get to this stage. Biosolids can be reused as fertilizer, or as I found at Crossness, they can even be turned into fuel to produce electricity, but often, biosolids are the ‘end point’ of wastewater treatment. As our populations grow, so too does the amount of poop we produce, and it’s beginning to cause problems. In Australia in 2017, 6% of biosolids produced – that’s just over 20,000 tonnes – were landfilled, stockpiled or discharged to the ocean. And the New York Times suggests that in the US, “…nearly a third of the 7 to 8 million tons of biosolids produced each year still end up in landfills.”
This was the context in which Abbas Mohajerani started exploring the potential for biosolids in construction projects. And just like with cigarette butts, he started with bricks.
You might not think of bricks as being all that common in modern construction practices, but approximately 1,500 billion of them are produced globally every year. That takes a lot of clay soil – I’d estimate enough to fill London’s Wembley Stadium 1,125 times*. So, in recent decades, there’s been a push to incorporate various waste products into bricks – everything from sawdust to rice husks – to reduce their environmental footprint. For their waste product, Mohajerani and his team went to two local wastewater treatment plants. There, they removed samples of biosolids from three existing stockpiles (excitingly named ETP22, WTP10 and WTP17–29), took them back to the lab, and mixed them with clay soil. Four mixes were used – 10, 15, 20 and 25% biosolids – and these were were dried and fired in the same manner as traditional bricks. The team tested the physical, chemical and mechanical properties of their poop-bricks, and compared them to traditional, 100% clay bricks.
Some of the results were hugely promising – for example, all of their biosolid-bricks showed lower thermal conductivity than the control bricks, which suggests they could help provide a level of thermal insulation. The researchers also found that, thanks to the high organic material content in the poop-bricks, they could be produced with less energy. In one case, bricks containing 25% of the WTP17–29 biosolids took almost half the amount of energy of the 0% biosolids bricks.
The organic content did have a downside, though. Because it tended to be burned off during the brick-firing process, the poop-bricks contained pores. This reduced their density and increased their shrinkage when compared to the control bricks. It also made them weaker, although poop-bricks containing 10% to 25% biosolids all showed compressive strengths above that considered acceptable for most low-rise buildings (35.5 and 12.04 MPa, versus a limit of 5 MPa).
In their life-cycle analysis of the brick production processes ($), Mohajerani and his colleagues describe the incorporation of biosolids into bricks as “environmentally favorable”, saying that it “…significantly reduces all negative environmental impacts when compared to control bricks, with the exception of water depletion.” The distance between the biolsolids stockpile and the brick manufacturing plant will also have an impact on the bricks’ ‘green’ potential – the closer they are, the better. “Otherwise,” said Mohajerani, speaking to the New York Times, “I don’t think it is likely on a large scale in the near future.”
So, while still far from coming to a house near you, I still think biosolid-bricks are a topic worth keeping an eye on. Another thing that caught my eye is some recent news from the University of Cape Town. There, engineers have been producing bricks made with human urine. These bricks don’t need to be fired in an incredibly hot oven, however – they form at room temperature via a natural process called microbial carbonate precipitation. First up, a particular strain of bacteria is added to loose sand, where it produces an enzyme called urease. This enzyme can break down urine, while also producing calcium carbonate that acts as a cement, solidifying the sand into bricks. The longer the bacteria is allowed to do its work, the stronger the resulting brick can be**.
So, next time you find yourself bored in the loo, think about the walls around you. One day, you might be helping to manufacture them.
* According to this site, it takes three cubic meters to produce 1000 bricks - this differs slightly from the numbers used in the paper, but I went with it. Anyway, scaling that up to our annual production of 1500 billion bricks, and we get to 4.5 x 10^12 liters of clay soil. The volume of Wembley Stadium has been estimated to be in the range of 4,000,000,000 liters. Divide one by the other, and you get 1125.
** Some of the researchers involved in the pee-brick project published a related paper ($) in August 2018, but they used synthetic urine, their bacteria was Sporosarcina pasteurii, and their products were columns, rather than bricks. As a result, I can’t be sure that their process has remained exactly the same, as the latest output as a press release rather than a paper. But if I find out, I’ll let you know!
Peter Behr, E&E News reporter Energywire: Monday, February 25, 2019
If the goals of the "Green New Deal" are a political minefield, so, too, are the most likely strategies for reaching its target of very high national levels of renewable energy output.
A shelf of authoritative studies under the Department of Energy's sponsorship dating back to George W. Bush's presidency define how to take a big step in that direction. Their answer — build a network of long-distance, ultra-high-voltage transmission lines to widely share wind and solar power across the continent's time zones.
But the strategy has faced overpowering headwinds of not-in-my-backyard opposition from residents and not-through-my-state political pushback. It's also been rare for Congress to put aside partisan politics and pass major legislation facilitating transmission corridors.
"If you're going to do a 100 percent clean energy portfolio — that is really 70 to 80 percent of electric power from renewables — I don't know how you avoid huge transmission builds," said Richard Sedano, president of the Regulatory Assistance Project, a nonprofit, nonpartisan think tank advocating a clean energy future. "It's either that or overbuilding the system so much with surplus renewables and batteries" that consumers will be hammered.
"I don't see how you have a national clean energy standard without significant federally mandated or incented transmission build cutting across regions of the country," added Travis Kavulla, a former Montana utility commissioner and president of the National Association of Regulatory Utility Commissioners, now with the R Street Institute in Washington, D.C.
DOE's National Renewable Energy Laboratory (NREL) issued the Eastern Wind Integration and Transmission Study in 2010, with strategies to provide 20 percent of the electricity supply east of the Rocky Mountains from wind energy by 2024. It counseled: "The integration of 20 percent wind energy is technically feasible, but will require significant expansion of the transmission infrastructure and system operational changes."
The most detailed of the analyses is NREL's ongoing Interconnections Seam Study based on massive computer simulations of power flows. It outlined one scenario with three ultra-high-voltage direct-current lines spanning the Rocky Mountains to the Mississippi River, with other new lines moving western power eastward.
Because grid operators can control the direction of power flows on direct-current lines, surplus afternoon solar power from the Southwest could stream into Southeastern states at dusk. Other lines could ship unused wind energy from the Great Plains into major cities in the Great Lakes and East Coast regions, or the other way into California.
By linking time zones, the variability of wind and solar power at different times of day becomes a strength, not a weakness, explained project leader Aaron Bloom.
The scenario would require an estimated $70 billion for new power lines and nearly $700 billion for new power plants through 2038. A carbon tax on power plant CO2 emissions would add $163 billion in costs. But considering savings on fuel and operations as wind and solar expand, overall benefits are three times greater than the costs, NREL estimated.
Coal plants nearly disappear by 2038 and wind and solar deliver 40 percent of the nation's electricity under the plan. Nuclear plant output stays at today's levels. Gas-powered generators increase their output slightly, providing essential quick ramping response to back up wind and solar power variations (Energywire, July 27, 2018).
An even more ambitious national grid has been proposed by Alexander "Sandy" MacDonald, former director of NOAA's Earth System Research Laboratory. MacDonald led a team that analyzed 50 billion pieces of weather data from across the country to demonstrate the value of a transmission "supergrid" built along interstate highways to move surplus wind and solar into distant urban centers.
The completed network would provide 38 percent of the nation's electricity output from wind and 17 percent from solar.
"Doing a robust electrical system isn't going to break us. It might raise the average electricity cost in the U.S. from 10 cents [per kilowatt-hour] to 11.5 cents and create 6 million permanent jobs," MacDonald said (Energywire, Dec. 19, 2017).
Clues from past fights
Outside these transmission studies is the reality of how Texas became the nation's biggest producer of wind power.
Transmission was a key factor in the addition of more than 20,000 megawatts of wind generation capacity in the Electric Reliability Council of Texas area, covering 90 percent of the state's households. Texas officials approved a plan in 2006 to build 2,400 miles of power lines to deliver wind generation to the state's population centers. That new grid is also delivering a growing output of solar power, an unexpected benefit, says Americans for a Clean Energy Grid.
But Texas' leap to wind power rested on a bipartisan political marriage of legislators. On one side were rural Texans in the wind-rich, thinly populated Panhandle region who were eager to earn royalties from wind farm developers. On the other were clean energy lawmakers representing metropolitan areas.
A consensus like that happened only once on the national level, when a Republican-controlled Congress passed the 2005 Energy Policy Act in reaction to the stunning Aug. 14, 2003, power failure that knocked out power to 50 million people between New York and Detroit.
The act was hailed as a monumental legislative undertaking, taking 213 recorded votes in the Senate and 373 in the House before the final compromise was hammered out and signed by Bush. It was supported by 90 percent of the House Republicans and 20 percent of the Democrats. It was saluted by supporters as the first comprehensive energy bill in 10 years — and that was more than 13 years ago.
The law set up a two-part process. First, the Energy Department would identify "National Interest Electric Transmission Corridors" where electric supply was threatened by overcrowded transmission lines. Congress also authorized the Federal Energy Regulatory Commission to step in and approve rights of way for newly proposed power lines within a corridor if a state government along the route withheld approval of a project for more than one year.
DOE drew two corridors running through the Mid-Atlantic and Southwestern states, prompting environmental lawsuits and protests that building more high-voltage lines would actually help coal-fired generation expand. Opponents said FERC's ability to intervene disappeared if a state rejected — not delayed — a project within the one-year time frame, and the 4th U.S. Circuit Court of Appeals agreed in 2009 by a 2-to-1 vote.
Then DOE's corridor designations were invalidated by the 9th U.S. Circuit Court of Appeals in 2011. By that time, a strong lobby was at work in Congress arguing for states' rights and pushing to block any new move for national transmission projects.
"If there is a national clean energy standard, that is necessarily a usurpation over state prerogatives of state resource mix," Kavulla said in an interview.
Similarly, a national plan now must overcome a deepening red state-blue state regional split over climate and clean energy policies.
New York provides an example of a state that is pinning its future on new wind and solar energy without a comparable commitment to moving the power, some analysts say.
In fiscal 2018, the New York State Energy Research and Development Authority pledged $360 million to attract $1 billion in private investment to fund 11 large-scale renewable energy projects in the state. By 2020, the energy authority expects to be funding 35 additional renewable projects. The state's plans include 800 MW of offshore wind.
The state has three transmission projects in the works to move renewable power from the Niagara Falls region to Albany and down to New York City. But the only mention of new projects in the 2019 budget of Gov. Andrew Cuomo (D) is a promise to transform the state's transmission grid "to a distributed smart grid network."
The Cuomo plan proposes to help keep power flowing when the sun is down and the wind is still by developing 1,500 MW of energy storage by 2025, a similar vision of how "Green New Deal" advocates look to battery backup for renewables.
Former Energy Secretary Ernest Moniz issued a report with energy historian Daniel Yergin this month arguing that the next round of large wind and solar power production over the next two decades requires more production of natural gas to fill in when those resources aren't available. Batteries don't have the capacity to do that in all circumstances, Moniz said in an interview.
"When we talk about [battery] storage, what we see being deployed today is generally storage for two, four, maybe six hours max. That's very important for handling the daily variations you have in solar and wind. But what about the week when the wind doesn't blow, for example, or in some cases the month?"
Raising renewable energy output over the next several decades will rest on current technologies, Moniz said. To go even further after that, carbon emissions from gas plants must eventually be captured, stored underground or absorbed in new chemical processes that haven't been invented yet, he added.
Releases and leaks of methane, the primary component of natural gas and a potent global warming agent, also must be controlled, Resources for the Future warns (Energywire, Feb. 8).
On the battery issue, Sen. Lisa Murkowski (R-Alaska), chairwoman of the Senate Energy and Natural Resources Committee, has also warned of U.S. dependence on imported minerals required for current lithium-based batteries used for utility storage and electric vehicles.
Simon Moores, managing director of Benchmark Mineral Intelligence, told the committee earlier this month that expected global increases in battery use will require an eightfold increase in lithium production worldwide and steep gains in output of cobalt, graphite and nickel. The United States imports 100 percent of its cobalt and 92 percent of its current lithium requirements, Moores said.
Researchers all over the country are striving to create technologies for better-performing, cheaper batteries with less vulnerable materials, but the chances for success are big question marks.
The "how to do it" political challenge for the "Green New Deal" is compounded by the most profound operating changes in the U.S. electric power sector since the era of Thomas Edison.
Jeffrey Taft, chief architect for grid transformation at the Pacific Northwest National Laboratory, has pointed out that the flow of electricity across the power grid and into Americans' homes and offices is getting inexorably faster and more complicated as rooftop "distributed" solar energy expands and "smart" appliances proliferate and alter supply and demand for power at minute levels.
"A future with potentially 30 per cent of the U.S. installed resource capacity coming from distributed resources and customer participation requires a different physical distribution system than exists today," Taft wrote in 2016.
Most of the current operators of the complex U.S. power networks grew up in a familiar system where operators had windows of five minutes or longer to react to a sudden outage on a big power line or an unexpected nuclear plant scram.
To manage the much more complex power flows of the future, with increasing amounts of fluctuating wind and solar power, computers must react to changes in grid conditions 30 to 60 times a second in some cases, much too fast for humans, Taft wrote.
Even a 30 percent renewable power output throughout the synchronized eastern grid would require instantaneous management of power flows from Montreal to Miami and as far west as New Mexico, DOE experts have noted.
Creating operating controls able to handle huge data volumes safe from hacking attacks, with embedded artificial intelligence and machine learning, is another huge challenge of innovation and investment, charted in studies by the Electric Power Research Institute.
Where will the investment come from?
Some "Green New Deal" advocates say the federal government must take the lead, as it did in fighting unemployment in the Great Depression and Germany and Japan in World War II.
Moniz argues that the formula for success in transforming the grid has to recognize the unique, essential roles that utilities and private-sector investors will play in remaking a power grid with thousands of utilities and multiple thousands of power plants and miles of power lines.
Putting the nation's innovation machine into overdrive in this cause requires both existing power companies and upstart inventors, he said.
"The large incumbents, that's where the customer bases are, where the supply chains have been built. They understand their public good responsibilities — things like universal service," Moniz said. They live with regulations created to protect the public interest.
"Disrupters come in with very, very different business models, but they are going to have to fit into that system of reliable service, as well," he said.
Sedano, a veteran former utility regulator, said: "'Social license' is a term we sometimes use when we're trying to sound very intelligent. It means whether the public is behind the government in what it is trying to do.
"We haven't even tried to get that from the public" in the debate over the climate threat, he said.
"So with the question of whether it's possible, of course it is," he said. "Is it likely? That's a better question."
FEBRUARY 14, 2019 / 3:29 PM
WASHINGTON (Reuters) - The U.S. Transportation Department on Thursday issued final rules requiring railroads to develop oil spill response plans and to disclose details of shipments to states and tribal governments after a series of high-profile incidents.
The department’s Pipeline and Hazardous Materials Safety Administration said the rules, first proposed in July 2016, would “improve oil spill response readiness and mitigate effects of rail accidents and incidents involving petroleum oil and high-hazard flammable trains.”
The new regulation “is necessary due to expansion in U.S. energy production having led to significant challenges for the country’s transportation system,” the agency added.
The new rules apply to High Hazard Flammable Trains transporting petroleum oil in a block of 20 or more loaded tank cars and trains that have a total of 35 loaded petroleum oil tank cars. They require railroads to establish geographic response zones and ensure that personnel and equipment are staged and prepared to respond in the event of an accident.
The new rules take affect in August and come after regulators reviewed more than a dozen oil car derailments from 2013 through 2016.
The rules partially address recommendations made by the National Transportation Safety Board after a 2013 crude-by-rail derailment killed 47 people in the town of Lac Megantic in Quebec and released 1.6 million gallons of crude oil.
“This new rule will make the transport of energy products by railroad safer,” said Transportation Secretary Elaine Chao in a statement.
With the natural gas fracking boom, plastics production is spreading in the Ohio River Valley. But at what cost to health and climate?
By James Bruggers
Feb 25, 2019
MONACA, Pennsylvania — Along the banks of the Ohio River here, thousands of workers are assembling the region's first ethane cracker plant. It's a conspicuous symbol of a petrochemical and plastics future looming across the Appalachian region.
More than 70 construction cranes tower over hundreds of acres where zinc was smelted for nearly a century. In a year or two, Shell Polymers, part of the global energy company Royal Dutch Shell, plans to turn what's called "wet gas" into plastic pellets that can be used to make a myriad of products, from bottles to car parts.
Two Asian companies could also announce any day that they plan to invest as much as $6 billion in a similar plant in Ohio. There's a third plastics plant proposed for West Virginia.
With little notice nationally, a new petrochemical and plastics manufacturing hub may be taking shape along 300 miles of the upper reaches of the Ohio River, from outside Pittsburgh southwest to Ohio, West Virginia and Kentucky. It would be fueled by a natural gas boom brought on by more than a decade of hydraulic fracturing, or fracking, a drilling process that has already dramatically altered the nation's energy landscape—and helped cripple coal.
But there's a climate price to be paid. Planet-warming greenhouse gas emissions from the Shell plant alone would more or less wipe out all the reductions in carbon dioxide that Pittsburgh, just 25 miles away, is planning to achieve by 2030. Drilling for natural gas leaks methane, a potent climate pollutant; and oil consumption for petrochemicals and plastics may account for half the global growth in petroleum demand between now and 2050.
Despite the climate and environmental risks, state and business leaders and the Trump administration are promoting plastics and petrochemical development as the next big thing, more than three decades after the region's steel industry collapsed and as Appalachian coal mining slumps.
"We have been digging our way out of a very deep hole for decades," said Jack Manning, president and executive director of the Beaver County Chamber of Commerce.
"When Shell came along with a $6-to-$7 billion investment ... we were in the right spot at the right time," he said.
Everyone wants jobs and economic growth, said Cat Lodge, who works with communities in the Ohio River Valley affected by the shale gas industry for the Environmental Integrity Project, a national environmental group. But not everyone wants them to be based on another form of polluting, fossil fuels, she said.
"While the rest of the world is dealing with global warming, Pennsylvania and Ohio and West Virginia are embracing developing plastics, and that just appalls me," Lodge says. "It's just not something I see as the future and unfortunately that seems to be the push to make that the future. And that's upsetting."
Lodge and her husband moved from Pittsburgh to the countryside 18 years ago in search of fresh air and open land. They have a small farm in a corner of rural western Pennsylvania, where winding roads trace the contours of Appalachian hills and a stark transition fueled by a shale gas boom is underway.
"We still love it, but little by little, and quickly over the last several years, we have become totally surrounded by the oil and gas industry," she said.
Rising Demand, but Also Pushback on Plastics
The natural gas that's pulled from deep underground in the Utica and Marcellus shale formations has done more than outcompete coal for electricity generation.
Drilling companies have also extracted a lot of natural gas liquids, particularly ethane, also called wet gas. It's used to produce ethylene, which then gets turned into plastics, providing an additional revenue stream for the oil and gas industry. It's the industry's latest play, and it comes at a time when industry analysts and the federal government say the demand for plastics is skyrocketing.
"These materials are hooked into just about every part of the economy, from housing to electronics to packaging," said Dave Witte, a senior vice president at IHS Markit, a global data and information service. "Today, the world needs six of these plants to be built every year to keep up with demand growth."
IHS Markit calls the Appalachian or upper Ohio River region "the Shale Crescent." Last year, it reported that the region's gas supplies could support as many as five large cracker plants, like the one Shell is building. The plants "crack" ethane molecules to make ethylene and polyethylene resin pellets and would be in close proximity to a number of manufacturers that use those products to make everything from paints to plastic bags.
IHS does see some headwinds, including an international backlash against plastics. It published a report last summer that found that worldwide pressure to reduce plastic use and increase recycling was one of the biggest potential disruptors for the plastics industry and was "putting future plastics resin demand and billions of dollars of industry investments at risk."
The oil and gas industry might find themselves with stranded assets, needing to abandon Ohio River valley communities, said Lisa Graves-Marcucci, a Pennsylvania-based organizer for the Environmental Integrity Project.
"Do they really care," she asked, "if they can make money for the first 10 years or 20 years of their operation, but then plastic goes away in the world? What happens to the communities that are left behind?"
She said she is also worried about such a major investment in oil and gas as the world grapples with the effects of climate change.
Visions of an Appalachian Plastics Hub
The idea for a plastics hub in Appalachia got a lift in December with a report to Congress from the U.S. Department of Energy. It described a proposal for the development of regional underground storage of ethane along or underneath the upper Ohio River.
Storage is needed to help provide a steady and reliable stream of ethane to ethane cracking plants, and it would be important for the development of a regional petrochemical complex in the upper Ohio River valley, the report concluded.
Storage is another growing part of the plastics pipeline as natural gas is turned into natural gas liquids and eventually into plastics. Credit: James Bruggers
A West Virginia business, Appalachia Development Group LLC, has proposed developing storage for ethane, possibly in mined salt or limestone caverns deep underground. It's in the second phase of an application process for $1.9 billion in loan guarantees from the Department of Energy for the project, according to the department.
"We have sites of interest in Pennsylvania, Ohio and West Virginia," said Jamie Altman, a representative of Appalachia Development Group. "We are aggressively pursuing private capital."
The Energy Department is thinking big, too.
Its report projects ethane production in the Appalachian basin would continue rapid growth through 2025 to a total of 640,000 barrels per day, more than 20 times greater than five years ago. By 2050, the agency said ethane production in the region is projected to reach 950,000 barrels per day.
China Energy signed an agreement with West Virginia in 2017 to potentially invest $84 billion in shale gas development and chemical manufacturing projects in the state. Late in January, West Virginia's development director, Mike Graney, told state senators that China Energy was looking at three undisclosed "energy and petrochemical" projects. An announcement could be made later this year, he said, though President Donald Trump's trade war with China was causing delays.
Other experts see a natural gas industry that's subject to booms and busts and question whether the region is headed down another unsustainable path, like coal.
"We are less optimistic than the industry that this will really boom out," said Cathy Kunkel, an energy analyst with Institute for Energy Economics and Financial Analysis, an environmental think tank that just published a report detailing how the natural gas industry in West Virginia hasn't lived up to earlier expectations for jobs and tax revenue.
There is a huge amount of international competition for plastic production, she said. "All of the major oil exporting countries in the Middle East are talking about making massive investments in petrochemicals over the next five years or so," she said. "That contains the risk that you will be exporting into a market that would be oversaturated with products."
IHS Markit, a global data and information service, published a report last summer that said worldwide pressure to reduce plastic use and increase recycling was one of the biggest potential disruptors for the plastics industry and was “putting future plastics resin demand and billions of dollars of industry investments at risk.” Credit: Rosemary Calvert via Getty Images
The Energy Department report also cited "security and supply diversity" as a benefit of developing a new plastics and petrochemicals hub in Appalachia. The bulk of U.S. plastics and petrochemical plants are currently along the Gulf Coast, where they face supply disruptions caused by hurricanes, it said.
Vivian Stockman, the interim director of the Ohio Valley Environmental Coalition based in West Virginia, called that a "hugely ironic" justification for an Appalachian plastics hub, since science is showing that global warming can intensify hurricanes.
Economic Benefits, with Health Concerns
The Shell plant was lured to Beaver County by Pennsylvania officials with some $1.65 billion in tax incentives. It's scheduled to open "early next decade," company spokesman Ray Fisher said. This year, as many as 6,000 construction workers will be working on it, and Shell says it plans 600 permanent jobs to run the plant.
It's in Potter Township, a community with fewer than 700 residents. Rebecca Matsco, who chairs the township commission that gave Shell the local zoning permits, said she sees the plastics plant as an industrial upgrade from a dirty zinc smelter that had stood on the property for about a century, and that Shell cleaned up.
"It had become a real environmental burden, and we do feel like Shell has been a real partner in lifting that burden," Matsco said.
Others, however, see the cracker plant as its own environmental burden—a new source of emissions that cause lung-damaging smog and heat the planet.
Gas processing plants like this MarkWest plant in Butler County, Pennsylvania, separate natural gas liquids from natural gas. Credit: James Bruggers
People in Pittsburgh were sad to see so much of the steel industry go, but they don't miss the dirty skies, said Graves-Marcucci, an Allegheny County resident. The economic resurgence that followed was centered around health care, academic institutions and cleaner industries, she said.
Pittsburgh has been brushing off its sooty steel city past and is now pledging to slash its carbon emissions. But the Shell cracker plant alone, just 25 miles away, would emit 2.25 million tons of carbon dioxide a year, effectively wiping out nearly all the gains in carbon reduction that Pittsburgh plans to achieve by 2030, said Grant Ervin, Pittsburgh's chief resilience officer.
The Shell plant will also emit as much smog-forming pollution as 36,000 cars driving 12,000 miles year; that would equate to about a 25 percent increase in the number of cars in Beaver County, said James Fabisiak, an associate professor and director of the Center for Healthy Environments and Communities at the University of Pittsburgh.
The environmental and health threats will only increase with a plastics hub buildout, and no regulators are looking at those potential cumulative impacts, Graves-Marcucci said.
Two More Communities Could Get Cracker Plants
About 70 miles southeast of the Shell plant, another community waits for news about what could be the region's second major ethane cracker plant, in Belmont County, Ohio.
PTT Global Chemical, based in Thailand, and its Korean partner, Daelim Industrial Co., Ltd., could announce any day whether they intend to proceed with an ethane cracker plant after getting state permits in late December. That plant would be along a section of the Ohio River in Belmont County where hulking old manufacturing plants and shuttered businesses paint the very picture of the nation's Rust Belt.
Bellaire, Ohio, is a few miles from another proposed cracker plant. Belmont County officials are waiting to hear whether PTT Global Chemical, based in Thailand, and its Korean partner are going to invest $6 billion to build the facility. Credit: James Bruggers
"Do you know what the biggest export is from Belmont County? Our youth," said Larry Merry, an economic development officer with the Belmont County Port Authority, overlooking the Ohio River bottomlands where the cracker plant would be constructed on the cleared-away site of a former coal-fired power plant.
Merry, who has been working to secure the plastics plant, called the oil and gas industry "a great employer for us that's provided a lot of investment that's helped."
But it's not fully made up for losses in steel and coal, and this cracker plant "is about jobs and opportunities so people can make the most of their lives," he said.
He brushed aside any concerns about climate change or too much plastics. "How are we going to live and have products? Until you come up with a solution, don't expect the world to shut down," he said.
A spokesman for PTT American said he could not say when an investment decision will be made.
A third potential cracker plant is planned for Wood County, West Virginia, but it has been delayed because of unspecified "challenges" with its parent company, the Department of Energy report said.
"It just blows my mind that there could be three or four cracker plants, or even one," said Steve White, a western Pennsylvania builder. "That's some serious investment. It just shows you where everything is headed and how much development is coming."
White is also a pilot, and he said he has observed from the cabin of a Cessna 3,000 feet aloft the spread of oil wells, pipelines and processing plants across shale drilling zones in Pennsylvania, Ohio and West Virginia, slicing up farms and encroaching on homes, schools and businesses.
"We are just in the way," he said.
Benjamin Storrow and Edward Klump, E&E News reporters Energywire: Thursday, February 28, 2019
The fate of two coal-fired units at the San Juan Generating Station is a subject of debate in New Mexico. PNM Resources
President Trump has tried and failed to revive the coal industry.
Now, a series of coal-reliant states and communities are stepping up to try to save the plants and mines that have long powered their economies.
The moves come after a near-record number of coal retirements in 2018 and amid warning signs that several of the country's largest coal facilities may be in danger of closing (Climatewire, Jan. 2).
The trend is particularly evident in the western United States, where most utilities operate as regulated monopolies and decisions over power plants' fate are subject to the approval of state regulators.
In Montana, lawmakers are considering a bill that would allow a local utility to increase its ownership stake in the Colstrip power plant, in a bid to prolong the life of one of the West's largest coal facilities.
Wyoming legislators have approved a bill requiring utilities to market their coal plants to potential buyers before shutting them down (Climatewire, Feb. 15). And in New Mexico, a small city is turning to a New York-based holding company in an attempt to prevent closure of San Juan Generating Station, one of the state's largest coal plants.
"The decline in coal is still largely market-based because of low gas prices. But you've also seen this shift away from utilities for various reasons, like state-level policies and EPA policies like ELG [effluent limitations guidelines] or the coal ash rule," said Joe Aldina, director of coal analytics at S&P Global Platts. "In the absence of things getting done at the federal level, it makes sense for some states to move forward with policy."
'The divide that is happening'
Coal plants have long served as the backbone of the West's electric grid but are often the single largest source of carbon pollution in their respective states.
Colstrip alone is a roughly 2-gigawatt facility, capable of producing enough electricity to power 1.5 million homes from its perch on Montana's eastern prairie. It also posted carbon emissions of 14.4 million tons in 2016, or almost half of the 30.5 million tons emitted in all of Montana that year, according to federal figures.
That has prompted a rift in recent years between coal-reliant communities and their increasingly green consumers in neighboring states.
"It is the divide that is happening in our country," said Anne Hedges, deputy director of the Montana Environmental Information Center, which has long argued for moving away from coal to renewables. "It is the old-school people who don't believe in climate change, who like revenue, who believe the only costs are the subsidies to wind and solar, versus those who believe in climate change, those who believe in market forces and those who believe in reasonable utility bills."
Washington state utilities have long been a principal buyer of electricity generated at Colstrip.
Lawmakers in Olympia, the Washington capital, are now advancing a bill to completely decarbonize the state's electricity sector and pull the plug on all coal-generated power by 2025 (Climatewire, Feb. 12).
PacifiCorp, a power company serving six Western states, is also a Colstrip owner that serves part of Washington. But the Portland-based utility also finds itself caught between climate-conscious Oregon, where lawmakers are considering a carbon cap-and-trade bill and already passed a law to speed the transition from coal, and Wyoming, where much of its power is generated.
The divide is also on display within New Mexico, where a push from newly elected Gov. Michelle Lujan Grisham (D) to bolster renewables and slash emissions has substantial support. But provisions that could affect legacy generation have met resistance from coal-reliant communities like Farmington, which is near the San Juan plant and the Four Corners power plant.
"I want children to be able to go to school here in New Mexico and graduate from college and want to stay in New Mexico," said Farmington Mayor Nate Duckett. "And right now, they're leaving in droves."
The city recently signed a letter of intent with Acme Equities LLC to allow the parties to work toward an agreement to keep the San Juan operating beyond a potential 2022 retirement date.
Closing the 847-megawatt plant could affect hundreds of workers combined at the plant and a nearby mine, though some could retire or move to other jobs or stay for a while. City estimates project that the proposed deal could potentially save 1,600 jobs, although that appears to include direct and indirect positions. The city's efforts to seek a new plant owner stem from Farmington's current position as a partial owner.
Farmington's moves caught the plant's majority owner off guard.
Public Service Company of New Mexico (PNM), a regulated utility with plans to be coal-free by the end of 2031, issued a statement saying it was "surprised" by the announcement from Farmington and Acme. The utility said it would continue to work with other San Juan owners "in accordance with their agreements to plan for a shutdown of the coal plant in 2022, subject to necessary regulatory approvals."
Wyoming lawmakers were concerned PacifiCorp might take a similar stance regarding its four power plants in the state after the company released a study last year showing that most of its coal plants were uneconomical (Climatewire, Dec. 5).
The resulting bill would require the company to market a plant it attempts to retire early. If a buyer can be found, the utility would be required to buy back the power from its former plant. The power company would be able to earn a rate of return on the contract, provided it didn't raise costs for Wyoming ratepayers.
Critics are nonetheless doubtful that the measure won't lead to a spike in rates. It will ultimately be up to Wyoming regulators to decide if the contract comes at an additional cost, they note.
Jason Shogren, an economics professor at the University of Wyoming, likened the state and local efforts to social programs like Medicare and Medicaid, state-sponsored efforts to alleviate fallout from market trends. He argued that Wyoming consumers are likely to pay more for energy to blunt the economic devastation wrought by coal's decline.
"These transitions — the Rust Belt was a transition, Detroit is a transition — don't offer a buffer because the market is trying to find the least-cost option," he said. "The question is, do you want to chip in the form of higher energy costs to help your friend and neighbor?"
It's a persistent concern. In New Mexico, the Sierra Club recently highlighted an analysis conducted for it by Synapse Energy Economics suggesting renewables, batteries and energy efficiency could be the least-expensive option for PNM to replace coal-fired power from San Juan.
"We have to keep up with global economics," said Camilla Feibelman, director of the Rio Grande Chapter of the Sierra Club. "We have to stop global climate change. And we have to help New Mexico transition into the renewable future."
Similar concerns have been raised in Montana, where lawmakers are debating a bill that would enable NorthWestern Energy to increase its ownership stake in Colstrip.
The power company is one of six with an ownership interest in Colstrip. Four of those owners serve out-of-state customers and now appear set for an early exit from the facility, as part of efforts to satisfy new or proposed laws in Oregon and Washington.
That leaves two utilities. One is Talen Energy, a merchant electricity provider and the plant's operator. The other is NorthWestern, a local utility with a 30 percent stake in one of the plant's four units.
The bill under consideration by Montana legislators would allow NorthWestern to buy its peers' shares for $1, but obligate Montana customers to pay for all future costs associated with the plant.
The state's Consumer Counsel has objected to that provision. In testimony to lawmakers, Jason Brown, an official in the counsel's office, warned that it could shift all the future risks associated with the plant to Montana ratepayers.
NorthWestern supports the bill, saying there is a growing need for reliable sources of power in the face of coal retirements across the West. Jo Dee Black, a spokeswoman for the utility, said it would only buy enough power to serve its reliability needs.
The bill faces an uncertain future. The proposal has divided Democrats and Republicans alike, and it faces an uncertain fate before Gov. Steve Bullock, a Democrat who has been supportive of efforts to help Colstrip in the past. Bullock is now mulling a presidential run and this year's proposal. An attempt to save a large coal plant likely would not play well with Democratic primary voters, observers said.
A Bullock spokeswoman did not return multiple requests for comment yesterday.
Farmington's efforts in New Mexico to keep San Juan are similarly fraught.
State lawmakers are now considering the "Energy Transition Act," a bill aimed at greening the state's grid and supported by Lujan Grisham. For an investor-owned utility such as PNM, the renewable standard would climb to 50 percent by 2030 and 80 percent by 2040. The legislation also could enable funding for worker assistance and help shape potential replacement resources as well as bonds that could be issued in light of the planned closing of San Juan.
PNM has signaled it is supportive of the legislation.
"The Energy Transition Act takes PNM out of our comfort zone, but we also know the change New Mexico needs and our customers have asked for is far too important to let that prevent us from stepping up to this challenge," the company said in a statement.
But PNM Resources Inc., PNM's parent, appeared to downplay the plant rescue attempt in a conference call with investors yesterday. Chief Financial Officer Chuck Eldred said the company would "in no way accept" a power purchase agreement or any continued "involvement after 2022 for San Juan."
One financial analyst drew laughter by saying he had "the coyote" in his head when he heard Acme was the name of Farmington's partner. That was a reference to a brand that popped up in old cartoons.
"And for you younger folks, that's Wile E. Coyote," said CEO Pat Vincent-Collawn of PNM Resources. "So look it up."
Efforts to obtain comment from Acme this week about its San Juan plans weren't successful. The letter of intent includes a nonbinding appendix with possible closing conditions. One of those is having certain power purchase agreements. Another raises the idea of a tax holiday.
Ultimately, only Wyoming's efforts may prove successful in providing some assistance to coal. S.F. 159 has already passed the House and Senate, and Gov. Mark Gordon (R) has indicated support for the measure.
Lingering over all the rescue attempts is the cautionary tale of Navajo Generating Station.
In 2017, four utilities with a stake in the 2.2-GW plant announced they were closing the coal behemoth on Arizona's high desert at the end of 2019. Peabody Energy, the plant's coal supplier, mounted a furious campaign to save the facility. The Trump administration pledged support. Congress took up an unsuccessful bid to help the plant.
Today, a tribal energy company remains the plant's last hope for staving off closure (Climatewire, Feb. 7).
But even Peabody has grown pessimistic. Earlier this month, the company told investors it was planning to retire the mine serving the massive power plant.
A woman in Gaoshan, China, walks through the entrance to her farmhouse, which was damaged during three recent earthquakes in surrounding Rong County.
By Steven Lee Myers
March 8, 2019
GAOSHAN, China — The first earthquake struck this small farming village in Sichuan Province before dawn on Feb. 24. There were two more the next day.
Sichuan is naturally prone to earthquakes, including a major one in 2008 that killed nearly 70,000 people, but to the rattled villagers of Gaoshan, the cause of these tremors was human-made.
“The drilling,” Yu Zhenghua said as she tearfully surveyed her damaged home, still officially uninhabitable five days later.
The drilling Ms. Yu referred to was hydraulic fracturing, or fracking. The technology, which has revolutionized the production of natural gas and oil in the United States, has created a boom in China, too, and with it many of the controversies that have dogged the practice elsewhere.
In the hours after the quakes, thousands of residents converged outside the main government building in Rong County to protest widespread fracking in the rolling hills and valleys here now yellowing with the flowering of rapeseed.
A shale gas drilling station in Rong County. In the last decade, the China National Petroleum Corporation alone has invested $4 billion in fracking shale gas in the Sichuan Basin.s
The protesters jostled with security guards along a sliding metal gate and dispersed only after officials announced they had suspended fracking operations of a regional subsidiary of China National Petroleum Corporation, the country’s largest oil and gas producer.
China, like the United States and other countries, has embraced the fracking revolution in hopes of weaning itself from its dependence on foreign energy sources. But the public fury that unexpectedly boiled over in Gaoshan underscores the social and environmental challenges the country must overcome — even in a tightly controlled political system.
“Sichuan is a major earthquake zone, so there is clearly a risk,” said Philip Andrews-Speed, a geologist with the Energy Studies Institute of the National University of Singapore. He added that the government should conduct a thorough and transparent study of causes of the temblors in order to reassure those who live nearby.
The three earthquakes killed two people and wounded 13. More than 20,000 homes in three villages suffered damage and nine collapsed completely, according to a statement by the county. About 1,600 people were displaced, forced to move in with relatives or to live temporarily in 470 blue tents distributed by the authorities.
The suspension of operations — which remains in effect — stilled 15 sites in the area affected by the quakes, pending a survey by officials from Sichuan Province, according to an official for the Rong County government, Huang Jing.
It has not affected fracking operations elsewhere in the region, a center of the fracking boom. China National Petroleum alone has invested $4 billion in fracking shale gas in the Sichuan Basin over the last decade, Xinhua, the state-run news agency, reported last November.
China National Petroleum declined to comment on the issue. The nation’s other major oil and gas producer, Sinopec, which also operates in the province, also declined to comment.
The website for the regional subsidiary of China National Petroleum, however, later shared a blog post suggesting that the suspension in Rong County was unnecessary. Compared to the loss of economic development, the post said, seismic activity caused by drilling for shale gas was “the lesser of two evils.”
For many residents of the area, that choice is far from clear.
Wu Shirong was in the shower when the second quake struck on Feb. 25. “This was the scariest one,” he said, though by magnitude the one that followed four and a half hours later was the strongest of the three, measuring 4.9 on the Richter scale, according to China’s geological service.
Cracks spread across the ceiling of his house, which was declared unsafe. He is now living in one of the tents with his in-laws in the driveway outside. “Where can I sleep if I don’t sleep here?” he said.
Yu Zhenghua shows visitors cracks that have appeared along the walls of her house. “My house was built only 12 years ago,” she said, “and now it is like this.”
Yu Zhenghua showing visitors cracks that have appeared along the walls of her house. “My house was built only 12 years ago,” she said, “and now it is like this.”CreditGilles Sabrie for The New York Times
Ms. Yu’s house appeared more badly damaged. The retaining wall that holds up her property along a steep hillside buckled and seemed on the verge of collapse. Deep cracks gouged the stuccoed brick walls of the two-story house her son built with his earnings. Her son and daughter-in-law, like many Chinese, moved to a city in the southern region of Guangxi for work.
“My house was built only 12 years ago,” she said, “and now it is like this.”
The local authorities have promised to repair the damage. They have not acknowledged any link between the tremors and fracking, which involves injecting chemicals and sand at high pressure into wells drilled in shale formations to break up the rock and release gas and oil.
“The relationship between earthquakes and local industrial exploitation cannot be determined,” the county government wrote on its website.
In China, as in other countries, the link remains the subject of debate.
Supporters of the technology claim there is no direct connection, though studies have shown otherwise. Fracking and related activity increases pressure underground, which can cause existing faults to slip.
Fracking has nonetheless vastly expanded the natural resources that can be recovered underground, making the technology irresistible to China, which is highly dependent on energy imports, as the United States once was.
Farmers in Gaoshan washing vegetables near the tent where they are living temporarily. About 1,600 people have been displaced by the earthquakes.
China sits on top of the largest technically recoverable reserves of shale gas in the world, according to the United States Energy Information Administration, and the government has set ambitious goals for expanding production in the years ahead.
There are many reasons. In addition to energy independence, the increased use of natural gas could help China meet its international commitments to reduce emissions that contribute to climate change. The transition to gas has already helped reduce pollution — at least in the northeast, where the authorities have phased out the use of coal to heat homes.
China’s hopes to replicate the American fracking boom, however, have hit significant stumbling blocks. Shale deposits tend to be deeper here — 3.5 kilometers in Rong County, or more than two miles down. That makes them more expensive to tap. The process also requires a lot of water, which is scarce in some regions.
Perhaps most importantly, China is much more densely populated, and many of its best shale deposits are in crowded places. Those include Sichuan, which has a population of more than 80 million. Since reserves were discovered there in 2009, scores of fracking sites have appeared — with virtually no public input, given the authoritarian nature of the government.
The 15 platforms in Rong County have 39 separate wells being drilled or already in operation. They have appeared within a 10-kilometer circle around the county’s main town, surrounded by fencing and filled with trucks and equipment.
Because of the heavy equipment involved, the roads to the sites are rutted and, when it rains, nearly impassable because of mud. Long black tubes extend across once scenic valleys and terraced fields. As in other places, Rong County residents say they noticed an increase in tremors and quakes after production began.
The latest appeared to inflame discontent that had long been simmering.
A video of the protest outside the government building was first published by Radio Free Asia. Local residents also voiced complaints on Weibo, a social media site like Twitter. “What on earth do you want from us?” one woman wrote. “Will you take the matter seriously only when there’s loss of life?”
Video of Protest:
Birthstrike is a movement of women who don’t want to bring children into a warming world.
BY ADELE PETERS
Shortly after Blythe Pepino decided that she wanted to have children, she realized that the idea of bringing kids into a world affected by climate change was making her uncomfortable. “It had only been a couple of years that I’d felt the desire to have kids because I’d met my partner, whom I’m deeply in love with,” she says. “I got to that point in my life where a lot of my friends were having kids and it suddenly seemed like a beautiful idea to me. And that happened to coexist with my becoming much more aware of the climate challenge.”
Pepino, a 29-year-old musician, started bringing up the idea with other women in environmental advocacy groups. “I said, ‘You’re around my age: What are you thinking about kids?'” she says. “I was able to ask that question to a few people, and I was really surprised that there were a lot of people who were saying, ‘I haven’t talked about this to anyone, but I’m really questioning it.'”
She started a Facebook group called #Birthstrike to make the idea public; within a few days, 90 women had joined. While some may be partly motivated by the fact that the choice limits carbon emissions–one recent study found that not having children is one of the most effective ways to limit your personal carbon footprint–the underlying motivation was wanting to avoid bringing a child into a world where they may suffer. “Our main focus is the fact that we’re too afraid really to bring a kid into that future,” Pepino says. After Alexandra Ocasio-Cortez recently suggested that some young Americans feel the same way, a survey found that 38% of 18- to 29-year-old Americans believe that a couple should consider the risks of climate change before deciding to have kids. “I can’t have a child unless I am seriously, seriously convinced that we are on a different path,” one member of Birthstrike, 22-year-old Alice Brown, says in a video about the group.
The group doesn’t suggest that others need to make the same choice, and members sign a declaration saying that they “stand in compassionate solidarity” with parents and don’t endorse population control as a solution for climate change. Just as some women in the middle of the 20th century might have decided to have children despite understanding the threat of nuclear war, Pepino says she understands why some people who fully understand the risks of climate change may still decide to have children. (The author of The Unhabitable Earth, a recent book that outlines catastrophic climate risks in painful detail, is among those who recently decided to have children.)
Pepino is not optimistic that the world will make the necessary changes to address climate change in time, “but that doesn’t stop me from trying to make it happen,” she says. “I’m amazed at people like Greta Thunberg–she’s so young and so she’s fighting so hard. And humans have made incredible turnabouts in the past.” Choosing not to have children also gives her more freedom to work as an activist. Still, she says, the scale of the challenge now is unprecedented, and the world hasn’t ever made this type of radical change in the past.
Being public with such an emotional decision is one way to underline the urgency of the need for that change. “This is my way of saying, come on, guys, I’m almost out of hope,” she says. “So let’s put everything into this, because I feel it so severely that I’m not having children, and neither are these other people.”
The bills would allow the state to sue protesters — and their supporters — and use the money for pipeline-related costs.
Elena Saavedra Buckley March 6, 2019 LIKE TWEET EMAIL PRINT
Update: On Thursday, the South Dakota Legislature passed SB 189 and SB 190, the riot-boosting and funding bills in question. The votes were largely partisan, with Democrats making up most of the opposition. The bills are now headed to Gov. Kristi Noem’s desk to sign in the coming days. Due to the bills' emergency clauses, once signed, they will immediately become law.
Two bills allowing the state of South Dakota to prosecute pipeline demonstrators and their funders — and use money from damages to fund law enforcement and pipeline costs — moved to the Senate floor on Wednesday. Introduced by South Dakota Gov. Kristi Noem, R, on Monday, the bills would protect the 1,179-mile-long Keystone XL pipeline, a planned TransCanada project that would slice through the state carrying 830,000 barrels of crude oil a day. The Great Plains Tribal Chairman’s Association has opposed the bills, representing the leaders of 16 tribes in the region, none of whom were consulted for the legislation.
Anti-protest laws exist in other states — bills were passed in North Dakota after the 2016 #NoDAPL demonstrations — but the South Dakota package, introduced in the legislative session’s final days, casts a much larger net over who can be legally pursued by authorities. It creates financial punishments for “riot-boosting,” a new term defining the actions both of protesters who participate in “riots” as well as anyone who “does not personally participate in any riot but directs, advises, encourages, or solicits other persons participating in the riot to acts of force or violence.” The legislation would also establish a fund — with the acronym “PEACE” — to address extraordinary expenses for the state and its counties from the pipeline, including protests, with the funds collected from “riot-boosters.”
Conflict around the bills simmered during a joint appropriations committee meeting on March 6, centering on the undefined reach of a “riot” and what that word truly means. Under South Dakota law, a riot is a felony, with “riot” defined as three or more people executing or threatening force or violence without the authority of law. (Noem said she considered the #NoDAPL demonstrations to be riots.) The “riot-boosting” law extends the state’s authority to sue rioters and any number of their supporters, and allows it to charge anyone who “solicits or compensates any other person to commit an unlawful act or to be arrested” three times the damages sustained. It’s unclear whether “encouraging” a “riot” could cover acts like GoFundMe donations or posting bond for arrested demonstrators.
South Dakota Rep. Peri Pourier, D., holds up a poster with a poem as Faith Spotted Eagle testifies to the joint appropriations committee. Spotted Eagle claims that "these bills are an attempt to legislate by ambush."
Republican proponents insisted the bills were necessary to help counties and the state address fees from unlawful assembly, citing the millions North Dakota incurred in damages in the wake of the #NoDAPL demonstrations. Supporters also called the legislation urgent, noting the Keystone XL could begin construction as early as summer 2019, though others noted that the pipeline’s outstanding permit requests negated that chance.
Opponents insist that the laws would effectively snuff out constitutional protests such as those at Standing Rock.
The riot-boosting bill “has the capacity to make a criminal of any citizen, not just big donors or supporters,” said Crow Creek Chairman Lester Thompson, Jr. “In a world of social media, how do you determine who is a riot-booster or just a concerned citizen?”
Thompson was the only tribal chairman who spoke at the hearing; many tribal leaders from South Dakota were in Washington, D.C., testifying about federal appropriations for the Bureau of Indian Affairs and the Indian Health Service.
During a press conference, Gov. Noem said the bills were meant to target out-of-state agitators, alleging that George Soros, a philanthropist and common right-wing target, was one of them. But there is nothing stopping the laws from impacting tribal nations or citizens. Unlike the tribes, TransCanada had a seat at the table as the bills took form, Noem said. The company will also contribute to the PEACE fund.
“It seems like this Canadian company is calling the shots here,” said Gay Kingman, executive director of the Great Plains Tribal Chairman’s Association, at the hearing. “Transparency dictates that all stakeholders should have a seat at the table, and anything less is paternalism that tribes have experienced for years.”
Members of the South Dakota state Legislature voted 14-4 to recommend passing the riot-boosting law and 15-3 to recommend passing the PEACE fund. Some members made last-minute attempts to amend the PEACE fund to allow tribes to submit claims for its funding, but the committee chairman, Republican Sen. John Wiik, dismissed them.
Sen. Reynold Nesiba, D, who voted no, criticized the lack of tribal involvement and expressed concern over the broad reaches of the state’s “riot” definition. “So many people write a check or give cash for people who are going to a protest, partly for educational experiences,” he said after the committee vote. “That notion of trying to hold somebody liable for the actions of another I find deeply problematic.”
Sen. Ryan Maher, R, whose district neighbors the Cheyenne River and Standing Rock reservations, also voted no. “I’m not going to support 189 today just because of how ‘riot’ can be construed,” he said.
Documents obtained by the ACLU of Montana showed the federal and local law enforcement had “anti-terrorism” training sessions and “riot-control formations” in 2018 to prepare for predicted Keystone XL protests. At the Standing Rock demonstrations, extremist vocabulary came hand-in-hand with mass arrests and the use of water cannons and teargas grenades. Some journalists received riot charges at Standing Rock, which were later dropped.
The bills will be voted on in the Senate Thursday morning.
Elena Saavedra Buckley is an editorial fellow at High Country News.
"ALBUQUERQUE, N.M. — New Mexico on Tuesday sued the U.S. Air Force over groundwater contamination at two bases, saying the federal government has a responsibility to clean up plumes of toxic chemicals left behind by past military firefighting activities.
The contamination — linked to a class of chemicals known as per- and polyfluoroalkyl substances, or PFAS — was detected last year in groundwater on and near Cannon and Holloman air bases.
Similar contamination has been found at dozens of military sites across the nation, and growing evidence that exposure can be dangerous has prompted the U.S. Environmental Protection Agency to consider setting a maximum level for the chemicals in drinking water nationwide. Currently only non-enforceable drinking water health advisories are in place."
by Liz Kimbrough on 5 March 2019
As thousands of hydroelectric dams are planned worldwide, including 147 in the Amazon, a new study finds that the true socio-environmental and cultural costs of dams are rarely evaluated before construction. Were such factors counted into the lifetime cost of the dams, many would not be built.
Dam repairs and removal at the end of a project’s life are rarely figured into upfront costs. Nor are impacts on river flow reduction, loss of fisheries, and aquatic habitat connectivity, destruction of productive farmlands drowned by reservoirs, and the displacement of riverine peoples.
Lack of transparency and corruption between government and dam construction companies is at the heart of the problem preventing change. Researchers recommend that environmental impact assessments (EIAs) and social impact assessments (SIAs) be granted enough weight so that if they turn out negatively it will prevent a bad dam from being built.
EIAs and SIAs should be done by third parties serving citizens, not the dam company. Better governance surrounding dams needs to be organized and implemented. There needs to be increased transparency about the true financial, social, cultural and environmental costs of dams to the public. Maintaining river flows and fish migrations is also critical.
The Belo Monte dam in the Brazilian Amazon. One of the biggest in the world, it was plagued by government and corporate corruption, did extensive socio-environmental damage, and has failed dramatically in producing promised amounts of electricity. Image courtesy of Wasserkraftwerk.
An estimated 3,700 dams are either planned or under construction in developing countries around the world — including 147 slated for the Amazon basin region. At face value, dams seem like a good deal environmentally and socially. Invest heavily upfront and then tap into an ever-flowing, renewable power source.
However, in a review paper, recently published in Proceedings of the National Academy of Sciences of the United States of America (PNAS), researchers argue that the true costs of hydropower projects are often underestimated, with harmful financial, social, and environmental consequences resulting in a wellspring of hidden costs.
Worldwide, approximately 472 million people have been negatively affected by dam construction over the last century. Downstream, river-dependent communities may be displaced, be deprived of their food security and livelihoods, and suffer immeasurable cultural losses. Almost immediately after installation of the Tucuruí Dam in the Brazilian Amazon, for example, fish catch declined by 60 percent, while 100,000 downstream residents were impacted due to lost fisheries, flooded agricultural lands, or the loss of other resources.
The PNAS review paper, produced by researchers at Michigan State University, investigates the socioeconomic and environmental impacts surrounding dams in several major river basins and makes recommendations for moving towards sustainable and responsible hydropower development in the 21st century.
Planning for the end
In North America and Europe, more dams are being torn down than constructed. In the U.S. alone, over 60 dams per year have been removed since 2006. Repairing a small dam can be more than three times more expensive than removing it, which is one of the reasons we are seeing a trend in dam removals.
“The cost of removing a dam once it’s useful life is over is extremely high and should be taken into account when computing the total cost of a new hydro development. If the cost of removal had to be included, many dams wouldn’t be built,” said Emilio Moran, study lead author and principal investigator of research project supported by the São Paulo Research Foundation that was designed to study the socio-environmental consequences of the highly controversial Belo Monte hydropower mega-dam in the Brazilian Amazon.
The lifespan of a dam is limited. Those currently being built in Brazil, for instance, are intended to last just 30 years. This may be extended by upgrades, but eventually, aging construction materials and sediment accumulation will cause a dam to deteriorate, stop functioning at a level that is economically viable to maintain, or — in worst case scenarios — to fail.
A failed dam can result in tremendous loss of life and property. In 1976, the Teton Dam in Idaho, USA failed, resulting in eleven deaths and over $2 billion in damages. In 2018, 40 people were killed and 6,000 displaced by the collapse of the Xe-Pian Xe-Namnoy hydroelectric dam in the Mekong Valley of Laos, and though hundreds of people remain missing after that disaster, the construction of two large Mekong River dams is surging ahead as planned.
The Teton dam in the U.S. as it began to collapse. As dams age and extreme weather events due to climate Image increase, more dam failures could occur. Image courtesy of WaterArchives.org licensed under the Creative Commons Attribution-Share Alike 2.0 Generic license.
Accounting for environmental harm
Development, deforestation, and climate change can, and do, greatly influence a dam’s costs, productivity, and lifespan. Construction and deforestation increase the amount of sediment deposited into a river by orders of magnitude, shortening the life of a dam or requiring expensive intervention measures such as dredging.
Also, up to half of the rain in the Amazon biome is created by means of internal moisture recycling (the forest making its own rain). Significant increases in deforestation disrupt the water cycle, decreasing precipitation and water availability to rivers. In addition, the decreases in river flow due to worsening drought can greatly reduce the amount of power generated by a dam, causing it to fall below original energy estimates — decreasing dam productivity and profits.
Overall, the Amazon Basin is getting drier, decreasing the reliable water flow for its dams. As a result, climate change coupled with deforestation has helped cause the Belo Monte dam (completed in 2016) to produce, even in best case scenarios, only 4.46 of the 11.23 GW it was forecast to generate. The Jirau and Santo Antonio dams, also in the Amazon, are likewise projected to produce only a fraction of their originally projected 3 gigawatt (GW) capacities.
At the famed Hoover Dam on the Colorado River in the U.S. West, water managers are preparing for predicted energy shortages due to climate change-escalated drought by placing new turbines at lower elevation — an expensive add on. Lake Mead, Hoover Dam’s reservoir, has seen a 40 percent decline in its water level, lowering electricity production by 25 percent.
Large dams are often promoted to the public as an opportunity to boost the local economy through jobs, increased infrastructure and access to power. But this is not always the case. In the Brazilian Amazon, the Belo Monte, Santo Antonio, and Jirau hydropower projects caused local electricity prices to go up rather than down.
Promised jobs were given to non-locals and most of those disappeared within 5 years. In the case of the Belo Monte dam, construction companies handed over huge illegal payoffs disguised as campaign contributions to politicians in order to get them to approve the dam and deliver on lucrative building contracts. This has happened often elsewhere, despite evidence that the particular dams under consideration would prove problematic to local communities and the environment.
“There are multiple barriers to moving towards more sustainable hydropower,” Moran told Mongabay. “One is the powerful lobbies [operated by the] construction companies who are addicted to building large hydropower because it is so financially rewarding for them. [Each new project] allows the lobbies to grow via opportunities for corruption [typically with] large amounts [of money going] undetected. Another [problem] is the collusion of people in government with the companies,” Moran explained. That complicity allows government to claim it is doing something important — providing energy for the public — even as officials profit from dam approvals while simultaneously disempowering the people who oppose the projects.
A recent study using a database of 220 dam-related conflicts found that government and corporate use of repression, criminalization, violent targeting of activists, and assassinations were commonly associated with controversial dam projects. Frequently, the targets of violence are community organizers and indigenous leaders.
The world’s largest waterfall by volume, Inga Falls, lies within the Democratic Republic of Congo, a country where 91 percent of the populace has no electricity. A massive hydropower project, the Grand Inga, is planned for this site, but instead of the $80 billion project bringing power to the people, electricity will be exported to South Africa to support large mining companies. This approach to electricity distribution is equally common in the Amazon, where this is a documented synergy between new dams and new mining concessions.
Brazil’s Jirau dam on the Madeira River. Flooding in 2014, likely worsened by this dam and the Santo Antonio dam, helped create an international incident between Brazil, where the dams are both located, and Bolivia upstream, where the catastrophic flooding occurred, displacing 11,000 people. Image courtesy of the International Hydropower Association.
A way forward
Ideally, companies and communities should work together to implement innovative and alternative energy technologies that do not require disruptive dams or resettlements of people and riverine settlements. One solution is to move toward the placement of far more “micro-hydro” or “zero-head” projects, which utilize smaller instream turbine technologies and much diminished reservoirs.
“Large hydro often bypasses isolated communities. Small hydro, like instream technologies, is designed to be located near people who are off the grid,” Moran told mongabay.com. “We need more off the grid solutions, whether solar, wind, biomass, or hydro, and the grid should be diversified and not over-rely on one source for its energy. And it should plan for climate change, and thereby reduce vulnerability. [Government] ministries who lead on energy need to serve the citizens and not only respond to the needs of industry, large urban areas or mining demand.”
Importantly, when a mega-dam is in the planning stages, there needs to be government and corporate transparency, along with the implementation of rigorous criteria for holding large hydropower interests accountable before they can go forward. In such cases, the researchers recommend the following:
Allow environmental impact assessments (EIAs) and social impact assessments (SIAs) enough power to prevent a dangerous dam from being built.
Be sure EIAs and SIAs are conducted by third parties or firms who are serving the citizens and not employed or influenced by the dam company.
Organize and implement better governance surrounding dams.
Increase transparency so that true financial, social, cultural, and environmental costs of dams are communicated to the public.
Carry sustainability measures from the design phase through to the build phase, including designs that follow seasonal river flow and allow fish pass-through, maintaining aquatic connectivity.
Hydropower has accounted for the majority (71 percent) of renewable energy produced globally since 2016. Moving forward, hydropower could continue to hold an important place among other diversified alternative energy sources. But government and companies must do so without endangering the very ecosystems, great rivers, and people which host new dams. That is, the recent research shows, a cost too great for society and nature to bear.
Moran EF, Lopez MC, Moore N, et al (2018) Sustainable hydropower in the 21st century. Proceedings of the National Academy of Sciences 115:11891–11898. doi: 10.1073/PNAS.1809426115
Banner Image: The Tucurui dam spillway. Courtesy of International Rivers.
Public Release: 7-Mar-2019
University of Waterloo
Simple, inexpensive urban design interventions can increase well-being and social connections among city residents, finds a new case study from the Urban Realities Lab at the University of Waterloo.
Researchers found that green spaces and colourful, community-driven urban design elements were associated with higher levels of happiness, greater trust of strangers, and greater environmental stewardship than locations without those amenities.
"The urban design interventions we studied are relatively simple and low-cost, but show great potential to improve individuals' emotional and social lives," says Hanna Negami, lead author and PhD candidate in cognitive neuroscience. "Something as simple as adding greenery to a concrete lane or painting a rainbow crosswalk could help to enrich urban public spaces."
For the study, participants were taken on walking tours of Vancouver's West End neighbourhood and asked to complete a questionnaire via a smartphone application at six stops, including a pair of laneways (one green, one concrete), crosswalks (one painted rainbow, one standard zebra), and a pair of greenspaces (one wild community garden and one manicured greenspace).
The addition of greenspace and place-making initiatives can help promote social connections for citizens, and help to mitigate social isolation. Researchers hope that these findings will ultimately help improve the experiences of people living in cities.
"We know that the design of a city has direct, measurable, psychological impact on its citizens," says Colin Ellard, professor of psychology and director of the Urban Realities Lab. "We've been able to show how such impact can be measured and what it can tell us about good, psychologically sustainable design."
The study "Field analysis of psychological effects of urban design: a case study in Vancouver" appeared in Cities and Health.
The study was conducted in partnership with Happy City, a Vancouver-based design, planning and architecture consultancy firm, with co-authors by Robin Mazumder, a PhD candidate in cognitive neuroscience, Colin Ellard, professor of psychology at Waterloo, and Mitchell Reardon, an urbanist with Happy City.
The Khumbu Glacier near Mount Everest in Nepal is one of the longest in the world.
By Kai Schultz and Bhadra Sharma
Feb. 4, 2019
NEW DELHI — Rising temperatures in the Himalayas, home to most of the world’s tallest mountains, will melt at least one-third of the region’s glaciers by the end of the century even if the world’s most ambitious climate change targets are met, according to a report released Monday.
If those goals are not achieved, and global warming and greenhouse gas emissions continue at their current rates, the Himalayas could lose two-thirds of its glaciers by 2100, according to the report, the Hindu Kush Himalaya Assessment.
Under those more dire circumstances, the Himalayas could heat up by 8 degrees Fahrenheit (4.4 degrees Celsius) by century’s end, bringing radical disruptions to food and water supplies, and mass population displacement.
Glaciers in the Hindu Kush Himalayan Region, which spans over 2,000 miles of Asia, provide water resources to around a quarter of the world’s population.
“This is a climate crisis you have not heard of,” said Philippus Wester, a lead author of the report. “Impacts on people in the region, already one of the world’s most fragile and hazard-prone mountain regions, will range from worsened air pollution to an increase in extreme weather events.”
One of the most complete studies on mountain warming, the Hindu Kush Himalaya Assessment was put together over five years by 210 authors. The report includes input from more than 350 researchers and policymakers from 22 countries.
In October, a landmark report from the United Nations’ scientific panel on climate change found that if greenhouse gas emissions continued at the current rate, the atmosphere would warm by as much as 2.7 degrees Fahrenheit (1.5 degrees Celsius) above preindustrial levels by 2040.
Avoiding further damage from this rise would require transforming the world economy at a speed and scale that has “no documented historic precedent,” the report said.
In the Himalayas, warming under this scenario would probably be even higher, at 3.8 degrees Fahrenheit (2.1 degrees Celsius), the Hindu Kush Himalaya Assessment found. Across the world, glacier volumes are projected to decline up to 90 percent this century from decreased snowfall, increased snowline elevations and longer melt seasons.
The Hindu Kush Himalaya Assessment touches on the phenomenon of elevation-dependent warming. Though it is well known that temperature changes due to increased levels of greenhouse gases are amplified at higher latitudes, like in the Arctic, there is growing evidence that warming rates are also greater at higher elevations.
“Mountain people are really getting hit hard,” said David Molden, the director general of the International Center for Integrated Mountain Development, the research center near Kathmandu, Nepal’s capital, that led the study. “We have to do something now.”
Around South Asia, the impact of climate change has already intensified. Brutal heat waves are becoming unbearable, making people sicker and poorer, and diminishing the living standards of 800 million people.
Access to water is also a concern. Last spring, shortages were so severe in the Indian city of Shimla, in the Himalayas, that some residents asked tourists to stop visiting so that they would have enough water for themselves.
A government report released last year found that India was experiencing the worst water crisis in its history. About half of India’s population, around 600 million people, faced extreme water scarcities, the report found, with 200,000 people dying each year from inadequate access to safe water.
By 2030, the country’s demand for water is likely to be twice the available supply.
In neighboring Nepal, rising temperatures have already uprooted people. Snow cover is shrinking in mountain villages, and rain patterns are less predictable. Fertile land once used for growing vegetables has become barren.
“Water sources have dried up,” said Pasang Tshering Gurung, a farmer from the village of Samjong, which is about 13,000 feet above sea level.
Sign up for The Interpreter
Subscribe for original insights, commentary and discussions on the major news stories of the week, from columnists Max Fisher and Amanda Taub.
A few years ago, all 18 families in Samjong moved to a village around 1,000 feet lower after their crops repeatedly failed.
But Mr. Gurung and his neighbors are still worried. Landslides linked to increased flooding continue to thunder down hillsides. The government has offered limited support for resettlement, he said.
And with little money to spare, Mr. Gurung is not sure where he would go next.
“We will be landless refugees,” he said. “How can we survive in the Himalayas without water?”
Kai Schultz reported from New Delhi, and Bhadra Sharma from Kathmandu, Nepal.
by Olufemi Taiwo and Holly Jean Buck for The Conversation
Washington DC (SPX) Feb 01, 2019
Captured carbon has a variety of industrial uses, including oil extraction and fire extinguisher manufacturing. U.S. Energy Department's National Energy Technology Laboratory
Environmental activists are teaming up with fresh faces in Congress to advocate for a Green New Deal, a bundle of policies that would fight climate change while creating new jobs and reducing inequality. Not all of the activists agree on what those policies ought to be.
Some 626 environmental groups, including Greenpeace, the Center for Biological Diversity and 350, recently laid out their vision in a letter they sent to U.S. lawmakers. They warned that they "vigorously oppose" several strategies, including the use of carbon capture and storage - a process that can trap excess carbon pollution that's already warming the Earth, and lock it away.
In our view, as a political philosopher who studies global justice and an environmental social scientist, this blanket opposition is an unfortunate mistake. Based on the need to remove carbon from the atmosphere, and the risks in relying on land sinks like forests and soils alone to take up the excess carbon, we believe that carbon capture and storage could be a powerful tool for making the climate safer and even rectifying historical climate injustices.
We think the U.S. and other rich countries should accelerate negative emissions research for two reasons.
First, they can afford it. Second, they have a historical responsibility as they burned a disproportionate amount of the carbon causing climate change today. Global warming is poised to hit the least-developed countries, including dozens that were colonized by these wealthier nations, the hardest.
Consider this: The entire African continent emits less carbon than the U.S., Russia or Japan.
Yet Africa is likely to experience climate change impacts sooner and more intensely than any other region. Some African regions are already experiencing warming increases at more than twice the global rate. Coastal and island nations like Bangladesh, Madagascar and the Marshall Islands face near or total destruction.
But the world's richest nations have been slow to endorse and support the necessary research, development and governance for negative emissions technologies.
Bad track record with coal
What explains the objections from climate justice advocates?
The U.S. has heavily funded experiments with carbon capture and storage to drastically reduce greenhouse gas emissions from new coal-fired power plants since George W. Bush's presidency.
Those efforts have not paid off, partly because of economics. Natural gas and renewable energy have become cheaper and more popular than coal for generating electricity.
Only a handful of coal-fired power plants are under construction in the U.S., where closures are routine. The industry is in trouble everywhere, with few exceptions.
In addition, carbon capture with coal has a bad track record. The biggest U.S. experiment is the US$7.5 billion Kemper power plant in Mississippi. It ended in failure in 2017 when state power authorities ordered the plant operator to give up on this technology and rely on natural gas instead.
Carbon capture and storage, however, isn't just for fossil-fuel-burning power plants. It can work with industrial carbon dioxide sources, such as steel, cement and chemical plants and incinerators. Then, one of two things can happen. The carbon can be turned into new products, such as fuels, cement, soft drinks or even shoes.
Carbon can also be stored permanently if it is injected underground, where geologists believe it can stay put for centuries.
Until now, a common use for captured carbon is extracting oil out of old wells. Burning that petroleum, however, can make climate change worse.
Going carbon negative
This technology may potentially also remove more carbon than gets emitted - as long as it's designed right.
One example is what's called bioenergy with carbon capture and storage, where farm residues or crops like trees or grasses are grown to be burned to generate electricity. Carbon is separated out and stored at the power plants where this happens.
If the supply chain is sustainable, with cultivation, harvesting and transport done in low-carbon or carbon-neutral ways, this process can produce what scientists call negative emissions, with more carbon removed than released. Another possibility involves directly capturing carbon from the air.
Scientists point out that bioenergy with carbon capture and storage could require vast amounts of land for growing biofuels to burn. And climate advocates are concerned that both approaches could pave the way for oil, gas and coal companies and big industries to simply continue with business as usual instead of phasing out fossil fuels.
Natural solutions Every pathway to limiting global warming to 1.5 degrees Celsius in the most recent U.N. Intergovernmental Panel on Climate Change report projected the use of carbon removal approaches.
Planting more trees, composting and farming in ways that store carbon in soils and protecting wetlands can also reduce atmospheric carbon. We believe the natural solutions many environmentalists might prefer are crucial. But soaking up excess carbon through afforestation on a massive scale could encroach on farmland.
To be sure, not all environmentalists are writing off carbon capture and storage.
The Sierra Club, Environmental Defense Fund and Natural Resources Defense Council, along with many other big green organizations, did not sign the letter, which objected not just to carbon capture and storage but also to nuclear power, emissions trading and converting trash into energy through incineration.
Rather than leave carbon removal technologies out of the Green New Deal, we suggest that more environmentalists consider their potential for removing carbon that has already been emitted. We believe these approaches could potentially create jobs, foster economic development and reduce inequality on a global scale - as long as they are meaningfully accountable to people in the world's poorest nations.
Jan 31, 2019 12:12 pm
The sites portrayed in many beloved artworks have been completely transformed since they were committed to canvas, carved in stone, or captured on film. Modern cities have risen at locations that were once picturesque villages; intensive farming has helped transform near-deserts into verdant pastures and lush greenery into parched wasteland; and mining operations have lopped the peaks off mountains and carved canyons that would have taken millennia to form naturally.
As the scale and pace of environmental change continues to accelerate, and as human activity keeps fueling global warming, more and more artworks will appear irreconcilable with the places they depict. Herewith are nine famous works depicting landscapes that seem destined to be dramatically—and irreparably—altered in the near future.
Canaletto’s Venice, soon to be Atlantis
View of St. Mark's from the Punta della Dogana, 1740-1745
Pinacoteca di Brera
The scientific community’s consensus seems to be that at the current rate, Venice will be underwater by the end of this century. Italy’s extremely complicated and expensive flood prevention measures may be too little, too late. That will leave the Venetian cityscapes painted most iconically by Canaletto—whose name means “little canal”—as important records of the way things once were. Venice of the 22nd century, oddly, may end up more closely resembling Damien Hirst’s elaborate shipwreck sculptures, which he showed in the city during its 2017 biennale.
J.M.W. Turner’s Mer de Glace, now just a mer
J. M. W. Turner, Mer de Glace, in the Valley of Chamouni, Switzerland, 1803. Image via Wikimedia Commons.
The melting of the glaciers in the Alps has gotten so bad that every summer, concerned Swiss citizens climb into the mountains and wrap the remaining glaciers in blankets to keep them cool. But that can only help so much, as the Alps’s glaciers are receding at record rates—and revealing the remains of missing mountaineers as they go.
In anticipation of a J.M.W. Turner and John Ruskin exhibition opening at the York Art Gallery in March, photographer Emma Stibbon was commissioned to photograph the same glacier that the 19th-century artists had rendered during their visits to the Alps. The resulting image is dramatically de-iced when compared to the sprawling glacier in Turner’s Mer de Glace, in the Valley of Chamouni, Switzerland from 1803.
Christo and Jeanne-Claude’s islands will go under
Surrounded Islands, Miami (Signed Poster), 1983
Surrounded Islands, Project for Biscane Bay, Greater Miami, Collage in Two Parts
As sea levels rise, very flat coastal areas—especially those in hurricane-prone regions—are increasingly at risk, which means Miami is in big, big trouble. While that may not bother some Floridians, others are taking the problem seriously. Among the likely casualties of rising sea levels in and around Miami are the 11 islands in Biscayne Bay that were ringed in bright pink fabric more than 35 years ago by land artists Christo and Jeanne-Claude. This incredibly ambitious project, Surrounded Islands (1980–83), may unintentionally become a valuable record of a landscape soon to be submerged.
Frederic Edwin Church’s Amazon is drying up
Frederic Edwin Church, El Rio de Luz “The River of Light,” 1877. Image courtesy of the National Gallery of Art.
In the atmospheric painting now titled El Rio de Luz (The River of Light) (1877), but formerly known simply as The Amazon, Frederic Edwin Church rendered the iconic South American waterway in all its misty glory, with the sun struggling to pierce the humid air. But due to climate change and deforestation, the air in the Brazilian rainforest has become markedly less humid than it once was, and scientists believe it may be contributing to droughts in the region and a general drying up of the Amazon River. A satellite study of water vapor data by NASA suggested that the Amazon may not be able to maintain itself if its dry season gets much longer than it already has, and Church’s lush greenery framing the river could end up looking more like grassy fields.
100 views of Osaka before the floods
Utagawa Yoshitaki, from “100 Views of Naniwa,”ca. 1860. Image via Wikimedia Commons.
Utagawa Yoshitaki, “100 Views of Naniwa,” ca. 1860. Image via Wikimedia Commons.
One of the major cities in Asia most at risk of rising sea levels is Osaka. In fact, its airport, which is built on a landfill in Osaka Bay, has already flooded. The city is the commercial linchpin of the enormous Keihanshin metropolitan region in Japan, which has a population of over 18 million, but if the global temperature rises more than 3 degrees Celsius by 2100—as the United Nations World Meteorological Organization has recently predicted—Osaka could be hobbled. It will be a far cry from the calm seaside city depicted by the ukiyo-e master Utagawa Yoshitaki in his sweeping 1860s woodblock print series “100 Views of Naniwa.”
Mexico City will go the way of the Aztecs
Roberto Cueva Del Río, Fundacion Tenochtitlan, 1986. Image via Wikimedia Commons.
Mexico City, the largest metropolitan area in the Western Hemisphere, is bone-dry—and it’s sinking. According to a 2017 report in the New York Times, some parts of the city center are collapsing at a rate of 9 inches per year as the sprawling metropolis, perpetually short on water, taps ever-deeper underground reserves, thereby weakening the lake bed on which it was built. Add to this the rising temperatures and worsening droughts that accompany climate change and you get a bleak picture of Mexico City’s future.
These prospects seem far away from the heroic images of modern Mexico painted by muralists Diego Rivera and Roberto Cueva Del Río. In the latter’s rendering of the foundings of Mexico City and the Aztec city-state Tenochtitlan, he tellingly stacks the two cities atop each other. This is accurate not only because Mexico City was built on the site that was once Tenochtitlan, but also because the city now sits on a fragile clay surface that is falling in on itself.
Albert Bierstadt’s dwindling glaciers
Albert Bierstadt, View of Glacier Park or Sunset on Peak, date unknown. Image via Wikimedia Commons.
Much like the glaciers Turner painted in the Alps, the majestic, ice-covered mountain that Albert Bierstadt depicted in View of Glacier Park or Sunset on Peak (date unknown) is almost certainly a lot more barren today. While Glacier National Park contained 150 glaciers in the late 19th century, it is down to just 26 today. The park may lose all of its namesake geological features as soon as 2050, according to research by the U.S. Geological Survey (USGS) and Portland State University. “It’s inevitable that we will lose them all over the next few decades,” Daniel Farge, a USGS scientist, told The Guardian in 2017. In other words, photographs and paintings like Bierstadt’s dramatic, pink-tinged peak will soon be the only reminders that Glacier National Park once had any glaciers at all.
Global warming comes for Paul Gauguin’s islands
Mahana No Atua (Day of the God), 1894
The Art Institute of Chicago, Chicago
Paul Gauguin’s depictions of life in French Polynesia suggest a carefree tropical paradise, but of course, the reality of his presence there was much more problematic. Now, problems of an entirely different order threaten to trouble the cerulean blue waters of islands throughout the South Pacific, which are increasingly at risk of being wiped off the map by the combination of erosion and rising sea levels.
In a 2013 study of more than 1,200 French islands around the world, scientists at a Université Paris-Sud lab found that between 6 and 12 percent of those islands will disappear, depending on the extent the sea level rises. In any case, the Polynesia of Gauguin’s paintings will be radically transformed—there’s even been talk of creating floating islands to replace some of the landmass lost to climate change.
Benjamin Sutton is Artsy’s News Editor.
At the Wally Shop in Brooklyn, your deliveries come with zero emissions, and all your packaging gets sent back for reuse.
BY ADELE PETERS
If you order grocery delivery from The Wally Shop, a startup in Brooklyn, a courier will pick up ingredients at the farmer’s market or at a local business like the Bushwick Food Cooperative. The delivery comes by bike, not the usual van that other companies use. But the biggest difference between your order and regular food delivery may be the lack of trash: All of the packaging is reusable, and on your next order, the courier will pick up the packages you’ve emptied.
“My goal was, how can we create a scalable solution that would make sense for us today and would dramatically help us cut down on waste?” says Tamara Lim, the company’s founder. Through her previous job managing the packaging and shipping category at Amazon, Lim was very aware of the issue of packaging waste; she also recognized the limits of recycling. As she talked with packaging vendors, she learned about the flaws in recycling infrastructure–particularly as countries like China cracked down on accepting unwanted American plastic. A better solution, she realized, would be truly circular, with packaging that could be returned and reused many times.
Online grocery delivery, which tends to involve a large amount of packaging, was a good fit for a different model. The new service is still convenient: Orders placed before 2 p.m. can be delivered the same day. Fruit and vegetables show up in organic cotton mesh bags. Bulk food, including dried pasta, grains, coffee, and seasoning, arrives in mason jars. Customers pay a deposit for the packaging, which they get back when the packaging is returned. “For someone who does shopping weekly, with orders roughly the same size, what will happen is you’ll pay a deposit and . . . when you return your packaging each week, [the deposit] rolls over to the next order,” says Lim. “So it’s almost like you pay for it once and you never have to pay for packaging again.”
The packaging goes back to the startup’s warehouse in Bushwick, where it’s cleaned and used for the next order. “If you think about a traditional grocery shop or any restaurant–anyone who uses single-use packaging–every single piece incurs a cost, and it’s really part of that individual order,” she says. “But what we’re doing essentially is we’re saying, let’s make packaging not a variable cost. Let’s actually view it more as an asset.”
It’s a philosophy that’s gaining more traction with businesses, including major brands, who will soon launch an experimental platform to sell everything from deodorant to ice cream in reusable packaging, driven by consumer appetite for different solutions to the problem of trash–particularly packaging that ends up in the ocean and in the stomachs of marine animals.
“When I was at Amazon, in the categories and numbers that I was seeing, consumers were choosing to buy more sustainable products,” says Lim. “These product lines are just growing at three times the rate of the normal product categories. And so I think that it’s exciting to see that companies and corporations are starting to act on and respond to what consumers have already been trying to tell us for a number of years now.”
The Wally Shop is still very small, having opened about three months ago. But it plans to soon expand to Manhattan and to cities beyond New York, with a continued focus on local, organic food. It also plans to expand product offerings, and may also expand to other types of businesses, such as restaurant or meal kit delivery. “We built this to not only be convenient and sustainable, but also to be scalable,” says Lim. “Because we understand that it’s only with scale that we’re going to have the amount of impact that we want to have in terms of preventing packaging.”
ABOUT THE AUTHOR
Adele Peters is a staff writer at Fast Company who focuses on solutions to some of the world's largest problems, from climate change to homelessness. Previously, she worked with GOOD, BioLite, and the Sustainable Products and Solutions program at UC Berkeley
BRAD SHANNON AND CAROLYN BICK/INVESTIGATEWEST January 31, 2019
TUKWILA — On a rainy Tuesday afternoon, Chad Peters is picking up oil from the Din Tai Fung restaurant at Southcenter Mall.
The SeQuential Pacific Biodiesel vacuum truck driver wears his long, salt-and-pepper hair in a ponytail to keep it out of the way as he wheels one of the restaurant’s two plastic oil collection vats out into the chilly, spitting rain.
“You should eat here. They’ve got good oil,” he says to SeQuential Marketing Manager Rachel Shaver. “That’s how I choose what restaurants to eat at — the quality of the oil. I can tell whether the oil is reused or not. This is good oil. Hasn’t been reused.”
Peters isn’t thinking much about what is going on an hour’s drive south at the Legislature in Olympia — but his bosses are.
Here’s why: The cooking oil he is collecting into his 1,500-gallon vacuum truck will taste a second life as biodiesel, an alternative fuel made partially of vegetable oil or animal fats that has a distinctly lower carbon impact than traditional diesel made solely of petroleum, meaning it also doesn’t pollute as much when it’s burned.
But here’s the catch: The biodiesel-to-be that is being sucked into Peters’ truck, like much of what’s collected in Washington, will help reduce climate-wrecking carbon emissions not in the Evergreen State but across the border in Oregon. There, lawmakers have required progressive reductions in the amount of fossil fuel allowed in gasoline and diesel, hitting a 10 percent drop by 2025.
In other words, in Oregon — as in California and British Columbia — the used cooking oil commands a hefty premium over its price in Washington, where there is no such requirement. You might say the used oil flows toward the money.
This year that may change, though. And it could be a good thing for Washington’s economy.
An hour to the other side of Olympia, a maze of industrial pipes feeding storage tanks, each the size of a small apartment house, looms over Grays Harbor on Washington’s central Pacific coast. This industrial facility stood out as the second-largest biodiesel producer in the nation last year.
The Renewable Energy Group plant employs 40 people in Hoquiam, a town hard-hit by the one-two punch of a withered timber industry and the opiate addiction crisis. Here, those 40 jobs really matter.
REG and Phillips 66 have announced plans to build a similar renewable fuel plant near Bellingham, in Whatcom County, and the companies are still calculating whether to move ahead later this year.
The answer may turn out to depend on what the Washington lawmakers do about House Bill 1110, which won committee approval from majority Democrats on the House Environment and Energy Committee during just the second week of the 105-day legislative session. But some Republicans question whether the state should intervene in energy markets, suggesting it could cause consumers to pay more at the pump.
The REG biodiesel plant in Grays Harbor County is the second-largest producer of biodiesel in the United States. (Photo courtesy of Renewable Energy Group)
REG spokespersons say a decision to proceed with the Whatcom County plant is due later this year. It would turn out so-called renewable diesel, which it now produces from used food oils, animal-fat wastes and canola at a similar Louisiana plant and sends to West Coast markets like California.
Though REG and Phillips are basing their decision on current markets, the prospect of increased demand in Washington could nudge their decision toward a “yes” if the Washington legislation passes.
“We have put it out to the market that we are going in this direction,” says Kent Hartwig, an Iowa-based REG senior manager who testified for the bill in Olympia. “That is really based on the low-carbon markets that are established and the potential for the (clean fuel) standard in Washington.”
If Washington joins Oregon, California and British Columbia with a fuel-blend mandate, Hartwig said “it definitely reinforces our decision.”
The low-carbon fuels standard is also being considered in a Senate bill, SB 5412, sponsored by Sen. Rebecca Saldaña, D-Seattle. It got a hearing on Wednesday in Olympia, where Ian Hill, co-founder of SeQuential, told senators that Washington is losing out.
“Our industry is moving forward with great progress – leaps and bounds – in Oregon and California and other parts of the country,” Hill said. “Washington is missing out on this opportunity currently.”
Some 17 percent of California’s diesel is biodiesel and that figure is about 7 percent in Oregon, while in Washington it’s just one-half of 1 percent, Hill said in testimony Wednesday.
Environmentalists, biofuel producers and electric carmaker groups also backed the bill. Keeping the fuels in Washington would promote jobs and investments in this state, advocates say.
Some 43 percent of Washington’s greenhouse gas emissions come from the transportation sector, with a large share of that from road vehicles targeted by legislation, as Rep. Joe Fitzgibbon, D-Burien, lead sponsor of HB 1110, pointed out at a hearing on his bill earlier in the month.
The legislation would create a somewhat complicated system of tradable credits for cleaner fuels that can offset the impacts associated with the full life-cycle greenhouse-gas pollution associated with dirty fuels.
“It’s a tough problem to solve so the solution is going to be a little bit complicated,” Fitzgibbon said, adding that the system to be devised by the state Department of Ecology would be “technology neutral.” In other words, the system would leave it up to fuel makers which kinds of cleaner fuel credits they buy or create.
Fitzgibbon said that fuel makers and distributors whose product does not meet fuel-blend standards could buy credits from fuel makers whose products do — with credits calculated to reflect the life-cycle carbon content or intensity of the fuels. Those earning credits could include makers of biofuels, meaning biodiesel and renewable diesel producers as well as producers of renewable natural gas taken from landfills or farm digesters. Similarly, credits could be earned by utilities that generate electricity used for electric or plug-in hybrid cars, if they are able to show this use.
The credits and deficits system should help promote electrification of road car fleets, according to Fitzgibbon.
The oil industry has long opposed state efforts to require or mandate blend standards. But while it fended off action in Washington until this year, it lost political battles in California, Oregon and British Columbia.
Lobbyist Jessica Spiegel of the Sacramento-based Western States Petroleum Association testified against HB 1110, telling a committee in Olympia that consumers would pay more if a fuel standard or mandate is passed.
Spiegel did not make the same doomsday claims her industry made in California and Washington a few years ago that a fuel standard would boost fuel costs by more than a dollar per gallon.
But Spiegel cited a report late last year from industry consultant Stillwater Associates that claimed California’s 2011 standard has jacked up the price of gasoline by more than 13 cents per gallon. The California Air Resources Board offered a similar estimate of 13.5 cents for gasoline and about 16 cents more for diesel.
However, neither the Stillwater Report nor the CARB calculations reflect that fuel producers may not be passing on all the extra costs. The Stillwater report suggests that gasoline prices could end up 36 cents a gallon higher in California by the time that state reaches its goal in 2030 of cutting the carbon intensity of road fuels by 20 percent.
The implication is that Washington, which has the second-highest state gas tax at more than 49 cents per gallon, could see the same additional increase.
The industry also warns that fuel stocks needed to produce biofuel blends may not yet exist in sufficient quantities to let the state reach its biofuels goals on the gasoline side. Past efforts to convert cellulosic fiber from waste products for ethanol, for example, have not resulted in local production, Spiegel said in an interview.
Backers of biofuels say cost impacts at the fuel pump may be much lower for consumers, citing Oregon state statistics. And they say there are additional environmental and economic benefits if Washington-produced fuels are used closer to home.
Whatever state lawmakers decide, some parties in the oil industry may be getting reconciled to a lower-carbon future for its fuels.
BP — which is Phillips 66’s refining rival in Whatcom County — already opened its own renewable diesel plant adjacent to its Cherry Point refinery last year. The BP plant produces biodiesel from waste or biomass products such as used cooking oil and animal fats, and its process lets the company mix the less-carbon-emitting bio-mix into regular diesel without a change in performance.
The moves by BP, Phillips and REG suggest that availability of renewable fuel stocks is less than a deal-killer for the industry. REG’s Hoquiam plant, for example, imports canola oil from the Midwest. And state Department of Commerce spokespersons say fuel stocks are ample. The Union of Concerned Scientists has also put out a report showing Washington’s targets could be achieved.
State legislators are not so concerned either — at least not majority Democrats backing the Fitzgibbon bill. His proposal requires carbon intensity in road fuels be cut by 10 percent by 2028 and by 20 percent by 2035, and the measure would encourage investments in electric car and charging stations as offsets for fuel intensity.
HB 1110 received its first OK in committee last week with support from six of the seven Democrats on the environmental panel and no aye votes from the GOP. One member voted against the policy and several others were neutral.
Rep. Richard DeBolt, R-Chehalis, was among the neutral votes, expressing some support for the policy going forward. DeBolt, who is the GOP point person on carbon in the House, also said he didn’t think the transportation sector is moving quickly enough toward electrification.
Republicans have largely been seeking incentives rather than mandates to reach climate policy goals.
And clean-fuel critics like Todd Myers of the conservative think tank Washington Policy Center argue that a fuels mandate is not the most efficient or cost-effective way to cut carbon emissions from the road sector.
But a long list of interest groups has weighed in with support for the rule, including electric car sellers, environmentalists and biofuels makers like REG and Portland-based SeQuential, which produces some 30 million gallons of biodiesel a year after its merger last year with a Bakersfield, Calif., firm.
Like REG, SeQuential is a big player in collecting restaurant oil wastes, serving an estimated 30 percent of restaurants in Washington.
Hill, the SeQuential co-founder, said in a phone interview that the company collects used cooking oil from up and down the West Coast, and ships most of its product to consumers in Oregon, but also to California and British Columbia. Both California and British Columbia have policies similar to Oregon’s, which makes the respective markets lucrative. But Washington doesn’t have this policy, which is why SeQuential only ships a fraction of its biodiesel to consumers in the state, Hill said. It’s simply not that profitable.
“We will continue to operate in Washington state as a feedstock company, and we’ll continue to work hard to grow in Washington state, and be a part of the economy there, but as far as being able to supply low-carbon biodiesel fuel options in a state without a carbon value, it just won’t be feasible,” said Hill. “And that’s true for the entire industry.”
The same factors are why South Seattle-based General Biodiesel doesn’t sell to any consumers in the state, the company’s CEO and board Chairman Jeff Haas said.
Because of the policies in place in Oregon and California — General Biodiesel’s market — it’s still better financially for the company to pay for the shipping costs required to transport the fuel to those states than it is to try to sell to companies in Washington. Without a similar policy here, there is almost no demand from within the state, he said.
“If Washington creates a competitive environment for biofuels, I would love to sell every drop we produce in Washington,” Haas said in a telephone interview.
The fuel-blend mandate is one of several major climate proposals backed by Gov. Jay Inslee, a Democrat who has indicated that climate change policies would be a top priority for his campaign if he decides to seek his party’s nomination for president.
The House legislation is still a long way from becoming law. It heads now to the House Transportation Committee, after which it would have to go through the Rules Committee, be passed by the full House, and then approved by the Senate in a similar succession of hearings and votes. In the Senate, the legislation still awaits an initial committee approval.
By Daniel Moritz-Rabson On 1/24/19 at 3:21 PM
A significant majority of Americans are unwilling to contribute $10 each month to address climate change, an AP-NORC survey found.
While 57 percent of those surveyed would contribute $1 a month to combat global warming, thatat number drops significantly when the monetary contribution increases. Twenty-eight percent of respondents said they would pay $10 each month, 30 percent said they would pay $20 a month, and 16 percent said they would contribute $100 each month.
The survey found that individuals living in households with an annual income of at least $100,000 were more supportive of a monthly utility fee to address climate change. It also found that 72 percent of Democrats who said climate change is real attributed its existence primarily to anthropogenic causes, a figure far higher than the 33 percent of Republicans who believed in climate change and thought it was caused by humans.
Extreme weather events played a significant role in altering respondents' views on climate change, the AP-NORC report said. For those who increasingly believed climate science in the previous five years, 76 percent said that extreme weather events changed their perception.
. Only 28 percent of Americans surveyed said they would pay $10 each month to combat climate change.
On Tuesday, the Yale Program on Climate Change Communication and the George Mason University Center for Climate Change Communication released a poll that found that 73 percent of Americans said global warming is taking place.
The research, which was conducted at the end of 2018, indicated record recognition of climate change.
The poll from Yale and George Mason showed a significant rise in the amount of people who said climate change is "extremely," "very," or "somewhat" important to them, according to The New York Times. By the end of the year, 72 percent of individuals surveyed said climate change was important to them, a nine percent increase from March.
"The thing that is most encouraging in these polls is that they show the public has now become aware that climate change is here and now," Bob Inglis, executive director of RepublicEN, an organization encouraging conservatives to respond to climate change, told NBC News. "They understand it’s not decades away and it’s not in some other place. That is a huge change."
As prominent figures around the world, like Sir David Attenborough, raise concern about the pace of the global response to climate change, the Trump administration has regularly dismissed concerns and denied climate science.
On Tuesday, White House Press Secretary dismissed statements made Monday by Congresswoman Alexandria Ocasio-Cortez about climate change, which fact-checkers have said were hyperbolic.
"Look, I don’t think we’re going to listen to her on much of anything, particularly not on matters we’re gonna leave in the hands of a much, much higher authority, and certainly not listen to the freshman congresswoman on when the world may end," Sanders said when on Fox News about Ocasio-Cortez's comments.
Jan 24 2019, 7:00am
by Kaleigh Rogers
The number of environmental protection laws around the world has increased 38-fold since 1972, but a lack of sufficient enforcement has rendered many of them useless, a new United Nations report has found.
In 1972, the year of the first UN environmental agreement, only three countries had national environmental framework laws: Norway, Sweden, and the United States. By 2017, 176 nations had these laws. In addition, 150 countries enshrined environmental protection or the right to a healthy environment in their constitutions, and 164 countries had cabinet-level bodies responsible for environmental protection.
But the UN report found that few of these laws have been implemented and enforced effectively.
“It really is something that all countries share,” Carl Bruch, the director of International Programs at the Environmental Law Institute and one of the authors of the report, said in a phone interview. “We do have a lot of environmental laws that are on that books that could be so much more effective if they were actually fully implemented.”
The report broke down the shortcomings of environmental policies into four categories: institutions responsible for the laws, civic engagement, environmental rights, and justice for those who break the law. Not every country has a problem in each category, but every country has challenges in at least one sector that has reduced the effectiveness of its environmental laws, Bruch said.
With regards to institutions, a member state might, say, have a law, and an agency responsible for enforcing that law, but that agency doesn’t actually have the authority necessary to do so. This was a barrier uncovered in some Asian countries in the mid-2000s, according to the report, when an assessment found that many nations’ regulatory agencies had the responsibility of enforcing environmental laws, but “lacked clear or sufficiently comprehensive mechanisms to limit and require monitoring of pollution discharges, file criminal or civil cases, take emergency response actions (such as closing a facility), impose penalties, or order corrective measures,” the report said.
"What we really need to do is focus on implementing the laws that we already have.”
Other times, the institutions have the authority but still don’t act, an issue that could be remedied through greated civic engagement and the ability of citizens to hold agencies accountable. Take the ongoing lead poisoning crisis in Flint, Michigan, which is listed as one of the 35 case studies in the report. The UN report noted that local, state, and federal agencies all failed to properly enforce laws that could have caught the crisis earlier.
In contrast, Costa Rica, has doubled its forest cover to more than 50 percent, and is on track to be climate neutral by 2021—bolstered by civic engagement and access to the courts, the report notes. Costa Rica’s constitution allows any individual to bring a suit to defend a constitutional right, which includes the right to “a healthy and ecologically balanced environment.” A 1994 ruling also allows citizens to sue on behalf of the public good, including on environmental issues.
On the justice front, sometimes a lack of proper training and education for judges can disrupt the systems in place to enforce environmental law. In Ecuador, for example, a non-government organization sued to prevent a pine tree plantation from being erected in a native grassland ecosystem. But the judge, unaware of Ecuador’s constitutional provisions that allow anybody to bring forward a suit in protection of the environment, dismissed the case and allowed the plantation to be built, the UN report noted.
“Due to the complexity and technical nature of many environmental matters, it is particularly important that judges be knowledgeable and competent regarding environmental law,” the report read.
Bruch said it’s time we focus on the structures around the laws to make sure they’re effective. He hopes this will be the first in a series of reports so people can track the progress—or regression—that governments make in shoring up their environmental laws.
“If we have laws in place and we still see the problems, whether it’s climate change or biodiversity loss, is it because the policies are not appropriate or is it because the policies aren’t being implemented and enforced?” Bruch said. “There’s often an instinct to ‘fix the laws,’ and what we really need to do is focus on implementing the laws that we already have.”
By Peter Coutu
Jan 24, 2019
Faced with the threat of more rainfall, increased flooding and sea levels rising at twice the global rate, a consultant's study warned that climate change's future impacts on Virginia's largest city could cost billions of dollars.
A new policy document lays out various adaptation options for Virginia Beach to consider, including a voluntary program to buy frequently flooded properties and a change in building codes and development standards. Initial estimates for citywide infrastructure plans range from $1.7 billion to $3.8 billion.
"It was an eye-opener," said Mayor Bobby Dyer. "And it shows a clear direction that we have to take."
The city hired Dewberry, an engineering consultant, in 2015 to conduct the $3.8 million study, which is still in progress. The first piece was rolled out last week. The firm uses two sea level rise estimates — 1½ feet in the short term (between 2035 and 2055) and 3 feet in the future (between 2065 and 2085).
And, according to its projections, the cost of doing nothing to combat the problem would be sky high.
If Virginia Beach takes no action, then a foot and a half of sea level rise would cost the city $50 million in yearly losses, the study projects. With 3 feet, that figure jumps to more than a quarter of a billion dollars — a year. The impending danger doesn't impact the entire city. Several areas — including Sandbridge, Pungo, the Oceanfront and around the Lynnhaven Inlet — face most of the future risk associated with sea level rise.
Because of this, there is a need for both long- and short-term plans, Dyer said.
"The genesis of the study was both in recognition of increased flooding and the need for a strategic plan to protect the city," read the executive summary of the city's working policy document. "The goal was to produce the needed information and strategies to enable the city to establish long-term resilience to sea level rise and associated recurrent flooding."
Councilwoman Barbara Henley, who for years has represented the rural, southern portion that includes some of the most vulnerable areas, said it was key to have solid data about the specific impacts throughout the city to help make decisions moving forward.
"The policies are something we can start right away," Henley said. "We don't have to wait until we have millions and billions of dollars."
She also said she was supportive of a voluntary buy-back program for frequently flooded homes, similar to how Open Space and Agriculture Preservation Programs have been run. And, she said, the city needs to protect vulnerable areas from more development while respecting property rights.
"The first thing we have to do is stop the bleeding," she said. "We have to stop building things where we know it's not a good idea."
Possible infrastructure-related projects that could help mitigate sea level rise citywide will cost billions of dollars and unfold over decades, according to current study estimates. Those figures are still being refined.
"It's going to be a huge cost," Dyer said.
He said the city will have to figure out how to strike a balance between its "overwhelming obligation" to get stormwater management under control, which he called his number one priority, and limiting the burden on taxpayers. He said Virginia Beach will need to look for innovate ways to raise more money for some of those costly flooding fixes.
Another problem: New infrastructure in Virginia Beach could adversely impact surrounding areas, both in Hampton Roads and across state lines — making collaboration crucial, officials said.
One component of an infrastructure plan — raising Muddy Creek Road and the bridge on it — could increase flooding in nearby Pungo, for example. This would all be weighed by city officials as the study continues.
Dyer said he's not sure when the next piece of the Dewberry study will be revealed or when it will wrap up. He said he's looking forward to an upcoming City Council retreat when all members will be able to hammer out a more specific strategy.
Resident Tim Worst, who ran for Henley's seat in November, said he welcomed the results of the study, especially after seeing the impacts of extreme weather, like Hurricane Matthew in 2016, and two major floods in the southern part of the city last summer.
"We've been waiting for that (study) like you've been waiting for Jesus' resurrection," he said. "I couldn't wait for that to finally come."
By ANTHONY ADRAGNA 01/08/2019 03:18 PM EST
House Energy and Commerce Committee Chairman Frank Pallone (D-N.J.) on Tuesday rejected the call for Democratic energy committee members to shun contributions from fossil fuel companies and other industries as too extreme.
"If you start going down that road, then nobody can contribute to you," Pallone told WNYC's "The Brian Lehrer Show" in an interview. “Ultimately you have to finance your campaign, and if you start saying that just because you’re on a committee, that nobody associated with any of the issues that the committee faces can contribute, I just think that’s the wrong way to go and too limiting.”
He added: "Where do you draw the line? Does that mean that somebody who works for the utility can’t contribute to me?"
The comments are sure to draw the ire of progressive groups and others like Rep. Alexandria Ocasio-Cortez (D-N.Y.), who have pushed Pallone and other House Democrats to reject contributions from fossil fuel companies as they formulate their strategy to combat climate change.
Pallone also promised his committee would examine the Green New Deal, but cast doubt on whether a core component of it — decarbonizing the U.S. within 10 years — is achievable.
"Some of the countries that have been a lot more progressive on this, like in Western Europe for example, they’re moving towards carbon free or carbon neutral but it’s going to take more than ten years," he said. "This is something that we should take a look at, but some of it may not be technologically or politically feasible.”
Pallone said his panel expected to hold its first hearing, which will address climate change, by the end of the month and he vowed to conduct aggressive oversight of the Trump administration’s actions on the issue.
Ambitious proposals for tackling climate change and transforming the economy are setting up one of the party's most crucial debates heading into 2020.
By ZACK COLMAN 01/12/2019 06:42 AM EST Updated 01/12/2019 05:03 PM EST
Democrats are rallying to turn the “Green New Deal” into a centerpiece of their Capitol Hill agenda and the party’s 2020 platform — as soon as they decide what exactly it is.
The term has become a potent brand name for a slate of ideas for transforming the economy and fighting climate change, championed by progressives like Rep. Alexandria Ocasio-Cortez (D-N.Y.) and embraced, at least cautiously, by potential presidential nominees including Sen. Elizabeth Warren (D-Mass.), Sen. Bernie Sanders (I-Vt.) and Beto O’Rourke.
But not all Democrats have signed onto the full agenda that Ocasio-Cortez and her activist allies have rolled into their Green New Deal platforms — which encompasses proposals such as a complete switch to clean energy by 2030, big tax increases on the wealthy, retrofits of every building in the U.S. and a federal guarantee of a well-paying job to everyone who wants one. And some leading Democrats on the Hill already are criticizing some of those planks as unrealistically ambitious and politically polarizing.
That means one of the Democrats’ most crucial debates of 2019 will be defining what their Green New Deal entails. Dozens of suggestions are already emerging, including smaller-bore, middle-of-the-road ideas such as cleaning up polluted sites or offering new tax breaks for electric cars.
Liberal activists contend that Democrats need bold ideas — both to tackle the urgency of the climate crisis and to drum up the voter excitement to oust President Donald Trump.
"I think it is a fantastic idea and I think it is the secret, or one of the secrets, to winning 2020," Celinda Lake, a Democratic pollster who has worked with progressives like Ocasio-Cortez, said of the Green New Deal. "It combines an issue that Democrats are way ahead on — the environment — and an issue they need to desperately get ahead on — the economy."
But Republicans says the leftward push on climate change and the economy benefits them by moving Democrats away from centrist policies.
"There are certainly places in America, like where I come from, where those ideas further isolate Democrats from political success," said North Dakota Republican Sen. Kevin Cramer.
Progressives and environmental groups are working to line up a suite a bills that could serve as an environmental policy road map for Democrats, breaking the agenda into pieces that they see defining the Green New Deal over time.
"I don't even think it’s possible to reach the full scale of what we’re talking about with one piece of legislation," said Varshini Prakash, co-founder of Sunrise Movement, which organized protests at Speaker Nancy Pelosi's office to push for aggressive green policies. "It’s not going to be necessarily one bill or one piece of legislation or one level of government that makes this possible."
Environmental organizations like Sunrise, 350.org and Data for Progress are working to bring “standard setting marker bills” to the floor and to draw out more details from the potential Democratic presidential candidates, said Julian NoiseCat, policy analyst at 350.org, which is already discussing legislation with lawmakers.
“The term Green New Deal entered into the public discourse before a lot of the white papers, and think pieces of how you would achieve that were really out in the mainstream,” NoiseCat said. “The same could be said of the original New Deal, of Medicare for All. The same was obviously true of [President Donald] Trump’s border wall.”
Many of the ideas that could comprise an eventual Green New Deal have been circulating for years, and while there's little chance that the Republican Senate or Trump will take up any environmental measures, the groups are hoping the ideas will germinate over the next two years and provide some ready-made policies Democrats can act on if they win White House in 2020.
Liberal research group Data for Progress has identified 31 bills introduced last Congress that could be part of the Green New Deal, including Rep. Marc Veasey (D-Texas)'s bill, H.R. 2830, to eliminate methane leaks from natural gas infrastructure, as well as Rep. Tulsi Gabbard (D-Hawaii)'s measure, H.R. 3671, to electrify transportation and shift to renewable sources by 2035, end subsidies and exports of fossil fuels and permanently extend renewable energy incentives.
And it could even include Republican Sen. Jim Inhofe's (R-Okla.) plan, S. 822 (115), to expand grants for cleaning up and reusing brownfield sites, as well as various carbon pricing proposals. State-level policies in Washington state, Rhode Island, Hawaii and Maine may also be considered.
Data for Progress expects lawmakers to float bills with national targets for generating electricity from renewable sources, infrastructure packages that include climate change provisions and new tax credits for electric vehicles and renewable energy. Some measures might hitch a ride on spending bills or must-pass items, while others will be seen as “marquee bills" that try to advance the Green New Deal's most ambitious elements, like job guarantees or nationwide building upgrades.
The most prominent bill from last Congress came from Green New Deal supporters Sens. Jeff Merkley (D-Ore.), Cory Booker (D-N.J.) and Sanders, which called for getting 100 percent of U.S. electricity from renewable sources by 2050. All three are considering a presidential runs.
But Sunrise's Prakash said the statements from the likely presidential candidates have "been real fuzzy," mostly consisting of support for the general concept or idea of a Green New Deal. Her group will be meeting with staff for Merkley, Booker, Sanders and Warren to draw them out on what Prakash sees as the three core principles: ending emissions-generating energy by 2030, guaranteeing good-paying jobs for everyone and providing economic and racial justice for all.
The green activists are hoping their policies will play a prominent role in the 2020 contest, when they want Democrats to rally around the progressive cause.
“Policy details are going to matter and be very important,” said Sean McElwee, co-founder of Data for Progress. “But the actual meta politics question is how do we make sure, in a roughly two-year period, ... Democrats create an agenda? How do we make sure that the Green New Deal and the environment take up a substantial share of that floor time when we have a bunch of competing interests?”
The newly formed think tank New Consensus will be tasked with doing some of that legislative legwork. The policy operations are being run by Rhiana Gunn-Wright, who was policy director for Abdul El-Sayed, the progressive activist and physician who last year lost the Michigan Democratic primary to now-Gov. Gretchen Whitmer. The group plans to meet with various constituencies to discuss elements of the Green New Deal and help shape it into a platform.
Historians say there are some parallels between what activists are doing with the Green New Deal and how President Franklin D. Roosevelt’s New Deal was formed. FDR merely mentioned giving Americans a “new deal” on the campaign trail and came into office working on “mostly good intentions and to be active ... without much in the way of specifics,” Robert McElvaine, a history professor at Millsaps College and Great Depression expert, said in an email.
Many of the policies that became the New Deal were bouncing around Democratic circles for decades, said David M. Kennedy, a history professor emeritus at Stanford University. Correspondence between FDR and lawmakers in the 1920s showed them planting the seedlings that eventually grew into Social Security, unemployment insurance, farm relief and the Tennessee Valley Authority.
“The level of concept, of high-altitude level definition of what the landscape looks like, the Green New Deal has some generic resemblance to the way Democrats — and believe it or not there were some progressive Republicans — were thinking,” Kennedy said.
The green groups say they are wary of bills that would dilute the aggressive agenda at the center of the Green New Deal — a real possibility given the somewhat skeptical reception some top House Democrats have given it. Energy and Commerce Chairman Frank Pallone (D-N.J.), who is planning to make climate change the first item on his agenda, said in a radio interview on Tuesday that the goal to "decarbonize" the U.S. economy by 2030 is unrealistic.
"Some of the countries that have been a lot more progressive on this, like in Western Europe for example, they’re moving towards carbon-free or carbon-neutral, but it’s going to take more than 10 years," he told WNYC. "This is something that we should take a look at, but some of it may not be technologically or politically feasible.”
That type of lukewarm support makes Green New Deal advocates nervous.
"There’s a million and one ways that this could get watered down," Prakash said. "I could totally see a lot of Democrats not pushing for the scale of ambition that we need. The fossil fuel industry hasn't even gotten all of its firepower behind fighting a Green New Deal."
One of Congress' most conservative Democrats, House Agriculture Chairman Collin Peterson of Minnesota, expressed even more skepticism about the push to advance a national strategy to fight climate change. The issue does not appear on a list of priorities for House Agriculture Democrats this year, according to a committee document obtained by POLITICO.
“What is our goal? Planting all those trees? I’m actually cutting down the forest,” Peterson told reporters recently, shortly after he began logging his own land to spur regrowth.
Still, some industry voices are already seeking to claim to a slice of a Green New Deal. Western Energy Alliance, an oil and gas industry group, noted in a press release this week that switching from coal to natural gas in power generation has driven carbon emissions 14 percent lower since 2005, citing figures from the U.S. Energy Information Administration.
“When it comes to climate change, do we care about actual results, or do we just care about virtue signaling?” Kathleen Sgamma, the organization’s president, said in a statement that slammed Green New Deal proponents for criticizing natural gas.
Catherine Boudreau contributed to this report.
Washington Gov. Jay Inslee, who is weighing a 2020 run, is rethinking the carbon tax approach to saving the planet. He believes his party should, too.
Governor Jay Inslee speaks at the Battle Born Progress Progressive Summit in Las Vegas on Jan. 12, 2019.John Locher / AP
Jan. 13, 2019, 8:13 AM EST
By Benjy Sarlin
WASHINGTON — Gov. Jay Inslee of Washington staked the future of his environmental policy on something activists had advocated for decades: a first-of-its-kind statewide fee on carbon pollution.
But in one of the greenest states in the country, in a historic midterm year for Democrats and amid a spate of new reports warning of climate catastrophe, his efforts to put a price on carbon failed badly.
Undaunted, Inslee is looking to carry the lessons learned from a long career of incremental wins and heartbreaking losses on climate policy to the national stage as a possible presidential contender.
"I learned one of the key talents is persistence," he told NBC News in an interview. "Climate change is not going away, and neither are we."
Could climate change separate Gov. Jay Inslee from pack of Democratic presidential candidates?
As a generation of young activists, led by new voices like Rep. Alexandria Ocasio-Cortez, D-N.Y., rise to the forefront, they may want to pull up a seat next to the 67-year-old governor and hear his stories. More than a decade before this year's rallies for a Green New Deal — a plan to spend big on a rapid transition to renewable energy — Inslee, in speeches, op-eds and a book, was calling for a "moonshot" federal project modeled on the Apollo space program to slash emissions.
As a member of Congress, he helped advance a landmark cap-and-trade bill, which failed in the Senate, and he loaded President Barack Obama's stimulus with $90 billion in green initiatives, which passed. As governor, he's led a multipronged effort to boost electric vehicles and support research into clean technology.
His potential entry into the wide-open 2020 Democratic primary contest with a climate-focused campaign comes amid an intense debate over how to marry environmental sustainability with political sustainability, a question he's grappled with like few others. He believes the fate of the world depends on getting the answer right.
"That's what's at stake here," Inslee said. "A fundamental continuation of life and civilization as we've become accustomed to."
The fight for a carbon fee
Over a period of several years, Inslee and activists tried repeatedly to enact a fee on carbon emissions via legislation and ballot initiatives, an idea long backed by economists as a way to nudge consumers toward clean energy.
In interviews, participants in the efforts described a series of well-intentioned plans brought down by opposition from industry groups that spent tens of millions of dollars, infighting among activists and skepticism among voters.
The initial attempt began outside the Inslee administration in 2016, when a group called Carbon WA led a campaign for a ballot measure, Initiative 732, that would have taxed carbon emissions and use the revenue to cut taxes elsewhere.
Supporters emphasized its political appeal. There is at least some support for the concept of a revenue-neutral carbon tax in conservative circles, since it doesn't entail an overall tax hike, and Carbon WA earned backing from three Republican state senators.
But the proposal also failed to garner support on the left, where activists favored using the additional revenue on investments to cut further emissions and to finance related "green jobs" in low-income communities. The national Sierra Club opposed 732 even as it acknowledged dissension among its members over the decision.
Inslee opposed it, too, in part over concerns that the revenue projections were off.
With the environmental community split, the measure garnered just 40 percent of the vote and lost every county outside of Seattle.
"Politically, both approaches are challenging," said Yoram Bauman, an economist who led the Carbon WA campaign and now works on energy policy in Utah, referring to courting Republicans with new tax cuts and uniting the left by putting the money into its priorities. "It's very difficult to tell a story about how the Democrats take over the world. But it's also difficult to tell a story about how you get enough Republicans and Democrats on board with some kind of centrist climate policy."
In 2018, Inslee and his allies looked to improve on the idea with a new plan to enact a carbon fee and use it to fund a variety of clean energy projects. They built a coalition of labor unions, environmentalists, racial justice groups, faith organizations and businesses to promote it.
The investment side was especially important, Inslee said, because his administration estimated about 90 percent of his proposal's carbon reductions came from its various green programs, rather than changes in consumer behavior due to higher energy prices. But he still saw the carbon fee as a "signal to inspire businesses and consumers to move to a less carbon intensive product" that could provide a foundation for future policy.
Despite Inslee's support, a bill to enact his plan stalled in the narrowly Democratic-controlled state Senate. That left a direct appeal to voters as a last resort. Another ballot measure, Initiative 1631, which would create a carbon fee, was born.
Inslee backs down
With a larger coalition, however, came a larger response from affected industries. Led by oil companies, opponents spent a state-record $31 million to defeat Initiative 1631 last year, about double the amount spent by supporters, who had their own backup from billionaires Michael Bloomberg and Bill Gates.
The "no" campaign pointed to higher energy costs for consumers. But in a move that could be a preview of future fights on the national level, the opponents' message focused heavily on splitting the left with accusations that 1631 exempted too many polluters and would not fund effective programs.
"Voters rejected I-1631 because they understood it was a flawed initiative that would have significant economic impacts on the state's economy while doing very little to meet carbon reduction goals in the state," said Catherine Reheis-Boyd, president of the Western States Petroleum Association, which helped spearhead opposition to the plan.
A supporter of Initiative 1631 holds a sign referencing the Nisqually Indian Tribe on Oct. 17, 2018, during a rally supporting I-1631, a November ballot measure in Washington state that would charge a fee on carbon emissions from fossil fuels.Ted S. Warren / AP file
The attacks were effective. Even as Democrats made gains in the legislature with environmentalist candidates, the initiative failed and improved only modestly on its 2016 predecessor with 43 percent of the vote.
Inslee has since backed off a carbon fee for now at the state level, instead focusing on a package of renewable energy investments financed by other means. And he's grown skeptical as to whether Democrats should pursue a similar policy nationally if it distracts from other green priorities.
"You can get enormous benefit without, perhaps, a carbon-pricing system," he said. "That should not totally take it out of consideration, but there's many, many ways to skin this cat."
State Sen. Reuven Carlyle, a Democrat who has quarterbacked Inslee's climate agenda in the legislature, said he still believes lawmakers need to put a price on carbon someday. But with voters unconvinced, it's better to bring emissions down elsewhere, Carlyle said.
"We need to go back to the drawing board and respect the will of the voters," he said.
Democrats, listen up
In many ways, the emerging national picture on climate policy looks similar to what Inslee has faced.
Heading into the presidential campaign, there’s a burst of grassroots energy around the Green New Deal, but it faces competition from similarly ambitious Democratic proposals on health care, education, taxes and more.
Inslee hasn't ignored those items (he just proposed a new public health care option in his state), but he has a message Democratic voters might not hear from the party's presidential candidates: If you're going to tackle climate change, the rest may have to wait.
"When you want college education for your kids, when you want better health care, when you want net neutrality, when you want all of those things, but your house is on fire and it's burning down, you've got to put the fire out first and get your family out of the house," he said.
"That's the type of prioritization we have to make if we are going to succeed in rescuing our country from this existential threat," he added.
Diver Kim Thomas holds a "Yes on 1631" sign as she dives in a large aquarium display at the Seattle Aquarium during an event to announce the endorsement of Initiative 1631 by the aquarium and the Woodland Park Zoo on Oct. 25, 2018, in Seattle.Ted S. Warren / AP
With an uncertain policy path forward, national Democrats are facing debates much like the ones environmentalists faced in Washington state.
Young activists in groups like the Sunrise Movement are focused on a massive jobs program to help transition the economy away from fossil fuels. But some Democrats are also pursuing a revenue-neutral carbon tax with GOP support. Before he left the Senate this month, Jeff Flake of Arizona signed onto a carbon tax with Sen. Chris Coons, D-Del.
Having just faced a voter backlash of his own, though, Inslee has grown wary of a national plan that leans strongly on a carbon tax to slash emissions.
"To actually get carbon savings, you need to jack up the price so high that it becomes politically untenable," he said.
The better option, in his eyes, is to look to taxes on the wealthy to fund a Green New Deal. Reversing portions of the Trump tax cuts could provide an easier source of financing, he said.
But if that doesn't fly either, then it's on to the next plan and the one after that. If there's one message the battle-scarred governor has for young voters and Democratic presidential hopefuls, it's not to give up when things go sour.
"As Churchill said, victory is the only option, because without victory there is no survival," Inslee said.
If successful, it could reduce landfill use and save the city millions. There are a few obstacles to work through first, though.
By Lisa M. Collins
Nov. 9, 2018
Composting has such potential. It can reduce the garbage sent to landfills and save money at the same time. San Francisco claims to have reduced landfill usage by 80 percent, and Seoul, South Korea, a city of 10 million, claims that it saves $600,000 daily by charging residents and businesses fees for discarded food scraps.
But for New York City, where food scraps account for an estimated one-third of all garbage, composting is hardly making rapid or dramatic progress.
In 2015, Mayor Bill de Blasio introduced the “Zero Waste” initiative, aiming for a 90 percent reduction in landfill use by 2030. A cornerstone of the plan was a robust compost program, where organic matter would be placed in brown bins provided by the city, picked up by the Sanitation Department, and then sold or delivered to places that turn the food into compost for gardening or convert it to energy. It is the largest compost program in the country, with brown bins for 3.5 million residents across the five boroughs, said Sanitation Commissioner Kathryn Garcia.
But the program picked up only 43,000 tons of food scraps last year.
That’s about five percent of the city’s total food waste sent to landfills. For those following the Zero Waste target: We only have 85 percent more to go.
The brown bin compost program, which started as a small pilot program on Staten Island in 2013, was expected to expand citywide by the end of this year. But the pickup service in some of the 24 neighborhoods where it is offered has been reduced and expansion plans have been delayed.
This leaves many New Yorkers wondering whether a composting program across the city will work. Here is an explanation of where things stand.
How does composting save money?
The less we export to landfills, the more money we save.
The city will spend $411 million in 2019 to export about 2.5 million tons of residential, school and governmental trash to landfills located as far away as South Carolina. In 2014 the city spent $300 million. The export cost is expected to increase to $421 million by 2021.
“At this rate, we will be spending half a billion dollars,” said Antonio Reynoso, chairman of the City Council’s sanitation committee.
Is composting lucrative for the city?
Not yet. The compost program cost the city $15.7 million this year, and unlike recycling (which costs less to process than landfill waste, according to Mr. Reynoso), so far it doesn’t bring in much money. Last year, the city earned $58,000 from selling compost, according to the Sanitation Department. So there’s room for growth.
Who currently gets the brown bins?
Buildings with nine or fewer units in the community districts where there is curbside service automatically receive brown bins, along with information on what can go in them (yes to meat and bones and coffee grounds and food-soiled paper; no to cat litter, diapers and plastic bottles). Buildings with more than nine units must apply for the program.
Is the compost program in jeopardy?
It’s certainly not a raging success. At this moment, the Sanitation Department is not on track to expand the program on time and has cut brown-bin pickup service from twice a week in some areas to once weekly, on recycling days. Service and pickup schedules have been experimental as the Sanitation Department tested behaviors, types of garbage trucks and routes, a Department spokeswoman said.
What was the problem with composting?
Low participation in the neighborhoods that took part in the pilot program led to inefficiencies and high costs, Ms. Garcia said. “We love composting,” said Kristin Brady, of Carroll Gardens, Brooklyn, who uses the service every week. “But most of the people we know don’t compost because of cleaning the honestly somewhat gross outdoor brown bin.” A Department spokeswoman said that residents put only about 10 percent of their food scraps in the brown bins, throwing the rest in the garbage. Thus, garbage trucks with special compost compartments were running around with little to carry.
Why is participation so low?
Mr. Reynoso, who represents parts of Brooklyn, said he thinks the problem is a lack of advertising and education, and the fact that the program is voluntary. His efforts to increase the compost advertising budget have been unsuccessful, he said.
“Survey 10 people in New York City, and you would be hard-pressed to find a single person who knows how recycling works and how to make it work right, and what it means to the city financially,” Mr. Reynoso said. “In my building, we received the brown bins, and some fliers. I guarantee I’m the only person in my building who knows how to use them.”
What are the major hurdles?
“The biggest challenge is asking New Yorkers to do something different,” Ms. Garcia said. She told a story about how the department was handing out brown bins and an older man said that he didn’t want one.
“But we were handing out compost at the same time, and he definitely wanted the compost,” Ms. Garcia said. “We said, ‘We really need your banana peels in order to make this in the future.’ He took the brown bin.
Is there any good news?
New Yorkers are throwing out less trash. In 2017, the Sanitation Department collected 2.5 million tons of garbage destined for landfill, down from 2.8 million tons in 2005, even as the population grew.
While our residential recycling rate is quite low at 17 percent, New Yorkers are good recyclers of corrugated cardboard, for example (79 percent).
What about waste from businesses?
Businesses in New York City must pay to haul away their trash (an estimated 11 million tons of it every year). In 2017, large food service establishments and arenas were required to separate their food waste or face fines. In August of this year, the New York City Council passed an ordinance to require large restaurants and hotels and large food manufacturers to separate out their food waste. Fines will begin in February.
Just this week, the city announced a new plan that will require all private haulers picking up commercial waste to provide recycling and organics collection. Businesses will be incentivized. They will pay lower rates for food scrap and recycling pickups than they will for garbage, a spokeswoman for the Sanitation Dept. said.
Will composting come to high-rise apartment buildings?
It’s a work in progress. The Department of Sanitation says that 2,000 high rises throughout the five boroughs currently have brown bin service. An effort is underway to sign up more high rises in Manhattan and the South Bronx.
Council Member Ben Kallos represents the Upper East Side of Manhattan. The 168,000 residents in his district, the second largest in the city, mostly live in high rises. Mr. Kallos has proposed a measure that would mandate the mayor’s Zero Waste initiative to include targets and updates. The measure failed, and the effort to bring residential composting to his district has been frustrating, he said.
“We’ve worked with a number of residents and buildings to get composting,” Mr. Kallos said. “But I’ve yet to hear of any successes. I’ve never seen any brown bins in my district and I’d be surprised if there are any.”
(A spokeswoman for the Department of Sanitation said that curbside service is available in all of Manhattan, including the Upper East Side, where 33 high-rise buildings have signed up for it.)
Is there a future for composting in New York City?
Experts are cautiously optimistic. Ms. Garcia said the city’s compost program is a priority, and the city remains dedicated to its Zero Waste goal. Ms. Garcia pointed out that residential compost collection is increasing. In 2017, the city collected 13,000 tons. In 2018, that amount grew to 43,000 tons (31,000 from brown bin pickups and another 12,000 from fall leaves, Greenmarket pickups and the Christmas tree recycling program).
“We’ve seen a lot of growth,” said Ms. Garcia, aided in large part by the work of nonprofits like the NYC Compost Project (nyc.gov/compostproject) and GrowNYC, which provide food scrap drop-off sites at subway stops and at green markets.
Ambitious scheme also aims to fully decarbonise country’s economy shortly after
Tue 13 Nov 2018 10.13 EST Last modified on Tue 13 Nov 2018 11.46 EST
El Bonillo solar plant, Albacete. The Spanish government aims to install at least 3,000MW of wind and solar power capacity yearly in the next decade. Photograph: Pablo Blazquez Dominguez/Getty Images
Spain has launched an ambitious plan to switch its electricity system entirely to renewable sources by 2050 and completely decarbonise its economy soon after.
By mid-century greenhouse gas emissions would be slashed by 90% from 1990 levels under Spain’s draft climate change and energy transition law.
To do this, the country’s social democratic government is committing to installing at least 3,000MW of wind and solar power capacity every year in the next 10 years ahead.
New licences for fossil fuel drills, hydrocarbon exploitation and fracking wells, will be banned, and a fifth of the state budget will be reserved for measures that can mitigate climate change. This money will ratchet upwards from 2025.
Christiana Figueres, a former executive secretary of the UN’s framework convention on climate change (UNFCCC), hailed the draft Spanish law as “an excellent example of the Paris agreement”. She added: “It sets a long-term goal, provides incentives on scaling up emissions technologies and cares about a good transition for the workforce.”
Under the plan, “just transition” contracts will be drawn up, similar to the £220m package announced in October, that will shut most Spanish coalmines in return for a suite of early retirement schemes, re-skilling in clean energy jobs, and environmental restoration. These deals will be partly financed by auction returns from the sale of emissions rights.
The government has already scrapped a controversial “sun tax” that halted Spain’s booming renewables sector earlier this decade, and the new law will also mandate a 35% electricity share for green energy by 2030.
James Watson, chief executive of the SolarPower Europe trade association, said the law was “a wake-up call to the rest of the world”.
Energy efficiency will also be improved by 35% within 11 years, and government and public sector authorities will be able to lease only buildings that have almost zero energy consumption.
Laurence Tubiana, chief executive of the European Climate Foundation, and former French climate envoy who helped draft the Paris accord, described the agreement as groundbreaking and inspirational. “By planning on going carbon neutral, Spain shows that the battle against climate change is deadly serious, that they are ready to step up and plan to reap the rewards of decarbonisation,” she said.
However, the government’s hold on power is fragile. With just a quarter of parliamentary seats it will depend on the more leftwing Podemos and liberal Ciudadanos parties to pass the climate plan.
No dates were included in the legislation for phaseouts of coal or nuclear energy, and a ban on new cars with petrol or diesel engines was delayed until 2040.
Rising temperatures are creating new habitats for ticks.
By Marlene Cimons Nexus Media November 12, 2018
Lyme disease is transmitted to humans by black-legged ticks, pictured above. Rising temperatures are providing new habitats for these insects.
German physician Alfred Buchwald had no clue that the chronic skin inflammation he described in 1883 was the first recorded case of a serious tick-carrying disease, one that would take hold in a small Connecticut town almost a century later and go on to afflict people across the United States.
Today we know a lot more than Buchwald did about Lyme disease — that it is caused by a bacterium, Borrelia burgdorferi, and it is transmitted to humans by blacklegged ticks, and that it can cause untold misery for those infected. U.S. scientists first recognized the disease in the 1970s in Old Lyme, Connecticut — hence the name. The condition starts with fever, headache, fatigue, and a characteristic bullseye rash. Untreated, it can spread to the joints, the heart, and nervous system producing long-lasting, debilitating symptoms. Early use of antibiotics is crucial.
About 300,000 Americans are diagnosed annually with Lyme, with cases concentrated in the Northeast and upper Midwest, according to the Centers for Disease Control and Prevention. The incidence of the disease has doubled in the United States since 1991, according to the EPA. And it’s about to get much worse, thanks to climate change.
“Warmer temperatures are making cold places suitable habitats for ticks, so new places are having Lyme disease cases, and endemic areas are having more cases than the average,” said Edson Severnini, assistant professor of economics and public policy at Carnegie Mellon University’s Heinz College, and co-author of a new study that predicts the incidence Lyme disease will rise around 21 percent by mid-century.
Climate change already has amplified the range of invasive insects that devour crops, destroy homes, and spread disease. “Tick-borne diseases are an important public health concern and the incidence of these infections is increasing in the United States and worldwide,” said Igor Dumic, the study’s lead author and a researcher at the Mayo Clinic College of Medicine and Science who has treated numerous Lyme patients. “Lyme disease is a classic example of the link between environmental factors and the occurrence and spread of disease.”
Ticks spend most of their lives in environments where temperature and humidity directly affect their survival. For this reason, the EPA uses Lyme disease as an indicator of climate change. Higher temperatures spur ticks to venture farther in search of hosts, such as deer, which are more plentiful after warmer winters. “The Lyme disease vector tick needs deer to complete its life cycle, so this means that more ticks will be completing their life cycle, and consequently the tick population will increase,” Severnini said. “Also, as temperature rises, people may engage in more outdoor activities, increasing exposure to ticks.”
The research, which appears in the Canadian Journal of Infectious Diseases and Medical Microbiology, examined the relationship between weather conditions and Lyme disease in 15 U.S. states. These states, located primarily in the Northeast and Upper Midwest, make up 95 percent of all U.S. reported cases. The scientists used epidemiological data from the CDC and at meteorological data from the National Oceanic and Atmospheric Administration.
Warmer winters mean more deer, which host ticks.
Assuming the temperature will increase by 2 degrees Celsius by mid-century — the forecast of the U.S. National Climate Assessment—the United States will see 8.6 more cases of Lyme disease per 100,000 people annually. That is bad news, but governments can take steps to keep the disease in check, Severnini said.
“We need to educate people about how to look for ticks after going to wooded areas where ticks are abundant,” he said. “Secondly, people and clinicians should be aware that just because ticks are not present in certain areas it doesn’t mean that people aren’t traveling to areas where ticks are present. For example, a resident of Arizona, where Lyme disease is rare, can acquire it while camping in Wisconsin and get symptoms upon returning to [his or her] home state of Arizona.”
It’s not just education, Severnini said. “We can also invest in the development of a Lyme vaccine, use insecticide and acaricidal to decrease the tick population,” he added. “Finally, we can prevent severe global warming.”
By Avery Miles | January 28, 2019
After a rainy summer and an even wetter fall, New Yorkers are becoming familiar with showery forecasts. With droughts and wildfires in the West and Southwest, rain might seem like a welcome reprieve—but not in New York City, where the rain washes pet waste, plastic utensils and other street grime into an aging sewer system that often can’t handle it.
Every year the city releases 100 billion liters of untreated water sewage and stormwater runoff into its waterways. Whenever it rains heavily, the surge of stormwater forces the sewers to hit maximum capacity, causing wastewater to flow from the 100-year-old system into nearby rivers.
The city is working on various ways to improve water quality but those efforts might not be enough. Recently, researchers at Queens College found that pollution from the overflow might actually be contributing to greenhouse gases in nearby marshes.
“The take home message [was] carbon additions increased both carbon dioxide and methane production in wetland soils,” said lead study author, Dr. Brian Brigham.
Typically wetlands serve as carbon sinks, areas that can absorb carbon dioxide from the atmosphere. They trap and store the carbon through a process called biological carbon sequestration. Wetland vegetation and soil accumulate organic matter, which is anything that includes carbon, before decomposing and sometimes emitting it through natural respiration. This is all part of the carbon cycle, a set of processes by which natural systems absorb and emit carbon. It’s estimated that wetlands store about 35 percent of all land-based carbon.
But the U.S has lost more than half of its original wetland areas because of agricultural and urban developments. When wetlands are drained or otherwise disturbed, that stored carbon can be released back into the atmosphere. Every time it rains a lot, and human pollutants are carried by the excess stormwater into surrounding waterways, wetlands lose their ability to contain the carbon, allowing more greenhouse gases to enter the atmosphere.
In order to show this, Brigham and his team added certain types of carbon and nitrogen commonly found in sewage-polluted environments to three different marshes along the Hudson River. In each sample, they took mud, soil and microbes and simulated in the lab what would happen if sewage had been added. They found that the added carbon increased carbon dioxide production rates 1.4 to 2 times more in the treated soils than in the controls. The added carbon also increased methane production rates in all three sites, depending on their salinity.
Salty environments inhibit methane production. The less salinity a marsh has—like the brackish waters of the Piermont and Iona Island Marshes—the more methane is produced. Extra carbon caused significantly greater methane production in Piermont and Iona Island compared to the saltier waters of Staten Island’s Saw Mill Creek Marsh. Every influx of stormwater carrying human pollutants from the sewers brings along excess carbon, which fuels microbial respiration, producing more methane.
The researchers concluded that inorganic nitrogen was not a major driver in carbon dioxide and methane production rates.
“The most important thing about this study is that it points to the fact that there are a lot of underappreciated impacts from water pollution,” said Dan Shapley, New York Riverkeeper’s Water Quality Program Director, who said it could shake the ground for managing greenhouse gases and preventing climate change.
Other environmental scientists have also been studying this area and have found similar results.
Siobhan Fennessy is a professor of biology and environmental studies at Kenyon College and she thought the Queens College researchers’ results were generally consistent with how carbon cycles function in wetlands. One difference, she noted, between what happens in natural sites compared to the lab study is that they added acetate as a carbon source. Acetate is a chemical compound that Brigham’s team used because it’s likely to exist in sewage-polluted environments.
“So the huge response they got in the lab study may be more,” Fennessey said, “than what they would’ve gotten if they had used the same sort of carbon sources that are in the CSO.”
Climate change is leading to more intense rainstorms and as a result there’s a peak load problem. The NYC Panel on Climate Change expects up to an 11 percent increase in precipitation by the 2050s.
This excess amount of water is predicted to stress the city’s sewer system even further, causing wastewater that is normally treated by treatment plants to flow into nearby rivers and wetlands.
Under a 2012 Consent Order, the city committed to spend a total of $4.2 billion to mitigate combined sewer overflows. It promised to invest in improvements like wastewater treatment plant upgrades, storage tanks and sewer separation as well as allot $1.5 billion to green infrastructure projects like rain gardens and green roofs.
The city’s green infrastructure addresses 90 percent of rain events for the year but it’s not designed for very large storms, said DEP Managing Director of the Office of Ecosystem Services, Green Infrastructure and Research, John McLaughlin.
“It’s all very storm-specific,” he said, if it rains three inches in four hours, the rain gardens and green roofs probably won’t be able to handle that volume of water but they probably can take on three inches over the course of a day. If the rainfall is spaced out enough and the subsurface infiltration is great enough, green infrastructure is equipped to handle stormwater catchment, he explained.
Some still think the city could be a lot more ambitious about how they’re reducing CSOs. Korin Tangtrakul, a Soil and Water Conservation Stormwater Technician with New York SWIM Coalition, believes having a more robust and effective green infrastructure plan will help reduce the overflow.
“To think that sewage is causing more greenhouse gas emissions in wetlands—it’s a pretty terrifying prospect,” she said.
As the city’s population increases, the amount of sanitary waste pouring into the system also skyrockets, potentially adding to the greenhouse gas emissions. According to the Mayor’s Office of Climate Policy and Programs, the city is leading in the fight against climate change with a commitment to reduce these emissions 80 percent by 2050.
But Dr. Greg O’Mullan believes the problem is a lack of understanding of how CSOs and greenhouse gas emissions relate to one another. An environmental microbiologist who worked on the study with Brigham, O’Mullan has been studying bacteria indicators in the city’s surrounding waterways for the past 15 years.
“The CSOs are a design feature that has very negative consequences,” said O’Mullan. When city planners discuss CSOs, he said they should look beyond just the health factors like reducing pathogens and infection risk.
“We don’t account for what’s released into the environment and turned into greenhouse gases,” he said, because carbon can be delivered in one form but there are alterations that happen in the environment, which turn it into carbon dioxide and methane.
“New York City has reduced sewer overflows by 80 percent and the Harbor is cleaner today than it has been since the Civil War,” according to a spokesperson for the mayor’s office, and the city “continues to invest billions of dollars to reduce overflows and clean up all of the waterways.”
But Tangtrakul said the study’s findings further demonstrate the need to take action to reduce combined sewer overflows. New Yorkers should start conserving water during rainstorms to help reduce the amount of sewage contributing to a CSO, she said. That means waiting to do laundry or dishes until after the storm passes, she said.
“If every resident in New York did that,” she said, “we would significantly reduce how much sewage ended up in the waterways.”
Nick Sobczyk, E&E News reporter
E&E Daily: Thursday, January 10, 2019
Climate activists protesting on Capitol Hill. @sunrisemvmt/Twitter
Progressive organizers today presented lawmakers with a formal wish list for "Green New Deal" legislation, asking for a sweeping plan to address climate change and move to 100 percent renewable energy.
In a letter to lawmakers on Capitol Hill, more than 600 advocacy groups, including Friends of the Earth, Food & Water Watch, the Center for Biological Diversity and the Sunrise Movement, laid down a set of principles, fleshing out the "Green New Deal" proposal floated by Rep. Alexandria Ocasio-Cortez (D-N.Y.).
At the top of the list is an end to all fossil fuel leasing and subsidies and a transition to all renewables "by 2035 or earlier."
"Congress must bring the outdated regulation of electricity into the twenty-first century, encouraging public and community ownership over power infrastructure and electricity choice, as well as permitting distributed energy sources, including rooftop and community solar programs to supply the grid," they wrote.
The letter is the most detailed look yet at what progressives mean when they say "Green New Deal." To this point, conservatives and energy industry groups have criticized the platform for being too vague and overly ambitious given current technology.
But the letter is still largely goals, rather than policies, and some of its more specific proposals won't sit well with even some Democrats.
For instance, the groups write that any climate legislation should reinstate the 40-year ban on fossil fuel exports, which ended during the Obama administration.
They also said they would oppose any bill to promote "corporate schemes," including carbon capture and storage, emissions trading, and nuclear energy, all seen in some parts of the climate policy world as viable, if incomplete, solutions to global warming.
Other ideas are unsurprising, such as an effort to update the U.S. electric grid and promote battery storage as part of the effort to get to 100 percent renewables.
"In addition, Congress must bring the outdated regulation of electricity into the twenty-first century, encouraging public and community ownership over power infrastructure and electricity choice, as well as permitting distributed energy sources, including rooftop and community solar programs to supply the grid," they wrote.
Progressives also want an economic plan to ensure a just transition for fossil fuel workers and statutory requirements and timelines for EPA greenhouse gas regulations under the Clean Air Act.
For now, the closest thing to the progressive plan is the "Climate Solutions Act," introduced this week by California Democratic Rep. Ted Lieu (Greenwire, Jan. 9).
And while that won't pass the Senate or be signed by President Trump, progressives are looking to influence Democratic debates in the House headed into the 2020 presidential race.
"As the world teeters on the brink of climate catastrophe, we're calling on Congress to take large-scale action," Bill Snape, senior counsel at the Center for Biological Diversity, said in a statement. "Americans want a livable future for their children, and that requires keeping fossil fuels in the ground while greening the economy on a wartime footing."
Re: Legislation to Address the Urgent Threat of Climate Change
January 10, 2019
On behalf of our millions of members and supporters, we are writing today to urge you to consider the following principles as the 116th Congress debates climate change legislation and momentum around the country builds for a Green New Deal. As the Intergovernmental Panel on Climate Change recently warned, if we are to keep global warming below 1.5°C, we must act aggressively and quickly. At a minimum, reaching that target requires visionary and affirmative legislative action in the following areas:
Halt all fossil fuel leasing, phase out all fossil fuel extraction, and end fossil fuel and other dirty energy subsidies.
The science is clear that fossil fuels must be kept in the ground. Pursuing new fossil fuel projects at this moment in history is folly. Most immediately, the federal government must stop selling off or leasing publicly owned lands, water, and mineral rights for development to fossil fuel producers. The government must also stop approving fossil fuel power plants and infrastructure projects. We must reverse recent legislation that ended the 40-year ban on the export of crude oil, end the export of all other fossil fuels, and overhaul relevant statutes that govern fossil fuel extraction in order to pursue a managed decline of fossil fuel production. Further, the federal government must immediately end the massive, irrational subsidies and other financial support that fossil fuel, and other dirty energy companies (such as nuclear, waste incineration and biomass energy) continue to receive both domestically and overseas.
Transition power generation to 100% renewable energy.
As the United States shifts away from fossil fuels, we must simultaneously ramp up energy efficiency and transition to clean, renewable energy to power the nation’s economy where, in addition to excluding fossil fuels, any definition of renewable energy must also exclude all combustion-based power generation, nuclear, biomass energy, large scale hydro and waste-to-energy technologies. To achieve this, the United States must shift to 100 percent renewable power generation by 2035 or earlier. This shift will necessitate upgrading our electricity grid to be smart, efficient, and decentralized, with the ability to incorporate battery storage and distributed energy systems that are democratically governed. In addition, Congress must bring the outdated regulation of
electricity into the twenty-first century, encouraging public and community ownership over power infrastructure and electricity choice, as well as permitting distributed energy sources, including rooftop and community solar programs to supply the grid.
Expand public transportation and phase out fossil fuel vehicles.
As the transition away from fossil fuels occurs, our transportation system must also undergo 100 percent decarbonization. To accomplish a fossil-fuel-free reality, Congress must require and fund greater investment in renewable-energy-powered public transportation that serves the people who need it most. The United States must also phase out the sale of automobiles and trucks with internal fossil fuel combustion engines as quickly as possible and phase out all existing fossil fuel mobile sources by 2040 or earlier. Federal credits for electric vehicles must be expanded.
Harness the full power of the Clean Air Act.
The Clean Air Act provides powerful tools that have proven successful in protecting the air we breathe and reducing greenhouse pollution. It can also serve as an important backstop to ensure climate targets are met. Congress should harness the full power of the statute by setting strict deadlines and providing adequate funding for EPA to carry out all its duties under all applicable sections of the Act, including implementing greenhouse pollution reduction requirements for cars, trucks, aircraft, ships, smokestacks and other sources, as well as a science-based national pollution cap. The Act has successfully reduced many air pollutants and can do the same for greenhouse pollution.
Ensure a Just Transition led by impacted communities and workers.
In effectuating this energy transformation, it is critical to prioritize support for communities who have historically been harmed first and most by the dirty energy economy and workers in the energy sector and related industries. We support a comprehensive economic plan to drive job growth and invest in a new green economy that is designed, built and governed by communities and workers. Building new energy, waste, transportation and housing infrastructure, designed to serve climate resilience and human needs; retrofitting millions of buildings to conserve energy and other resources; and, actively restoring natural ecosystems to protect communities from climate change, are but a few ways to build a sustainable, low carbon economy where no one is left behind during this change.
Uphold Indigenous Rights
The United Nations Declaration on the Rights of Indigenous Peoples (UNDRIP) must be upheld and implemented, along with treaties, instruments and decisions of international law that recognize that Indigenous Peoples have the right to give or withhold “free, prior and informed consent” to legislation and development of their lands, territories and/or natural resources, cultural properties and heritage, and other interests, and to receive remedies of losses and damages of property taken without consent.
Further, we will vigorously oppose any legislation that: (1) rolls back existing environmental, health, and other protections, (2) protects fossil fuel and other dirty energy polluters from liability, or (3) promotes corporate schemes that place profits over community burdens and benefits, including market-based mechanisms and technology options such as carbon and emissions trading and offsets, carbon capture and storage, nuclear power, waste-to-energy and biomass energy. Fossil fuel companies should pay their fair share for damages caused by climate change, rather than shifting those costs to taxpayers.
We look forward to working with you to address the gravest environmental crisis humanity has ever faced, to protect all present and future generations around the world, while centering the rights of those communities and workers most impacted.
January 25, 2019
From Environment & Energy Report
.By Stephen Lee
The Interior Department is soon expected to release a proposal that would sweep away possible criminal prosecutions for oil and gas companies, wind and solar energy firms, ranchers, farmers, and developers who harm migratory birds.
If eventually finalized, as expected, the proposal would undo decades of established practice. For at least the last 50 years, the Migratory Bird Treaty Act has been interpreted as forbidding the killing, injuring, or capture of any covered bird species or their eggs or nests, even if done by accident.
That strict liability approach would come to an end under the expected changes: Only intentional migratory bird killings would count as violations of the statute.
That could mean the roughly 1,000 species covered under the statute would essentially lose their protections, according to Sarah Greenberger, senior vice president of conservation policy at the National Audubon Society.
“When you build a wind tower, your purpose is not to kill a bird,” Greenberger said. “Your intent is to create energy. When you build an oil waste pit, your intent is to get rid of oil waste, not to trap birds.”
The statute helped the federal government secure a $100 million settlement from Exxon Mobil Corp. for the 1989 Exxon Valdez oil spill, according to Greenberger. She said that settlement wouldn’t have been possible under the expected changes from the Trump administration.
The oil and gas industry backs the move.
“The U.S. oil and natural gas industry supports protection of migratory birds, and it is clear that the Migratory Bird Treaty Act should not be used for overzealous enforcement of criminal penalties on those engaging in otherwise lawful activities,” the American Petroleum Institute told Bloomberg Environment in a statement.
In December 2017, the Interior Department said in a memo that bird deaths prohibited by the Migratory Bird Treaty Act should apply “only to affirmative actions that have as their purpose the taking or killing of migratory birds, their nests, or their eggs.”
As previously formulated, the Migratory Bird Treaty Act “hangs the sword of Damocles over a host of otherwise lawful and productive actions, threatening up to six months in jail and a $15,000 penalty for each and every bird injured or killed,” the memo said.
Action Expected Post-Shutdown
The rulemaking is significant because the memo could be undone by a future administration. Once a new approach is formally issued as a regulation, it could only be reversed by another rulemaking, which would likely encounter opposition and court challenges.
Bird species that could be affected include the western meadowlark and great blue heron, which are vulnerable to oil waste pits; the brown pelican, common loon, and tufted puffin, which have been killed or injured in large numbers by oil spills in recent decades; and the red-tailed hawk and great horned owl, which are at high risk of electrocution from transmission lines, according to the National Audubon Society.
The proposal hasn’t been published yet, but the Trump administration has given signs that it’s coming. Following the December 2017 memo, Interior listed a proposal to “define the scope” of the law in its fall 2018 regulatory agenda. At the time, the department said it would publish a notice of proposed rulemaking in November.
Since then, environmental groups that oppose the proposal say they have been expecting it any time and now think it will be released shortly after the government shutdown ends.
“It could be days after the shutdown ends,” Brett Hartl, government affairs director for the Center for Biological Diversity, said. “We just don’t know.”
The Interior Department didn’t respond to a query about its plans.
‘Example of Astute Governance’
Interior’s approach is “an example of astute governance that provides certainty for responsible owners and operators of oil and natural gas facilities,” API said in its statement. “The opinion reinforces the original intent and text of this law and provides assurance to many others who engage in everyday and commercial activities like farming, mining, and operating wind turbines and solar facilities.”
Mike Speerschneider, senior director of permitting policy and environmental affairs at the American Wind Energy Association, said, “We haven’t seen any specifics, but would be interested to review a proposal from the administration that clarifies MBTA liability risk.”
“Whatever the proposal, our industry will continue to take actions, such as following the U.S. Fish and Wildlife Service Wind Energy Guidelines and focusing on ongoing research and conservation efforts, to minimize our already small impacts to birds,” Speerschneider said.
Attorneys general from California, Illinois, Maryland, Massachusetts, New Jersey, New Mexico, New York, and Oregon filed a lawsuit last May to challenge Interior’s 2017 memo.
The National Audubon Society, joined by three other groups, and the Natural Resources Defense Council, joined by one other group, filed lawsuits May 24 in the U.S. District Court for the Southern District of New York arguing that the policy violates federal laws.
"Detection of a toxic chemical in a northeastern Wisconsin wastewater treatment plant’s sludge has prompted a halt to application of the material on nearby farms and raised broader concerns about how public sewer systems across the state may be spreading the chemical across the landscape.
The contaminated sludge in Marinette also highlights unease and confusion in local communities over the absence of enforceable federal or Wisconsin environmental standards for the chemicals — often referred to by the acronym PFAS — despite at least two decades of research linking them to serious health problems.
Marinette has the worst PFAS contamination of drinking water that has been detected in the state. Private wells serving dozens of homes in the neighboring town of Peshtigo are affected, many with PFAS levels exceeding a federal health advisory. Tyco Fire Products, the local company blamed for the pollution, has installed water treatment systems and distributed bottled water in dozens of homes."
A surface elevation table is used to measure subsidence in a tropical swamp in Indonesia.
By Paul VoosenJan. 30, 2019 , 12:10 PM
For coastal communities, the sea level rise propelled by melting ice and warming oceans is bad enough. But people living on the soft, compressible sediments of river deltas have another factor to contend with: sinking land. Scientists have traditionally inferred the sinking from tide gauge readings or measured it directly at GPS stations. But a team of scientists now says these methods significantly underestimate subsidence at many deltas and low-lying coastlines worldwide.
In recent years, scientists at Tulane University in New Orleans, Louisiana, have shown that in the Mississippi River delta, fluffy, young sediments within a few meters of the surface are compacting rapidly. They estimate the effect more than doubles the region's rate of sea-level rise to a total of 13 millimeters a year. Tide gauges and GPS stations miss that subsidence because they are anchored to deeper layers, which are less susceptible to compaction.
The same mechanism is likely at play in many low-lying coastal areas worldwide, which host some 10% of the global population, the team argues in a paper published this week in Ocean Science. "Tide gauges are not measuring what we need," says Torbjörn Törnqvist, a geologist at Tulane and co-author on the study. "We need to really rethink how we monitor these areas."
Satellites are the main tools for monitoring the absolute changes in ocean height, which reflect the biggest drivers of sea level rise: melting ice and the expansion of warming water. But for people and ecosystems, the relative impact of rising or falling land is just as important. Some regions are still rebounding thousands of years after ancient ice sheets melted, lifting a colossal weight off Earth's elastic mantle. Many more are subsiding. "It's something we've been overlooking too long in sea level projections," says Aimée Slangen, a climate scientist at the Royal Netherlands Institute for Sea Research in Yerseke and a lead author of the sea level chapter of the next United Nations climate report.
Louisiana, for example, is sinking fast. Although compaction is the primary culprit, the extraction of groundwater, oil, and gas also play a role. Sediment washed down the Mississippi River once compensated for the subsidence, but levees and other engineered structures now shunt it out into the Gulf of Mexico. To monitor the sediment loss, the state over the past few decades has deployed a network of some 400 simple wetland-monitoring instruments, called surface elevation tables.
The table, a metal arm that juts out parallel to the swamp's surface, is anchored to a pole driven deep below. Twice a year, a series of pins are lowered from the table until they just touch the marsh surface—giving a regular measure of how fast the surface is sinking relative to deeper layers. Five years ago, when Törnqvist's group began to use this network to divine the source of Louisiana's subsidence, researchers realized the problem is not just sediment loss. Shallow soils, deposited in earlier centuries when the river ran free, are simply compressing. "Tide gauges were not capturing that," says Molly Keogh, the Tulane graduate student who led the new work.
The new paper lays out why. The region's 131 tide gauges measure the tide in comparison with benchmarks anchored in deep sediments, often tens of meters down—"as close as we get to bedrock in Louisiana," Keogh says. The region's 10 GPS stations with known benchmarks were also anchored, on average, 14 meters deep in the mud. To both devices, the zone of the most compaction—a source of half the sea level rise—was invisible. The scenario could be true in other delta regions that also rely on tide gauges, casting doubt on estimates of regional sea level rise, says Mark Schuerch, a physical geographer at the University of Lincoln in the United Kingdom. "It's quite innovative and quite exciting—or scary, really."
Figuring out the anchor depths of tide gauges elsewhere in the world will be a herculean task, warns Philip Woodworth, former director of the Permanent Service for Mean Sea Level in Liverpool, U.K., who reviewed the paper. Tide gauge records do not typically include the depth of their benchmarks; that knowledge, if it exists, is buried in country bureaucracies.
Moreover, the rate of shallow compaction probably varies greatly from wetland to wetland. In some marshes, plants compensate for compaction by capturing new sediment with their roots. And in some regions, such as Bangladesh, compaction occurs more uniformly across shallow and deep layers, says Céline Grall, a marine geologist at Columbia University's Lamont-Doherty Earth Observatory in Palisades, New York. "These assumptions are not true there."
Deploying elevation tables in deltas around the world could resolve those uncertainties, creating a global database, Törnqvist says. The tool is simple, cheap, and effective, and has already been used in more than 30 countries. For an area the size of coastal Louisiana, only 40 would be needed to keep track of subsidence—and determine how fast seas are truly rising. The millions of people living in the world's deltas need to know the answer, Grall says. "That's a legacy we should work on."
Public Release: 24-Jan-2019
Joint report: Technologies of the Fourth Industrial Revolution show huge potential and could lead to "dematerialization", better product tracking, take-back and recycling, and products sold as services
United Nations University
Seven UN entities have come together, supported by the World Economic Forum and the World Business Council for Sustainable Development to address e-waste
A new joint report shows that the world now discards approximately 50 million tonnes of electronic and electrical waste (e-waste) per year, greater in weight than all of the commercial airliners ever made, or enough Eiffel towers to fill Manhattan. Only 20% is formally recycled. If nothing changes, the United Nations University predicts e-waste could nearly triple to 120 million tonnes by 2050
In terms of material value, this presents an opportunity worth over 62.5 billion dollars per year, more than the GDP of most countries and three times the output of the world's silver mines. There is 100 times more gold in a tonne of e-waste than a tonne of gold ore
The joint report calls for a new vision for electronics based on the circular economy and the need for collaboration with major brands, small and medium-sized enterprises (SMEs), academia, trade unions, civil society and associations in a deliberative process to change the system
The report points to the importance of new technologies; although e-waste is growing, technologies from Internet of things to cloud computing show a huge potential and could lead to "dematerialization" and better product tracking, take-back and recycling
Major global brands, governments and other organizations support the initiative with commitments and projects to address e-waste and build a circular economy.
Davos, Switzerland, 24 January 2019 -- Seven UN entities have come together, supported by the World Economic Forum, and the World Business Council for Sustainable Development (WBCSD) to call for an overhaul of the current electronics system, with the aim of supporting international efforts to address e-waste challenges.
The report calls for a systematic collaboration with major brands, small and medium-sized enterprises (SMEs), academia, trade unions, civil society and associations in a deliberative process to reorient the system and reduce the waste of resources each year with a value greater than the GDP of most countries.
Each year, approximately 50 million tonnes of electronic and electrical waste (e-waste) are discarded -- the weight of more than all commercial airliners ever made. In terms of material value, this is worth 62.5 billion dollars-- more than the GDP of most countries.
Less than 20% of this is recycled formally. Informally, millions of people worldwide (over 600,000 in China alone) work to dispose of e-waste, much of it done in working conditions harmful to both health and the environment.
The report, "A New Circular Vision for Electronics - Time for a Global Reboot," launched in Davos 24 January, says technologies such as cloud computing and the internet of things (IoT), support gradual "dematerialization" of the electronics industry.
Meanwhile, to capture the global value of materials in the e-waste and create global circular value chains, the report also points to the use of new technology to create service business models, better product tracking and manufacturer or retailer take-back programs.
The report notes that material efficiency, recycling infrastructure and scaling up the volume and quality of recycled materials to meet the needs of electronics supply chains will all be essential for future production.
And if the electronics sector is supported with the right policy mix and managed in the right way, it could lead to the creation of millions of decent jobs worldwide.
The joint report calls for collaboration with multinationals, SMEs, entrepreneurs, academia, trade unions, civil society and associations to create a circular economy for electronics where waste is designed out, the environmental impact is reduced and decent work is created for millions.
The new report supports the work of the E-waste Coalition, which includes:
International Labour Organization (ILO);
International Telecommunication Union (ITU);
United Nations Environment Programme (UN Environment);
United Nations Industrial Development Organization (UNIDO);
United Nations Institute for Training and Research (UNITAR);
United Nations University (UNU), and
Secretariats of the Basel and Stockholm conventions.
The Coalition is supported by the World Business Council for Sustainable Development (WBCSD) and the World Economic Forum and coordinated by the Secretariat of the Environment Management Group (EMG).
Considerable work is being done on the ground. For example, in order to grasp the opportunity of the circular economy, today the Nigerian Government, the Global Environment Facility and UN Environment announce a 2 million dollar investment to kick off the formal e-waste recycling industry in Nigeria. The new investment will leverage over 13 million dollars in additional financing from the private sector.
According to the International Labour Organization, in Nigeria up 100,000 people work in the informal e-waste sector. This investment will help to create a system which formalizes these workers, giving them safe and decent employment while capturing the latent value in Nigeria's 500,000 tonnes of e-waste.
UNIDO collaborates with a large number of organizations on e-waste projects, including UNU, ILO, ITU, and WHO, as well as various other partners, such as Dell and the International Solid Waste Association (ISWA). In the Latin American and Caribbean region, a UNIDO e-waste project, co-funded by GEF, seeks to support sustainable economic and social growth in 13 countries. From upgrading e-waste recycling facilities, to helping to establish national e-waste management strategies, the initiative adopts a circular economy approach, whilst enhancing regional cooperation.
Another Platform for Accelerating the Circular Economy (PACE) report launched today by the World Economic Forum, with support from Accenture Strategy, outlines a future in which Fourth Industrial Revolution technologies provide a tool to achieve a circular economy efficiently and effectively, and where all physical materials are accompanied by a digital dataset (like a passport or fingerprint for materials), creating an 'internet of materials.' PACE is a collaboration mechanism and project accelerator hosted by the World Economic Forum which brings together 50 leaders from business, government and international organizations to collaborate in moving towards the circular economy.
United Nations University (UNU)
"UNU's and its world-wide partners' research and advocacy of sustainable e-waste practices have substantially contributed to placing the issue of electronic waste on the political agenda. But current efforts are insufficient to address this fast-growing problem. We need to develop innovative policies. We need to establish and monitor targets so we can measure whether our policies have any impact. We need new multi-stakeholder alliances, because reducing e-waste will require the cooperation of many actors, including the private sector. We hope that the E-Waste Coalition and this report will instigate the important innovation required." David Malone, Rector, UNU & UN Under-Secretary General
International Telecommunication Union (ITU)
"ITU has been raising awareness and guiding efforts to reduce and rethink e-waste since 2011. So I am delighted to see that a movement to promote a circular economy for electronics is now gaining ground. Together, with newly created partnerships such as the United Nations E-waste Coalition, we can transform waste into wealth, and deliver development benefits to all." Houlin Zhao, Secretary-General, ITU
United Nations Environment Programme (UN Environment)
"A circular economy brings with it tremendous environmental and economic benefits for us all. UN Environment is proud to support this innovative partnership with the Government of Nigeria and the Global Environment Facility and support the country's efforts to kick start a circular electronics system. Our planet's survival will depend on how well we retain the value of products within the system by extending their life." Joyce Msuya, Acting Executive Director, UN
World Business Council for Sustainable Development (WBCSD)
"Global e-waste is the fastest growing waste stream and presents societal and environmental risk. This summary clearly lays out why we must act at scale, now, and collaborate between business, international organizations, governments and NGOs. WBCSD is committed, through Factor10 and the Alliance to End Plastic Waste, to achieving a world where waste has no place." Peter Bakker, President and CEO, World Business Council for Sustainable Development
International Labour Organization (ILO)
"Thousands of tonnes of e-waste is disposed of by the world's poorest workers in the worst of conditions, putting their health and lives at risk. We need better e-waste strategies and green standards as well as closer collaboration between governments, employers and unions to make the circular economy work for both people and planet."' Guy Ryder, Director-General, ILO
United Nations Industrial Development Organization (UNIDO)
"E-waste is a growing global challenge that poses a serious threat to the environment and human health worldwide. To minimize this threat, UNIDO works with various UN agencies and other partners on a range of e-waste projects, all of which are underpinned by a circular economy approach. This UN coalition is taking steps towards a cleaner, more sustainable and safer future, and in doing so, demonstrating how such cooperation can lead to transformational results where the total is greater than the sum of its parts." Stephan Sicars, Director, Department of Environment, UNIDO
United Nations Institute for Training and Research (UNITAR)
"Working together with the UN coalition on E-waste presents a new paradigm shift and a new dispensation with tremendous opportunities to support countries march towards a cleaner and more sustainable way of managing e-waste. UNITAR recognizes urgent needs in training and capacity building in the e-waste management value chain based on the national training needs assessment, and we are pleased to support this partnership and countries through our programmes." Nikhil Seth, Executive Director, UNITAR
Secretariat of the Basel, Rotterdam and Stockholm Conventions
"The global e-waste problem is one of the most pressing environmental issues we face, with significant threats to human health, especially for women, children, and other vulnerable groups such as the poor, often working in the informal recycling sector in developing countries without proper protective equipment and without following established norms. That is why 187 Parties to the Basel Convention have worked together to develop technical guidelines to ensure the environmentally sound management of e-waste, including concerning recycling and recovery of materials from mobile phones and computing equipment, and transboundary movements of waste and its proper disposal." Rolph Payet, Executive Secretary, Secretariat of the Basel, Rotterdam and Stockholm Conventions
World Economic Forum
"The circular economy offer incredible benefits, but it does require us to be less transactional with our resources, stewarding them through the economy rather than throwing them out after one use. The Fourth Industrial Revolution offers us the ability to rethink resource flows, while also providing better services to consumers. It's time to unlock that innovation potential. The innovative capacity of the electronics sector and the value inside the materials in electronics makes it the best place to start " Dominic Waughray, Head of the Centre for Global Public Goods, Member of the Managing Board, World Economic Forum
Joint report, "A New Circular Vision for Electronics - Time for a Global Reboot"
Rapid innovation and lowering costs have dramatically increased access to electronic products and digital technology, with many benefits. This has led to an increase in the use of electronic devices and equipment. The unintended consequence of this is a ballooning of electronic and electrical waste: e-waste.
It is difficult to gauge how many electrical goods are produced annually, but just taking account of devices connected to the internet, they now number many more than humans. By 2020, this is projected to be between 25-50 billion, reflecting plummeting costs and rising demand.
E-waste is now the fastest-growing waste stream in the world. Some forms of it have been growing exponentially.
The UN has called it a tsunami of e-waste. It is estimated this waste stream reached 50 million tonnes in 2018.
This figure is expected to double if nothing changes.
Globally, society only deals with 20% of e-waste appropriately and there is little data on what happens to the rest, which for the most part ends up in landfill, or is disposed of by informal workers in poor conditions.
Yet e-waste is worth at least $62.5 billion annually, which is more than the gross domestic product (GDP) of most countries. In fact, if e-waste was a single nation, it's GDP would be on a par with that of Kenya. Furthermore, 123 countries have less GDP than the global pile of electronic waste.
In the right hands, however, it could be worth considerably more.
Changes in technology such as cloud computing and the internet of things (IoT) could hold the potential to "dematerialize" the electronics industry. The rise of service business models and better product tracking and take-back could lead to global circular value chains. Material efficiency, recycling infrastructure and scaling up the volume and quality of recycled materials to meet the needs of electronics supply chains will all be essential. If the sector is supported with the right policy mix and managed in the right way, it could lead to the creation of millions of decent jobs worldwide.
A new vision for the production and consumption of electronic and electrical goods is needed. It is easy for e-waste to be framed as a post-consumer problem, but the issue encompasses the lifecycle of the devices everyone uses. Designers, manufacturers, investors, traders, miners, raw material producers, consumers, policy-makers and others have a crucial role to play in reducing waste, retaining value within the system, extending the economic and physical life of an item, as well as its ability to be repaired, recycled and reused. The possibilities are endless.
This is an inflection point in history and represents an unparalleled opportunity for global businesses, policymakers and workers worldwide. Those who can rethink the value chain for electronic goods and prioritize dematerialization and closed loop systems (which means reducing reliance on primary resources), could have an incredible advantage. Innovative products and services do not have to mean more e-waste; they can mean a lot less.
The prevailing "take, make and dispose" model has consequences for society, a negative impact on health and contributes to climate change.12 It is time for a system update. We need a system that functions properly - in which the circular economy replaces the linear.
In the short-term, electronic waste remains a largely unused, yet growing, valuable resource. Nearly all of it could be recycled. Urban mining, where resources are extracted from complex waste streams, can now be more economically viable than extracting metal ores from the ground. It is largely less energy intensive. E-waste can be toxic, is not biodegradable and accumulates in the environment, in the soil, air, water and living things. It can also have an adverse impact on health. Children and women are particularly vulnerable to the health risks of e-waste exposure.
It is time to reconsider e-waste, re-evaluate the electronics industry and reboot the system for the benefit of industry, consumer, worker, health of humankind and the environment. The incredible opportunities here are also aligned to the globe's "just transition" to environmental sustainability and to shaping a future that works for all in the circular economy.
The report in full is available at http://www3.weforum.org/docs/WEF_A_New_Circular_Vision_for_Electronics.pdf
2019 World Economic Forum Annual Meeting
Tue, Jan 22 - Fri, Jan 25, 2019
Public Release: 24-Jan-2019
Society for Research in Child Development
The social disruption that results from natural disasters often interrupts children's schooling. However, we know little about how children's learning is affected in the years after a disaster. A new study looked at changes in children's academic performance after major bushfires in Australia. The study concluded that children in regions affected significantly by bushfires demonstrated poorer academic outcomes in some subjects than children in regions that were less severely affected by the fires.
The findings come from researchers at the University of Melbourne, Smouldering Stump (a charity to support children affected by natural disasters), Swinburne University of Technology, and the University of New South Wales. They are published in Child Development, a journal of the Society for Research in Child Development.
"Our study is the first in Australia to track the academic performance of elementary-school children affected by a natural disaster over a four-year period," explains Lisa Gibbs, director of the Jack Brockhoff Child Health and Wellbeing Program at the University of Melbourne, who led the study. "The findings highlight the extended nature of academic impact and identify important opportunities for intervention in the education system so children can achieve their full potential."
The study looked at 24,642 children who attended primary schools in Victoria, Australia, that were affected by the Black Saturday bushfires in Australia in February 2009. Researchers compared students in schools that had high or medium impact from the fire (based on the effects of the fire on lives and property) to peers in schools with low or no impact. They examined students' academic scores on tests of reading, writing, spelling, numeracy, and grammar two and four years after the fires (when students were in grades 3 and 5). The tests were standardized assessments given as part of the National Assessment Program to evaluate literacy and numeracy skills across time through the school curriculum.
The researchers also took into account the different family circumstances of the children, such as level of parents' education, language, cultural and health factors, and whether they came from single- or two-parent families, as well as the potential influence of schools.
The study found that students' expected gains from 3rd to 5th grade in reading and numeracy were reduced in schools that had higher levels of impact from the fire. There were no significant impacts of exposure on trends in academic scores for the writing, spelling, and grammar parts of the academic assessment, and no gender differences in any of the scores.
The authors note that cognitive skills related to types of learning may be affected by early experiences of trauma. After a disaster, ongoing disruptions in the home, school, and community may also affect learning opportunities. The study didn't include students who moved to different schools between grades 3 and 5 but given that relocation was more common among families who were affected by loss of property, these children are also likely to be at risk of impacts on learning.
"Our study extends the evidence base by examining the period up to four years after a disaster and identifying a subject-specific decline in academic achievement associated with the level of impact of the fire," explains Jane Nursey, senior clinical consultant at Phoenix Australia-Centre for Posttraumatic Mental Health at the University of Melbourne, who coauthored the study.
"Given the apparent delayed impact of the fires, it will be important for future studies on the impacts of disasters on children to extend beyond three years, and to consider academic and cognitive impacts alongside factors related to health and social and emotional well-being," Nursey continues. "In this way, we can be more confident of capturing the longer-term impact of disasters on children's academic performance, impacts that might not be apparent the first few years after an event, and we can ensure that interventions target appropriate areas to help children succeed at school and in life."
Such interventions might include extended social and emotional support for students, as well as additional academic support to address the developmental factors that likely influence academic achievement, especially those that relate to reading and numeracy, the authors note.
The study was funded by the Teachers Health Foundation, the National Health and Medical Research Council, and the Jack Brockoff Foundation. Data were provided by the Department of Education and Training Victoria.
Summarized from Child Development, Delayed Disaster Impacts on Academic Performance of Primary School Children by Gibbs, L (University of Melbourne), Nursey, J (University of Melbourne), Cook, J (Smouldering Stump), Ireton, G (University of Melbourne), Alkemade, N (University of Melbourne), Roberts, M (Department of Education and Training Victoria), Gallagher, HC (Swinburne University of Technology, University of Melbourne), Bryant, R (University of New South Wales), Block K (University of Melbourne), Molyneaux, R (University of Melbourne), and Forbes, D (University of Melbourne). Copyright 2019 The Society for Research in Child Development, Inc. All rights reserved.
Public Release: 25-Jan-2019
Rapidly receding glaciers on Baffin Island reveal long-covered Arctic landscapes
University of Colorado at Boulder
Glacial retreat in the Canadian Arctic has uncovered landscapes that haven't been ice-free in more than 40,000 years and the region may be experiencing its warmest century in 115,000 years, new University of Colorado Boulder research finds.
The study, published today in the journal Nature Communications, uses radiocarbon dating to determine the ages of plants collected at the edges of 30 ice caps on Baffin Island, west of Greenland. The island has experienced significant summertime warming in recent decades.
"The Arctic is currently warming two to three times faster than the rest of the globe, so naturally, glaciers and ice caps are going to react faster," said Simon Pendleton, lead author and a doctoral researcher in CU Boulder's Institute of Arctic and Alpine Research (INSTAAR).
Baffin is the world's fifth largest island, dominated by deeply incised fjords separated by high-elevation, low-relief plateaus. The thin, cold plateau ice acts as a kind of natural cold storage, preserving ancient moss and lichens in their original growth position for millennia.
"We travel to the retreating ice margins, sample newly exposed plants preserved on these ancient landscapes and carbon date the plants to get a sense of when the ice last advanced over that location," Pendleton said. "Because dead plants are efficiently removed from the landscape, the radiocarbon age of rooted plants define the last time summers were as warm, on average, as those of the past century"
In August, the researchers collected 48 plant samples from 30 different Baffin ice caps, encompassing a range of elevations and exposures. They also sampled quartz from each site in order to further establish the age and ice cover history of the landscape.
Once the samples were processed and radiocarbon dated back in labs at the Institute of Arctic and Alpine Research (INSTAAR) at CU Boulder and the University of California Irvine, the researchers found that these ancient plants at all 30 ice caps have likely been continuously covered by ice for at least the past 40,000 years.
"Unlike biology, which has spent the past three billion years developing schemes to avoid being impacted by climate change, glaciers have no strategy for survival," said Gifford Miller, senior author of the research and a professor of geological sciences at CU Boulder. "They're well behaved, responding directly to summer temperature. If summers warm, they immediately recede; if summers cool, they advance. This makes them one of the most reliable proxies for changes in summer temperature."
When compared against temperature data reconstructed from Baffin and Greenland ice cores, the findings suggest that modern temperatures represent the warmest century for the region in 115,000 years and that Baffin could be completely ice-free within the next few centuries.
"You'd normally expect to see different plant ages in different topographical conditions," Pendleton said. "A high elevation location might hold onto its ice longer, for example. But the magnitude of warming is so high that everything is melting everywhere now."
"We haven't seen anything as pronounced as this before," Pendleton said.
Additional co-authors of the study include Scott Lehman, Sarah Crump and Robert Anderson of CU Boulder; Nathaniel Lifton of Purdue University; and John Southon of the University of California Irvine. The National Science Foundation provided funding for the research.
Maxine Joselow, E&E News reporter Climatewire: Thursday, January 3, 2019
Mekkonen Kassa drives an electric taxi in the District of Columbia. He plans to switch to a gas-powered car because there aren't enough charging stations.
In the long row of bright-red taxis queued up at Washington's Union Station on a recent morning, a cab sporting a bright-green "e" stood out.
That's "e" for electric.
District of Columbia officials and environmentalists have celebrated the city's electric cab fleet as a weapon against climate change. But a largely immigrant corps of taxi drivers who are struggling to compete against ride-hailing services like Uber and Lyft say the program has stuck them with a hefty tab for environmental progress.
"We are suffering because of this car," said Tsegaye Mamo, a cab driver from Ethiopia. "This car is not good for business."
The Department of For-Hire Vehicles, which encouraged cabbies to join the electric taxi effort in 2016, failed to prepare them for the rough road ahead — notably, an acute shortage of charging infrastructure, E&E News found in interviews with 11 electric-cab drivers.
The drivers — most of them immigrants — say they didn't understand the difficulties and expense of owning and operating an electric cab. While the Department of For-Hire Vehicles dangled $10,000 grants to cover the cost of new electric vehicles and lure drivers, many cabbies said they either didn't receive the grant or are still paying off loans for the car.
"I don't know why they forced us to buy this car," said Mesfia Tesfaye, who came to the United States from Ethiopia nine years ago. "It is very hard."
David Do, the department's interim director, maintains that the drivers should have done their homework.
"I understand it's difficult. I feel for them," Do said in an interview. "[But] the grant guidelines were clear from the beginning that, 'Hey, if you do accept this program, you have to be in this vehicle for at least three years.'"
Drivers say they didn't realize that charging an electric vehicle sucked up so much time — which in the cab business is money.
While filling a gasoline tank takes just minutes, charging an electric car takes around half an hour. And drivers say they need to charge two or three times a day.
"Now is my second time charging today," said Asmare Aemiro. "I just went in the morning and now again."
An electric taxi owned by Nuru Shafi had to be towed last year when it ran out of juice. Shafi
It was 1 p.m.
"With gas, you fill up once, you can work all day," he said. "So which one is better?"
Alan Dowdell, vice president of business development at ChargePoint, the country's largest network of EV charging stations, said he thinks the plug-in stops offer cabbies an ideal lunch break.
"You know, there are natural breaks that a taxi driver has to take to eat lunch and stuff," Dowdell said. "You could replace 100 miles of charge pretty easily during a 30-minute lunch break."
But on a recent winter afternoon, just one driver was eating lunch while his battery charged. The rest killed time by looking at their phones or napping as earning opportunities passed them by.
Most District of Columbia cabbies said they earn $80 to $120 a day. The average D.C. taxi driver makes $38,872, according to Salary.com. By contrast, the director of the Department of For-Hire Vehicles makes $177,295 a year, city records show.
Many drivers said they feel pressure to maximize their earnings by working longer hours. Sometimes, they feel compelled to pick up passengers even when their battery is low.
That's a risky move. Five drivers said they've had to call a tow truck after their battery died on a trip.
"The last time I went to Dulles Airport, I had to get towed," said Nuru Shafi, referring to the northern Virginia airport that's about 45 minutes away from downtown Washington.
"The batteries ran out. I could not find a charging station. Finally, I called the tow truck. It was so cold. I was crying," Shafi said. "That was a nightmare for me."
Few charging options
There's just one charging station in the District of Columbia that's available to cabbies at a reasonable price.
Most Americans who own electric vehicles charge them at home, but that's an expensive proposition. Installation of a home charging station can cost $500 to $700, while labor and parts can cost $1,200 to $2,000, according to HomeAdvisor, a digital site for home repair services.
D.C. has 103 charging stations, but most are on hotel property, in private businesses or in parking garages. That means they're reserved for patrons or off-limits to cab drivers unless they pay a parking fee.
'We hear what those drivers are saying'
Companies that supply electric vehicle equipment are looking to invest in more urban charging stations.
In the District of Columbia, Potomac Electric Power Co. (Pepco) recently filed a $15 million request with the D.C. Public Service Commission to expand charging infrastructure in the region.
The request includes a proposal to install two chargers for taxis and ride-hailing services.
"We recognize that Washington, D.C., has the opportunity to be the leader in electric taxicab service in the country," said Rob Stewart, manager of smart grid and technology for the Exelon Corp. subsidiary.
"We hear what those drivers are saying, and we want to help them out."
Pepco hopes the commission will approve its request in time for installation this spring or summer, Stewart added.
Noah Garcia, a transportation policy analyst at the Natural Resources Defense Council, said cities around the country must commit to building out more charging infrastructure, with an eye toward equity and accessibility.
"If we're going to promote and sustain an EV market that's broader and more diverse," he said, "we're going to have to think more comprehensively about the needs of people who don't have access to charging in their homes."
Compounding the problem, EVgo recently closed down a charging station outside a Giant supermarket where cabbies used to congregate.
Jonathan Levy, EVgo's vice president of strategic initiatives, declined to comment on the record about why the charging station was closed. But Do said he heard EVgo received too many complaints about overuse by cabbies.
Union Station is now the only remaining option. The Department of For-Hire Vehicles subsidizes the two fast chargers in the garage there, and all 11 cabbies interviewed for this story said they primarily charge there.
"It's only here now," said Aemiro, gesturing at the Union Station garage. "Before, when we bought these cars, we could charge at two or three places. But now everywhere is closed."
Another cabbie who requested anonymity said he wished he had known about the lack of charging options before going green.
"There are not enough. That's a big issue for taxis," he said. "So I don't think D.C. is really ready for electric taxis, because the infrastructure is not there."
Grants fall short
When the Department of For-Hire Vehicles rolled out the electric taxi program in 2016, it solicited applications for $5,000 and $10,000 grants to cover the cost of a new vehicle, which has a sticker price of about $30,000.
The department received 56 applications and doled out 45 grants, spokesman Neville Waters said.
But the total number of electric-taxi drivers in the District is 127, he said. That means 82 drivers are operating without grants, either because their applications were denied or because they didn't apply in time.
"They promised, but they did not give," said Mekkonen Kassa, a driver from Eritrea. "They lied. They told us we'd get grants of $10,000. They did not give them to us."
Lemmesa Hunde, another driver from Ethiopia, said it's unclear why his application for a grant was denied.
"They told me the government was going to give us $10,000," Hunde said. "They denied me. I don't know why. It is a problem how some people got it and some people didn't."
Another cabbie who asked to remain anonymous said he received a $5,000 grant, but he nonetheless struggles to support his wife and two children. His wife works in a supermarket.
"It's not enough. Believe me, it's not enough," he said. "This is a $30,000 car. If I knew this would happen, I would never put $30,000 in this car."
Do, the department's interim director, said: "It's a competitive process. With any grant system, not everybody is awarded a grant. But it's more than that, you know? There's limited funds."
He added: "If I could give everybody money to get an electric vehicle, that's something that I want to do. But we're a government agency. We don't have that much money overall. So, you know, we have to do a competitive process, and we have to seek out the best proposals."
Leadership in limbo
David Do, interim director of the D.C. Department of For-Hire Vehicles. Maxine Joselow/E&E News
Do took over as For-Hire Vehicles' interim director in November, taking the helm from Ernest Chrappah, who led the department for more than two years.
Chrappah earned an award for his work on the electric taxi program as well as a promotion from Mayor Muriel Bowser (D). And at a September 2017 conference in Austin, Texas, the International Association of Transportation Regulators gave him its top award.
In a news release, the association credited Chrappah with "creating the first-ever electric taxi program in DC ... that increased awareness of climate change, generated fuel cost savings, and reduced [carbon dioxide] emissions."
At an October 2017 meeting of the D.C. For-Hire Vehicle Advisory Council, Chrappah proclaimed the electric taxi program a runaway success.
"No other jurisdiction within striking distance has made this type of commitment to the for-hire vehicle industry," he said, according to a transcript.
But Shafi accused Chrappah of misleading the drivers.
"It's because we are foreigners. That's the reason Mr. Chrappah, he doesn't respond," Shafi said. "He treats us like foreigners, like second-rate citizens. But I'm an American citizen right now."
Chrappah was recently promoted to interim director of the D.C. Department of Consumer and Regulatory Affairs. The department declined to make him available for an interview.
"Director Chrappah has been out of the office and will not be able to respond by your deadline," spokesman Timothy Wilson said in an email.
Asked how he would respond to allegations that the Department of For-Hire Vehicles misled drivers, Do said he thinks the cabbies should have done their research on the different types of EVs.
The vast majority of drivers opted to buy the Nissan Leaf, which costs around $30,000 and has a range of roughly 100 miles. But, he said, they could have chosen the Chevrolet Bolt, which costs about $36,000 but has a range of roughly 200 miles.
Newer Tesla Inc. models boast ranges of more than 300 miles, although their prices stretch from $49,000 for the Model 3 to $94,000 for the Model S.
"You know, there's a myriad of options," said Do, who drives a Tesla. "And sometimes, one vehicle might be cheaper than the other. But there are costs and benefits to that."
He added, "These rules and guidelines came before my time, and it's my job to adhere to what was laid out three years ago."
Group pleads for help
A group calling itself the D.C. Electric Taxi Cab Operators Coalition recently sent a letter to Bowser and D.C.'s representative in Congress, Del. Eleanor Holmes Norton (D), accusing the District of "unfulfilled promises."
The group wants permission to transfer from electric vehicles to gasoline-powered vehicles, which is prohibited under current rules.
Kassa is one of the cabbies who have had enough.
"I just decided to quit, to sell my car, because this is not the right car to own," he said. "An electric car for a taxi is not right."
A man who identified himself as the coalition chairman declined to be interviewed for this story.
"I don't want to speak about this headache," he said before joining the long red ribbon of cabs outside Union Station, indistinguishable from the other cars save for a small green emblem.
David Ferris, E&E News reporter Energywire: Thursday, January 3, 2019
Volkswagen AG's Electrify America is upending the electric vehicle charging business under a U.S. settlement that requires VW to spend $2 billion over a decade on EV infrastructure. Claudine Hellmuth/E&E News (illustration); Electrify America (chargers) Brands of the World (Volkswagen logo)
Three years ago, Volkswagen AG was in the depths of scandal because it lied about the emissions of its diesel cars. As 2019 begins, it has emerged as the most disruptive player in the business of selling a different kind of fuel: the electrons that move electric cars.
This is a direct result of its diesel cheating. Ordered by U.S. and California officials to build a $2 billion profit-seeking business in electric vehicle fueling stations, it has done so with aplomb. The subsidiary it created, Electrify America LLC, is now the richest and most talked-about company in the making of roadside chargers. Its influence will be profound if millions of Americans switch from gasoline to electric drivetrains.
Electrify America and its money are like a gravitational force. Almost overnight, the Reston, Va.-based VW unit has turned its suppliers into market leaders. It has upended the business plans of longer-established charging companies, and set high and expensive new standards for crucial parts of the EV ecosystem.
The VW subsidiary says it is making charging faster and more accessible, just like regulators asked. But its rivals see something else: a ruthless competitor that is magnanimous in public but can be underhanded in private.
Those firms, with far smaller budgets, say that Electrify America's flood of money is warping the market in ways that might slow down the adoption of EVs and make the fuel more expensive. They say they don't have the resources to compete with a corporate giant that is compelled to spend tens of millions of dollars per quarter. They say Electrify America is using its heft to steal business out from under them, and wonder how much of its charging network is for the common good and how much is tailor-made for VW's future stable of electric cars.
"With their big wallets and big promises, they have actually gone to our customers and pulled our hardware out of the ground and replaced it with theirs," said Michael Farkas, the CEO of Blink Charging Co., a competitor, making an allegation that Electrify America denies. "Doing that is not the most honorable thing, and it doesn't help the marketplace."
At the LA Auto Show in November, leaders of the auto industry leaned forward in their chairs to hear a presentation by Electrify America's CEO, Giovanni Palazzo. He didn't mention the pollution scandal that led to his presence there. He scarcely uttered the word Volkswagen.
"Let me tell you that I hope you got, and the message came through today, what Electrify America is willing to do," he said. "We see we are contributing to a better future — greener, more sustainable, more conscious."
Electrify America, with its parentage and large purse, has become EV-charging royalty with powerful allies. It has the blessing of the Sierra Club and many utilities, which welcome the investment in charging networks that could unseat gasoline, increase demand for electricity and reduce the threat of climate change.
It even gets kind words from its most rigorous regulator, the California Air Resources Board (CARB), which uncovered Volkswagen's diesel cheating in the first place.
"I think from the beginning, Electrify America has done in good faith what it said they were going to do, which is spend a lot of money, more money than anyone has ever spent, in all the years we've been trying to get infrastructure developed, to build a network of public charging stations that will enable an electricity market," said Mary Nichols, the chairwoman of CARB, in an interview at the LA Auto Show. "They deserve our thanks and our support for what they're doing, even if it took catching them in a violation to get this commitment done."
In private, Electrify America's competitors speculate about political undercurrents driving California regulators' light handling of Electrify America lately.
The VW consent decree, which President Obama's Justice Department celebrated as a major achievement, is now being enforced by the Trump administration. In theory, some suggest, a White House that champions fossil fuels and takes a dim view of electric cars could agree to renegotiate the terms that dictate VW's $2 billion of spending. And that could wipe out the windfall of dollars to build out EV infrastructure.
The legal settlement that created Electrify America also resolved most of VW's troubles in the United States. But abroad, the diesel scandal and the corporate behavior that enabled it continues to burn.
Volkswagen is fighting lawsuits around the globe, and the company and its current and former executives face multiple civil and criminal investigations in Europe and in its home country of Germany. Volkswagen's management boards, which oversee its executives, still deny they knew about the sophisticated, worldwide effort to cheat on diesel emissions, and most of those board members remain in place. It is unclear whether Volkswagen's top-down, win-at-all-costs culture has really changed.
Since the scandal, a related and equally astonishing turnaround has happened at Volkswagen: The company has abandoned diesel and gasoline and bet the farm on electric cars.
At the LA Auto Show, Volkswagen Group of America CEO Scott Keogh announced the company would spend $38 billion worldwide on electric drivetrains in the next four years, far more than any other global automaker. By 2025, it plans to have 50 fully electric models. By 2026, it expects to launch its last gas-powered car.
A huge hybrid
Mary Nichols, chairwoman of the California Air Resources Board, answers questions in June 2016 about a nearly $16 billion consent decree that settled Volkswagen's consumer lawsuits and government allegations after the auto giant's diesel emissions scandal. Eric Risberg/Associated Press
The experts can't recall anything quite like Electrify America, a multibillion-dollar combination of regulatory zeal and corporate ambition.
The seed was sprung from VW's systematic deceit. For seven years, starting in 2009, VW marketed its "clean diesel" vehicles as a zippy alternative to hybrid-electric cars like the Toyota Prius. It achieved that performance by embedding a "defeat device" that helped the cars meet emissions tests in the lab. On the road, however, they would produce up to 40 times the allowed limit of nitrous oxide, which contributes to smog and harms human health. The fraud was uncovered by an investigation by CARB, the California air regulator, in 2015.
The legal dominoes fell through a federal courtroom in San Francisco. Squads of lawyers representing car owners, auto dealers, the Justice Department, EPA and CARB sought to recover damages from the automaker. A special master — a certain former FBI chief named Robert Mueller — oversaw the negotiations. And in late 2016, the parties signed a consent decree that assigned VW almost $16 billion in penalties, most of which was to buy back the polluting cars or compensate for the pollution they'd spewed.
One part of the decree was unusual.
Called Appendix C, it apportioned $2 billion to "Investments in ZEV Market Support." Volkswagen was required to spend that amount over a decade to facilitate zero-emission vehicles. Of that, $800 million would be spent in California, and $1.2 billion in the rest of the country.
Put simply, the consent decree directed the world's largest automaker to create a business around EV charging — spending $500 million of its own money every 2 ½ years, ending in 2027.
Corporate wrongdoers aren't often asked to inject large amounts of money into a market adjacent to their own, like forcing a car company to invest in fueling.
"It's pretty unusual for the regulator to force entry," said Craig Falls, an antitrust attorney at the firm partner in the firm Dechert LLP. "Often they force you to leave."
California has carried out this kind of settlement before on a smaller scale. In 2012, it penalized NRG Energy Inc. for its role in the state's 2001 energy crisis with a requirement to seed $100 million in an EV charging business. That resulted in the company EVgo, which is now independent after NRG spun it off two years ago. (In a twist, EVgo is one of the most vocal opponents of Electrify America's tactics.)
But the VW settlement — larger by a factor of 20 — turned the money tap to a flood.
By point of comparison, the $500 million that VW is compelled to spend every 30 months is almost equal to the $532 million that ChargePoint Inc., one of the industry's giants, has raised in its decade of existence.
VW can't spend however it wants. Strings are attached, especially in California, where CARB retains a veto and oversight has been more muscular than at EPA. Charging investments have to be "brand-neutral," usable by every kind of electric car and not specifically designed to help Volkswagen. CARB sought not scattershot spending, but "transformative outcomes." It required that 35 percent go to infrastructure in poor and disadvantaged communities, while another hefty chunk goes toward advertising to raise EV awareness. Lastly, according to CARB's instructions, VW's spending has to "not interfere with or undermine established and emerging businesses in the market place."
CARB attached other conditions designed to make charging work better — provisos that are now creating headaches for Electrify America's competitors.
Today's EV drivers usually pay for electrons through a company membership; CARB requested that Electrify America offer credit card payment, like gas stations do. Charging stations can take hours to charge a vehicle; CARB asked for faster chargers. In California, some underused stations have been abandoned; CARB asked VW to provide a full decade of maintenance, with live operators standing by.
These changes make fueling an EV more pleasant. But they are also expensive to implement and are hard to afford for builders not backed by VW's deep pockets. CARB is making no apologies.
"We know what we get with Electrify America," said Analisa Bevan, an assistant division chief at CARB who oversees the investment. "We get chargers that consumers can use. It is one of many networks within the state, but knowing that we've got that consistent investment over the 10 years, and we know that it's going to be maintained over those 10 years, because that's required in the consent decree, that it will be open access to all users, all of those elements are critical to demonstrating what we want to see in California."
The employees of VW's Electrify America pose by a bank of its chargers and, in a nod to its "brand-neutral" directive, around a Chevrolet Bolt. Electrify America
With German efficiency, VW did what it was told.
In February 2017, five months after a judge signed a partial consent decree directing $2 billion in EV spending, Volkswagen announced its new subsidiary, Electrify America. It would be led by Mark McNabb, a former chief operating officer of Volkswagen Group of America and veteran executive of other automakers, including Nissan, where he had helped roll out the electric Leaf.
Electrify America, he told Reuters, would be "a very fast and furious project." (Last April, McNabb was abruptly replaced by Palazzo, another VW executive.)
Giovanni Palazzo. Volkswagen Group of America Inc.
Electrify America rented an office in Reston, in the orbit of Washington, D.C., and hired an experienced team at top salaries. "I have BMW. We have Toyota. We have people from different EV infrastructure companies. EVGo, Greenlots. We have government officials. We have people who came from the Department of Energy," McNabb told CARB at a hearing in July 2017.
To negotiate with the utilities that provide the electricity — almost 200 of them — it created a three-person crew of experts. To lobby in Washington, it hired Matthew Nelson, who was chief of staff of the U.S. Department of Energy's Office of Energy Efficiency and Renewable Energy under President Obama, and Patricia Readinger, a former legislative aide at the U.S. Department of Transportation. It also brought on Sophie Shulman, a former deputy chief of staff at Obama's Domestic Policy Council, as its manager of partnerships.
It would be, in the words of Palazzo, "one of the largest, most technologically advanced and customer-friendly charging networks in the U.S." And with its money, corporate backing and its tight deadlines, it rose to the status of major industry player overnight.
The company inked deals at a scale that were the envy of competitors who had been in the business for years.
It landed a contract to build at 100 Walmarts. It signed on with major real estate investment trusts such as AvalonBay Communities Inc., which is installing 80 stations at apartment buildings it owns. Overall, more than 80 percent of Electrify America's stations are coming through partnerships with big companies, said Brendan Jones, chief operating officer of Electrify America.
Electrify America's money was a magnet for metropolitan areas that wanted to win its investments. "States and metros have been clamoring, come here, come here, come here," said Max Baumhefner, an attorney at the Natural Resources Defense Council. In the end, Electrify America named 17 metros as its focus points across the country, from Los Angeles to D.C. to Boston to Houston.
For struggling makers of EV-fueling equipment, Electrify America became a windfall of dazzling proportions.
It hired four suppliers to build its charging stations, immediately becoming one of last year's biggest buyers of hardware. One was BTCPower, a small Southern California firm.
"We're going from a job shop to a scaled manufacturer," said Terry O'Day, the chief strategy officer at Innogy SE, a German firm that bought BTCPower in July because of its new influx of business. "It is the large Electrify America orders that have propelled that." The jolt of business from Electrify America enabled it to build a factory in the Philippines to supply the U.S. market.
Another is SemaConnect, a Maryland installer that was a minor player until Electrify America tapped it as the master contractor for about 1,500 stations across the country.
"This award has required us to beef up internally, to staff up. Every week it seems we're bringing on new employees to support the different phases of the Electrify America award," said Josh Cohen, the company's director of policy and utility programs.
Electrify America built all of this in about 18 months.
"The amount they were able to do in just one year was herculean," said Quincy Holloway, a former Electrify America project development manager who left the company in mid-2018. "A lot of companies who weren't operating with that capital thought they were operating at a disadvantage. But none of those companies had the Air Resources Board breathing down their necks."
The hardest part has been figuring out where to put the fueling stations. "I cannot say it's a mess, because it's not nice," said Palazzo to the crowd at the LA Auto Show in his heavy Italian accent. "So let's say it's a challenge."
It might seem an easy thing to place a plug. But instead, Electrify America and its competitors are often in hand-to-hand combat over a few coveted locations — raising costs and maybe slowing, rather than accelerating, the growth of charging networks.
Driving the controversy is Electrify America's urgent deadlines. Essentially, it has promised regulators to build not one but two major EV networks by June.
The company, like its rivals, can only make money if it focuses on where the EVs actually are, or soon will be. That means early-adopter cities and on major interstates.
For the first of those, in its 17 big metros, Electrify America is making a web of lower-powered chargers, called Level 2. Electrify America has budgeted $85 million to build 650 of them. Unlike gas stations, which have their own real estate, chargers usually occupy a few parking slots at a parking lot. In addition to a willing parking lot owner, Electrify America needs good lighting and a convenient location, and a local grid strong enough to accommodate the electricity demand. And getting permits takes time.
The amount they were able to do in just one year was herculean.
Quincy Holloway, a former Electrify America project manager who left the company in mid-2018
The obstacles are also daunting on interstate highways, where Electrify America is building a $265 million network of fast-charging stations to rival Tesla's.
Interstates may seem endless, but parking lots with the necessary electrical connections are few. Electrify America is trying to build 300 fast-charging stations in 39 states at intervals of about 70 miles by summer — far faster than anyone has tried to build before.
As a result, when Electrify America picks a site, its competitors are often already there.
At one of its newest locations at an outlet mall in the Northern California city of Livermore, Electrify America's fast-charging plaza joins three fast-charging plazas, one belonging to Tesla and two others belonging to EVgo.
"If you put one where there's already chargers, you're not expanding the usefulness," said Jonathan Levy, the vice president of strategic initiatives at EVgo. "Instead, you're cannibalizing the business of someone else and not making a service available for new drivers."
As Electrify America prepared to unleash its second, $200 million cycle of funding in California, its leading competitors — EVgo and ChargePoint — accused it of a "land grab" (Energywire, Nov. 5, 2018).
EVgo, which is the leading builder of EV fast-fueling stations, told CARB that Electrify America's gold-plated contract for site hosts — free installation, maintenance for 10 years and generous rent — has raised costs for all installers. Because of Electrify America's largesse, it said, parking lot owners now expect rent payments.
"One key area has been a marked shift in how, for example, grocery stores have switched from seeing the value of charging as an amenity to increase foot traffic into their stores to now expecting hundreds of dollars of rent per charger for [a] month," the company's director of market development, Sara Rafalson, said in a letter to CARB.
EVgo also accused Electrify America of paying site hosts to break its contracts under a highway charger rollout funded by a sister agency of CARB, the California Energy Commission.
Jones, the chief operating officer of Electrify America, disputes that claim. "There's no example where we've ripped and replaced someone else's charger. We don't do it as policy," he said, allowing the possibility that contractors had made such offers. "Once it gets to my desk, and I find out about it, it's disengagement."
For all the controversy that Electrify America has created, it's only getting started.
For its second phase in California, it wants to install 3,400 new charging stations and spend the bulk of its funds — up to $115 million — on fast charging in cities, where the action is these days. It will spend up to $30 million on new highway routes. It also plans incursions into fresh corners of the EV-fueling world, including homes (up to $12 million), buses and shuttles ($6 million), rural areas ($2 million), automated vehicles ($4 million), and renewable generation alongside its charging stations ($5 million).
CARB, the California regulator, approved Electrify America's plan.
It's all hard to swallow for Electrify America's competitors, who continue to claim their well-heeled rival is swiping customers from under their noses.
Marc Aprea, a contractor arguing ChargePoint's case before CARB a few weeks ago, put it this way. "When you are the world's largest auto manufacturer," he said wearily, "your world looks very different than it does at a startup company."
The secret advantage
Electrify America chargers at San Francisco Premium Outlets. Electrify America
While Electrify America is mandated by regulators to serve all vehicles, some competitors have grown uneasy with the ways in which Electrify America could jigger its network to give an unfair advantage to its parent, Volkswagen.
One involves the sort of chargers that Electrify America is building.
Today, most fast chargers in the United States deliver 50 kilowatts of electrical power. One of the first announcements that Electrify America made was of far more powerful stations, between 150 and 350 kW.
No EV on the road today can accept that much power. Even the supercharging network of Tesla Inc., which makes the fastest EVs, max out at only 135 kW. Electrify America says it is "future-proofing" its stations, and it has a point: An EV can fill its battery almost seven times faster at a 350-kW charger than a 50-kW one. Once cars are built to handle this much juice, an electric car might fill up as quickly as Americans expect at a gas station.
This upgrade comes at great expense to Electrify America. With no such high-powered chargers on the market, it commissioned its suppliers to engineer them, including new liquid-cooled cables to manage their enormous heat.
Why is super-fast charging so important to Electrify America?
"If anyone tried to create a financial model for DC [direct current] fast charging, it doesn't make sense today from a financial perspective," said Farkas, the CEO of rival network Blink. "It doesn't make sense for anyone except someone who's selling a car."
Among established automakers who are developing EVs that can accept super-fast power levels, about half are being built by VW and its brands Audi and Porsche.
One of the words that Electrify America often uses to describe itself is "premium." A critical part of its business model, one to distinguish it from rivals, is that it will partner with automakers and offer drivers special access to its super-fast network. The first such partnership, announced in September, was with Audi. Drivers of the new Audi's e-tron, the first of Volkswagen's new wave of EVs, will get 1,000 free kilowatt-hours of charging on Electrify America's network.
Electrify America says that negotiations with Audi — a sister division of the same company — were as hard-nosed as they come and that other partnerships with outside automakers are coming soon.
"The Audi deal was simply the hardest deal ever," said Jones, laughing at the thought of it. "It was very much a negotiation between a vendor and an OEM [original equipment manufacturer] and the pretenses of having a connection to the company did not give us any favor in any way, shape or form."
Jones says that Electrify America holds itself at arm's-length from the parent that created and funds it. Communication between the two happens mostly through Palazzo, the CEO, he said, though others like Jones occasionally find themselves in the room.
Questions persist about the extent to which Volkswagen is building a network to benefit itself. Doubts also exist about whether VW's statements can be trusted, given that the company's most senior leaders still deny that they knew of the yearslong, sophisticated effort to deceive regulators before the diesel scandal broke.
One worry belongs to General Motors Co., which is developing its own fleet of EVs.
In a letter to CARB, it suggested the bountiful data produced by Electrify America's chargers — which no other automaker has — could give VW an edge in the race toward electric transportation. "We would like to again emphasize the importance of ensuring there is a strict data firewall between Electrify America and VW that protects all automaker and consumer data that is shared with Electrify America," it wrote. (Palazzo has promised CARB it will not share the data.)
Electrify America has catalyzed all of this change and controversy in less than two years, with just the first $500 million required by regulators. It will keep spending through 2027, with another $1.5 billion to go.
If Electrify America can figure out how to make money with thousands of electric chargers in the United States — and its 32 new stations it is building unasked in Canada — executives talk of expanding to Europe. Meanwhile, Volkswagen the automaker is leading a $38 billion global transition to electric cars.
Should it succeed at both, it's possible that a diesel-tainted past will transform Volkswagen into an auto company unlike any the world has ever seen, a major carmaker crossed with a Chevron, a colossus that builds the EVs of the 21st century while also selling the fuel to make them go.
David covers big trends in energy technology and innovation for Energywire. A seasoned reporter who has followed the industry for six years, he often writes about energy storage, solar power, microgrids, smart thermostats, and the growing role of data, as well as the clash between established businesses and up-and-comers. David joined E&E News in 2014. Earlier, he was the editor of an energy news site and a contributor on energy topics to publications including Popular Science, Sierra and The New York Times. He graduated cum laude from the University of California, San Diego, and holds a master's degree in journalism from Northwestern University.
Public Release: 31-Jan-2019
University of Exeter
Microplastics have been found in the guts of every marine mammal examined in a new study of animals washed up on Britain's shores.
Researchers from the University of Exeter and Plymouth Marine Laboratory (PML) examined 50 animals from 10 species of dolphins, seals and whales - and found microplastics (less than 5mm) in them all.
Most of the particles (84%) were synthetic fibres - which can come from sources including clothes, fishing nets and toothbrushes - while the rest were fragments, whose possible sources include food packaging and plastic bottles.
"It's shocking - but not surprising - that every animal had ingested microplastics," said lead author Sarah Nelms, of the University of Exeter and PML.
"The number of particles in each animal was relatively low (average of 5.5 particles per animal), suggesting they eventually pass through the digestive system, or are regurgitated.
"We don't yet know what effects the microplastics, or the chemicals on and in them, might have on marine mammals.
"More research is needed to better understand the potential impacts on animal health."
Though the animals in the study died of a variety of causes, those that died due to infectious diseases had a slightly higher number of particles than those that died of injuries or other causes.
"We can't draw any firm conclusions on the potential biological significance of this observation," said Professor Brendan Godley, of the Centre for Ecology and Conservation on the University of Exeter's Penryn Campus in Cornwall.
"We are at the very early stages of understanding this ubiquitous pollutant.
"We now have a benchmark that future studies can be compared with.
"Marine mammals are ideal sentinels of our impacts on the marine environment, as they are generally long lived and many feed high up in the food chain. Our findings are not good news."
Dr Penelope Lindeque, Head of the Marine Plastics research group at Plymouth Marine Laboratory, said: "It is disconcerting that we have found microplastic in the gut of every single animal we have investigated in this study.
"Indeed, from our work over the years we have found microplastic in nearly all the species of marine animals we have looked at; from tiny zooplankton at the base of the marine food web to fish larvae, turtles and now dolphins, seals and whales.
"We don't yet know the effects of these particles on marine mammals. Their small size means they may easily be expelled, but while microplastics are unlikely to be the main threat to these species, we are still concerned by the impact of the bacteria, viruses and contaminants carried on the plastic.
"This study provides more evidence that we all need to help reduce the amount of plastic waste released to our seas and maintain clean, healthy and productive oceans for future generations."
In total, 26 species of marine mammal are known to inhabit or pass through British waters.
The species in this study were: Atlantic white-sided dolphin, bottlenose dolphin, common dolphin, grey seal, harbour porpoise, harbour seal, pygmy sperm whale, Risso's dolphin, striped dolphin and white-beaked dolphin.
The study, supported by Greenpeace Research Laboratories, used samples provided by the Scottish Marine Animal Stranding Scheme (SMASS), Cornwall Wildlife Trust's Marine Stranding's Network and ZSL's (Zoological Society of London) Cetacean Strandings Investigation Programme (CSIP).
The paper, published in the journal Scientific Reports, is entitled: "Microplastics in marine mammals stranded around the British coast: ubiquitous but transitory?"
Public Release: 31-Jan-2019
The Department of Energy has awarded four Virginia Tech researchers a $1.8 million grant to reduce the stress renewable energy sources put on the U.S. power grid.
The Virginia Tech Center for Power Electronics Systems (CPES) and the Power and Energy Center (PEC) will partner with Siemens to tackle this challenge.
With the dramatic increase in levels of carbon dioxide and other greenhouse gases, renewable energy sources, such as concentrating solar power and photovoltaics, have been implemented to reduce these emissions. According to the Solar Energy Industries Association, the electric power industry sector is the highest producer of greenhouse gas emissions in the United States.
More than a third of greenhouse gases released into the atmosphere are a result of the burning of fossil fuels for electricity usage. Concentrating solar power and photovoltaics are emissions-free alternatives that can feed electricity directly into the U.S. power grid.
"In the grand scheme, we need more energy. The energy consumption is increasing. The way to handle that in a smarter way is renewable energy sources, which continue to increase," explained Rolando Burgos, associate professor in the Bradley Department of Electrical and Computer Engineering. "About 12 percent of power generation in the U.S. is wind and solar. Ten years ago this number was 1 percent. So, this has been a very rapid growth and it is projected to continue at a similar pace."
However, increased use of these renewable energy alternatives presents a new challenge to the grid. In order for the power grid to function properly, power generated has to be matched to the power consumed, defined as load. Solar and wind energy sources rapidly change or vary throughout the day, thus creating fast imbalances in the generation-to-load consumption. A higher load will reduce the operating voltage and frequency, and a lighter load will inversely affect the grid. Disturbing the grid in such a way can cause the whole grid to collapse, producing blackouts.
Burgos, along with three other co-investigators, Dushan Boroyevich, Chen-Ching Liu, and Virgilio A. Centeno, all electrical and computer engineering professors, will work together for the next three years to develop a modular power converter based on highly efficient wide-bandgap semiconductors to lessen the stress renewable energy sources put on the power grid.
Historically, industries have relied on small generation units, called peaking generators, to keep the generation-to-load consumption in power grids balanced. Due to the increase in renewable energy sources, these units alone are no longer fast enough to respond to the imbalances produced.
Combined heat and power systems are now used to provide services to the grid instead. "The Department of Energy has demonstrated that in small to medium industrial sites with significant electricity and heat generation needs, the use of combined heat and power systems has the potential to increase local energy efficiency to 70-80 percent, while also reducing operational costs," said Burgos. "This is important as combined heat and power systems plants become attractive with a relatively short return of investment period."
According to the Department of Energy, combined heat and power systems are more cost-effective for grid operators and utilities. They improve the resiliency of the U.S. electric grid, reduce the strain on the existing grid infrastructure, and improve overall power quality. While these combined heat and power systems are currently being used in large industrial facilities, the Department of Energy is looking to enable the development of flexible combined heat and power systems for small-to-midsize facilities.
This is where the Virginia Tech research team comes in.
The Center for Power Electronics Systems will develop a modular multilevel converter that will allow for the selling of power, control over the amount of power used, regulation of the voltage and frequency, and a faster response time to grid disturbances. The converter will be more cost-efficient, reliable, and energy efficient, helping to mitigate the stress create by the growth in renewable energy.
Researchers will also add a new grid-stability-monitoring capability to the power converter, allowing the flexible combined heat and power systems to operate safely within the physical limits of the electrical grid.
Additionally, "The Power and Energy Center will work closely with Siemens to integrate the flexible heat and power systems into a microgrid environment, using Siemens' digital control platform to manage the operation of the microgrid that will be able to dispatch the flexible combined heat and power systems' grid-support functions as needed, and fully autonomously," said Burgos.
"This is why we were selected," said Burgos. "Here [at Virginia Tech], we have a strong electronics group and a strong power systems group, so now we can work together, thanks to the College of Engineering that has been instrumental in supporting the collaboration of these two groups over the past several years."
A high volume and lucrative black market business
Public Release: 30-Jan-2019
The University of Hong Kong
Hong Kong's illegal wildlife trade is contributing to a global extinction crisis. Every year millions of live animals, plants and their derivatives are illegally trafficked into and through Hong Kong, by transnational companies and organised crime syndicates.
There is an urgent need for the government to enhance its current enforcement strategy against wildlife smuggling. Over the last decade, the diversity of endangered species imported into Hong Kong has increased by 57%. At the same time, the estimated value of the trade has increased by 1,600%. Since 2013, seizures of illegal ivory, pangolin scales and rhino horn have been made by Hong Kong authorities, potentially equating to the deaths of 3,000 elephants, 96,000 pangolins and 51 rhinoceros.
Hong Kong's illegal wildlife trade is increasing in volume, underestimated in value and contributing to the global extinction crisis.
Some members of the Hong Kong Wildlife Trade Working Group (HKWTWG) have joined forces to publish a study focusing on the type and volume of seizures relating to illegal wildlife trade in Hong Kong over the last 5 years. The findings documented in the 200 page report: Trading in Extinction: The Dark Side of Hong Kong's Wildlife Trade, illustrate the city's central role in global wildlife trafficking and the extent and nature of the associated criminality. It identifies clearly, how future policy and enforcement could be improved to provide the urgently required long-term sustainability.
Associate Professor Amanda Whitfort of the Faculty of Law, one of the authors of the report said: "Wildlife crime in Hong Kong remains under-policed and under-investigated. Wildlife smuggling is not regarded as organised and serious crime, under Hong Kong law. Failure to include wildlife smuggling as a crime under the Organised and Serious Crime ordinance, Cap 455, hampers authorities' powers to effectively prosecute those behind the networks and syndicates that take advantage of Hong Kong's position as a major trading port."
"Our research indicates Hong Kong has become a hub for organised wildlife smugglers, with consequences for the international reputation of our city as well as international biodiversity," said Lisa Genasci, CEO of ADMCF, adding that "Extinction of elephants, rhino, pangolin and many other species in our lifetime is on the horizon, unless the illegal trade is stopped."
Shall you have any questions about this conference, please do not hesitate to contact:
Communiqué: (852) 2850 5990
Liberty McCarthy | Liberty@communiquehk.com | (852) 5300 0624
Jo Chan | email@example.com | (852) 6112 9337
ABOUT THE HONG KONG WILDLIFE TRADE WORKING GROUP (HKWTWG)
Established in 2015, the Hong Kong Wildlife Trade Working Group is a loose coalition of Non-Government Organisations, academics, legal professionals and experts in Hong Kong, with a specific interest in the wildlife trade. The report Trading in Extinction: The Dark Side of Hong Kong Wildlife Trade is a collaborative effort of some of its members including: ADM Capital Foundation (ADMCF), Animals Asia, Bloom Association (HK), The University of Hong Kong (HKU), Civic Exchange, Hong Kong Shark Foundation (HKSF), Kadoorie Farm and Botanic Garden (KFBG), The Society for the Prevention of Cruelty to Animals (SPCA), Teng Hoi Conservation Organisation, University of St. Andrews, WildAid and WWF-Hong Kong.
Public Release: 31-Jan-2019
American Geophysical Union
WASHINGTON--Planes flying over rain or snow can intensify the precipitation by as much as 10-fold, according to a new study.
The rain- and snow-bursts are not caused by emissions from the aircraft but are the peculiar consequence of the aircrafts' wings passing though clouds of supercooled water droplets in cloud layers above a layer of active rain or snow.
Under the right conditions, this effect can boost rain and snow storms over airports, where many planes intersect the cloud layer on approach and descent.
"The interesting thing about this feature is that it is caused by aircraft, but it is not caused by pollution," said Dimitri Moisseev, a researcher at the University of Helsinki and the Finnish Meteorological Institute and the lead author of the new study in AGU's Journal of Geophysical Research: Atmospheres. "Even if there would be absolutely ecological airplanes, which don't have any combustion, no fuel or anything, it would still happen."
Although the bands of enhanced precipitation are artificially created, the physical process jump-started by the passage of planes can occur naturally, which makes them useful laboratories for studying the formation of precipitation, according to Moisseev. Observing them may help meteorologists "nowcast" natural rain and snow conditions 2 to 6 hours into the future, which is essential for airport operations.
"When you, like myself, look at the radar data every day there is always something interesting going on," Moisseev said. "Surprisingly enough, there's always new things that we cannot explain still."
Moisseev discovered curious streamers of heightened precipitation in scans from the campus radar antenna at the University of Helsinki Kumpula. The unnaturally straight patches of intense precipitation appeared against a background of lighter rain or snow and seemed to bend toward the nearby Helsinki-Vantaa airport.
Their shapes looked intriguingly like the inverse of cloud formations known as fallstreaks, hole-punch or canal clouds, phenomena which can occur when aircraft fly through clouds of water droplets that are colder than 0 degrees Celsius (32 degrees Fahrenheit) --but aren't freezing.
The new study demonstrates that a similar phenomenon can enhance or elicit rain or snowfall from cloud layers underlying these supercooled cloud layers.
Both tiny water droplets and ice crystals form clouds. Pure water can stay liquid down to -40 degrees Celsius (-40 degrees Fahrenheit) without dust particles or other suitable surfaces present to seed crystallization into ice. So water droplets that condense into clouds can be much colder than the typical freezing point of 0 degrees Celsius (32 degrees Fahrenheit). Such supercooled liquid clouds are common in low- to mid-level cloud layers.
Air pressure changes from passing aircraft can trigger these supercooled water droplets to freeze into ice crystals. Air expands abruptly in the wake of wing and propeller tips, causing a dramatic local drop in pressure and temperature. Inside a cloud of water droplets that is already supercooled between -15 to -20 degrees Celsius (5 to -4 degrees Fahrenheit), the passing aircraft can drop the temperature below -40 degrees Celsius (-40 degrees Fahrenheit) and instigate the formation of ice crystals.
The new ice crystals help freeze more supercool water droplets, setting off a chain reaction of crystal formation in a widening circle around the path of the aircraft. When the crystals fall, they create holes or streaks of clear air in the cloud, sometimes opening a window of blue sky if the cloud layer is thin. In most cases, the ice crystals evaporate before they reach the ground.
Meteorologists have known that passing aircraft can freeze water droplets into ice crystals and previous work had suggested that the process could enhance rain and snow in underlying clouds, but the effect had not been captured in detail.
Andrew Heymsfield, a senior scientist at the National Center for Atmospheric Research's Mesoscale and Microscale Meteorology Laboratory in Boulder, Colorado, and a researcher unaffiliated with the new study, had noted the potential for inadvertent seeding of supercooled clouds over airports in a previous paper about the formation and spread of aircraft-induced holes in clouds. He observed similar arcs of heightened snowfall in radar data collected near Denver, Colorado's former Stapleton Airport in 1992.
"We know that planes can trigger precipitation. The authors of this study have a lot of cases, with wonderful data from ground-based instruments--radar, lidar--good information about particle size and concentration, and radiosonde data to show the likely temperature for formation," Heymsfield said. "They succeeded in documenting the phenomenon."
To find out if the streamers of heightened precipitation could be caused by aircraft, Moisseev and his colleagues reviewed 11 years of the University of Helsinki's weather radar data and found 17 days with repeat cases of the characteristic linear streamers between December 2008 and January 2018.
They examined flightpaths near the airport to see whether the streamers could be caused by passing aircraft. Flightpaths archived to 2011 confirmed aircraft passed within 2-10 kilometers (1-6 miles) of the intense precipitation streamers in most of the cases observed.
"The intensified precipitation basically follows the track of an airplane above the cloud," Moisseev said. "It could extend over hundreds of kilometers, but the cross-section would be maybe 100 meters. So it's a very narrow, long feature."
The additional ice crystals raise the rate at which crystals collide to form larger snowflakes, intensifying snowfall, according to the authors.
This could happen if an airplane flies directly through a precipitating cloud, but the authors suspect something more complicated is going on, because their data locates the starting height of rain and snow enhancement far above the layer that is already precipitating.
The new study concludes the airplane-generated ice crystals most likely fall from a supercooled upper cloud layer into a lower layer that is actively raining or snowing, begetting more rain or snow from the lower cloud layer.
Satellite data support this scenario, showing a top layer of clouds composed of supercool droplets or a mix of ice and water, poised at about the right temperature to turn to ice crystals under the influence of aircraft. This upper, supercool layer floats at the typical approach altitude of planes flying into the Helsinki-Vantaa airport.
Rain and snow artificially enhanced by air traffic has useful clues for natural precipitation and the factors affecting the efficiency of formation, according to Moisseev. The streamers are accidental experiments that allow the researchers to observe the effect along the path of the aircraft, and just outside it, and ask questions the kinds of microphysical processes taking place.
Public Release: 30-Jan-2019
University of British Columbia
Carbon dioxide emissions from fuel burnt by fishing boats are 30 per cent higher than previously reported, researchers with the Sea Around Us initiative at the University of British Columbia and the Sea Around Us - Indian Ocean at the University of Western Australia have found.
In a study published in Marine Policy, the scientists show that 207-million tonnes of CO2 were released into the atmosphere by marine fishing vessels in 2016 alone. This is almost the same amount of CO2 emitted by 51 coal-fired power plants in the same timeframe.
"The marine fishing industry relies heavily on the use of fossil fuel and its role in global greenhouse gas emissions has been largely ignored from a policy or management perspective," said Krista Greer, lead author of the study and a researcher with the Sea Around Us at UBC. "Until now, the most comprehensive study of carbon dioxide emissions from fishing suggested that in 2011, fisheries released 112-million tonnes of CO2 per year from the combustion of fuel during fishing."
The previous data suggested fisheries contributed to only 0.29 per cent of global CO2 emissions, while the new study indicates that their contribution is almost twice that amount. The higher values are largely due to the UBC-UWA research considering regional differences in fuel use based on fishing effort and the amount of fuel used to catch 30-million tonnes of fish that were not reported in 2016.
Greer and her colleagues used the Sea Around Us' global catch and fishing effort database to calculate the amount of carbon dioxide emitted by each boat operating in each country's different fishing sectors and the amount of CO2 emitted per tonne of each fish those boats catch, also known as emissions intensity.
"We found that global emissions intensity for 2016, on average, was 1.88 tonnes of carbon dioxide, compared to 1.5 tonnes in 1950. This is despite the fact that marine catches have been declining since the mid-1990s," said Greer. "The emissions intensity of small-scale, artisanal and subsistence fleets has increased the most over the time period in terms of magnitude, but the industrial sector continues to be the greatest contributor to overall emissions."
In their analysis, the researchers also found that the emissions intensity started to grow in the 1980s.
"Small-scale fisheries caught up with the industrial sector because artisanal and subsistence fishers began installing gasoline-powered engines on their boats," said Dirk Zeller, co-author of the study and leader of the Sea Around Us - Indian Ocean at the University of Western Australia. "This means that there's a need to think of emission reduction strategies, such as switching to small diesel-powered engines in small-scale fisheries."
According to Zeller, industrial fisheries also need to do their part by reducing their fishing effort, which is currently three to four times what it should be in order to be sustainable. This would not only allow for a reduction in CO2 emissions by industrial fleets but would foster the recovery of declining fish populations.
Massachusetts is leading the charge in dual-use solar installations, making it possible to grow some crops and pasture animals while generating clean energy.
By Sarah Shemkus
Posted on: January 22, 2019
The solar panels in the fields at the University of Massachusetts Crop Research and Education Center don’t look like what most of us have come to expect. Instead of hunkering close to the earth, they’re mounted seven feet off the ground, with ample room for farmers or cows to wander underneath. Panels are separated by two- and three-foot gaps, instead of clustering tightly together. Light streams through these spaces and, underneath, rows of leafy kale and Brussels sprouts replace the typical bare earth or grass.
This unusual arrangement is one of the first examples of a dual-use solar installation—sometimes called agrivoltaics. It’s a photovoltaic array that’s raised far enough off the ground and spaced in such a way that some crops can still grow around and beneath the panels. The goal is to help farmers diversify their income through renewable energy generation, while keeping land in agricultural use and reducing greenhouse gas emissions.
“This would seem like a great thing—you get to farm and use the same exact space to generate money from solar production,” said Brad Mitchell, deputy executive director of the Massachusetts Farm Bureau Federation. “But it’s still in the early stages.”
The idea of producing solar energy and growing crops on the same land has been around for a while. Isolated demonstration and research installations are in place or planned in Arizona, Japan, and France. In recent years, however, the concept has become more attractive, as the price of photovoltaic panels has plummeted, interest in renewable energy has risen, and financial pressures on small farmers have grown. And because solar arrays often displace agriculture, causing tension between the two land uses, agrivoltaics is being seen as a potential win-win.
Massachusetts is at the forefront of the push. The state’s ambitious renewable energy goals—current targets call for 3,600 megawatts of wind and solar capacity by the end of 2020, doubling the state’s current output of 1,800 megawatts—have created a surge of interest in developing solar projects, but the state’s high population density means that available land is scarce.
Furthermore, many local food advocates argue that an inadequate portion of the food consumed in Massachusetts is grown there. The short growing season along with high costs for labor and land can make farming in Massachusetts a financially precarious proposition. Some advocates say that dual-use solar installations have the potential to ease a number of these problems at once.
Traditionally, when solar developers turn to farmland for their projects, the property is leased or sold to the developer, the topsoil is stripped, and the panels are mounted on concrete footings embedded in the land. While the shift boosts renewable energy generation, it weakens the local food supply. Some counties have even started prohibiting large-scale solar developments on agricultural property as a way to preserve the land.
Dual-use developments are particularly suited to Massachusetts’ needs, and the state is seizing the opportunity. The UMass installation, a partnership between private solar company Hyperion Systems and the university, is home to a unique research project aimed at calculating exactly well how different crops fare when grown beneath raised panels. And the state’s new solar incentive program, known as Solar Massachusetts Renewable Target (SMART), offers extra compensation for dual-use projects.
“To our knowledge, no place else is doing what we’re doing,” said Michael Lehan, an advisor to Hyperion Systems.
Agrivoltaics’ Origin Story
The story of dual-use projects in Massachusetts dates to 2008, when construction company owner Dave Marley installed a solar array on the roof of his headquarters in Amherst and quickly decided he wanted to generate even more energy. As he started considering farmland as a location, Marley became determined to find a way to avoid interrupting the land’s agricultural use.
“He kept emphasizing, ‘I want to keep the land alive and well. I don’t want to cover up the land,’” said Gerald Palano, renewable energy coordinator for the Massachusetts Department of Agricultural Resources.
In 2009, Marley connected with researchers at UMass and in 2010 his vision became a reality with the construction of a 70-panel array at a research farm in South Deerfield, Massachusetts. The following year, Marley formed Hyperion Solar to pursue this new approach to renewable energy. Marley died in 2013, and his son James has taken over the business.
Today, the dual-use installation Dave Marley envisioned remains in place, advancing his goals. The lower ends of the panels are raised seven feet off the ground and rise to 15 feet in the air. They’re spaced far enough apart to allow sunlight to pass through to the field below and can be shifted horizontally to adjust the gap. The panels are supported by vertical poles embedded 10 feet into the ground. No concrete is used, so the damage to the soil is limited and completely reversible.
“What farmers really care about is the land,” Lehan said. “And there is minimum soil disruption.”
Since the array came online in 2011, Stephen Herbert, professor of agronomy at UMass, has been studying the impact of the panels on crop growth. His results have been encouraging.
When he cultivated grass and other forage plants to support grazing cows, the land under the panels yielded more than 90 percent as much volume as land that received direct sun. For beef or dairy farmers, agrivoltaic arrays are a “no-brainer,” Hebert said, between bites of a fresh Brussels sprout he plucked from a stalk beneath a panel.
Early results suggest that, when grown under the panels, vegetables such as peppers, broccoli, and Swiss chard can produce about 60 percent of the volume they would in full sun. At the same time, a dual-use system offers about half the power-generation capacity per acre of a conventional installation, and the costs are higher.
However, while these systems offer less energy generation and lower crop productivity than solar panels or agriculture alone, the combination generally pays off.
“Absolutely,” Lehan, Palano, and James Marley said, nearly in unison.
Hyperion estimates that its dual-use installations pay for themselves in about eight years, under average conditions. State and federal grants can shorten that timeline.
To help accelerate the adoption of this new approach, Massachusetts is putting some money on the line. In November, the state launched the SMART program, which pays solar owners a fixed base rate for every kilowatt-hour of energy they generate. The amount they earn is then deducted from the cost of the electricity they draw from the grid when the panels aren’t producing enough power. If an owner produces more energy than they use, they can apply those credits to future bills.
The underside of a dual-use solar array. The hardware used to mount the panels make it easy to reposition each panel to maximize the light needed to grow crops underneath.
Base rates for these solar installations range from 15 cents to 39 cents per kilowatt-hour, depending on the size and location of the development. Projects that combine solar panels with farming earn an extra 6 cents per kilowatt-hour. In practical terms, that means that a 1-megawatt system on agricultural land, with solar panels in fixed positions which could produce around 1.2 million kilowatt-hours of energy in a year, would earn an extra $72,000 toward an electric bill.
In the first six weeks of SMART, five applications proposed dual-use projects and another two have submitted pre-determination requests, an earlier step in the process. Several more developers have inquired about potential developments, Palano said. The proposed projects range from 249 kilowatts to 1.6 megawatts.
“We think the level of interest is there from large-scale developers and others but the concept is new, so they are needing to invest more time to better understand,” he said. “We’re happy to see the interest so far.”
Not every agricultural area will benefit from agrivoltaics. The added costs and effort might not make sense in a region that already has plenty of open, non-agricultural space to host solar arrays, for example.
In places like Massachusetts, however, Palano said the technology is only going to get better and more helpful. He’s already seeing interest in panels that move to follow the sun, maximizing their energy generation. He also expects growing interest in storage, essentially large batteries that can collect power and save it for use when the sun isn’t shining. The future may even include translucent panels that would let more light through to plants growing below, he added.
“We’re saying, ‘let’s see if we can get this to the next level,’” said Palano. “We’re looking forward to the innovation.”
In the rush to go green, exercise equipment powered by you may attract the climate conscious.
By Eric Roston , January 22, 2019
SportsArt treadmill Source: SportsArt
As scientists seek more ways to harness nature’s power to produce renewable energy, there’s one energy source burned naturally every day that isn’t being harnessed: calories.
SportsArt, a 42-year-old Taiwanese athletic equipment maker, is trying to change that by selling exercise treadmills, ellipticals and cycles that turn workouts into electricity, feeding it back into the building through an electrical socket. The company recently showcased the third generation of its treadmill to attendees of the International Consumer Electronics Show in Las Vegas.
“Think of a hamster wheel,” said Ruben Mejia, chief technology officer of SportsArt. “You’re the hamster and the treadmill is the wheel. As soon as you start turning that wheel, we’ve got a generator inside that starts producing power.”
Physical exertion is powered by combustion reactions—small cellular ones. As SportsArt writes in its patent filings, “it is a pity that the energy is not utilized.”
There is a problem of scale, however. The treadmill’s maximum output is 200 watts an hour. The average American uses about 28,000 watt-hours a day. The maximum treadmill workout, generating 200 watts for an hour, would save 2.4 cents, assuming an electricity cost of $0.12 a kilowatt-hour, plus the power that would have been used by a motorized machine.
The company’s bikes and elliptical trainers can move up to 250 watts. On the treadmill, a 147-pound person running roughly 8-minute, 20-second miles would put out only 24 watts every 30 minutes, or enough for 4 hours of wifi. A 176-pound person lightly jogging for 20 minutes could power a 60-watt lightbulb long enough to light the room while they’re working out.
Factoring in the electricity-use avoided, SportsArt’s “Eco-Powr” equipment with continual use could save almost $900 a year compared with other brands’ treadmills, according to Mejia. Units cost about $10,000 each, and are sold to gyms, assisted-living centers, universities and beyond. Consumer models are in the works.
“There’s a lot of gyms that are going green and they’re going green in a variety of ways.”
Given the small size of the benefit, and a price that’s five times more than a traditional treadmill, why would a gym buy one of these? Being even a little green is increasingly a selling point all by itself, or so the thinking goes.
“There’s a lot of gyms that are going green and they’re going green in a variety of ways, whether it’s like zero waste or being a net zero property,” Mejia said.
Paul Crane owns Eco-Gym, a “sustainable gym” in Brighton, England, that uses SportsArt equipment. In the past, the facility reduced fees based in part on how much power members generate while working out. He said members “definitely feel motivated and committed to improving their own health and that of the planet.” Other clients include boutique gyms that can charge more for amenities like power-generating equipment, where it’s not about saving energy and more about making a statement.
Getting to the gym is difficult enough for busy, working people. Being able to measure one’s own power output may be the added mental incentive, or trigger, people need to get moving, even if it’s “just giving people a sense that they are burning energy and seeing some results,” said Dan Ariely, a psychology and behavioral economics professor at Duke University and an author.
Conventional treadmills have motors that put the belt in motion as soon as the workout begins. That costs electricity, as does the electronic workout display. The SportsArt treadmill has no motor. It’s powered initially by gravity. The workout begins when a brake on the belt is released. The unit is set at a 4-degree angle relative to the floor, just enough for the belt—which is really a mat composed of horizontal slats rolling on ball bearings to reduce friction—to slip backwards under the weight of the runner or walker. As feet pound forward, the belt spins rollers that capture the motion and convert it into electricity.
A micro-inverter, a device that regulates the flow of current, translates it into the form of electricity powering the house or building, and shoots it right back into the electrical socket. The modest additional power flows to whatever needs it first, nearby electronics that share the same outlet, or deeper into the building. For now at least, the current can’t flow through the circuit board, out of the structure, and onto the grid.
Since the machines generate the most energy during more intense workouts, the amount of calories burned doesn’t necessarily translate into power. Mejia said that while you can burn a lot of calories taking a three-hour stroll at three miles per hour, “you’re not going to be producing a lot of electricity.”