MAKING OUR GREEN ECONOMY COME ALIVE AND THRIVETM

NEWS

ENVIRONMENTAL NEWS

Remember to also click on and check out the "Green Local 175 News" for environmental and green news within a 175 mile radius of Utica-Rome. Organizations and companies can send their press releases to info@greenlocal175.com . Express your viewpoint on a particular issue, by sending an e-mail to our opinion page, "Urge to be Heard". Lastly, for more environmental information listen to our Green Local 175 Radio & Internet Show.

Green Local 175 on ,

The Gulf of St. Lawrence is Losing Oxygen Faster than Anywhere Else

Posted: Sep 30 2018, 6:44am CDT | by Hira Bashir, Updated: Sep 30 2018, 6:49am CDT, in News | Latest Science News

Researchers explain how climate change is causing a dramatic oxygen decline in a large region of ocean.

Global warming has significantly altered Earth’s atmosphere. While oceans in general have long suffered from low oxygen levels, the Gulf of St. Lawrence has been warming and losing oxygen faster than almost any other place of global oceans. In a latest study, researchers have investigated causes of this rapid deoxygenation.

The Gulf of St. Lawrence is a large part of ocean that drains North America's Great Lakes and also harbors an incredibly diverse marine ecosystem. Researchers suggest that the unprecedented drop in oxygen levels of the Gulf of St. Lawrence is attributed to two of the ocean's most powerful currents: the Gulf Stream and the Labrador Current. The changes in these currents have caused lower-oxygen water to invade Canada's Gulf of St. Lawrence.

“The oxygen decline in this region was already reported, but what was not explored before was the underlying cause,” said lead researcher Mariona Claret from University of Washington, who did the work while at Canada's McGill University. “Observations in the very inner Gulf of St. Lawrence show a dramatic oxygen decline, which is reaching hypoxic conditions, meaning it can't fully support marine life.”

Canada's fisheries agency has been tracking temperature and oxygen levels of St. Lawrence region for decades, but the connection between the two has never been considered.

To find out what caused oxygen loss in the region and what role climate played, researchers used output from NOAA's Geophysical Fluid Dynamics Laboratory model. Model output combined with the historical observations showed that the Gulf Stream has shifted northward and the Labrador Current has weakened due to high carbon dioxide levels over the past century. As the carbon dioxide level increases, it causes Gulf Stream's warm, salty and oxygen-poor water to enter deeper parts of the St. Lawrence gulf. Since warm water is less effective at holding oxygen, it results in more oxygen loss. The shift in the large-scale ocean circulation causes warming and deoxygenation in the Gulf of Saint Lawrence.

"We relate a change in oxygen on the coast to a change in large-scale currents in the open ocean," said Claret. "Being able to potentially link the coastal changes with the Atlantic Meridional Overturning Current is pretty exciting.

In the Heart of the Corn Belt, an Uphill Battle for Clean Water

Runoff from farms and feedlots has badly polluted Iowa’s waterways, more than half of which do not meet federal quality standards. Now, an unlikely coalition is calling for stricter controls to clean up the drinking water sources for millions of the state’s residents.

BY MARK SCHAPIRO • SEPTEMBER 25, 2018

“Health trumps politics,” said Iowa State Senator David Johnson before taking the stage at a raucous rally in Des Moines last winter to support strengthening the state’s water quality. In the marble rotunda of the state capitol, he rose to denounce the nitrogen and phosphates that have been flowing in ever-increasing quantities into Iowa’s public water supplies — and was cheered by the small crowd of family farmers, concerned mothers, and his new political allies, the legislature’s drastically outnumbered Democrats. Johnson had been one of the longest-serving Republicans in Iowa until he left the party to become an independent in 2016 after defying it repeatedly on one of the most divisive issues in Iowa — the integrity of the state’s water.

Iowa’s nitrogen load has been accelerating despite more than $100 million spent by the federal and state governments to rein it in. Starting in 1999, the concentration of nitrogen in the state’s major waterways has increased almost 50 percent, according to a study from the University of Iowa, published last spring in PLOS One. The battle over Iowa’s water had long been posed as one between rural and urban interests, until Johnson, whose district is one of the most thinly populated and heavily farmed in the state, came along.

In 2002, Senator Johnson co-authored one of the state’s key water statutes, the “Master Matrix,” which was supposed to establish health and safety guidelines for CAFOs and industrial farms in Iowa. By 2012 he was seeing how the relatively lax controls he had authored were being exploited, leaving his constituents vulnerable to the health consequences of escalating pollution from agricultural runoff. Nitrate pollution increased from about 200 million pounds in 2002 to more than a billion pounds in 2016. “I helped write this law in 2002,” he says, “and it’s terribly outdated with the current conditions that we’re seeing right now.”

Ninety-two percent of the nitrogen and 80 percent of phosphates in Iowa’s waters come from farms and animal feedlots.

Johnson describes himself as a moderate and a strong opponent of abortion. He is tall, loose-limbed, and acerbic. His concern over the health impacts of the state’s increasingly polluted water supplies has turned him into an unlikely leader of the fight now underway over the accelerating stresses on water in Iowa, where large industrial farms abut key waterways.

More than 750, or 58 percent, of the state’s rivers and streams do not meet federal water quality standards and are designated by the Iowa Department of Natural Resources (DNR) as too contaminated for swimming or consuming fish caught there — making a state once renowned for its lattice of waterways into a mess of inaccessible creeks, streams and lakes. Another 23 percent fall into a category of being “potentially impaired,” which the state defines as, “waters in need of further investigation.” Ninety-two percent of the nitrogen and 80 percent of phosphates in the state’s waterways, says the DNR, come from farms and animal feedlots.

More than a billion pounds of nitrogen entered Iowa's waterways in 2016, largely from fertilizer runoff.

More than a billion pounds of nitrogen entered Iowa's waterways in 2016, largely from fertilizer runoff. USDA/NRCS

All the water running through Johnson’s district — in the Floyd, Little Sioux, and Maple rivers — ultimately flows west into the Missouri River, which winds its way south to the Mississippi River, which then dumps the toxic legacy of Iowa farms into the Gulf of Mexico. It’s the same for rivers like the Des Moines and the Raccoon, which wind their way through Iowa and end at the Mississippi. For the last decade, Iowa has been the first or second biggest contributor to the toxic dead zone in the Gulf, an area about the size of New Jersey where few living creatures can survive because the pollution has deprived the water of sufficient oxygen.

Climate change is exacerbating the toxic stew: Hotter temperatures are turning river waters into what David Cwiertny, director of the Center for Health Effects of Environmental Contamination (CHEEC), at the University of Iowa, in Iowa City, describes as “a breakfast buffet for micro-bacteria and algae.”

But long before the waters of northern and western Iowa reach the Mississippi, they pass through the more densely populated towns and cities of Iowa and leave their legacy in the water that millions of Iowans rely on to drink.

On that January day, Johnson was joined in the capital by an unlikely ally, Bill Stowe, CEO and general manager of the Des Moines Water Works (DMWW), a public utility responsible for the safety of the city’s drinking water. What Johnson sees flowing off the fields across Iowa, Stowe sees in the water he’s supposed to ensure is safe for the more than 500,000 residents of the greater Des Moines area, situated at the confluence of the Raccoon and Des Moines Rivers.

“Iowa has become the sacrifice state for industrial agriculture,” says Bill Stowe, a Des Moines water official.

The two could not be less similar in appearance and demeanor. Urbane and charismatic, with a long mane of gray hair, Stowe looks more like the keyboard player in a classic rock band than the state capital’s chief water engineer. Stowe famously filed suit against two upriver counties in 2015 demanding compensation for the costs of filtering Des Moines water to meet minimum drinking water quality standards. The price tag for installing the city’s filtration facility in 1991 was $4.1 million, and it costs $7,000 a day to operate. The utility has raised its water rates between 4 and 10 percent every year over the past decade to keep pace with the rising filtration costs, and in 2017 it announced that it would have to either double the capacity of its nitrogen-filtering tanks at a cost of $15 million or develop alternative groundwater sources. The water utility has had to spend as much as 9 percent of its operating budget on nitrate treatments.

Stowe has survived Republican efforts to defund his agency, and has joined forces with Johnson in a so far unsuccessful effort to pass legislation that would limit the agricultural and livestock industry’s ability to contaminate rivers used for drinking water.

“Iowa,” says Stowe, “has become the sacrifice state for industrial agriculture. As a native Iowan and somebody whose family has been around here for generations, we are the sacrifice, just like West Virginia has become the sacrifice state for coal.”

A sequence of studies of women in Iowa over the age of 55 found that sustained exposure to nitrogen levels of just 5 milligrams per liters — half the federally determined “safe rate” — may contribute to increased risks of bladder, ovarian, and thyroid cancer. According to Peter Weyer, former director of CHEEC and one of the nation’s foremost experts on nitrates in drinking water, more than half of the 42,000 Iowa women monitored by the center over a 30-year period had relied on the same water source for more than 20 years, giving extra credence to the researchers’ conclusion that it was nitrogen in the water, and not other potential sources, that was the main contributor to the elevated cancer rates. Another study on birth defects that centered on 10 locales around the country, including Iowa, found evidence suggesting newborn children face increased risk of conditions such as spina bifida and cleft palate if their mothers have sustained exposure to drinking water with half the federal limit for nitrogen.

Across rural Iowa, tens of thousands of residents get their drinking water from private wells, sources that are specifically exempted from the federal Safe Drinking Water Act and therefore unregulated for nitrogen or other contaminants, according to Craig Cox, director of the nonprofit Environmental Working Group’s Midwest office in Ames, Iowa. “It’s one thing to spread nitrate removal costs over half a million people [in Des Moines],” he said, “but it’s totally different if it’s a couple of thousand people.”

Every morning when Stowe goes to work, he confronts the challenge posed by Iowa’s upstream agriculture: which river to choose for his city’s water supply. He only has two options: either the Raccoon, which flows past a string of factory farms from the west, or the Des Moines, flowing to the city past farms to the northwest. Each morning, an intake valve delivers a water sample from each, which is run through the water works’ laboratory to test for contaminants.

One day, the Raccoon might exceed the EPA standard of 10 parts per million of nitrogen. Try the Des Moines? On that same day the Des Moines might not be an easy substitute because rates of phosphorous are up, which raises the prospect of toxic algae proliferating and producing poisonous mycotoxins. There’s no way to filter algae; the filters get clogged. And if you have too much algae in the water you might have to shut down the public water supply altogether — which is what happened in the city of Toledo, Ohio over three successive days in 2014. The incident in Toledo haunts Midwestern water officials. If the Des Moines is showing too much phosphate, as often occurs, Stowe will choose the Raccoon despite the elevated nitrogen levels, a decision that means firing up the city’s nitrogen filtration system that day and adding another $7,000 to the ratepayers cumulative bill.

Stowe’s lawsuit, originally filed in 2015, was intended to hold upstream agricultural counties liable for the mounting clean-up costs, but a state judge dismissed the suit on jurisdictional grounds. The judge ruled that a city water district has no legal standing to sue a county, and threw the matter to the state legislature. The legislature, according to Stowe, has done little to address the problem. Instead, Republican lawmakers have tried — thus far unsuccessfully — to defund the Des Moines Water Works and replace it with a body directly accountable to them. Senator Johnson further enraged his former GOP allies earlier this year when he fiercely opposed their low-budget approach to water monitoring: “We have done nothing but spit in the river that looks like chocolate milk,” he told Iowa Public Radio in denouncing the Republican water plan.

The EPA has identified phosphate and nitrogen farm runoff as “the single greatest challenge to our nation’s water quality.”

Starting in 2008, state funding has been cut for water monitoring, as well as for conservation incentives designed to discourage overuse of fertilizers. Initiatives to promote the planting of cover crops to absorb excess nitrogen and the building of buffers between nitrogen-saturated fields and streams also lost funding. The Aldo Leopold Center, which is at the cutting edge of researching more regenerative agricultural practices — including strategies for reducing nitrogen-based fertilizer runoff — had its funding withdrawn in 2017, and its future has been in limbo ever since. The budget for state environmental health positions was cut in half from 2009 to 2017, from $22 million to $11 million last year.

“I am pro-life,” Johnson said. “But what’s also very important to me is what happens between my concept of life at conception and what’s at the end of that. We need to fill that space with good things — excellence in education, a clean environment. That’s number one.” In 2016, he left the party shortly after the nomination of Donald Trump for president — which tipped the scales for Johnson. He became the lone independent in the Iowa legislature. In June, he dropped out of his race for reelection after facing major obstacles in raising money to run in the solidly Republican conservative district.

In 2016, in the waning days of the Obama administration, the EPA identified phosphate and nitrogen farm runoff in U.S. water supplies as a serious threat to the public’s health, identifying the two contaminants as “the single greatest challenge to our nation’s water quality.” The agency called for identifying those responsible and began devising strategies to be pursued by the federal government to slow down the rates of contamination. Instead, the Trump administration cut the EPA’s water monitoring budget by almost a third and eviscerated water quality enforcement.

Meanwhile, millions of residents in Iowa and along the Mississippi River corridor remain, as in the case of Des Moines, on the hook for millions of dollars in filtration costs, and exposed to the harmful impacts on their health.

Water quality is turning out to be a major issue in the upcoming midterm election in Iowa — most notably in the state’s race for governor between the incumbent, Republican Kim Reynolds, and Democrat Fred Hubbell, who has made water quality a key part of his campaign platform. In August, Johnson endorsed the Democrat, a first in his long career in state politics.

Travel for reporting this article was made possible by an award from The Fledgling Fund and Invoking the Pause.

Mark Schapiro is an award-winning environmental journalist and author of the recently published Seeds of Resistance: The Fight to Save Our Food Supply. He is also a lecturer at the University of California at Berkeley Graduate School of Journalism.

High CO2 levels cause plants to thicken their leaves, could worsen climate change effects

UNIVERSITY OF WASHINGTON

Plant scientists have observed that when levels of carbon dioxide in the atmosphere rise, most plants do something unusual: They thicken their leaves.

And since human activity is raising atmospheric carbon dioxide levels, thick-leafed plants appear to be in our future.

But the consequences of this physiological response go far beyond heftier leaves on many plants. Two University of Washington scientists have discovered that plants with thicker leaves may exacerbate the effects of climate change because they would be less efficient in sequestering atmospheric carbon, a fact that climate change models to date have not taken into account.

In a paper published Oct. 1 in the journal Global Biogeochemical Cycles, the researchers report that, when they incorporated this information into global climate models under the high atmospheric carbon dioxide levels expected later this century, the global "carbon sink" contributed by plants was less productive -- leaving about 5.8 extra petagrams, or 6.39 billion tons, of carbon in the atmosphere per year. Those levels are similar to the amount of carbon released into the atmosphere each year due to human-generated fossil fuel emissions -- 8 petagrams, or 8.8 billion tons.

"Plants are flexible and respond to different environmental conditions," said senior author Abigail Swann, a UW assistant professor of atmospheric sciences and biology. "But until now, no one had tried to quantify how this type of response to climate change will alter the impact that plants have on our planet."

In addition to a weakening plant carbon sink, the simulations run by Swann and Marlies Kovenock, a UW doctoral student in biology, indicated that global temperatures could rise an extra 0.3 to 1.4 degrees Celsius beyond what has already been projected to occur by scientists studying climate change.

"If this single trait -- leaf thickness -- in high carbon dioxide levels has such a significant impact on the course of future climate change, we believe that global climate models should take other aspects of plant physiology and plant behavior into account when trying to forecast what the climate will look like later this century," said Kovenock, who is lead author on the paper.

Scientists don't know why plants thicken their leaves when carbon dioxide levels rise in the atmosphere. But the response has been documented across many different types of plant species, such as woody trees; staple crops like wheat, rice and potatoes; and other plants that undergo C3 carbon fixation, the form of photosynthesis that accounts for about 95 percent of photosynthetic activity on Earth.

Leaves can thicken by as much as a third, which changes the ratio of surface area to mass in the leaf and alters plant activities like photosynthesis, gas exchange, evaporative cooling and sugar storage. Plants are crucial modulators of their environment -- without them, Earth's atmosphere wouldn't contain the oxygen that we breathe -- and Kovenock and Swann believed that this critical and predictable leaf-thickening response was an ideal starting point to try to understand how widespread changes to plant physiology will affect Earth's climate.

"Plant biologists have gathered large amounts of data about the leaf-thickening response to high carbon dioxide levels, including atmospheric carbon dioxide levels that we will see later this century," said Kovenock. "We decided to incorporate the known physiological effects of leaf thickening into climate models to find out what effect, if any, this would have on a global scale."

A 2009 paper by researchers in Europe and Australia collected and catalogued data from years of experiments on how plant leaves change in response to different environmental conditions. Kovenock and Swann incorporated the collated data on carbon dioxide responses into Earth-system models that are widely used in modeling the effect of diverse factors on global climate patterns.

The concentration of carbon dioxide in the atmosphere today hovers around 410 parts per million. Within a century, it may rise as high as 900 ppm. The carbon dioxide level that Kovenock and Swann simulated with thickened leaves was just 710 ppm. They also discovered the effects were worse in specific global regions. Parts of Eurasia and the Amazon basin, for example, showed a higher minimum increase in temperature. In these regions, thicker leaves may hamper evaporative cooling by plants or cloud formation, said Kovenock.

Swann and Kovenock hope that this study shows that it is necessary to consider plant responses to climate change in projections of future climate. There are many other changes in plant physiology and behavior under climate change that researchers could model next.

"We now know that even seemingly small alterations in plants such as this can have a global impact on climate, but we need more data on plant responses to simulate how plants will change with high accuracy," said Swann. "People are not the only organisms that can influence climate."

Tests Showed Children Were Exposed to Lead. The Official Response: Challenge the Tests

For at least two decades, the New York City Housing Authority routinely disputed tests that revealed lead in its apartments. Private landlords almost never do this.

Nov. 18, 2018

Mikaila Bonaparte has spent her entire life under the roof of the New York City Housing Authority, the oldest and largest public housing system in the country, where as a toddler she nibbled on paint chips that flaked to the floor. In the summer of 2016, when she was not quite 3 years old, a test by her doctor showed she had lead in her blood at levels rarely seen in modern New York.

A retest two days later revealed an even higher level, one more commonly found in factory or construction workers and, in some cases, enough to cause irreversible brain damage.

Within two weeks, a city health inspector visited the two Brooklyn public housing apartments where Mikaila spent her time — her mother’s in the Tompkins Houses; her grandmother’s in the Gowanus Houses — to look for the source of the lead exposure, records show. The inspector, wielding a hand-held device that can detect lead through multiple layers of paint, found the dangerous heavy metal in both homes. The Health Department ordered the Housing Authority to fix the problems.

The discovery spurred the Housing Authority to action: It challenged the results.

Rather than remove or cover the lead, the Housing Authority dispatched its own inspector who used a different test, documents show. The agency insisted that however Mikaila was poisoned, there was no lead in her apartments.

Entrusted as the landlord to 400,000 people, the Housing Authority has struggled for years to fulfill its mission amid a strangled budget and almost endemic political neglect. Last week, a judge suggested strongly that the federal government should take over the agency after an investigation found evidence of deep mismanagement, including that the Housing Authority failed to perform lead inspections and then falsely claimed it had. Six top executives lost their jobs amid the federal investigation; a complaint was filed in June.

But the authority did not just ignore the required lead inspections, The New York Times found.

For at least two decades, almost every time a child in its apartments tested positive for high lead levels, Nycha launched a counteroffensive, city records show. From 2010 through July of this year, the agency challenged 95 percent of the orders it received from the Health Department to remove lead detected in Nycha apartments.

Private landlords almost never contest a finding of lead; they did so in only 4 percent of the 5,000 orders they received over the same period, records show.

Nycha’s strategy often worked. The Health Department backed down in 158 of 211 cases in public housing after the authority challenged its finding, the data shows. A Health Department spokesman said that it rescinded its orders because it became convinced that its initial test was a false positive.

“I’m not sure how useful it is to spend all the time and resources going back and forth with testing when maybe we could spend the time and resources making sure the exposure is controlled,” said David Jacobs, who ran the lead poisoning prevention program at the United States Department of Housing and Urban Development from 1995 to 2004.

It is emblematic of disarray in the Housing Authority’s lead policy that stretches back decades, an examination by The Times found.

The Times interviewed more than 100 current and former top city and federal housing officials, maintenance workers, building managers, lead contractors, health experts and public housing residents and reviewed thousands of pages of documents and court records. Taken together, they reveal an agency that assumed lead was no longer a threat, despite not really knowing where it was.

After suing lead paint companies in 1989, the city spent years arguing in court that its public housing buildings were riddled with lead. But as the case wound down, the Housing Authority adopted the opposite position, routinely contesting findings of lead. By 2004, the authority decided that only 92 of its 325 developments contained lead and clung to that position.

It was apparently wrong.

The Gowanus Houses in Brooklyn are not on a list of buildings that the Housing Authority believes have lead. But lead has been found in its apartments at least twice, The Times found.

The Gowanus Houses in Brooklyn are not on a list of buildings that the Housing Authority believes have lead. But lead has been found in its apartments at least twice, The Times found.

Inspection failures

Soon after he took office in 2014, Mayor Bill de Blasio, a former regional director for HUD, signaled his support for the public housing system, modeling himself after former Mayor Fiorello H. La Guardia, who oversaw the creation of the authority in the 1930s. Mr. de Blasio repeatedly visited the red brick developments in his first year, holding eight news conferences, and he put billions into capital projects, including new lighting and roofs.

Of all the problems afflicting Nycha, lead was thought to be somewhere toward the bottom of the list, former officials said.

Lead paint — which becomes dangerous when it peels into flakes or is ground into dust that people can ingest — was once a pernicious threat in homes, schools and factories all over the country. The substance affects children differently but can stunt growth and cause permanent cognitive and behavioral problems in developing brains.

It has been banned in New York City since 1960 and subject to a federal ban in 1978. Since then, cases of lead poisoning have dropped precipitously in the city and nationwide.

So the Housing Authority just stopped looking for it.

Lead came up in discussions at Gracie Mansion after the metal was found in the water supply in Flint, Mich., in 2015. The consensus among New York officials was that they did not have to worry about a lead problem in the Housing Authority, according to a person with direct knowledge of the conversations.

Word came in late 2015 to City Hall that the Housing Authority was the subject of a sprawling federal investigation that included lead paint. When the inquiry became public the following March, the de Blasio administration played down the problem, even as it began to learn of inspection failures, several former officials said. Administration officials were dismissive of stories about lead exposure that had appeared in The Daily News.

Not long afterward, public housing residents received letters from the authority requesting access for inspections. The authority did not want to create “a panic” among residents, two people with direct knowledge of the conversations said, so the letters said nothing about yearslong lapses in mandatory lead paint checks.

Some members of the authority’s own board were also not informed. “Maybe they just didn’t want us to know,” said Beatrice Byrd, a resident of the Red Hook Houses in Brooklyn and a board member at the time.

Anthony Marshall, who works for a private lead inspection contractor, waited outside an apartment in the Red Hook Houses in Brooklyn in September.

Nycha, like all public housing that receives federal funds, must abide by myriad rules, and since 2000, one of those has been to look around older apartments every year to check for possible lead paint hazards, like peeling or flaking paint.

Once those are spotted, inspectors are supposed to conduct more sophisticated tests to determine whether lead is actually present.

In theory, the authority met that requirement by including those checks in its general apartment inspections, which happened annually until 2012. In reality, The Times found, looking for potential lead hazards was rarely part of the routine even before 2012, according to interviews with maintenance workers, residents and officials.

“We’re maintenance. We’re doing the inspections. We’re not mainly checking for lead,” said Tyree Haslip, a retired building superintendent who worked in the Queensbridge Houses. “The only thing we may mark down is if the paint is peeling off the walls or off the ceiling.”

That did not mean someone came to test or fix the potential hazard. Lead abatement teams worked mostly in buildings that Nycha believed to contain lead and usually only after a resident moved out, its workers said.

In the summer of 2012, the authority stopped making its annual maintenance rounds entirely, in response to a federal rule change.

The decision to stop those apartment inspections came under former Mayor Michael R. Bloomberg. The authority was eager to direct its maintenance workers to conduct repairs rather than perform so many inspections, two former officials said, to clear a ballooning backlog of open work orders, often called tickets.

“There was such pressure to get tickets completed,” said Paul D’Ambrosi, a former paint inspector who retired in 2012.

Officials are still unsure when lead inspections were last done, The Times found.

Olivia Lapeyrolerie, a spokeswoman for Mr. de Blasio, said in response to questions from The Times that City Hall could not find evidence that inspections even before 2012 were in compliance with local and federal laws.

“At this point, we have no confidence in Nycha’s annual inspections that took place before our administration began,” Ms. Lapeyrolerie said in a statement.

The inspection failures began coming to light inside the authority in 2015, according to a report by the city Department of Investigation. By the time the de Blasio administration began quietly starting to check for lead paint hazards in 2016 and 2017, it had been years since anyone had done so.

The neglect showed.

In a two-month stretch at the end of 2017, contractors hired by the city visited 8,300 apartments and found potential lead paint hazards — peeling or flaking paint, or dust — in 80 percent of them, according to records produced as part of a lawsuit in state court.

The suit, filed by a tenant group, is one of several the authority currently faces over lead, including a state court suit by Mikaila’s mother, Shari Broomes, and a separate action in federal court by the families of dozens of children who have tested positive for lead in recent years.

“You all have Band-Aids that you put on everything in the projects, but you all could not put a Band-Aid on something that was harmful to my daughter?” Ms. Broomes said. “It didn’t come from me, it came from my place of dwelling, and I can’t help the fact that we live here.”

This spring, the judge in the tenants’ case ordered new inspections, saying there was a “credibility issue” with testing overseen by the Housing Authority.

A new round of visual checks this year found hazards such as peeling paint in 92 percent of the apartments that were checked.

Federal authorities have said the city Housing Authority failed to conduct lead paint inspections and then falsely claimed it had. The revelations have hurt Mayor Bill de Blasio’s reputation as a champion for public housing.

Federal authorities have said the city Housing Authority failed to conduct lead paint inspections and then falsely claimed it had. The revelations have hurt Mayor Bill de Blasio’s reputation as a champion for public housing.

Not that long ago, the city was busy trying to convince a court that lead was a widespread hazard throughout its public housing. In 1989, city lawyers sued companies that made lead paint, accusing them of knowingly selling a poisonous product, much like the successful lawsuits against tobacco companies.

The lawsuit failed. Nycha’s own design specifications showed that except for two developments — Williamsburg in Brooklyn and the Harlem River Houses in Manhattan — the authority had never used the specific type of lead at issue in the suit. Another type of lead was more commonly used on building components, but that was not part of the lawsuit.

From 1998 to 2004, Nycha inspectors testing a sampling of apartments concluded there was lead in less than a third of its buildings. The inventory became a kind of bible: The apartments in buildings that were not on the list, such as the Gowanus and Tompkins Houses, where Mikaila lives, were assumed not to have lead.

But Nycha could have known something was most likely wrong with the list by watching its youngest tenants.

New York State law requires children to be screened for lead exposure even before they can walk, and annually up to age 6 if they are at particular risk. If the amount of lead in a child’s blood hits a certain threshold, it sets off a mechanism: The doctor contacts the city Health Department, which sends an inspector to test the child’s home for lead paint.

In 2015, 171 children in New York City public housing tested positive for elevated lead, down from 517 in 2010.

The Housing Authority was ordered by the Health Department inspector to remove lead in a child’s apartment an average of two dozen times a year from 2010 through 2017, records show.

But city and court records show the authority rebutted the Health Department’s findings as a matter of routine.

The Health Department would do a test using an X-ray fluorescence device called an X.R.F. analyzer, which looks like a ray gun and can measure lead through layers of paint. Nycha would follow up by digging out samples in the apartment and sending them to a lab, called a paint-chip test.

In a 1999 affidavit, Brian Clarke, then the coordinator of the Housing Authority’s lead detection and abatement unit, had disparaged the paint-chip technique. “A false negative can result,” he said in an affidavit.

But the paint-chip test eventually became the Housing Authority’s preferred method to challenge the Health Department’s tests.

“By the course of business, when we issue violations, Nycha does their own check,” Michel Meulens, a Health Department inspector, testified in October 2017 in a trial involving a child who tested positive for lead in 2003. The case ended in a settlement.

Mr. Clarke, who declined to comment for this article, would eventually rise to the upper echelons of the authority, as a senior vice president for operations. He was one of several top executives to be forced out late last year over his handling of the lead paint scandal. Shola Olatoye, the chairwoman of Nycha, was also among the executives who were ousted. She declined a request for comment.

The goal in challenging the Health Department’s findings, much like it was for the paint companies years before, was to shield the city from lawsuits by showing that the high lead levels in these children came from somewhere other than the home where they lived and played, officials said. The stakes are high: In January, a jury ordered Nycha to pay $57 million to the family of Dakota Jade Taylor, a child with high levels of lead in her blood. The sum is being negotiated.

The authority believed its approach was valid because the Health Department so often rescinded its orders, Stanley Brezenoff, the interim Nycha chairman, said recently in an interview.

The Housing Authority cannot say precisely when it began challenging the city’s own findings of lead. Staffers recalled that the practice dates at least to the late 1990s, Jasmine Blake, an authority spokeswoman, wrote in an emailed statement.

It continued until September when, after inquiries from The Times, the de Blasio administration reversed course.

“We are now in a posture of not contesting,” Mr. Brezenoff said. “Whatever the merits of a particular case, or whatever is involved, we’re accepting whatever the finding of the Health Department is.”

It was a lesson private landlords learned years ago. “There’s a concern and a fear on the part of the owners about liability. They just do it,” said Frank Ricci, director of government affairs for the Rent Stabilization Association, which represents residential building owners. “The landlord calls a certified contractor to come and correct the condition wherever D.O.H. has designated they’ve found lead.”

Warring between government agencies has bewildered families.

Deborah Morrison, 51, a substitute teacher and resident of the Gowanus Houses, recalled when her son, Saheed, tested positive for lead in 2010. Now a soft-spoken 11-year-old, Saheed excels in designing cartoon characters on his phone but needs special help in school.

Saheed Morrison, 11, near a wall where lead was detected in his home at the Gowanus Houses in Brooklyn. Health inspectors found lead at a second location in the apartment, but the Housing Authority disputed the findings.

Saheed Morrison, 11, near a wall where lead was detected in his home at the Gowanus Houses in Brooklyn. Health inspectors found lead at a second location in the apartment, but the Housing Authority disputed the findings.

Ms. Morrison said Housing Authority workers used gypsum board to cover a portion of her bedroom wall, close to where Saheed had slept in a crib for the first years of his life.

She did not realize until told by a reporter that the authority had successfully disputed a Health Department finding of lead paint in a second location, on a hallway pipe. No work was done there, according to city records.

“See, now you got me, because I didn’t even know there was two,” she said.

Lead paint had been found during renovations in the mid-1990s in the Gowanus Houses, Barry Stabile, a former Housing Authority employee involved in the work, said in an interview. But based on its sampling from 1998 to 2004, the Housing Authority did not include the Gowanus Houses on its list of complexes assumed to have lead paint. That did not change even after Saheed tested positive, and the authority worked on his home.

So when federal regulators visited the Gowanus Houses in 2015 on a routine inspection — when Saheed was 8 and Mikaila barely 2 — they did not treat the peeling paint they saw almost everywhere as a health hazard, according to HUD.

Neither did the Housing Authority. For both agencies, the deteriorating paint was just a maintenance problem.

This year, Mr. de Blasio promised to spend $80 million for testing next year to figure out, once and for all, where the lead paint is. The city will be inspecting apartments built before 1978, approximately 140,000 units out of 176,000 that the Housing Authority maintains, and the inspectors will be relying mainly on X.R.F. analyzers for the hunt.

In the late summer of 2016, as the city scrambled to reinspect apartments for lead paint hazards, Mikaila’s blood lead level hit 37 micrograms per deciliter, nearly eight times the amount that prompts Health Department action.

After the Housing Authority told the Health Department that the lead could not have come from its apartments, Mikaila’s family said she was still not herself, by turns lethargic and hyperactive. Occasionally, said her grandmother, Ordeen Broomes, she wailed with discomfort. A third blood test in late September 2016 showed she still had very high levels of lead.

So the Health Department returned to both apartments and again found lead, according to city records, this time in dust on the floor. At this point, the Housing Authority relented. Workers came with a bucket of cleanser and a special vacuum to suck up the dust.

But no one looked for the source of the lead-riddled dust, according to city records reviewed by The Times. The Housing Authority declined to comment on Mikaila’s case, citing the pending litigation.

Mikaila, now 5 and a kindergartner, has not required any special attention at school, her mother said. Still, said Max Costa, a professor and chairman of environmental medicine at New York University School of Medicine, her experience is “going to totally affect her life, and there’s no way you can reverse it.” The family’s observations are consistent with those effects, Mr. Costa said.

Ms. Broomes, who works for the Parks Department, wants to get her family out of public housing. But it is a struggle.

On a recent evening, she sat at her dining room table holding her head in her hands. A cockroach fell from a kitchen cabinet. Another climbed the wall.

About a year after Mikaila tested positive for lead, maintenance workers painted, patched over a large hole in the wall and laid new tiles on top of her crumbling linoleum floor, Ms. Broomes said. Problems persisted, she said, but saving money for a private apartment or a house was difficult.

As she spoke, Mikaila, sitting beside her, arched her eyebrows at the thought of a house.

“I want stairs for my room,” Mikaila said. “I want stairs so I can go up the stairs so I can go to my room. I want to get a back garden and I want to plant some seeds.”

"U.N. Environment Chief Quits Over Travel Expense Report"

"UNITED NATIONS - U.N. environment chief Erik Solheim said on Tuesday he had resigned after receiving a final audit of his official travel.

A draft of the report found he traveled for 529 out of the 668 days audited, spending $488,518 with no regard for the rules, Britain’s The Guardian newspaper reported in September.

U.N. Secretary-General Antonio Guterres had accepted Solheim’s resignation, which would be effective from Thursday, U.N. spokesman Stephane Dujarric told reporters."

Regulators close Maine’s shrimp fishery for next 3 years

By DAVID SHARP

November 16, 2018

PORTLAND, Maine (AP) — Regulators voted Friday to close the Gulf of Maine winter shrimp season for another three years, raising fears that the fishery decimated by rising water temperatures may never bounce back.

The Atlantic States Marine Fisheries Commission has been taking a year-to-year approach to determining whether to allow a winter season, but the panel decided to shut it down for 2019, 2020 and 2021 after receiving a dismal report on the depleted fishery.

The fishery has been shut down since in 2013 in Maine, New Hampshire and Massachusetts.

“The stock has shown very little signs of recovery. It’s considered a depleted resource,” said Tina Berger, spokeswoman for the agency.

Fishermen, the bulk of them from Maine, used to catch millions of pounds of the shrimp every winter.

But the warming ocean and predation have decimated the shrimp fishery. The shrimp are especially sensitive to changes in water temperature, Berger said.

Maine Marine Resources Commissioner Patrick Keliher supported another one-year moratorium on the fishery but voted against closing it down for three years.

He offered successful motions to create a team to adjust the management strategy to account for climate changes and made a motion to evaluate the effectiveness of the summer shrimp survey, both in hopes of having the best information for future decision-making.

“Climate change is driving the decline in this fishery. He’s trying his best to put forward ideas for change and improvement in the science and in the management, to provide the best opportunity for a fishery in the future,” said Jeff Nichols, spokesman for the Maine Department of Marine Resources.

The shrimp, called Maine shrimp or northern shrimp, are small and pink, and prized by New Englanders. They have been mostly unavailable to U.S. consumers since the shutdown, though they are also harvested by Canadian fishermen.

Antarctic scientists begin hunt for sky’s ‘detergent’

20 NOVEMBER 2018

Ice records pre-industrial levels of chemical that scrubs the atmosphere of greenhouses gases.

Nicky Phillips

Antarctic ice records historic levels of a chemical that removes greenhouse gases from the atmosphere.Credit: Mario Tama/Getty

To understand how the sky cleanses itself, a team of Australian and US researchers is heading to Antarctica to track down the atmosphere’s main detergent. By drilling deep into polar ice, the scientists hope to determine how the sky’s capacity to scrub away some ozone-depleting chemicals and potent greenhouse gases has changed since the Industrial Revolution — information that could help to improve global-warming projections.

The first members of the project travelled to Law Dome, their drilling site in East Antarctica, this week. There, they hope to capture the first historical data on concentrations of the dominant atmospheric detergent, the hydroxyl radical. This highly reactive molecule, made of an oxygen atom bonded to a hydrogen atom, breaks down about 40 gases in the air. They include methane and hydrofluorocarbons, but not the most prevalent greenhouse gas — carbon dioxide.

Although studies1 of other atmospheric gases have been used to infer the abundance of hydroxyl over the past four decades, atmospheric chemists still refer to the chemical as ‘the great unknown’.

“We have been more or less in the dark when it comes to how hydroxyl has evolved from pre-industrial times to present day,” says Apostolos Voulgarakis, an environmental scientist at Imperial College London. “This new research endeavour can provide unprecedented information on hydroxyl variations in the deeper past, which is exciting.”

Over two and a half months, the team will drill at least two ice cores — three if time allows — down to depths of about 230 metres. They will then melt the cores to extract bubbles of air that were trapped as the ice froze. The samples will represent the atmosphere back to about 1880, before emissions of greenhouse gases from human activity started to increase.

Hydroxyl radicals form naturally in the atmosphere in a reaction involving ultraviolet rays, ozone and water vapour. But because the radicals last about a second before they react with other gases and break them down, as a proxy, the team will instead measure the tiny fraction of carbon monoxide that contains the carbon-14 isotope.

Carbon-14 in carbon monoxide is produced in the atmosphere by cosmic rays at a known rate, and is almost entirely removed by hydroxyl. Because of this, scientists can use the trend in its abundance to infer the trend of the radical, says David Etheridge, an atmospheric chemist at the Commonwealth Scientific and Industrial Research Organisation (CSIRO) in Aspendale, Australia, and a co-leader of the drilling project.

But measuring levels of carbon monoxide that contain carbon-14 is tricky, because there are only a few kilograms of it in the atmosphere, says Etheridge. “And we’re trying to measure a bit of that over the last 150 years in the Antarctic ice.”

There is also a risk that the ice cores will become contaminated with external sources of carbon-14 from cosmic rays. This high-energy radiation cannot penetrate the ice, but the moment the cores are removed, they are at risk of exposure. This would interfere with the signal the team is trying to measure, says co-leader Vasilii Petrenko, an ice-core scientist at the University of Rochester in New York. To avoid that risk, the researchers will melt the ice and extract the air on-site.

Organizing the equipment to do this and transport it to a remote ice sheet has been a huge logistical challenge, says team member Peter Neff, an ice-core scientist at the University of Washington in Seattle.

Tractors pulled giant sleds loaded with equipment to the Law Dome drilling site, which is more than 130 kilometres from the nearest research station. And it will take the team 36 days to melt the ice they need to get enough air samples. “It's a marathon, not a sprint,” says Neff.

The project is co-funded by the Australian Antarctic Division and the US National Science Foundation.

Once the researchers return from Antarctica, to assess the levels of the carbon-14 in carbon monoxide, the team will convert the gas into carbon dioxide and then into graphite, from which the isotope can be measured. The scientists can then use the information to infer how hydroxyl levels in the Southern Hemisphere have changed over time.

Up to now, information on historical trends in hydroxyl levels has come solely from atmospheric models; these simulations suggest2 that concentrations remained fairly stable from 1850 until the 1970s, when they started to rise. The increase was mainly because of a boost in atmospheric warming at the time, says Voulgarakis.

The data collected from Law Dome will help to determine whether the atmospheric models have captured this trend correctly, says Matt Woodhouse, a climate modeller at CSIRO, who will use the information to improve Australia’s global chemistry-climate model, called ACCESS. “Our ability to resolve hydroxyl won’t revolutionize climate models, but it’ll increase our confidence in them.”

And accurate pictures of hydroxyl’s historical and current atmospheric concentrations are essential for developing better projections of its future levels, says Voulgarakis. This will then enable more-accurate projections of the future abundance of gases that affect climate — such as methane, ozone in the lowest layer of the atmosphere, and aerosols — that hydroxyl scrubs from the sky, he says. This would make it easier to determine the gases’ potential contribution to global warming.

References

1.

Krol, M., Jan van Leeuwen, P. & Lelieveld, J. J. Geophys. Res. Atmos. 103, 10697-10711 (1998).

Article Google Scholar

2.

Voulgarakis, A. et al. Atmos. Chem. Phys. 13, 2563–2587 (2013).

There’s a fight brewing in D.C. over the future of the green movement

By Zoya Teirstein on Nov 20, 2018

Something weird is happening around climate change right now — and it’s not just that rising average temperatures are throwing our entire planet out of whack. Typically an issue politicians on both sides of the aisle avoid, climate has been a topic of heated conversation on the Hill ever since the Democrats took the House on Nov. 6. What gives?

The reinvigorated dialogue around climate is due, at least in part, to a group called the Sunrise Movement. Representative-elect Alexandria Ocasio-Cortez joined 150 Sunrise protestors in House Minority Leader Nancy Pelosi’s office last week for a sit-in to demand an economy-wide plan to address climate change. The activists and a small number of progressive Congressional Democrats (most of them newly elected), are pushing for something called a Green New Deal — kind of like the 1930s version but for green jobs. (Sunrise Movement cofounder Varshini Prakash was a member of the 2018 Grist 50.)

But if you think the plan went over well with everyone who understands climate change, you’d be mistaken. Many politicians on both sides of the aisle prefer a market-driven approach that could hypothetically garner bipartisan support. The activists argue that neither political party, especially not the Republicans, has come to the table with the kind of solution necessary to avert climate catastrophe. To that end, on Tuesday, Sunrise Movement members staked out Congressional representatives, like Democrats Barbara Lee of California and Jan Schakowsky of Illinois, to ask them for their support on a Green New Deal.

The protests shine a spotlight on the rebirth of two very different approaches to climate change solutions: sticking with compromise tactics, such as a carbon tax that can appeal to people on either side of the political spectrum, versus a balls-out, last-ditch effort to create a green America. Proponents of each think they have the more realistic approach. As we hurtle closer to a 2 degrees Celsius of warming, the split between these two groups is widening into a chasm.

One of the people rankled by the activists’ efforts to strong-arm Pelosi is Representative Carlos Curbelo of Florida, the Republican who co-founded a bipartisan climate change caucus in the House of Representatives two years ago (which earned him a spot on our 2017 Grist 50 list). This past Election Day, Curbelo lost his seat to a Democrat, Representative-elect Debbie Mucarsel-Powell. The Sunrise demonstrations still didn’t sit well with Curbelo, who called the protestors’ actions “truly deplorable.” In response, the young activists called him a phony.

There’s reason to think that Curbelo really believes his vision for reining in emissions is the right one. This summer, he introduced the Market Choice Act — a carbon tax that went approximately nowhere, but, as Curbelo said, laid the groundwork for similar taxes in the future. He was one of only a handful of candidates, blue or red, who ran midterm ads that mentioned his position on climate change. And he wasn’t shy about bringing up climate change on the Hill over and over again, even while the rest of his Republican colleagues ignored the issue and condemned solutions.

But Curbelo’s political legacy isn’t all green. He voted in favor of President Trump’s tax plan that opened up parts of the Arctic Refuge for oil exploration, took money from energy companies in his bid for reelection, and recently caught flak for calling people who made the link between hurricanes and climate change “alarmists.”

Curbelo says he plans on continuing his climate-related work after he steps down in January — and he’s still got his eye on a carbon tax. But Sunrise activists aren’t giving up either. Serious climate legislation won’t get through the Republican-controlled Senate for a long time. In the meantime, the Democratic Party has a choice: stick with its old, bipartisan approach (albeit now with fewer Republican moderates to reach to across the aisle), or break off from the middle like a piece of Arctic ice.

We might not have to wait long to see which road Democrats take. Capitalizing on the zeitgeist, Senator Bernie Sanders announced on Monday that he’ll host a town hall dedicated to climate solutions next month. The 90-minute event is meant to galvanize support for fundamental changes in America’s energy policy — exactly the kind of solution for which Sunrise and Ocasio-Cortez are gunning. If this keeps up, veteran politicians may soon be forced to confront an approach that has been, until now, safely sequestered on the sidelines.

Generation Climate: Can Young Evangelicals Change the Climate Debate?

For students at this top evangelical college, loving God means protecting creation. That includes dealing with the human sources of climate change.

BY MEERA SUBRAMANIAN, INSIDECLIMATE NEWS

NOV 21, 2018

WHEATON, Illinois — Diego Hernandez wasn't thinking much about climate change until last summer, when he was traveling with his family along the Gulf Coast in his home state of Texas, where his ancestors—cowboys and politicians, he said—reach back to the 1600s. His mother suggested they take the "scenic route" for that summer drive, Diego said, his fingers making air-quotes because there was nothing "scenic" about it. All he saw were oil refineries.

"At that moment," said 19-year-old Diego, who considers himself a libertarian, "the switch kind of flipped for me." Why are we putting refineries in this beautiful place? he thought. The impacts from Hurricane Harvey, which had hit Houston the previous August and had affected some of Diego's relatives, were also still lingering in his mind.

"I used to be like, oh, there's oil, go start drilling, you know, because of course it's all about the money, right?" he said, his voice tinged with sarcasm. But after that family outing, he began to ask questions—"What is it doing to our environment? How is it going to affect us in the next 10 to 50 years?"—and since then he's had climate change on his mind.

Diego is a clean-shaven, lifelong Christian wearing a cyan blue button-down and polished cowboy boots, and a sophomore at Wheaton College in Wheaton, Illinois, which has been called the Harvard of Christian schools. The entrance sign, framed by a glowing bed of zinnias in full bloom, pronounces the school's motto: "For Christ and His Kingdom." But while Diego has all the credentials of a true political conservative—president of Wheaton's Young Americans for Freedom chapter, a cabinet member of the College Republicans—he also finds himself genuinely baffled by the right's stance against acting on climate change.

While many evangelicals are preoccupied with the long-term state of human souls and the protection of the unborn, Diego and the other students I met at Wheaton are also considering other eternal implications and a broader definition of pro-life. They are concerned about the lifespan of climate pollutants that will last in the atmosphere for thousands of years, and about the lives of the poor and weak who are being disproportionately harmed by the effects of those greenhouse gases. While Diego was just shy of eligible voting age in the 2016 presidential election, he's old enough to vote now. He and other young evangelicals thought hard this year about the politicians on offer, the issues they stand for, and who deserved their votes.

What's an Evangelical to Do?

Evangelical Protestants—one in four American adults—are a political powerhouse. They are the single largest religious group in the nation, and they are nearly twice as likely to be Republican as Democrat. And while Baby Boomers are currently the strongest political voting bloc, that's only because the older you are, the more likely you are to vote.

The current crop of younger people—from Gen X to Millennials to the newly minted adults I met at Wheaton—are poised to dominate the eligible-voter body politic. They would definitively tip the voting scales—should they become engaged. There are signs they might be doing just that. From the Parkland school shooting victims to Millennial political candidates, the youth of America are speaking up. And, significantly, they accept the scientific consensus on climate change at a much higher rate than their elders.

This is true even of young evangelicals, as the existence of the Young Evangelicals for Climate Action (YECA) attests. YECA is a ministry of the Evangelical Environmental Network that aims to mobilize students, influence religious leaders and pressure lawmakers into passing legislation to address climate change. I met Diego at a climate change discussion event on campus that was organized by Chelsey Geisz, a Wheaton junior and a YECA climate leadership fellow.

From Colorado Springs, Colorado, Chelsey, 20, always loved nature, she told me as we sat together in a gazebo in Adams Park, near campus. She'd taken a few classes on sustainability at Wheaton, and last year spent time working at Eighth Day Farm in Holland, Michigan, where Christian volunteers have turned the dirt once trapped below strip mall pavement into garden plots to grow vegetables for the hungry. These experiences meant she was primed when she heard about YECA.

Lindsay Mouw, program coordinator for Young Evangelicals for Climate Action, speaks with students at a training session in Indiana. The group works to mobilize other students and influence religious leaders to pressure lawmakers on climate action. Credit: Tori Goebel

Though non-partisan, YECA is targeting conservatives, since that's where the facts of climate change have failed to lead to action. According to the organization, they've engaged more than 10,000 young evangelicals so far. Along with Chelsey, there are another half-dozen fellows at other schools across the country, helping to build the grassroots movement. The fellowship includes a summer training session that covers the science of climate change, as well as the socio-cultural and religious aspects of the issue. As a YECA fellow, Chelsey organizes campus events such as the session I attended in September and she serves as Wheaton's executive vice president of campus sustainability, a new position that YECA helped develop.

When it comes to climate change and the environment, "I think there's some misunderstanding about what our faith compels us to do," said Chelsey Geisz.

It can be tough to be an evangelical who cares about climate change, Chelsey said, "because the environmental activists don't trust you and the evangelicals hate you." Or they could hate you; she was quick to point out that the evangelicals she knows personally are generally tolerant of her views. "I'm not encountering anyone at Wheaton, even among my most conservative friends, who disagree with climate change," she told me. She's having some trouble with her father, though, who's troubled by her YECA work. He holds a Harvard law degree, works at a company that invests in resource-rich properties, and associates Chelsey's transformation into a "climate activist" with a liberal agenda he finds suspect. "For a man who has such well-reasoned opinions, I just feel like there's so much emotion for him that it's not about the science at all," she said.

As for liberals themselves, Chelsey said, some of them do treat evangelicals like her with some suspicion. After all, aren't evangelicals the ones who elected anti-environment Trump?

"I think there's some misunderstanding about what our faith compels us to do," she said as the sun set behind her, creating a halo around the edges of her auburn hair.

Praising Natural Systems

Sean Lyon is a recent Wheaton graduate who was also a YECA fellow while he was in school. He feels that he was born to love the natural world; his first word as an infant was "bird," after all, and flying creatures remain a passion he can't quite explain. While in school, he created his own interdisciplinary major of biology and business and spent significant time in Tanzania working with ECHO East Africa, a faith-based sustainable agriculture organization. He still lives in the town of Wheaton, easy commuting distance to Chicago, where he's volunteering at the Field Museum of Natural History.

"If you focus too much on only a personal relationship being the core tenet of your faith, then it means that you're more easily able to marginalize topics like human suffering, which in some cases is spurred by climate change. We are embodied creatures in this planet, so let's live like we are," said Sean Lyon. Credit: Meera Subramanian

Sean, 23, grew up in upstate New York, among "classic North American white evangelicals," where climate was not a concern and politics were conservative. But his love of the natural world shifted his perspective. He saw heaven on earth, and something worth saving, in every wingbeat he witnessed.

"Every ecosystem carries His creativity in it," Sean said, "and every species is a mark of His design." He had a thick brass bangle encircling his wrist, and blue eyes behind clear Lucite-rimmed glasses. Sean drew an analogy to his sister and grandparents, who are all artists. "So how would I treat the art that they created? If I love them, then I'm going to treat their art well. I'm not going to deface it. I'm not going to ignore it. I'm going to really honor it. And so when I see my God as having created everything that I'm interacting with, I want to honor it because that's a way that I can show my love for this Creator."

But God didn't just create singular works, Sean said; he created systems, natural systems that every living being relies on. He hoped that all Christians—no, he corrected himself, all faiths—would unite to protect those systems.

Climate science isn't questioned at Wheaton College the way it often is in the wider evangelical community. The school is a brick-and-mortar rebuttal to the myth that science and religion must be at odds with each other. When Wheaton students step into their-state-of-the-art science building, for instance, they are greeted with signs stating that a "sound Biblical theology gives us a proper basis for scientific inquiry," and a display featuring locally excavated Perry the Mastodon, which carbon dating shows to be more than 13,000 years old.

The school is not alone in intertwining commitments to love God and protect the earth, often referred to as "creation care." The Cape Town Commitment, a global agreement between evangelical leaders from nearly 200 countries, includes acknowledgement of climate change and how it will hurt the world's poor (and it is required reading for Wheaton freshmen). Katharine Hayhoe, an atmospheric scientist at Texas Tech University and an evangelical, has been an outspoken advocate for climate action. And in addition to YECA, there are numerous groups active in this arena, including the Evangelical Climate Initiative, Climate Caretakers, Care of Creation and A Rocha.

Wheaton College is a brick-and-mortar rebuttal to the myth that science and religion must be at odds with each other. Credit: Meera Subramanian

In late 2015, the National Association of Evangelicals (NAE)—the biggest umbrella group of evangelicals in the country, representing 43 million Americans—issued a statement accepting climate change, acknowledging the human contribution to it and encouraging action. YECA's advocacy helped bring that statement, called "Loving the Least of These," into being. In it, NAE argues that Christians should be compelled to care about climate change as a matter of social justice, equating those without the resources to adapt to failed farming or dry wells or rising seas as the modern-day equivalents of the widows and orphans of Jesus's day.

When Chelsey reads the Bible, she hears this gospel of social justice, too.

"Instead of talking about climate change," she said of her work as a YECA fellow, "I talk about environmental justice. There's definitely a guilty complex, especially among the white evangelical community, about how complicit we've been, and apathetic. People really want to redeem that."

Chelsey's framing reveals that she is steeped in a liberal arts ethos friendly to intersectionality, the idea that humanity's ills, which disproportionately affect the most vulnerable, cannot be conquered until root causes are addressed. This perspective is shaping academic dialogue in both secular and faith-based schools.

But does fighting climate change detract from evangelism? Here there's a rift within the evangelical community. Should the emphasis be on saving souls or saving God's creation? And are the two really at odds?

A sign in a Wheaton College science building reminds students about lab safety by asking: What Would Jesus Do (in chemistry lab)? Credit: Meera Subramanian

"That's the Billy Graham evangelicalism," Chelsey said of the personal salvation perspective, referencing Wheaton's most famous alumnus. "It's your faith between you and Jesus." But the problem with that approach, she said, is that it doesn't force Christians to deal with larger systems of injustice. "The evangelical community is really limited when it comes to talking about systemic and structural sin rather than individual sin. Most of us have never heard about systemic racism and climate change in church," she said. Even as evangelical organizations embrace the need for action, the message isn't coming across from the pulpit. "These things never come up because they're apparently not gospel issues," Chelsey said, "But at Wheaton, we think they are."

For Sean, there's not one speck of conflict between his love of God and the gospel and his fierce desire to see action on climate change. They're complementary, he said.

"If you focus too much on only a personal relationship being the core tenet of your faith, then it means that you're more easily able to marginalize topics like human suffering, which in some cases is spurred by climate change," he said. "We are embodied creatures in this planet, so let's live like we are."

Could his concern for the climate be a threat to his faith? I asked him.

"Actually, I see more of a threat in the idea that we can divorce our lives on this earth and the lives of other people and the lives of other creatures from our life of faith," Sean said. Better to revel in God's love. "How much deeper and how much more beautiful is a way of loving Him that involves my whole being and the whole world around me rather than just simply the status of my soul?"

Abortion was the entry point into American politics for many evangelicals, after the Supreme Court affirmed abortion rights in Roe v. Wade in 1973. Before that, evangelicals were generally unconcerned about abortion rights, which had the uncontroversial support of Republicans; they were also generally disengaged from voting. Today, the single-issue anti-abortion preoccupation of many evangelicals, now considered a given by many political leaders, confounds some of the young evangelicals I met at Wheaton.

"If we say we're pro-life, we have to care for people who are experiencing incredible environmental degradation and so directly affected by climate change," Chelsey said. "If we're pro-life, that's a bigger issue to me than abortion."

Sean agreed. "So many people are now saying, okay, if you're going to be pro-life you have to be pro all-of-life, lifelong pro-life, which has primarily come up in the immigration debate. If you're pro-life, how can you be separating children from their parents?"

"I used to be like, oh, there's oil, go start drilling, you know, because of course it's all about the money, right?" Diego Hernandez said, his voice tinged with sarcasm. But "what is it doing to our environment? How is it going to affect us in the next 10 to 50 years?" Credit: Meera Subramanian

Diego sees it a little differently. "Abortion is definitely a deal-breaker for me," he said, even though he said he's not generally a one-issue voter. He echoed Sean and Chelsey to some degree, agreeing that "being pro-life doesn't just mean being pro-life to the baby at birth. It also means the life of the mother and the life of the baby after birth." But when he watched the 2016 presidential debates, he found himself agreeing with some of Hillary Clinton's points ... until he was appalled by what he saw as her "gung-ho" support of abortion rights. He decided he could just not get behind someone with those views.

Young evangelicals wrestle with these difficult choices in the voting booth, confronted with either/or candidates, unsure who will best represent their hopes for life on earth, all life, all of God's creation. Right now, anti-abortion rights Christians typically have only one party to get behind. And it's that party, represented in the White House, that is aggressively rolling back climate protections, from pulling out of the Paris climate accord to promoting coal.

Future Powerhouse at the Polls?

Diego, Chelsey and Sean are the future. This younger generation has grown up with the realities of climate change and political polarization since they were swinging on monkey bars, and they aren't hesitating to break rank with evangelical Baby Boomers on the issue. They remain faithful and politically conservative for the most part, but they are more concerned about a climate that they will have to live with much longer than those boomers heading into retirement. The shift aligns with a recent Pew poll that found that among Republicans, young adults were far less likely than their elders to support reliance on fossil fuels.

"Every one of the people who I've talked to who's come to my events and engaged in climate issues from a Christian perspective said, 'My parents don't agree with me,'" Sean told me.

Wheaton College students have an active role model in former U.S. Rep. Bob Inglis, a Republican who has been leading a conservative movement for clean energy and action on climate change. Credit: Meera Subramanian

But even with this clear shift toward accepting climate science among young Americans, the quandary for young evangelicals in the voting booth remains.

Sean, who said he couldn't in good conscience vote for either party, opted for Jill Stein in 2016.

Chelsey, as a busy freshman in 2016, followed in her father's footsteps and voted for Trump. Her father had been singularly focused on getting a Republican on the Supreme Court. Now, she hangs her head about the decision.

Diego, about to vote in his first election, grew up in a struggling, hard-working family in San Antonio. His father showed him how to mow lawns when he was six, he said. His mother would pick up her raggedy old Bible and tell Diego, "This is what you should base all of your beliefs and all your values on. It shouldn't be what you hear from someone on TV or C-SPAN or NPR."

Lyndsay Mouw and other YECA members brought their message of climate action in God’s name to Washington, D.C., during the People's Climate March.

Surveys show that the way people view climate change is determined more by political affiliation, along with race and ethnicity, than by religious affiliation. So while 81 percent of white evangelicals voted for Donald Trump, it's important to remember that about a quarter of the country's evangelicals are not white, and it is among minority groups that the evangelical community is growing. And on the issue of climate change, Diego's Latino background makes him part of the American demographic that is most concerned about climate change. He wonders whether his mother deliberately pushed for that "scenic route" to wake him up a little.

What are the choices for these faithful young? With church membership in decline and the Republican party in flux, how vocal these young people are could shape the future of the climate debate. If the Christian right wants to hold onto the next generation, getting right with the planet might prove as important as getting right with God.

Many concerned about the environment rally for more evangelicals to understand climate change and embrace leadership positions on the issue. "It would be a milestone if you managed to take influential evangelists—preachers—to adopt the idea of global warming, and to preach it," Nobel Prize-winning economist Daniel Kahneman told the host of Hidden Brain, an NPR science show. "That would change things. It's not going to happen by presenting more evidence, that is clear."

And in the book The Creation: An Appeal to Save Life on Earth, renowned biologist E.O. Wilson wrote a long letter with the salutation, "Dear Pastor." It is an urgent, heartfelt plea. "We need your help. The Creation—living Nature—is in deep trouble. Scientists estimate that ... half the species of plants and animals on Earth could be either gone or at least fated for early extinction by the end of the century. A full quarter will drop to this level during the next half century as a result of climate change alone."

These new sermons and stories are unlikely to come from older pastors and preachers, most of whom have become representatives of the Republican Party platform that doesn't want to even acknowledge that climate change is an issue to discuss, let alone embark on the massive undertaking necessary to begin to solve it. But for the young, who will live with the catastrophic predictions that worsen with each new iteration of the UN climate report, there are new stories emerging. They are conversion stories of a new sort, springing from dirt once buried under Midwestern parking lots and held aloft on the wings of Sean's beloved birds. Preachers and politicians seeking to keep the young religious right in their midst may need to leap past the quagmire of a questionable climate change debate and get right to the root of finding solutions for the generations that will be living into the long tomorrow of a warming planet.

The face of the Green New Deal (she's not who you think)

Maxine Joselow, E&E News reporter Greenwire: Wednesday, November 21, 2018

Sunrise Movement co-founder Varshini Prakash protests in House Minority Leader Nancy Pelosi’s office earlier this month. @sunrisemvmt/Twitter

She's only in her 20s, and she's already the face of an ambitious push for Congress to tackle climate change.

Thinking of Rep.-elect Alexandria Ocasio-Cortez (D-N.Y.), a rising star in the progressive wing of the Democratic Party at just 29 years old?

Think again. Her name is Varshini Prakash, and at 25 years old, she serves as the co-founder and leader of the Sunrise Movement, a youth-driven grassroots effort to hold elected leaders accountable for a warming planet.

"It's fundamentally about the future that our generation wants to grow up in," Prakash said in a recent interview. "We see this fight as being about our right to access good jobs and a livable future."

The Sunrise Movement has made waves with its backing of the "Green New Deal," a progressive plan to wean the United States off fossil fuels, boost renewables and build a "smart" grid

The group has lofty demands: It wants to revive a select committee on global warming in the House, which could produce draft Green New Deal legislation by 2020. It also wants incoming Democratic lawmakers to sign a pledge that they won't accept fossil fuel money.

To build consensus around these ideas, the group staged protests this month in the offices of House Minority Leader Nancy Pelosi (D-Calif.) and Democratic Rep. Frank Pallone of New Jersey (Greenwire, Nov. 16). Activists visited dozens of additional Democratic offices across the country yesterday, from New Hampshire to California.

But not everyone's on board. Disagreement is festering within the Democratic Party about the best way to address climate change in the next Congress, let alone to pass climate legislation — no small feat with Republicans controlling the White House and the Senate.

Prakash was born and raised outside of Boston by parents who originally hail from India. She attended the University of Massachusetts, Amherst, where she led the UMass Fossil Fuel Divestment Campaign, which declared victory after a two-week peaceful protest. She supported the creation of the Sunrise Movement this spring and now serves as lead spokeswoman.

E&E News recently spoke with Prakash about making big demands, staving off the "climate crisis" and battling for the "heart and soul of the Democratic Party."

Why did you decide to storm Pelosi's office?

During the week preceding this action, there were massive fires happening in California. Forty-eight people had perished. Entire towns had been leveled. And the U.N. report on climate had come out, saying we have just 12 years to radically transform our economy and society at scale to stop the climate crisis.

So a lot of us were really frustrated after the Democrats took back the House, when we heard Nancy Pelosi endorsing this bipartisan marketplace of ideas and saying she was going to revive this committee on climate change but not making other promises. It felt like this was grossly inadequate to address the climate crisis at hand. So we chose to do an action targeting Nancy Pelosi in order to say that we need our leaders to step up significantly on climate.

What was your experience of the protest like?

It was inspiring; it was emotional; it was moving. There were over 200 young leaders there, ranging from 17 to 35. It was so moving to see what young people are willing to do today to actually take action to stop the climate crisis.

Oh, and Alexandria Ocasio-Cortez came to our action. It was really meaningful for the young people who were there to see someone who had walked the halls of power encouraging us and lending her support.

Do you support Pelosi's bid to become House speaker again?

Well, we're still figuring that out a little bit. Our line right now is that we really want Nancy Pelosi to back a resolution that was put forward by Alexandria Ocasio-Cortez to create a new committee called the House Select Committee for a Green New Deal.

What do you mean exactly by a "Green New Deal"?

Not everybody agrees on what it means. But to me and to Sunrise, when we talk about a Green New Deal, we're talking about the massive transformation of our society and our economy that we will need to stop the climate crisis and act in accordance with what science and justice demand over the next 10 years.

That looks like a rapid wartime economic mobilization that gets us to a 100 percent renewable energy economy. It looks like ending fossil fuel subsidies. It looks like supporting the creation of tens of millions of good-paying jobs for Americans. And it looks like massive reinvestment in communities on the front lines of the climate crisis: communities of color and low-income communities. It would also include energy efficiency and retrofitting of businesses and homes.

It sounds like it includes a lot.

The list just goes on and on. It's a huge, massive undertaking. And a lot of this stuff sounds scary and technical and overwhelming. But the basic value underpinning a Green New Deal is that all people have a right to breathe clean air and drink clean water and live free from chaos, harm and violence. That is fundamentally what the Green New Deal is about, and that is fundamentally what we believe all people should have a right to on this planet.

Is the term supposed to remind people of Franklin D. Roosevelt and the New Deal?

It does harken back to that. I would say it has some significant upgrades. It inherently includes equity and justice at its core.

Would you like to see a Green New Deal folded into a broader infrastructure bill, or would that be too watered down?

I think that would be a great first step. If a massive infrastructure bill didn't include a climate component, I think that would be truly negligent. But also I think we can do more.

How do you envision the Select Committee for a Green New Deal functioning?

One of the issues with the committee that Pelosi has proposed is that it would be pretty toothless. It would have no actual funding. It would have no power to draft legislation. But the resolution that Ocasio-Cortez has put forward would give this committee the power to actually draft legislation and build more of the political consensus around what the real solutions to the crisis are.

So the committee would work on the first major climate legislation since the Waxman-Markey bill in 2009.

Right. Here's the thing: The last time that Democrats even made a major move to pass climate policy legislation was in 2009. That was almost a decade ago. I think I was 14 years old at the time. And we have seen nothing significant — especially nothing at the actual scale and scope of the crisis — since then.

What do you make of Rep. Frank Pallone's opposition to the committee?

Frankly, we don't trust Pallone's leadership at this time because he's taken over $100,000 from fossil fuel PACs [political action committees]. It feels hard to trust someone who takes money from the industry and who doesn't seem to be supporting a committee that would actually address the issue at scale.

In addition to supporting the committee, you're also asking lawmakers to refuse donations from the fossil fuel industry. How politically feasible are your demands, given that Republicans control the Senate and the White House?

With the committee, we're not planning on trying to ramrod any legislation through or anything like that, especially considering that Trump's the president and Republicans have the Senate. But we do think that it's imperative that we lay the groundwork now, given the timeline that we're working on.

You've mentioned Ocasio-Cortez as a champion of your issues. Which other new Democratic members are you excited about?

Sunrise actually endorsed 30 candidates this election cycle, and 19 of them won. All of them have taken the No Fossil Fuel Money Pledge, and all of them support aggressively working to stop the climate crisis. We're especially excited about incoming first-years like Deb Haaland [of New Mexico], Ilhan Omar [of Minnesota] and Ayanna Pressley [of Massachusetts].

We are really excited to engage with Ocasio-Cortez and others in this battle, which is really the battle for the heart and soul of the Democratic Party. Sunrise sees itself as a part of that, as pushing the party to first and foremost become a party of the people, not just of wealthy interests, and pushing the party to actually protect the interests of all working people — black, brown and white — for generations to come.

Are there any Republicans you think your group or your members could work with?

Potentially, yeah. We haven't had any come through the woodwork right now. And I think we'd need them to really back a Green New Deal or back a transition at the scale that we need. We haven't seen that yet. But I am definitely not putting it out of the picture.

Where are you from originally?

I'm from Boston. My family is from India, but I'm from Boston.

How old are you?

I'm 25.

Do some people take you less seriously because you're so young?

[Laughs] Definitely. I mean, they all do, yeah. Being young, being a woman, being a woman of color. All of those things add up. But it doesn't really stop me because fundamentally we need to do everything in our power to change this world. And that's just where I'm at.

This interview has been edited and condensed for clarity.

Scientists Find Large Amounts of Methane Being Released From Icelandic Glacier

NOVEMBER 21, 2018

Biogeochemist Peter Wynn of Lancaster University collects a sample of meltwater from the Slheimajkull glacier in Iceland. CREDIT: DR HUGH TUFFEN

Scientists have discovered that Iceland’s Slheimajkull glacier, which covers the active volcano, Katla, is releasing up to 41 tons of methane every day through its meltwater during the summer months — equal to the methane produced by more than 136,000 belching cows.

The research, published this week in the journal Scientific Reports, is the first major examination into whether melting glaciers could be a potential major source for methane as the climate warms, particularly those situated on or next to active volcanoes.

“This is a huge amount of methane lost from the glacial meltwater stream into the atmosphere,” Peter Wynn, a glacial biogeochemist at Lancaster University and co-author of the study, said in a statement. “It greatly exceeds average methane loss from non-glacial rivers to the atmosphere reported in the scientific literature. It rivals some of the world’s most methane-producing wetlands and represents more than twenty times the known methane emissions of all Europe’s other volcanoes put together.”

Wynn and his colleagues collected meltwater samples from the front of the Slheimajkull glacier and compared their methane concentrations with samples collected from nearby rivers and sediment. They used a mass spectrometer to identify that the methane from Slheimajkull was being produced by microbial activity in the bed of the glacier, sped up by the heat of the glacier’s underlying volcano.

The scientists say their findings reveal a previously unknown source of methane emissions that could further disrupt the global climate system. “Both Iceland and Antarctica have many ice-covered, active volcanoes and geothermal systems,” Rebecca Burns, a geochemist and lead author of the study, said in a statement. “The recent International Panel on Climate Change (IPCC) report highlights that current trajectories indicate global warming is likely to reach 1.5 C between 2030 and 2052, with greatest perceived climate sensitivity at higher latitudes. If methane produced under these ice caps has a means of escaping as the ice thins, there is the chance we may see short term increases in the release of methane from ice masses into the future.”

Nanoparticles for improving smart-window energy efficiency

BY STEVE KOPPES |SEPTEMBER 7, 2018

U.S. buildings leak an estimated 30 percent of their energy through inefficient windows, costing consumers an estimated $42 billion annually.

The U.S. Department of Energy’s (DOE) Argonne National Laboratory has patented new process for synthesizing vanadium dioxide nanoparticles that makes manufacturing energy-efficient “smart windows” economical.

Argonne has patented a new process for synthesizing vanadium dioxide nanoparticles that makes manufacturing energy-efficient “smart windows” economical.

But that could begin to change if efforts by the U.S. Department of Energy’s (DOE) Argonne National Laboratory are successful in commercializing a patented new process for synthesizing vanadium dioxide nanoparticles that makes manufacturing energy-efficient ​“smart windows” economical.

“There’s a need to develop a continuous process to rapidly manufacture such nanoparticles in an economical way and to bring it to the market quickly,” said Jie Li, an Argonne chemical engineer. Li and his Argonne colleagues received a U.S. patent for the process in May that is available for licensing.

Thermochromic smart windows work automatically to pass infrared energy in winter to keep buildings warm and block infrared energy in the summer to keep them cooler. The nanoparticle-based vanadium dioxide films have approximately twice the solar modulation values for high and low temperatures as thin films, Li said. Solar modulation is the amount of solar energy that the vanadium dioxide material can control at low and high temperatures. Further, the material acts with switch-like rapidity, transitioning from blocking infrared to passing it through in micro- or nano-seconds.

Thermochromic technology has attracted industrial interest for decades but has been a niche product because of its high cost and limited performance, said Ralph Muehleisen, who heads Argonne’s building science program. The problem has been that the best performing material for smart windows is vanadium dioxide in nanoparticle form. Until now, however, no one knew how to inexpensively produce the nanoparticle form in enough volume to support commercialization.

Continuous flow processing is a technology increasingly used in Europe that improves process and energy efficiency, as well as material performance. It also eliminates the need for potentially hazardous high temperature and pressure conditions, which require designing costly safety measures into the traditional batch manufacturing process.

The batch production process requires two or three days and involves manual insertion and extraction of the raw materials. The Argonne method, however, is a continuous-flow process involving high temperatures that reacts in a fraction of that time, taking only minutes. It also yields nanoparticles of more uniform size, which enhances the material’s energy efficiency. Output can be increased simply by networking multiple microreactors—the tabletop-sized devices in which the materials are synthesized.

“The use of nanoparticles increases performance and the continuous flow process we’ve invented reduces the cost of manufacturing them, so this is finally a technology that makes sense for window manufacturers to consider,” Muehleisen said. ​“Perhaps more importantly, though, the manufacturing process itself has applicability to all kinds of other materials requiring nanoparticle fabrication.”

Conventional thermochromic films incorporate orderly vanadium dioxide material that responds at a much higher temperature than those made with element-doped nanoparticles. The conventional windows must reach 154 degrees Fahrenheit before they begin to block infrared heat. Windows containing vanadium dioxide nanoparticles that include tungsten achieve this critical transition temperature at 77 degrees Fahrenheit.

The nanoparticle-containing windows do not require tinting to enhance their energy efficiency, unlike their conventional counterparts. Further, Muehleisen estimates that Argonne’s continuous-flow processing technique could make manufacturing the nanoparticles at least five times less expensive than conventional methods.

The smart window market was valued at $1 billion in 2014 and is expected to reach nearly $3 billion by 2021, according to NanoMarkets, LC. Existing energy-saving window technologies are inefficient, expensive, or both. Thermochromic windows could capture two-thirds of that market by 2020, according to Lux Research, an advisory firm that conducts independent studies on emerging technologies.

To further develop the vanadium dioxide thermochromic technology, Li and Muehleisen seek to reduce the particle size from 100 nanometers to 15 or 20 nanometers. At the smaller sizes, a line of 3,000 to 4,000 of the nanoparticles could fit across the diameter of a human hair. The smaller particles offer two advantages over their larger counterparts. They scatter less light, making window film more transparent; and they would modulate infrared heat better, making them more energy efficient, Li said.

The vanadium dioxide nanoparticles may also lend themselves to defense applications. The material could potentially scramble infrared beams used to measure room vibrations as a surveillance technique. With further research and development, the material might also offer protection for aircraft and other vehicles against laser weapons.

A team of material scientists, process engineers, energy scientists and building scientists from Argonne and commercialization specialists at the University of Chicago worked together to develop the vanadium dioxide nanoparticle technology under the auspices of the laboratory’s relatively young building program.

“I am looking to expand the buildings program to work more closely with our discovery science groups,” Muehleisen said. ​“There is a need to better understand some fundamental physics and chemistry of materials that have applications in building design. We need to really look outside the box to get to the next generation of materials that are adaptive/dynamic, better performing, lower cost to make and have lower environmental impacts.”

Argonne has made it a core mission to develop advanced manufacturing technologies, such as materials with advanced properties and manufacturing processes that are more energy efficient. Sign up here to learn more about Argonne’s technologies advancing manufacturing, including energy efficient processes and advanced materials design.

“Continued flow synthesis of VO2 nanoparticles or nanorods by using a microreactor” received U.S. Patent No. 9,975,804 B2 on May 22, 2018. The inventors include Jie Li, Yugang Sun, Ralph Muehleisen, Xiaojie Yan, Leah Guzowski, Samuel Dull and Ioannina Castano.

Parties interested in learning more about the technology, including possibly collaborating with Argonne on further development or licensing the technology, should contact partners@​anl.​gov.

The research was supported by Argonne’s Laboratory Directed Research and Development funding. The Center for Nanoscale Materials and the Advanced Photon Source, two DOE Office of Science User Facilities, were used to help develop this technology.

Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science.

The U.S. Department of Energy’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit the Office of Science websit

N.J. is first state to regulate toxic PFNAs in drinking water

by Frank Kummer, Posted: September 4, 2018

New Jersey has become the first state to regulate its drinking water for a man-made, toxic chemical compound once used in making nonstick cookware and now linked to a variety of health problems.

A new Department of Environmental Protection rule will cap the amount of compounds known as PFNAs, short for perfluorononanoic acid. For years, the state has been concerned about the level of PFNAs detected in water samples and has studied how the compounds were making their way into water. The state has even found some of the compounds in fish from recreational waterways and has begun issuing consumption advisories.

PFNAs are part of a large group of chemical compounds known as PFAS, short for per- and polyfluoroalkyl substances. The compounds were also used to make firefighting foam, stain-resistant clothing, and food packaging. They have been linked to low infant birth weights, effects on the immune system, cancer, and hormone disruption. PFAS can accumulate in the body and remain for long periods.

There are no federal standards for the compounds. Environmental Protection Agency officials under the Trump administration sought to block the release in June of a federal study showing that the same class of chemicals that contaminated water supplies near military bases and other areas, worrying it would cause a "public relations nightmare." Since then, the EPA has held a series of public forums on the compounds, including one in Horsham that drew hundreds of residents.

The New Jersey rule amends the Safe Drinking Water Act to set a maximum contaminant level of 13 parts per trillion of PFNAs starting in 2019. It aligns with Gov. Murphy's much more aggressive environmental policies compared with the Christie administration, which declined to take up the issue. Environmental groups have long sought such regulation.

"Today, the state has met the challenge to protect people from exposure to PFNAs, one of the most toxic perfluorinated compounds known," said Tracy Carluccio, deputy director of the Delaware Riverkeeper Network.

PFNAs were first detected in the Delaware River watershed in Gloucester County in 2010, according to the Delaware Riverkeeper Network. The compound was found in a groundwater well in Paulsboro near the Solvay plastics manufacturing plant. The Paulsboro groundwater showed concentration of 96 parts per trillion. Higher levels were later found. The borough filed notice it would sue Solvay, which led to a water treatment system to remove the compound.

Five municipalities in the state shut down contaminated wells because of the chemicals.

PFNAs are no longer used in manufacturing.

Turning waste into power: the plastic to fuel projects

By Scarlett Evans

Plastic to fuel projects are beginning to gain traction in the energy industry, with rising awareness of the prolific environmental damage caused by single-use plastics and people’s insufficient recycling habits leading researchers to turn to alternative disposal methods for our mounting plastic output.

Such projects use the chemical energy stored in the material’s hydrocarbon structure to create fuel, a method praised for its economic and environmental benefits, yet which has for the most part remained in its developmental stage.

Now, these schemes are beginning to take off. Here Future Power looks at some of the most innovative examples.

Why turn plastic to fuel?

Estimates show that less than 5% of the plastic manufactured each year is recycled, with production of the material set to increase by 3.8% every year until 2030, adding to the 6.3 billion tonnes churned out since production began 60 years ago. The majority ends up in our oceans, posing a disruption to marine ecosystems, which researchers predict would take a minimum of 450 years to biodegrade, if ever.

The solution of plastics-to-fuel holds promise in not only curbing such pervasive pollution but also providing a significant economic benefit to regions. The American Chemistry Council estimates plastic-to-fuel facilities in the US alone would create nearly 39,000 jobs and almost $9bn in economic output, making the global market potential of such an industry huge.

Plastic-derived fuels are also capable of producing a cleaner burning fuel than traditional sources due to their low sulphur content, considering the majority of developing nations use sulphur-heavy diesel.

Plastic to hydrogen

Most recently, researchers from Swansea University have discovered a means of converting plastic waste into hydrogen fuel, which they say could one day be used to power cars.

The team added a light-absorbing photo catalyst to plastic products, a material that absorbs sunlight and transforms it into chemical energy in a process called ‘photoreforming’. The plastic and catalyst combination was then left in an alkaline solution exposed to sunlight, breaking down the material and producing bubbles of hydrogen gas in the process.

The new method would be a cheaper than current recycling options, as any plastic can be used without needing to be cleaned. According to The Balance Small Business, it currently costs around $4,000 to recycle a tonne of plastic bags, which often leads to plastic waste being burned or thrown in landfill to avoid expenses.

Dr Moritz Kuehnel, from the university’s chemistry department, said: “There’s a lot of plastic used every year – billions of tonnes – and only a fraction of it is being recycled. We are trying to find a use for what is not being recycled.

The beauty of this process is that it’s not very picky. It can degrade all sorts of waste.”

The team is now looking to scale up the process from current milligrams of plastic to use the photoreforming process on more sizeable pieces.

While it may be years before this plastic-to-fuel process can be rolled out on an industrial level, its development would work well in tandem with the advent of hydrogen vehicles. Currently, no such cars exist on our roads, though a number of companies have big plans in the pipeline. For instance, Toyota has stated its aim to sell one million electric and fuel cars worldwide by 2030, while it is also due to launch a fuel cell-powered bus in 2020.

Plastic to diesel

Chemists from the University of California, Irvine (UC), in collaboration with researchers from the Shanghai Institute of Organic Chemistry, have devised a plastic recycling method that allows them to dissolve the bonds of polyethylene plastic to create petroleum and other fuel products.

While untreated polyethylene can be broken down, it requires either a significant amount of heat or reactive, toxic chemicals, and results in the atomic bonds breaking in an unusable way. By contrast, the process developed by the researchers uses far less heat and allows the final product to be transformed into a new fuel source.

The team, led by UC Irvine chemist Zhibin Guan, used a type of hydrocarbon molecule known as alkanes, which are typically used to produce polymers, though they were here harnessed to break down polymers. In a gradual process of removing and adding bonds between the carbon and hydrogen atoms within the material, the team were able to restructure the polyethylene into a liquid fuel that can be used in cars or other industrial purposes.

The catalysts used are also compatible with various types of polyolefin additives, meaning plastic waste such as bottles, bags and film can all be converted into chemical feedstocks without the need for any pre-treatment.

Plastic to crude oil

In 2016, Illinois Sustainable Technology Center researchers B.K. Sharma and Kishore Rajagopalan, in collaboration with the US Department of Agriculture, successfully converted plastic bags into fuel.

The team used high-density polyethylene bags sourced from local retailers and fed them into a pyrolysis unit, creating plastic crude oil (PCO) in the process. They then distilled the PCO to make gasoline and two types of diesel. Following the addition of antioxidants, the resulting materials proved superior to conventional diesel fuels in terms of lubricity and derived cetane number, which demonstrates ignition quality.

Plastic to sulphur

US firm Plastic2Oil works to turn waste plastic into sulphur fuel, using the discarded material as feedstock to create an ‘ultra-low sulphur diesel’ that contains 15ppm or lower sulphur content.

Currently, ultra-low sulphur diesel is primarily produced from petroleum, though Plastic2Oil provides a viable alternative with its plastic-derived fuel. The firm’s processor accepts unwashed and unsorted plastic, generating around one gallon of fuel from 8.3 pounds of the material. The processor uses its own off-gases as fuel (approximately 10%-12% of process output), meaning minimal energy is required to run the machine. The fuel produced can also be refined and separated without the cost of a distillation tower.

Are there any negatives?

The waste-to-fuel industry has received some opposition from environmental organisations, with protests causing the halting of a planned waste-to-fuel facility in Lancashire last year, and investigations launched into such sites in Canberra, Australia following environmentalists’ complaints.

Larry O’Loughlin, executive director of the ACT Conservation Council, spoke to The Guardian about the environmental threat he says waste-to-plastic sites pose. He rejected the notion that the industry is a form of recycling, as the plastics may only be used once before being turned into fuel. He also argued that widespread adoption of the method could slow efforts to find fuel alternatives, saying: “At a time of reducing carbon emissions, they are introducing another fossil fuel. The ACT is trying to move to zero emissions by 2050. How are we going to do that by setting up a refinery here?”

https://www.theguardian.com/sustainable-business/2017/feb/20/campaigners-reject-plastics-to-fuel-projects-but-are-they-right

H&M Foundation opens 'Garment-to-Garment' recycling plant in circular fashion drive

7 September 2018, source edie newsroom

Fashion retailer H&M's charitable arm, the H&M Foundation, has this week opened a hydrothermal textile recycling plant as the company strives to become "truly circular" by 2030.

The technology aims to overcome the problem of recycling hard-to-recycle textile blends, which are the most widely-used fabrics globally

The technology aims to overcome the problem of recycling hard-to-recycle textile blends, which are the most widely-used fabrics globally

The opening of the new pre-industrial sized facility in Hong Kong marks the first time that H&M’s hydrothermal recycling technology is put into practice at scale. The innovative recycling method involves using heat, water and a blend of biodegradable chemicals to separate cotton and polyester from mixed fabrics. Once the fibres are separated, they can be sorted for reuse in new garments, including jeans.

The H&M Foundation claims that this method, which it calls “Garment-to-Garment recycling”, prevents the potential for chemical pollution finding its way into the environment while minimising carbon emissions and costs. While the plant will initially be used by H&M only, the retailer has pledged to licence the technology, so it can be used by other fashion manufacturers.

The Foundation’s innovation lead Erik Bang said the opening of the plant, which was co-funded by The Hong Kong Research Institute of Textiles and Apparel (HKRITA), marks a “significant step towards a new fashion industry that operates within the planetary boundaries”.

“As we scale up and make this technology freely available to the industry, we will reduce the dependence on limited natural resources to dress a growing global population,” Bang said.

Alongside the Garment-to-Garment plant, the H&M Foundation is showcasing a miniaturised version of the recycling technology at a pop-up H&M store in Hong Kong in a bid to educate customers about the importance of recycling. Customers are being encouraged to bring their unwanted or end-of-life clothing to the temporary store, where they will have the chance to see the technology first-hand.

“Seeing is believing, and when customers see with their own eyes what a valuable resource garments at the end of life can be, they can also believe in recycling and recognise the difference their actions can make,” Bang added.

Cradle-to-cradle clothes

The H&M Foundation and HKRITA predict they will collectively invest more than 5.2m (€5.8m) on sustainable fashion initiatives over the next four years, with 50% of this funding set to be earmarked for research into textile recycling. The remainder of the investment will be spent on projects which promote inclusion and diversity in the fashion industry.

The move comes shortly after H&M announced it would be one of the brands leading a new Ellen MacArthur Foundation initiative that aims to help drive a circular fashion industry, along with Nike, GAP and Burberry.

The brands, joined by HSBC and Stella McCartney, have pledged through the Make Fashion Circular project to create business models which will keep garments in use, utilise materials which are renewable and find ways of recycling old clothes into new products.

It is thought that these moves could help the global fashion industry to capture $460bn currently lost due to the underutilisation of clothes each year, as well as $100bn from clothing that could be used but is currently lost to landfill and incineration.

Within its own operations, H&M said in 2016 that it was more than a quarter of the way towards its goal of becoming “truly circular” by 2030. The company is currently one of the world’s biggest users of recycled polyester and in January, unveiled a new sportswear collection for women that is predominately made from PCR polyester.

Cover the U.S. in 89 Percent Trees—Or Go Solar

By Jen A. Miller | Published 5:00 a.m., September 7, 2018

How many fields of switchgrass and forests of trees would be needed to offset the energy produced by burning coal? A lot, it turns out.

While demand for energy isn’t dropping, alarms raised by burning fossil fuels in order to get that energy are getting louder. Solutions to cancel the effects of carbon dumped into our atmosphere include carbon capture and storage or bio-sequestration. This zero-emission energy uses technical means as well as plants to take in and store carbon emissions. Another route is to use solar photovoltaics to convert sunlight directly into electricity and only sequester the carbon emissions from the production of solar cells.

Zero-emission energy has been offered as a way to offset the carbon dioxide production while still maintaining coal’s electricity generation. That’s done through carbon capture and storage in saline aquifers, or by using both enhanced oil recovery and bio-sequestration through planting trees and other plants to suck up and store carbon.

In a new study published in Scientific Reports (DOI: 10.1038/s41598-018-31505-3), a Nature publication, Michigan Technological University researchers analyzed how much land would be required to offset greenhouse gases created by traditional coal-fired plants or coal-fired plants with carbon sequestration and then neutralizing the remaining carbon pollution with bio-sequestration. Then they compared these routes to how much bio-sequestration would be required to offset greenhouse gases produced when making solar panels.

For the first time, researchers have shown that there is no comparison. It’s not even close.

In fact, coal-fired power plants require 13 times more land to be carbon neutral than the manufacturing of solar panels. We’d have to use a minimum of 62 percent of U.S. land covered by optimal crops or cover 89 percent of the U.S. with average forests to do it.

“We know that climate change is a reality, but we don’t want to live like cavemen,” says Joshua Pearce, professor of materials science and engineering and electrical engineering at Michigan Tech. “We need a method to make carbon neutral electricity. It just makes no sense whatsoever to use coal when you have solar available, especially with this data.”

Coal-fired power plants require 13 times more land to be carbon neutral than the manufacturing of solar panels. Coal without carbon capture and sequestration (CSS) totals 377 metric tons of carbon dioxide equivalent; coal with saline aquifer CSS totals 117 metric tons of carbon dioxide equivalent; solar photovoltaics (PV) totals 9 metric tons of carbon dioxide equivalent.

Researchers drew these conclusions from over 100 different data sources to compare energy, greenhouse gas emissions and land transformation needed to carbon neutralize each type of energy technology.

They claim a one-gigawatt coal-fired plant would require a new forest larger than the state of Maryland to neutralize all of its carbon emissions.

Researchers also found that applying the best-case bio-sequestration for all the greenhouse gases produced by coal-fired power plants, would mean using 62 percent of the nation’s arable land for that process, or 89 percent of all U.S. land with average forest cover.

In comparison, solar cells require 13 times less land to become carbon neutral and five times less than the best-case coal scenario.

“If your goal is to make electricity without introducing any carbon into the atmosphere, you should absolutely not do a coal plant,” Pearce says. Not only is it not realistic to capture all the carbon dioxide they release, but burning coal also puts sulfur dioxide and nitrous oxide and particulates in the air, which creates air pollution, already estimated to cause 52,000 premature deaths annually.

Pearce says that he and his team were generous to coal-fired power plants in how they calculated the efficiency of carbon capture and storage when scaled up. They also did not consider new ways that solar farms are being used to make them even more efficient, like using higher efficiency black silicon solar cells, putting mirrors in between rows of panels so light falling between them can also be absorbed, or planting crops between rows (agrivoltaics) to achieve greater land use.

Pearce says future research should focus on improving the efficiency of solar panels and solar farms, not on carbon capture of fossil fuel-powered plants in an attempt to become zero-emission energy–not when this data shows it isn’t realistic in order to protect our changing climate.

Michigan Technological University is a public research university, home to more than 7,000 students from 54 countries. Founded in 1885, the University offers more than 120 undergraduate and graduate degree programs in science and technology, engineering, forestry, business and economics, health professions, humanities, mathematics, and social sciences. Our campus in Michigan’s Upper Peninsula overlooks the Keweenaw Waterway and is just a few miles from Lake Superior.

Flood Policy Standoff Tests Democrats' Promise of Climate Action

By Christopher Flavelle and David Schultz

November29, 2018

Democrat Party resists cuts to subsidies for flood insurance program

Experts say those subsidies keep people living in risky areas

A Congressional deadlock over flood insurance highlights the difficulty of enacting the type of reforms urged last week in a U.S. government report on climate change -- even for Democrats, who embraced the report’s findings.

The heavily indebted National Flood Insurance Program, which provides subsidized coverage for homes in flood-prone areas, is scheduled to expire at midnight on Friday after months of debate over long-term changes. Both the House and Senate passed a one-week extension on Thursday night. That extension must now be signed by President Donald Trump.

Chief among the sticking points is a Republican-led effort to cut the subsidies, something that experts say would be consistent with the message in the National Climate Assessment released on Nov. 23. In that document, a consensus report by U.S. government agencies, scientists warned that coastal communities must adapt more quickly to accelerating flooding and sea-level rise -- including restricting development in at-risk areas, and in many cases leaving them altogether.

“Democrats talk a good game when it comes to the urgency of climate change, but then they turn right around and vote for the National Flood Insurance Program, a program that subsidizes building in known flood plains,” Republican Senator Mike Lee of Utah, a leading advocate for cutting the subsidies, said by email.

“If Democrats are really so concerned about economic damage from climate change, why are they subsidizing mansions in flood zones?” Lee said.

Democrats defended their support for below-rate flood insurance.

“We subsidize crop insurance. We certainly subsidize the oil companies...The concept of protection [for homeowners] doesn’t rankle me,” Senator Sheldon Whitehouse of Rhode Island, a state that depends heavily on flood insurance, said in an interview Wednesday. “It’s really unfair to strip them from coverage.”

Senator Bob Menendez of New Jersey, which has some of the nation’s most flood-exposed communities, said more realistic insurance rates aren’t the only way to prepare vulnerable areas for climate change. He said he wants Congress to provide more funding so officials can buy homes from people who want to leave.

“We won’t say, ‘We’ll take your home,’ but you have the incentives in place,” Menendez, a Democrat, said in an interview Wednesday.

Earlier: No Flood Insurance Fixes During Lame Duck, Sen. Kennedy Says

Not all Republicans support cuts. The mostly GOP delegation from Louisiana, which is more reliant on flood insurance than anywhere in the country, has also been strident about resisting the cuts.

“There are plenty of folks in both parties that have trouble reforming the NFIP in a way that’s consistent with other positions they hold dear, whether that’s climate-change impacts or fiscal conservatism,” Rob Moore, a senior policy analyst at the Natural Resources Defense Council, said by email. “For some members of Congress, when it comes to NFIP reforms, cheap insurance trumps everything else.”

Still, R.J. Lehmann, director of insurance policy for the R Street Institute, which advocates for market-based solutions to climate change, said that Democrats who say they support the report’s findings face a particular challenge when it comes to flood insurance.

“If you say that your goal is affordability, that’s your primary goal, without concern for risk, then you’re ignoring” what’s in the climate report, Lehmann said. “I just don’t think there’s any other way about it.”

Representative Earl Blumenauer, an Oregon Democrat who’s cosponsored bills with Republicans to change the program, supports cutting the subsidies, saying the country needs “sensible reform.”

Katherine Greig, an author of the National Climate Assessment’s chapter on adaptation, said it’s appropriate to worry about the impact of higher flood insurance premiums on the most vulnerable homeowners. She said the best solution is to raise rates over time, coupled with means-tested vouchers to protect people with low incomes.

“We should communicate risk with a price signal,” said Greig, a senior fellow at the University of Pennsylvania’s Wharton Risk Center. “I think a glide path is the only way.”

The person who led the report’s adaptation chapter, Robert Lempert, said the best approach would be a mix of higher rates, more funding for coastal infrastructure, and money for buyouts.

“Price signals are probably an important part of any effective policy portfolio,” said Lempert, a principal researcher at the RAND Corp. “The perfect societal solution would combine price signals with subsidies to help people harden where they can, and then ways to relocate where you need to.”

Asked why it’s proving so hard for Congress to agree to those reforms, Lempert said, “Change is hard.”

Putting this material on roofs can help clean up smoggy air

With air quality hovering around dangerous levels in much of the world, the manufacturing company 3M has developed a way for the roofs of buildings to play a role in reducing pollution.

For the past several weeks, the air quality in California has made headlines. Raging wildfires in the northern and southern parts of the state created clouds of ash and dangerously high toxicity levels in the atmosphere. The fires created a state of emergency, but in many parts of the country, that poor air quality is a daily issue, not one created by natural disasters. Over the summer, southern California violated federal smog standards for 87 days in a row. Around 41% of people in the U.S. live with regular exposure to poor air quality.

Cleaning up the air will require reducing dependency on fossil fuels and the pollution they create. While industries like manufacturing and transportation work on these wide-scale changes, 3M has developed something of a stopgap: roofing granules that suck smog from the atmosphere.

“Roofing granules have been a part of our business since the 1930s,” says Gayle Schueller, 3M’s chief sustainability officer. “This is not a new business to be in, and it’s not a new material for us.” Traditionally, granules are used in construction to coat rooftops and provide an extra layer of protection from UV rays, which helps buildings remain cool and less dependent on air conditioning. They also make roofs more fire resistant. Around 10 years ago, 3M developed “cool roofing” granules that reflect sunlight and help buildings comply with roofing standards like the 2014 Los Angeles ordinance mandating new residential projects be built with additional rooftop insulation to keep them cool.

Instead of reflecting the sun, though, 3M’s new smog-reducing granules use it. The photocatalytic coating on these granules, designed for asphalt roofing, is activated by the sun’s UV rays. That generates radicals that bind with the chemical compounds in smoggy air, and transform them into water-soluble ions that eventually wash away.

While 3M conducted its own internal testing of the granules, they also sent samples for external validation to Lawrence Berkeley National Lab, which evaluated how effectively the granules absorbed different gases and pollutants. They found that a average-sized roof coated in granules removes around as much pollution from the air as three trees could. One company that sources from 3M, Malarkey Roofing, has pledged to incorporate the smog-reducing granules into all of their shingles. So far, Malarkey shingles have pulled the equivalent amount of smog from the air as 100,000 trees.

“When we innovate, we start with an understanding of where there is a problem, and we identified the issue of smog in cities,” Schueller says. Around 10% of 3M’s 90,000 person staff are scientists, she adds, and they collaborated on both developing the photocatalytic coating, and testing its environmental impacts.

The runoff from the ions, they found, does not contribute significantly to water quality issues–the impact is minimal, Schueller says, and the smog would’ve infiltrated the water from the air anyway if it hadn’t first been absorbed by the roofing granules.

This dynamic underscores the need for these granules to be seen as a temporary fix, not a solution to air quality issues. The smog-absorbing granules pull pollution out of the air–which is certainly helpful for people breathing in that air–but they don’t ultimately remove it from the ecosystem. Truly cleaning up the air and the environment in cities will require addressing the root causes of that pollution. Until that happens, innovations like these granules can help ease conditions for people in the short term.

Rye Brook,NY passes law banning plastic bags

Gabriel Rom, Rockland/Westchester Journal News Published 3:52 p.m. ET Nov. 19, 2018 | Updated 5:28 p.m. ET Nov. 19, 2018

To be environmentally responsible, First Village Coffee in Ossining uses biodegradable straws and cups, and sells reusable metal straw

RYE BROOK - The village continues to pass policies restricting disposable items with a new ban on plastic bags at all retail locations.

The law, passed by the village Board of Trustees on Nov. 8, also establishes a 10-cent surcharge for paper bags in an effort to encourage customers to use reusable bags.

"We're trying as hard as we can to encourage people to bring their own bags which is really what the end goal of this law is all about," said Mayor Paul Rosenberg.

Stores that violate the law will receive a warning and then may be subject to fines up to $250.

"We have a record of doing very environmentally-friendly things," Rosenberg said. "As stewards of the environment, we feel that these are the right things to do."

A separate law restricting the use of plastic straws was also passed Nov. 8. That policy allows food establishments to still carry plastic straws, but requires them to place them behind counters and not offer them unless requested.

While the laws passed unanimously, some village residents had reservations.

"In some cases, these changes create inconvenience for customers," said village resident Elaine Hertz in a letter to Rye Brook officials. "For those who like paper, fine — for those who prefer plastic, that should be OK. We should have the choice."

As Rye Brook moves forward with the restrictions, other municipal bans on plastic bags have been met by more organized opposition.

A 2013 effort in the town of Mamaroneck to impose a fee on plastic and paper bags stalled when industry group Food Industry Alliance of New York threatened to sue the town.

Since then, tides have changed at the county and state level, emboldening smaller municipalities.

In April, the county Board of Legislators' Democrats introduced a law that would ban plastic bags from store checkout lines and require them to charge at least 10 cents each for paper bags. The legislation is still pending.

Meanwhile, at the state level earlier this year, Gov. Andrew Cuomo proposed a law that would ban plastic "carryout" bags. His bill would prohibit grocery, convenience and other stores from using plastic bags. That proposal is also still pending.

But in Rye Brook, Rosenberg said liability was not a concern.

In terms of supermarkets, Balducci's already provides paper bags to customers. Because of a zoning quirk, the ACME Supermarket at the Washington Park Plaza is within Port Chester village limits while the rest of the shopping mall lies within Rye Brook.

"We didn't have any push-back from any industry groups," Rosenberg said.

Plastic microfibers found for first time in wild animals' stool, from S. A. fur seals

MORRIS ANIMAL FOUNDATION

PUBLIC RELEASE: 9-NOV-2018

For the first time, plastic microfibers have been discovered in wild animals' stool, from South American fur seals. The findings were made by a team of Morris Animal Foundation-funded researchers at the University of Georgia, who suggest examining scat from pinnipeds can be an efficient way to monitor environmental levels of microfibers and microplastics in the environment. Their study was published in the Marine Pollution Bulletin.

"It's no secret that plastic pollution is one of the major threats to marine ecosystems, but we're learning now just how widespread that problem is," said Dr. Mauricio Seguel, a research fellow at the University of Georgia. "These samples are invisible to the naked eye. We want to understand factors that are driving their distribution and what this means for animals in the Southern Hemisphere."

The team examined the scat of 51 female South American fur seals on the remote Guafo Island, in southwestern Chile, from December 2015 to March 2016. Each sample's inorganic material was dissolved in a solution in a lab, leaving only the microscopic, plastic particles to be analyzed. Researchers then found 67 percent of the samples showed a remarkable abundance of microfibers, which until now had only been reported in animals fed in captivity.

Microplastics are plastic fragments smaller than 5 millimeters. Microfibers are the least studied form of microplastic. They are small hairs of plastic, less than 1 millimeter in size, from materials such as polyester or nylon and can end up in the ocean through waste water after cleaning, no matter how thorough the treatment. They also can absorb a wide array of pollutants.

The researchers believe the microfibers arrived at Guafo Island through changing ocean currents, before being consumed by plankton and continuing up the food chain through fish and, finally, to the seals. There isn't enough evidence to determine if microfibers have any adverse effects on mammals, but some studies have indicated morphological changes in fish.

Scat analysis, the team noted, could be a good tool to monitor the exposure of marine mammals to plastics as it's effective and non-invasive, poses no danger to either the researcher or the animal, and it's easy to identify both fur seals and their feces. Dr. Seguel says his colleagues are conducting similar, follow-up tests in other parts of South America.

"It's not too late to act to heal our oceans, but one of the first steps is determining how much we have damaged the ecosystem through our activities, like producing and disposing of plastic," said Dr. Kelly Diehl, Morris Animal Foundation Interim Vice President of Scientific Programs. "Studies like these will help us learn those answers so we can begin to make better decisions for the health of marine life."

Morris Animal Foundation has funded other fur seal studies at Guafo Island with Dr. Seguel's team. One found that factors that contributed to a die-off of South American fur seal pups included mites, pneumonia and changing sea surface temperature. In another, researchers discovered hookworms feed at a constant rate on their seal pup hosts before they produce eggs and die, a strategy which also often kills the pups as well.

About Morris Animal Foundation

Morris Animal Foundation's mission is to bridge science and resources to advance the health of animals. Founded by a veterinarian in 1948, we fund and conduct critical health studies for the benefit of all animals. Learn more at morrisanimalfoundation.org.

Dam problems, win-win solutions

UNIVERSITY OF MAINE

PUBLIC RELEASE: 5-NOV-2018

Orono, Maine -- Decisions about whether to build, remove or modify dams involve complex trade-offs that are often accompanied by social and political conflict. A group of researchers from the natural and social sciences, engineering, arts and humanities has joined forces to show how, where and when it may be possible to achieve a more efficient balance among these trade-offs. Their work is featured in a paper published in the Proceedings of the National Academy of Sciences (PNAS).

What's the dam problem?

In some parts of the world, there are proposals to build thousands of massive new dams for hydroelectricity, flood control and irrigation. In other regions, such as the U.S., there is a growing movement to restore rivers by removing dams that are obsolete, pose safety risks or have large negative impacts on ecosystems. In both instances, difficult trade-offs and divergent stakeholder preferences can greatly complicate decision-making processes.

For example, conservation groups and resource agencies seeking to restore sea-run fish often favor the removal of dams that prevent these species from reaching their spawning grounds. But other stakeholders may value the diverse services that dams can provide, including water supply, hydroelectricity and reservoir-related recreation.

"This is exactly the kind of problem where you need an interdisciplinary team with the right mix of expertise to help quantify trade-offs and identify promising solutions from multiple perspectives," says Sam Roy, lead author from the University of Maine.

Maximizing economic and ecological benefits

The research team collected a database of over 7,500 dams in New England as a "model system" to search for decisions that provide efficient outcomes for multiple criteria valued by stakeholders. These criteria include habitat availability for migratory fishes, hydroelectric power production, water storage, drinking water supply, water quality, recreational use, dam breach risks, waterfront property impacts and decision costs.

Using an economic concept known as the production possibility frontier, combined with a scenario-ranking technique, the researchers identified potential dam decisions that maximize the combined ecological and economic benefits, for individual watersheds as well as the entire New England region. Given the large size of the database (the largest of its kind in the world), together with the enormous number of potential solutions, a machine-learning approach was used to simulate the many trade-offs and find solutions that maximized total benefits.

The team's approach can be used to identify many different kinds of decisions that result in efficient outcomes given resource and technological constraints, including ones that remove or modify specific dams to produce the greatest increase in fish habitat for a small reduction in hydropower, or the greatest improvement in dam breach safety for a small reduction in drinking water supply.

"We also find that it is possible to improve the trade-offs between certain criteria by coordinating multiple dam decisions at larger spatial scales," says Roy. "This means that there are many opportunities to find win-win solutions that can simultaneously improve dam infrastructure, freshwater ecosystems and decision costs by selectively removing, modifying or even constructing specific dams in a river basin."

Interdisciplinary research that connects the dots

Roy, a postdoctoral fellow at UMaine's Senator George J. Mitchell Center for Sustainability Solutions, worked with colleagues from the University of New Hampshire, the University of Rhode Island and the Rhode Island School of Design.

"One of the strengths of our interdisciplinary approach is that we can examine many different trade-offs using an integrated, quantitative framework," says co-author Emi Uchida, an environmental economist at the University of Rhode Island. The team also collaborates with diverse stakeholders (e.g., tribal communities, government agencies, conservation organizations) to strengthen the scientific basis of decision-making.

The authors cite the multi-stakeholder Penobscot River Restoration Project as a highly successful example where coordinating dam removal and alteration through broad stakeholder engagement dramatically reduced conflict, efficiently allocated resources, and aligned with pre-existing constraints of dam ownership and regulation.

Says Roy, "Our model can help identify specific decisions that gain the support of a broader stakeholder audience by providing desirable infrastructure and ecosystem trade-offs. This may encourage funders and practitioners to make these decisions a reality. Based on our research, there may be many more future decisions that repeat the success of the Penobscot River Restoration Project."

Support for the Future of Dams Project is provided by the National Science Foundation's Research Infrastructure Improvement award 1539071.

Samuel G. Roy, Postdoctoral Fellow at the University of Maine Senator George J. Mitchell Center for Sustainability Solutions, can be reached at 207-581-3286 and samuel.g.roy@maine.edu. Web: https://www.researchgate.net/profile/Samuel_Roy2.

Making wind farms more efficient

PENN STATE

PUBLIC RELEASE: 9-NOV-2018

ERIE, Pa. -- With energy demands rising, researchers at Penn State Behrend and the University of Tabriz, Iran, have completed an algorithm -- or approach -- to design more efficient wind farms, helping to generate more revenue for builders and more renewable energy for their customers.

Wind energy is on the rise, and not just in the US," said Mohammad Rasouli, assistant professor of electrical engineering at Penn State Erie, the Behrend College. "The efficiency of solar panels is less than 25 percent, and is still a subject of current research. Wind turbines, on the other hand, are much more efficient and convert over 45 percent of the wind energy to electricity."

Though wind turbines are efficient, wind farm layouts can reduce this efficiency if not properly designed. Builders do not always put turbines in the places with the highest wind speeds, where they will generate the most power, said Rasouli. Turbine spacing is also important -- because turbines create drag that lowers wind speed, the first turbines to catch the wind will generate more power than those that come after.

To build more efficient wind farms, designers must take these factors into account wind speed and turbine spacing, as well as land size, geography, number of turbines, amount of vegetation, meteorological conditions, building costs, and other considerations, according to the researchers.

Balancing all of these factors to find an optimum layout is difficult, even with the assistance of mathematical models.

"This is a multi-objective approach," said Rasouli. "We have a function and we want to optimize it while taking into account various constraints."

The researchers focused on one approach, called "biogeographical-based optimization." Created in 2008 and inspired by nature, BBO is based on how animals naturally distribute themselves to make the best use of their environment based on their needs. By creating a mathematical model from animal behavior, it is then possible for the researchers to calculate the optimal distribution of objects in other scenarios, such as turbines on a wind farm.

"Analytical methods require a lot of computation," said Rasouli. "This BBO method minimizes computation and gives better results, finding the optimum solution at less computational cost."

Other researchers used simplified versions of the BBO approach in 2017 and 2018 to calculate more efficient wind farm layouts, but these simplified versions did not take into account all factors affecting the optimum layout.

The researchers from Penn State and the University of Tabriz completed the approach by incorporating additional variables, including real market data, the roughness of the surface -- which affects how much power is in the wind -- and how much wind each turbine receives.

The research team also improved the BBO approach by incorporating a more realistic model for calculating wakes -- areas with slower wing speeds created after the wind blows past a turbine, similar to the wake behind a boat -- and testing how sensitive the model was to other factors such as interest rates, financial incentives, and differences in energy production costs. They report their results online in the Journal of Cleaner Production, which will be published in November.

"This is a more realistic optimization approach compared to some of the simplifying methods that are out there," said Rasouli. "This would be better to customers, to manufacturers, and to grid-style, larger-size wind farms."

By incorporating more data, such as updated meteorological records and manufacturer information, the researchers will be able to use the BBO approach to optimize wind farm layouts in many different locations, helping wind farm designers across the world make better use of their land and generate more energy to meet future energy demands from consumers.

"There is an end time for fossil fuels," said Rasouli. "With this and upcoming methods or better optimization approaches, we can make better use of wind energy."

Soheil Pouraltafi-kheljan and Amirreza Azimi, graduate students, and Behnam Mohammadi-ivatloo, associate professor and faculty of electrical and computer engineering, all at the University of Tabriz, also worked on the project.

Tommorow's population will be larger, heavier and eat more

Feeding 9 billion people will require more food than has been projected

NORWEGIAN UNIVERSITY OF SCIENCE AND TECHNOLOGY

PUBLIC RELEASE: 9-NOV-2018

"It will be harder to feed 9 billion people in 2050 than it would be today," says Gibran Vita, a PhD candidate at the Norwegian University of Science and Technology 's Industrial Ecology Programme.

According to WWF, the world's greatest environmental problem is the destruction of wildlife and plant habitat. A large part of the devastation is due to the demands of an ever-growing human population. On the other hand, "Zero Hunger" is the second UN Sustainable Development Goal and its challenge is to meet a global growing food demand.

The world's population could level off at around nine billion in a few years, compared to just over 7.6 billion now.

But an average person in the future will require more food than today. Changes in eating habits, attitudes towards food waste, increases in height and body mass, and demographic transitions are some of the reasons.

People are changing

Professor Daniel B. Mller and colleagues Felipe Vsquez and Vita analysed changes in the populations of 186 countries between 1975 and 2014.

"We studied the effects of two phenomena. One is that people on average have become taller and heavier. The second is that the average population is getting older," said Vita.

The first phenomenon contributes to increased food demand. The second counteracts the former one.

An average adult in 2014 was 14 per cent heavier, about 1.3 per cent taller, 6.2 per cent older, and needed 6.1 per cent more energy than in 1975. Researchers expect this trend to continue for most countries.

"An average global adult consumed 2465 kilocalories per day in 1975. In 2014, the average adult consumed 2615 kilocalories," says Vita.

Globally, human consumption increased by 129 per cent during this time span. Population growth was responsible for 116 per cent, while increased weight and height accounted for 15 per cent. Older people need a little less food, but an ageing population results in only two per cent less consumption.

"The additional 13 per cent corresponds to the needs of 286 million people," Vsquez says.

This in turn corresponds approximately to the food needs of Indonesia and Scandinavia combined.

Major differences

Considerable variations exist between countries. Weight gain per person from 1975 to 2014 ranged from 6 to 33 per cent, and the increased energy requirement ranged from 0.9 to 16 per cent.

An average person from Tonga weighs 93 kilos. An average Vietnamese weighs 52 kilos. This means that Tonga people need 800 more kilocalories each day - or about four bowls of oatmeal.

Some countries are changing quickly. On Saint Lucia in the Caribbean, the average weight rose from 62 kilos in 1975 to 82 kilos 40 years later.

The lowest and highest changes are found in Asia and Africa, reflecting the disparities between the countries of these continents.

Not previously calculated

"Previous studies haven't taken the increased demands of larger individuals and aged societies into account when calculating the future food needs of a growing population," said Vsquez.

Most studies estimate that an average adult's food needs remain constant over time and fairly similar across nations. But that's not how it is.

"These assumptions can lead to errors in assessing how much food we'll actually need to meet future demand," Vsquez says.

This study provides relevant information for the UN's Food and Agriculture Organization (FAO), which is a leader in the struggle to ensure food security for all.

Vsquez and Vita say that we have to look at more than just the number of people in an area to understand the mechanisms behind their consumption. This requires a multidisciplinary approach that considers both social and physiological factors.

This study's analysis involved bio-demography, a hybrid of biology and demography. The researchers adapted a model for dynamic systems that is often used in industrial ecology to study resource stocks and flows.

Reference: Food Security for an Aging and Heavier Population. Felipe Vsquez, Gibran Vita and Daniel B. Mller. Sustainability 2018, 10(10), 3683; https://doi.org/10.3390/su10103683

NEW KITCHEN APPLIANCE CAN TRANSFORM HOUSEHOLD WASTE INTO ENERGY

By Kate Nicholson | 29 October 2018 |

A new domestic waste disposal appliance that turns waste into energy to heat homes has been found to be less impactful in terms of CO2 emissions than more traditional waste collection methods, according to a new report by Ricardo Energy and Environment.

The independent report carried out by the environmental consultancy has found that HERU (Home Energy Resource Unit), which uses low temperature pyrolysis to transform everyday waste (such as nappies, plastics and food) into energy, has the potential to save each household up to 1,200kg of CO2 per year.

The report states that HERU has the potential to save 8.8 per cent of the total UK carbon output if installed across all 27m UK households.

Coined by the BBC as ‘The Next Dyson’, the device is the world’s first hybrid boiler; capable of running on oil or gas as well as household waste, it is able to switch to traditional energy resources to heat the house when there is no waste for it to process.

When compared to traditional waste collections, the HERU had 68 per cent less global warming impact than co-mingled collections and 32 per cent less than kerbside collections, according to the report.

Furthermore, when powered by solar energy, the HERU (which runs off a standard 32-amp cooker plug) saw an impressive 554 per cent decrease in global warming impact compared to kerbside collections and 733 per cent less than co-mingled collections.

The process uses pipe technology, which enables a highly efficient, low temperature pyrolysis process to take place. The small amounts of emissions are cleaned before the CO2 is released into the air and the teaspoon of ash (all that is left of the household debris) is released into the sewer. This process ensures there is no risk of harm to human health or the environment.

Yet, there are exceptions as to what the HERU can be fed. Glass and metal must be recycled using normal council collections, as the HERU cannot convert them to ash and the items will remain perfectly intact. The device is marketed as an end-of-life solution for the complex packaging we see up and down the high streets and supermarkets, such as coffee cups and crisp packets, though one could argue that increasing the recyclability of such packaging would prove more sustainable in the long-term.

Discussing the release of the report, Nik Spencer, Founder and Inventor of the HERU, said, “We opened up the HERU to independent scrutiny as part of our continual process of refinement and improvement. However, the results are truly staggering. Climate change and global warming is something that is and will continue to affect us all, and solutions such as the HERU clearly provide viable technology to start addressing these global problems, particularly when used with renewable technologies such as solar.

“The opportunities for businesses are also profound, with the potential for forward thinking companies to adopt the HERU first and help lead the way in changing our relationship with the resources around us.”

Throughout the remaining months of 2018, the HERU will be undertaking technical trials at UK sites, before aiming moving to the mass market in early 2019. Accelerated by HERU’s partnership with the UK’s Manufacturing Technology Centre, the trials are also supported by BAXI, Worcestershire County Council, Wychavon District Council and Rugby Borough Council.

Web Site

https://www.myheru.com

CERTIFICATION FOR UK COMPANY CONVERTING WASTE PLASTICS TO HYDROGEN

By Rob Cole | 9 November 2018 |

A UK technology company pioneering hydrogen production from waste plastic and used tyres has received independent technical assurance certification for its waste to power and waste to hydrogen technology processes.

PowerHouse Energy (PHE) Group has developed a process technology, known as DMG, to convert waste plastic, end-of-life tyres and other wastes into EcoSynthesis gas, which can be used to create valuable products such as chemical precursors, hydrogen, electricity and other industrial products. The DMG process can generate in excess of one tonne of road-fuel quality hydrogen and more than 28 megawatt hours of exportable electricity per day.

The independent review of PHE’s technology was carried out by DNV GL, a global quality assurance and risk management company. DNV GL has issued a ‘Statement of Feasibility’ to the DMG technology, meaning its Technology Assessment found no prohibitive obstacles to the development of the technology.

As part of the certification process, PHE undertook an extensive engineering, safety and risk management programme over several months. The programme involved a robust and rigorous review of the engineering design, test data, process modelling and the equipment engineering design required for the commercial application of the DMG technology.

The DNV GL ‘Statement of Feasibility’ states that ‘the PowerHouse Energy Group’s DMG waste-to-energy technology can convert 25 tonnes per day of feedstock comprising high calorific value waste materials.’

With regard to the outputs of the DMG technology, it adds: ‘The produced energy-rich syngas can be combusted to produce power for distributed electrical generation. The DMG technology allows for integrating a process for the co-production of high purity hydrogen (one tonne per day) from a proportion of the syngas in addition to generating power.’

The Statement of Feasibility also confirms that the DMG technology eliminated waste with high levels of energy recovery, produces electricity for distribution and has the ability to co-produce high purity hydrogen with electrical power.

PHE has identified seven sites, including one at Ellesmere Port in Cheshire, where application-specific engineering activities are taking place, with each site comprising different specific waste streams that will be processed by a DMG technology ‘research demonstrator’.

Commenting on the news, David Ryan, PHE’s Engineering Director, said: “Gaining this Statement of Feasibility provides us with a key foundation in the engineering and risk mitigation programme, giving us great confidence in the scale up and roll out of the technology. Furthermore, it provides our partners in the waste management industry with a key element of the independent technical assurance needed to finalise site application-specific commercial agreements to utilise the DMG technology to reduce the volumes of waste plastics sent to landfill.

“Our expectations are that, with our planned engineering and risk management in place, the commercial operation of our DMG technology will exceed one tonne of hydrogen production and generate in excess of two megawatts of electricity per DMG unit, and we should achieve full certification against the DNV GL Technology Qualification process at our first site”.

Keith Allaun, the company’s CEO, added: “We sought this Technology Assessment of our engineering design by one of the world’s most highly respected evaluators of new technologies. DNV GL confirms both our technology design and the rigour of our engineering approach. Our team has worked relentlessly over the last 18 months to get the DMG technology to its existing commercialisation phase.

“This independent assessment of our proprietary DMG Technology adds further credibility to the considerable scope that exists for its commercial application, in the many sectors [involving] efficient and responsible use of non-recyclable and waste plastics, end-of-life tyres, and the creation of clean energy”

Carbon emissions will start to dictate stock prices

UNIVERSITY OF WATERLOO

PUBLIC RELEASE: 13-NOV-2018

Companies that fail to curb their carbon output may eventually face the consequences of asset devaluation and stock price depreciation, according to a new study out of the University of Waterloo.

The researchers further determined that the failure of companies within the emission-intensive sector to take carbon reduction actions could start negatively impacting the general stock market in as little as 10 years' time.

"Over the long-term, companies from the carbon-intensive sectors that fail to take proper recognizable emission abatements may be expected to experience fundamental devaluations in their stocks when the climate change risk gets priced correctly by the market," said lead author Mingyu Fang, a PhD candidate in Waterloo's Department of Statistics & Actuarial Science. "More specifically for the traditional energy sector, such devaluation will likely start from their oil reserves being stranded by stricter environmental regulations as part of a sustainable, global effort to mitigate the effects caused by climate change.

"Those companies may find that large portions of the reserves are at risk of being unexploitable for potential economic gains."

Climate change impacts investment portfolios through two channels. Directly it elevates weather?related physical risk to real properties and infrastructure assets, which extends to increased market risk in equity holdings with material business exposures in climate?sensitive regions. Indirectly it triggers stricter environmental regulations and higher emission cost in a global effort in emission control, which shall induce downturns in carbon?intensive industries in which a portfolio may have material positions.

This indirect impact of climate change on investments will effectively be transformed into a political risk affecting particular asset classes, often referred to as the investment carbon risk.

As part of the study, which grew out of Fang's PhD thesis as well as a funded research project by the Society of Actuaries under the theme of 'Managing Climate and Carbon Risk in Investment Portfolios', the researchers undertook an inter?temporal analysis of stock returns. They examined 36 publicly traded large emitters and related sector indices from Europe and North America around the ratification of major climate protocols. Only nine of the 36 samples were found to display recognizable carbon pricing. The historical performance of the emission?heavy sectors, such as energy, utilities, and material was also compared against those of the other industries. The carbon-intensive sectors consistently ranked at the bottom of the list across the metrics used and underperformed the market indices for both Europe and North America.

"It is in the best interest of companies in the financial, insurance, and pension industries to price this carbon risk correctly in their asset allocations," said Tony Wirjanto, a professor jointly appointed in Waterloo's School of Accounting & Finance and Department of Statistics & Actuarial Science, and Fang's PhD thesis supervisor. "Companies have to take climate change into consideration to build an optimal and sustainable portfolio in the long run under the climate change risk."

The study, "Sustainable portfolio management under climate change" by Fang, Wirjanto and Ken Seng Tan, another of Fang's PhD thesis supervisors, was published recently in the Journal of Sustainable Finance & Investment.

New US study reveals natural solutions can reduce global warming

US forests, wetlands, and agricultural lands could absorb a fifth of greenhouse gas pollution -- equivalent to emissions from all US vehicles

NATURE CONSERVANCY

PUBLIC RELEASE: 14-NOV-2018

Restoring the United States' lands and coastal wetlands could have a much bigger role in reducing global warming than previously thought, according to the most comprehensive national assessment to date of how greenhouse gas emissions can be reduced and stored in forests, farmland, grasslands, and wetlands.

The peer-reviewed study in Science Advances from The Nature Conservancy and 21 institutional partners found that nature's contribution could equal 21% of the nation's current net annual emissions, by adjusting 21 natural management practices to increase carbon storage and avoid greenhouse emissions. The study is the first to include the climate benefits of coastal wetlands and grasslands in a comprehensive mix along with forests and agriculture.

In October the Intergovernmental Panel on Climate Change special report called for global action immediately to limit warming to 1.5?Centigrade (approximately 3? Fahrenheit) to avoid the most damaging climate change impacts. This new study highlights how, and which, natural solutions in the United States offer the most promise to help limit temperatures below that 3? Fahrenheit goal.

Joe Fargione, Director of Science for The Nature Conservancy, was the study's lead author: "One of America's greatest assets is its land. Through changes in management, along with protecting and restoring natural lands, we demonstrated we could reduce carbon pollution and filter water, enhance fish and wildlife habitat, and have better soil health to grow our food -- all at the same time. Nature offers us a simple, cost-effective way to help fight global warming. In combination with transitioning to zero carbon energy production, natural climate solutions can help protect our climate for future generations."

Lynn Scarlett, Chief External Affairs Officer for The Nature Conservancy and former Deputy Secretary of the Department of the Interior, spoke to practical elements of the study's findings: "An ounce of prevention is worth a pound of cure, so we should reduce carbon pollution where we can. But we also need to put natural solutions to work as a tool to insulate ourselves from global warming. This study provides good news that making investments in nature will make a big difference, while offering the potential for new revenue to farmers, ranchers, foresters, and coastal communities at the same time."

Of the 21 natural solutions analyzed, increased reforestation (the planting of trees) emerged as the largest means to achieve greater carbon storage, equivalent to eliminating the emissions of 65 million passenger cars. Other high-performing forest solutions include allowing longer periods between timber harvest to increase carbon storage; increasing controlled burns and strategic thinning in forests to reduce the risk of megafire; and avoided loss of forests from urban sprawl.

The study identified a maximum of 156 million acres that could be reforested, 304 million acres where forest harvest rotations could be extended, and at least 42 million additional acres of forests that would benefit from fire risk reduction treatments. In addition, almost a million acres of forest are being converted to non-forest habitat a year, largely due to suburban and exurban expansion, which could be addressed through better land use planning. The study also finds that urban reforestation can add important carbon storage benefits.

"Planting trees and improving the health of existing forests will be a deciding factor in whether we are able to get ahead of the climate curve," said Jad Daley, CEO of American Forests. "This breakthrough analysis clarifies the highest impact actions for keeping our forests as a growing and resilient carbon sink and the potential scale of climate benefit."

Grasslands are underappreciated for their carbon storage opportunity. Grassland is being lost at a rate of over one million acres per year. When grassland is converted to cropland, about 28% of the carbon in the top meter of soil is released to the atmosphere. This trend could be reversed by re-enrolling 13 million acres of marginal cropland in conservation programs and restoring them to provide habitat and storage of carbon in the soil.

Existing croplands have an important role to play. Farmers can optimize their nutrient application, saving money and avoiding emissions of nitrous oxide, a potent greenhouse gas. Farmers can also plant cover crops, which suck carbon out the atmosphere and return it to the soil during times of the year when fields would normally be bare.

"Farmers are some of our best land stewards, and Danone North America is partnering with farms across the country to find climate solutions through our soil health initiative. Improved nutrient management, cover crops, and crop rotations are examples of practices that can help reduce GHG emissions and over time improve a farm's bottom-line. Farmers and the food industry depend on a predictable climate, so it's important to work together to reduce the risks of climate change." said Chris Adamo, Vice President Federal & Industry Affairs at Danone North America.

Natural solutions can be found under water as well. An estimated 27% of tidal wetlands have been disconnected from the ocean, increasing the release of methane. Reconnecting tidal wetlands to the ocean virtually eliminates those methane emissions, and also restores fish habitat important for coastal communities.

"Shellfish growers make a living on the water and have witnessed salt marshes losing productivity due to freshwater inundation. Not only does this damage important waters and increase emissions, but it also harms their ability to make a living growing oysters, clams, mussels and other species that support many coastal communities and other important stakeholders. By restoring salt marshes, we can help shellfish farmers, wholesalers, retailers and restaurants and the climate all at the same time," said Davis Herron, Director, Retail & Restaurant Division, Lobster Place, spokesman for the Shellfish Growers Climate Coalition.

Not only do natural climate solutions have strong benefits for personal enjoyment, healthier water, air, wildlife, and soil, many are quite affordable. As states and the federal government evaluate rules and markets for greenhouse gas emissions, these low-cost reductions from natural solutions offer the United States a powerful tool to address a warming planet.

Half of the world's annual precipitation falls in just 12 days, new study finds

Climate change likely to make global precipitation more uneven

PUBLIC RELEASE: 15-NOV-2018

Currently, half of the world's measured precipitation that falls in a year falls in just 12 days, according to a new analysis of data collected at weather stations across the globe.

By century's end, climate models project that this lopsided distribution of rain and snow is likely to become even more skewed, with half of annual precipitation falling in 11 days.

These results are published in Geophysical Research Letters, a journal of the American Geophysical Union.

Previous studies have shown that we can expect both an increase in extreme weather events and a smaller increase in average annual precipitation in the future as the climate warms, but researchers are still exploring the relationship between those two trends.

"This study shows how those two pieces fit together," said Angeline Pendergrass, a scientist at the National Center for Atmospheric Research (NCAR) and the lead author of the new study. "What we found is that the expected increases happen when it's already the wettest -- the rainiest days get rainier."

The findings, which suggest that flooding and the damage associated with it could also increase, have implications for water managers, urban planners, and emergency responders. The research results are also a concern for agriculture, which is more productive when rainfall is spread more evenly over the growing season.

The research was supported by the U.S. Department of Energy and the National Science Foundation, which is NCAR's sponsor.

What it means to be extreme

Scientists who study extreme precipitation -- and how such events may change in the future -- have used a variety of metrics to define what qualifies as "extreme." Pendergrass noticed that in some cases the definitions were so broad that extreme precipitation events actually included the bulk of all precipitation.

In those instances, "extreme precipitation" and "average precipitation" became essentially the same thing, making it difficult for scientists to understand from existing studies how the two would change independently as the climate warms.

Other research teams have also been grappling with this problem. For example, a recent paper tried to quantify the unevenness of precipitation by adapting the Gini coefficient, a statistical tool often used to quantify income inequality, to instead look at the distribution of rainfall.

Pendergrass wanted to find something even simpler and more intuitive that could be easily understood by both the public and other scientists. In the end, she chose to quantify the number of days it would take for half of a year's precipitation to fall. The results surprised her.

"I would have guessed the number would be larger -- perhaps a month," she said. "But when we looked at the median, or midpoint, from all the available observation stations, the number was just 12 days."

For the analysis, Pendergrass worked with Reto Knutti, of the Institute for Atmospheric and Climate Science in Zurich, Switzerland. They used data from 185 ground stations for the 16 years from 1999 through 2014, a period when measurements could be validated against data from the Tropical Rainfall Measuring Mission (TRMM) satellite. While the stations were dispersed globally, the majority were in North America, Eurasia, and Australia.

To look forward, the scientists used simulations from 36 of the world's leading climate models that had data for daily precipitation. Then they pinpointed what the climate model projections for the last 16 years of this century would translate to for the individual observation stations.

They found that total annual precipitation at the observation stations increased slightly in the model runs, but the additional precipitation did not fall evenly. Instead, half of the extra rain and snow fell over just six days.

This contributed to total precipitation also falling more unevenly than it does today, with half of a year's total precipitation falling in just 11 days by 2100, compared to 12 in the current climate.

"While climate models generally project just a small increase in rain in general, we find this increase comes as a handful of events with much more rain and, therefore, could result in more negative impacts, including flooding," Pendergrass said. "We need to take this into account when we think about how to prepare for the future."

The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions or recommendations expressed in this material do not necessarily reflect the views of the National Science Foundation.

About the article

Title: The uneven nature of daily precipitation and its change

Authors: Angeline G. Pendergrass and Reto Knutti

Journal: Geophysical Research Letters, DOI: 10.1029/2018GL080298

More Information and Grphic:

https://news.ucar.edu/132637/half-worlds-annual-precipitation-falls-just-12-days-new-study-finds

Study: Impact of mercury-controlling policies shrinks with every five-year delay

Toxin will accumulate in the environment, particularly in remote regions, as countries delay implementing emissions controls.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY

PUBLIC RELEASE: 1-NOV-2018

Mercury is an incredibly stubborn toxin. Once it is emitted from the smokestacks of coal-fired power plants, among other sources, the gas can drift through the atmosphere for up to a year before settling into oceans and lakes. It can then accumulate in fish as toxic methylmercury, and eventually harm the people who consume the fish.

What's more, mercury that was previously emitted can actually re-enter the atmosphere through evaporation. These "legacy emissions" can drift and be deposited elsewhere, setting off a cycle in which a growing pool of toxic mercury can circulate and contaminate the environment for decades or even centuries.

A new MIT study finds that the longer countries wait to reduce mercury emissions, the more legacy emissions will accumulate in the environment, and the less effective any emissions-reducing policies will be when they are eventually implemented.

In a paper published in the journal Environmental Science and Technology, researchers have found that, for every five years that countries delay in cutting mercury emissions, the impact of any policy measures will be reduced by 14 percent on average. In other words, for every five years that countries wait to reduce mercury emissions, they will have to implement policies that are 14 percent more stringent in order to meet the same reduction goals.

The researchers also found that remote regions are likely to suffer most from any delay in mercury controls. Mercury contamination in these regions will only increase, mostly from the buildup of legacy emissions that have traveled there and continue to cycle through and contaminate their environments.

"The overall message is that we need to take action quickly," says study author Noelle Selin, associate professor in MIT's Institute for Data Systems and Society and Department of Earth, Atmospheric, and Planetary Sciences. "We will be dealing with mercury for a long time, but we could be dealing with a lot more of it the longer we delay controls."

Global delay

The Minamata Convention, an international treaty with 101 parties including the United States, went into effect in August 2017. The treaty represents a global commitment to protect human health and the environment by reducing emissions of mercury from anthropogenic sources. The treaty requires that countries control emissions from specific sources, such as coal-fired power plants, which account for about a quarter of the world's mercury emissions. Other sources addressed by the treaty include mercury used in artisanal and small-scale gold mining, nonferrous metals production, and cement production.

In drafting and evaluating their emissions-reducing plans, policymakers typically use models to simulate the amount of mercury that would remain in the atmosphere if certain measures were taken to reduce emissions at their source. But Selin says many of these models either do not account for legacy emissions or they assume that these emissions are constant from year to year. These measures also do not take effect immediately -- the treaty urges that countries take action as soon as possible, but its requirements for controlling existing sources such as coal-fired power plants allow for up to a 10-year delay.

"What many models usually don't take into account is that anthropogenic emissions are feeding future legacy emissions," Selin says. "So today's anthropogenic emissions are tomorrow's legacy emissions."

The researchers suspected that, if countries hold off on implementing their emissions control plans, this could result in the growth of not just primary emissions from smokestacks, but also legacy emissions that made it back into the atmosphere a second time.

"In real life, when countries say, 'we want to reduce emissions,' it usually takes many years before they actually do," says Hlne Angot, the study's first author and a former postdoc at MIT. "We wanted to ask, what are the consequences of delaying action when you take legacy emissions into account."

The legacy of waiting

The group used a combination of two models: GEOS-Chem, a global atmospheric model developed at MIT that simulates the transport of chemicals in the atmosphere around the world; and a biogeochemical cycle model that simulates the way mercury circulates in compartments representing global atmosphere, soil, and water.

With this modeling combination, the researchers estimated the amount of legacy emissions that would be produced in any region of the world, given various emissions-reducing policy timelines. They assumed a scenario in which countries would adopt a policy to reduce global mercury emissions by 50 percent compared to 2010 levels. They then simulated the amount of mercury that would be deposited in lakes and oceans, both from primary and legacy emissions, if such a policy were delayed every five years, from 2020 to 2050.

In sum, they found that if countries were to delay by five, 10, or 15 years, any policy they would implement would have 14, 28, or 42 percent less of an impact, respectively, than if that same policy were put in place immediately.

"The longer we wait, the longer it will take to get to safe levels of contamination," Angot says.

Remote consequences

Based on their simulations, the researchers compared four regions located at various distances from anthropogenic sources: remote areas of eastern Maine; Ahmedabad, one of the largest cities in India, located near two coal-fired power plants; Shanghai, China's biggest city, which has elevated atmospheric mercury concentrations; and an area of the Southern Pacific known for its tuna fisheries.

They found that, proportionally, delays in mercury action had higher consequences in the regions that were farthest away from any anthropogenic source of mercury, such as eastern Maine -- an area that is home to several Native American tribes whose livelihoods and culture depend in part on the local fish catches.

Selin and Angot have been collaborating with members of these tribes, in a partnership that was established by MIT's Center for Environmental Health Sciences.

"These communities are trying to go back to a more traditional way of life, and they want to eat more fish, but they're contaminated," Angot says. "So they asked us, 'When can we safely eat as much fish as we want? When can we assume that mercury concentrations will be low enough so we can eat fish regularly?'"

To answer these questions, the team modeled the amount of fish contamination in eastern Maine that could arise from a buildup of legacy emissions if mercury-reducing policies are delayed. The researchers used a simple lake model, adapted and applied at MIT in collaboration with colleagues at Michigan Technological University, that simulates the way mercury circulates through a column that represents layers of the atmosphere, a lake, and the sediment beneath. The model also simulates the way mercury converts into methylmercury, its more toxic form that can bioaccumulate in fish.

"In general, we found that the longer we wait to decrease global emissions, the longer it will take to get to safe methylmercury concentrations in fish," Angot says. "Basically, if you are far away [from any anthropogenic source of mercury], you rely on everyone else. All countries have to decrease emissions if you want to see a decrease in contamination in a very remote place. So that's why we need global action."

This research was supported, in part, by the National Institute of Environmental Health Sciences, through a core grant to MIT's Center for Environmental Health Sciences, and by the NIEHS Superfund Basic Research Program.

PAPER: "Global and local impacts of delayed mercury mitigation efforts." https://pubs.acs.org/doi/10.1021/acs.est.8b04542

ARCHIVE: Health benefits will offset cost of China's climate policy http://news.mit.edu/2018/study-health-benefits-will-offset-cost-china-climate-policy-0423

ARCHIVE: Noelle Selin: Tracing toxins around the world http://news.mit.edu/2017/faculty-profile-noelle-selin-1023

ARCHIVE: Modeling the unequal benefits of U.S. environmental policy http://news.mit.edu/2017/modeling-unequal-benefits-us-environmental-policy-0215

ARCHIVE: Global reductions in mercury emissions should lead to billions in economic benefits for U.S. http://news.mit.edu/2015/reductions-mercury-billions-economic-benefits-1228

Reducing methane emissions can play a key role in reducing ozone worldwide

EUROPEAN COMMISSION JOINT RESEARCH CENTRE

Methane is a climate pollutant that leads to the production of ozone with serious health and environmental impacts

60% of all methane emissions originate from the energy, waste and agriculture sectors

Targeting these three sectors with methane reduction policies can lead to significant reductions in overall methane emissions

Methane (CH4) is the main ingredient in natural gas. It is the second most important greenhouse gas (GHG) after carbon dioxide (CO2), and it also leads to the formation of another GHG - ozone.

Ozone has harmful effects for people, ecosystems and agricultural productivity. It is a so-called "short-lived climate forcer". This term refers to pollutants that remain in the atmosphere for a much shorter period of time than CO2 but have a much greater potential to warm the atmosphere.

The life span of short-lived climate pollutants is usually less than 15 years, unlike CO2 which stays in the atmosphere for about 100 years.

Methane emissions are increasing

Since the pre-industrial era, methane concentrations have more than doubled. And after a period of stagnation, they are increasing again since the last decade.

"Worldwide, methane emissions increased by 17% between 1990-2012, compared to a 53% increase in CO2 emissions. The methane emissions of the EU28 and the contributions of methane to the overall EU GHG emissions declined substantially in the 1990s, but in the last 15 years the rate of decline has been much less", explains JRC researcher Rita Van Dingenen.

If nothing is done about reducing methane emissions worldwide, they could cause between 40 000 and 90 000 more premature deaths globally by 2050, compared to the present situation.

Solutions within our reach

A new JRC study shows that the man-made methane emissions are on a non-sustainable path, but that there are cheap and even profitable options to reduce emissions in a relatively short time frame.

"About 60% of the global methane emissions originate from agriculture, landfills and wastewater, and the production and transport of fossil fuels. Targeting these three sectors can bring a significant reduction in the overall methane emissions and ozone concentrations globally", said Rita who presented the JRC report today at the WHO global conference on air pollution and health.

The JRC report shows that there is a substantial global mitigation potential in these three sectors. In particular, important emissions reductions can be obtained by:

Pursuing efforts to lower energy consumption, substitute fossil fuels, upgrade old gas and oil production, and gas distribution infrastructure to reduce unintended leakage.

Enforcing maximum waste separation and treatment, and not using landfill for biodegradable waste. The global abatement potential in the solid waste landfill sector is estimated to be approximately at 61% of the baseline emissions by 2030, of which 12% at relatively low or zero costs.

Improving the sanitary standards in developing countries and implementing western standards for wastewater sanitation.

Following FAO recommendations to improve animal health and efficiency of milk and meat production. Ruminant enteric fermentation - an important source of CH4 - can also be reduced for instance through adjustment of animal's diets and vaccination.

Changing dietary habits by reducing meat and dairy consumption, which would also bring additional health benefits.

Scientists also note that there are big differences in the methane emissions from the waste and fossil fuel production sectors between developed and developing countries. Investing in efforts to align developing regions with the Sustainable Development Goals (SDGs) can unlock a huge potential for emissions reduction.

A global challenge

The EU is spearheading global efforts to fight climate change and reduce GHG emissions. The bloc is set to meet its 2020 target to reduce GHG emissions by at least 20% compared to 1990 - and has raised this target to at least 40% by 2030. The 2030 target is the basis of the EU's commitment to the 2015 Paris Agreement, and the legislative framework for implementing it has already been adopted.

In its Declaration on the Review of Methane Emissions, the European Commission also stated its intention to review methane emissions in the context of assessing options to further reduce ozone concentrations in the EU, and to promote methane reductions internationally.

However, Europe's contribution to global CH4 emissions is currently only about 6%. Reducing methane emissions in Europe only is not enough to make a difference. Global cooperation to reduce methane emissions is essential - not only for the climate but also to prevent air pollution.

International climate agreements are thus an important means to reduce CH4 emissions. Reaching the emission reduction targets included in the Paris Agreement would reduce global CH4 emissions substantially. This would mean that the exposure of the global and European populations to ozone would remain at the 2010 levels.

"EPA Plans Soil Removal At Lead-Tainted Indiana Complex"

"EAST CHICAGO, Ind. — The U.S. Environmental Protection Agency is moving ahead with plans for a 2-foot-deep removal of lead- and arsenic-contaminated soil at the site of a northwestern Indiana public housing that’s been evacuated and demolished over health concerns.

The agency estimates the cleanup project at the East Chicago’s West Calumet Housing Complex will cost about $26 million. It would involve removing more than 160,000 cubic yards of soil contaminated over decades by a lead-products factory and replacing it with clean soil and seed or sod.

More than 1,000 people, including about 700 children, were forced from the housing complex after 2016 tests found high lead levels in blood samples from some children and some yards with lead levels over 70 times the U.S. safety standard."

U.S. plans new limits on heavy-duty truck emissions

David Shepardson, NOVEMBER 12, 2018 / 11:55 AM /

WASHINGTON (Reuters) - The U.S. Environmental Protection Agency will announce plans to propose new rules to significantly decrease emissions of smog-forming nitrogen oxide from diesel-powered heavy-duty trucks, an agency official said.

The U.S. Environmental Protection Agency (EPA) sign is seen on the podium at EPA headquarters in Washington, U.S., July 11, 2018. REUTERS/Ting Shen

Industry groups and state environmental officials have urged the EPA to set new nationwide rules as the state of California has been moving forward with plans to set new state emissions limits. California also wants nationwide rules, in part because more than half of all trucks delivering goods in the state are registered in other states.

The EPA said in a statement it had scheduled a formal announcement on Tuesday with industry executives and state environmental officials regarding its “Cleaner Trucks Initiative,” but did not immediately disclose details. The effort to impose a new regulatory limit by the EPA comes as the Trump administration has generally touted its efforts to eliminate regulations. But the effort on nitrogen oxide (NOx) is backed by industry, which wants to avoid a patchwork of federal and state standards, the official said.

The official asked not to be identified because the announcement was still pending.

In December 2016, the Obama-led EPA said in response to petitions to impose new standards that it acknowledged “a need for additional NOx reductions from on-highway heavy-duty engines, particularly in areas of the country with elevated levels of air pollution” and said it planned to propose new rules that could begin in the 2024 model year.

Local and state air quality and other agencies including New York City, New Hampshire, Rhode Island, Los Angeles, Washington State had petitioned for the rules.

Another administration official said Monday the new proposed emissions rules may not be written and announced until 2020.

Nitrogen oxide emissions are linked to significant health impacts and can exacerbate asthma attacks, the EPA has said.

The current heavy-duty truck rules for NOx were adopted in 2000 and took effect over the following decade.

In the aftermath of Volkswagen AG’s (VOWG_p.DE) light-duty diesel emissions scandal, in which the German automaker admitted to secretly using software to evade emissions rules, the EPA has taken steps to insure that diesel cars and SUVs are meeting emissions requirements in on-road use.

The new NOx heavy-duty truck rules may also include new tests or other regulatory steps to ensure that vehicles and their engines are complying during real-world driving, the official said.

Reporting by David Shepardson; Editing by Steve Orlofsky and Tom Brown

Are artisanal foodie brands ruining a California national park?

Herds of cows provide meat and dairy for influential purveyors, but environmentalists say they despoil the landscape

Laura Fraser in Pt Reyes Station, California

Fri 9 Nov 2018 06.00 EST Last modified on Fri 9 Nov 2018 19.08 EST

An hour north of San Francisco lie two-dozen dairy and meat farms that have produced some of the most beloved artisanal brands in northern California – along with a farm-fresh, locally sourced foodie ethos that has become globally influential.

All the dairies in Point Reyes are organic, and the beef is grass-fed. They are models of sustainable farming, providing the raw ingredients for cheesemaker Cowgirl Creamery, the Straus Family Creamery, and Marin Sun Farms meats, to name a few.

Yet some national environmentalists are taking a stand against these ranchers, who have farmed for generations on grasslands that are now part of Point Reyes National Seashore, claiming they are despoiling a landscape visited by 2.5 million people every year and should be ejected. Cows, the environmentalists argue, do not belong in a national park.

“Visitors come to Point Reyes seeking a wild part of the coastline of California, in order to view wildlife, walk on sand beaches, and tour dramatic ocean cliffs unhindered by private property and development,” said Erik Molvar, executive director of Western Watersheds Project. “They do not come here to see herds of cattle on overgrazed weed plots.”

His Idaho-based group is one of three that sued the National Park Service in 2016, claiming that Point Reyes cows were causing environmental damage, interfering with recreation and harming the herds of tule elk that roam the landscape.

Now a local Democratic congressman, Jared Huffman, has teamed up with an unlikely Republican ally from Utah, Rob Bishop, to introduce legislation to protect the ranchers, whose leases are expiring. Huffman has proposed extending them for 20 years. While Huffman’s environmental record is pristine, Bishop’s decidedly is not; he led efforts to eliminate Utah’s Bears Ears and Grand Staircase-Escalante National Monuments, opening them to the fossil fuel industry.

“Multi-generational ranching in a small portion of the seashore is part of the culture and the landscape and the character that was always meant to be protected,” Huffman said.

Sue Conley, one of the founders of Cowgirl Creamery, agreed. “The ranches have contributed significantly to the sustainable food scene” in the area, she said. “It’s a great model to have working farms in a national seashore, connecting consumers with farmers. There’s a consciousness that comes from being around nature and farming that’s really important to urban life.”

Lying along the San Andreas fault, Point Reyes National Seashore is a foggy 111-square-mile peninsula of wilderness, grasslands, rocky windswept beaches – and dairy and cattle ranches. Cows and hikers have coexisted uneasily at Point Reyes since 1962, when President John F Kennedy designated it a national seashore. To create the park, the government bought the ranches, which had been there since the Gold Rush, leasing them back long-term to families who had been working them for generations. It was a lifeline for the small producers, who faced threats including commercial agricultural enterprises and urban sprawl.

The argument over cattle in Point Reyes actually started with oysters, which had been farmed in an estuary in the national park since the 1930s. Like the ranchers, a local company also had a long-term lease – and when it expired, chef Alice Waters, Senator Dianne Feinstein and author Michael Pollan sided with the oysterers in their bid to stay put. Nevertheless, in 2014, the then–US interior secretary, Ken Salazar, decided that the oyster lease, which was in an area designated as wilderness, would not be renewed.

Environmentalists started to wonder: if oysters could be evicted, why not cows?

Last week Bob McClure surveyed a rare sunny sky and two tule elk bulls in the distance on his grazing land. “We’ve been led to believe that we would always stay here,” said McClure, a fourth-generation farmer whose family gave McClure Beach to the national seashore before the second world war. His 1,400-acre dairy is perched above a hiking trail and hills that tumble to a seaside lagoon. “We’ve worked cooperatively with environmentalists for 50 years,” he said. “We get that this is public ground, and we’re stewards of it.”

Both sides argue over the original intent of Congress in setting up this unusual park with historic ranches within its boundaries. “This wasn’t supposed to be Yellowstone or Yosemite,” said environmental writer John Hart, who has written histories of the area. The original plan was for the seashore to be developed as a sprawling beach recreation area, crisscrossed with roads, boardwalks and beach businesses.

Regional environmental and agricultural groups have mainly sided with the ranchers, whose organic, sustainable products are mainstays of the vibrant local food scene. They say that efforts to rid national parks of cattle may make sense elsewhere in the arid west, but Point Reyes has been grasslands for centuries owing to grazing by tule elk.

“The ranchers are making improvements that actually help protect and restore the lands,” said Linda Novy, president of the Marin Conservation League, which supports Huffman’s bill. “By grazing, the cattle maintain the rare coastal prairie ecosystem.” Without the cattle, the grasslands that characterize the hills of Point Reyes would quickly turn into dense coyote brush.

David Lewis, county director of the UC Cooperative Extension in Marin County, estimates that the Point Reyes ranches contribute as much as 20% of the county’s $110m in annual agricultural production. Given the industries that support agriculture – feed companies, veterinary services, a grass-fed beef butchery – the overall economic output of the ranches may be three times that amount. If the ranches closed, Lewis says, “You’d be losing about $60m a year in production.” The ranchers also contribute more than 5,000 jobs in the region, on and off the farms.

Albert Straus, founder and CEO of Straus Family Creamery, an organic local dairy that supplies milk to Cowgirl Creamery and others, said the ranchers are the linchpin of the local agricultural economy. Without them, the viability of the small farms in the county is at risk, he said, because the county needs a critical mass of ranches to get essential services, such as feed deliveries. “Eliminating or reducing the farms in the park would have a huge impact on the few farms that are left” in the area, he said. “It would devastate our dwindling schools and community.”

Cheeses are being made at Cowgirl Creamery inside the Tomales Bay Foods company in Point Reyes. Photograph: Talia Herman for the Guardian

For the national environmental groups, though, these commercial concerns are secondary to protecting wilderness. They raise eyebrows at the fact that Huffman reached across the aisle to a Republican like Bishop to introduce the bill. “It’s shocking that Representative Huffman would team up with anti-public lands zealot Rob Bishop to undermine the public planning process already under way at Point Reyes, hand these public lands over to ranch and dairy operations that have already been paid to get off the park, and turn a blind eye to the ecological damage this land use is doing to this special place,” said Chance Cutrano, director of Strategy at Resource Renewal Institute.

Whether or not the legislation passes, the National Park Service is continuing its own long-term planning process. While options for ridding the park of ranchers are on the table, its initial proposal to the public is to give the existing ranchers 20-year leases.

As his cows munched on grass, McClure, the rancher, suggested a solution that sounded both old-world and downright practical.

“We need to collaborate,” he said. “We all need to sit down at the Point Reyes fire station and hammer these things out.”

This story has been published in collaboration with Pacific Standard and the Missoulian.

City tests confirm some Chicago homes with water meters have lead in tap water

City Hall under Mayor Rahm Emanuel has denied for years that Chicago has a widespread problem with lead in its tap water.

John Byrne and Michael HawthorneContact Reporters

Chicago Tribune

City testing of Chicago homes with water meters found nearly 1 in 5 sampled had brain-damaging lead in their tap water, but Mayor Rahm Emanuel’s water commissioner acknowledged Thursday that the city continued installing new meters after learning about the alarming results in June.

Disclosure of the previously secret study of 296 metered homes comes after more than five years of denials by Emanuel and his aides that the nation’s third-largest city has a widespread lead problem, even as the scandal in Flint, Mich., drew national attention to the hazards and other research in Chicago consistently found the toxic metal in drinking water.

The Emanuel administration’s sudden reversal, outlined at a hastily organized City Hall news conference, adds Chicago to a growing list of cities that are distributing water filters to homes with lead service lines, which in Chicago were required by the city’s plumbing code until Congress banned the practice in 1986.

Randy Conner, the city’s water commissioner, and Julie Morita, the health commissioner, said all 165,000 Chicago homes with water meters are eligible for city-provided water filters. Money collected through water bills will cover the cost of $60 kits that include a pitcher and six replacement filters, Conner said.

“It was just determined that this was the appropriate way of action between myself, Morita and the scientists,” Conner said when asked why the city took so long to address the well-documented health risks.

The Chicago Tribune first reported in 2013 that the city water department and the U.S. Environmental Protection Agency had found high levels of lead in Chicago tap water after lead service lines had been disturbed by street work or plumbing repairs, including the installation of water meters.

Emanuel dramatically expanded that type of work after taking office in 2011. His administration has borrowed more than $481 million for water conservation projects, including the installation of household meters and new water mains citywide. The city has steadily raised water rates to pay back the 20-year loans.

None of the money has been earmarked to replace lead service lines.

In April, a Tribune analysis revealed that lead was found in water samples drawn from nearly 70 percent of the 2,797 homes that returned free testing kits provided by the city during the past two years. The toxic metal turned up in samples collected throughout the city, the newspaper found. Tap water in 3 of every 10 homes tested had lead concentrations above 5 parts per billion, the maximum allowed in bottled water by the U.S. Food and Drug Administration.

As recently as September, mayoral aides and water department officials continued to insist that it is up to individual homeowners to protect themselves from mostly invisible particles leaching out of lead pipes the city required by law for decades. On Wednesday, Emanuel himself declared Chicago’s drinking water is safe while opposing plans introduced in the City Council to finance the replacement of lead service lines. He accused the measure’s authors of treating homeowners “as an ATM machine” by proposing to pay for the project with a 1 percent tax on sales of Chicago homes worth more than $750,000.

“I believe in science in forming good policy decisions,” Emanuel said.

A day later, Conner and Morita announced the city would begin distributing water filters shortly before the water commissioner was scheduled to appear at a City Council budget hearing, a setting during which aldermen could slam the Emanuel administration for not doing more about the lead problem.

Revealing the plan to distribute water filters provided Conner with a response to blunt the criticism. Yet there is no guarantee any other action will be taken beyond conducting another study.

“What we’re committed to doing is taking a look at this thing holistically, and understanding what this is going to take to tackle this issue, from the feasibility, the framework and a funding perspective,” Conner said about a $750,000 contract with the global engineering firm CDM Smith, which is required to submit a new review of the municipal water system before Emanuel leaves office in the spring.

Lead is unsafe at any level, according to the EPA and the Centers for Disease Control and Prevention. Emanuel and his aides this week continued their technically true but misleading defense that Chicago drinking water is safe because it meets federal standards.

Water utilities are considered to be in compliance with federal water quality regulations as long as 90 percent of the homes tested have lead levels below 15 ppb, a 1991 standard the EPA acknowledges is based not on the dangers of lead but because the agency thought the limit could be met with corrosion-inhibiting chemicals.

Chicago conducts this type of testing in just 50 homes every three years — the minimum required — and typically doesn’t find anything wrong. Most of the Chicago homes tested for regulatory purposes during the past decade were owned by water department employees or retirees living on the Far Northwest and Far Southwest sides.

Aldermen who have been calling for the removal of lead service lines held their own City Hall news conference Thursday. They criticized the Emanuel administration for not immediately publicizing the results of the city’s study of metered homes and for failing to stop installing meters after learning the work could be putting Chicagoans at risk.

“It’s dangerous, it’s irresponsible and it’s unacceptable,” said Ald. Chris Taliaferro, 29th.

“I think the lack of transparency and the communication is what’s lacking here,” said Ald. Gilbert Villegas, 36th, one of the sponsors of the proposed transfer tax Emanuel opposes. “They need to do a better job. Let’s get ahead of this thing.”

Paul Vallas, one of the candidates to replace Emanuel as mayor, has urged Illinois Attorney General Lisa Madigan to investigate.

“Since June, I have been calling on the city to take more aggressive action to address our lead in the drinking water problem, but the Emanuel administration has dismissed me as a panic peddler,” Vallas said in a statement that accused the mayor and his aides of “an unbelievable level of cynicism” in their public statements on the issue.

Morita, the city health commissioner, noted that the number of Chicago children with elevated levels of lead in their blood has steadily declined citywide for years. “First and foremost, there is no public health crisis,” she said.

Most lead exposure comes from ingesting dust in homes built before 1978 with lead-based paint. A 2015 Tribune investigation found that while the rate of childhood lead poisoning has declined citywide, more than a fifth of the children tested in some of the poorest parts of Chicago still had levels of the toxic metal in their blood that exceeded CDC guidelines.

The city later announced it would begin testing water in the homes of poisoned children for the first time.

jebyrne@chicagotribune.com

mhawthorne@chicagotribune.com

"Fast-Rising Demand for Air Conditioning Is Adding to Global Warming"

"With window units set to more than triple by 2050, home air conditioning is on pace to add half a degree Celsius to global warming this century, a new report says."

"Increasing demand for home air conditioning driven by global warming, population growth and rising incomes in developing countries could increase the planet's temperatures an additional half a degree Celsius by the end of the century, according to a new report by the Rocky Mountain Institute.

The demand is growing so fast that a "radical change" in home-cooling technology will be necessary to neutralize its impact, writes RMI, an energy innovation and sustainability organization.

The problem with air-conditioning comes from two sources: the amount of energy used, much of which is still powered by carbon-emitting coal, oil and gas generation, and the leaking of hydrofluorocarbon (HFC) coolants, which are short-lived climate pollutants many times more potent than carbon dioxide."

More Evidence Points to China as Source of Ozone-Depleting Gas

By Chris Buckley

Nov. 3, 2018

BEIJING — An environmental group says it has new evidence showing that China is behind the resurgence of a banned industrial gas that not only destroys the planet’s protective ozone layer but also contributes to global warming.

The gas, trichlorofluoromethane, or CFC-11, is supposed to be phased out worldwide under the Montreal Protocol, the global agreement to protect the ozone layer. In May, however, scientists published research showing that CFC-11 levels in the atmosphere had begun falling more slowly. Their findings suggested significant new emissions of the gas, most likely from East Asia.

Evidence then uncovered by The New York Times and the Environmental Investigation Agency pointed to rogue factories in China as a likely major source.

Now, the E.I.A. has prepared a report that it says bolsters the finding that Chinese factories are behind the return of CFC-11.

Independent laboratory tests “clearly confirm the use of CFC-11 in three enterprises” in China, the agency said in the report. It plans to submit the work this week in Quito, Ecuador, where delegates from nearly 200 countries are attending a Montreal Protocol meeting on the status of efforts to repair the ozone layer.

Avipsa Mahapatra, head of the climate change campaign at the E.I.A., said the Chinese authorities should make thorough regulatory changes that make underground CFC-11 production impossible. “Simply clamping down a few enterprises without systemic changes could mean that similar illegal enterprises pop up in other regions,” she said in an email.

But definitive answers and solutions to the problem of CFC-11 appear to be some way off.

Chinese officials have said they have already acted vigorously to close rogue chemical makers. They have also asserted that the CFC-11 emissions in question are too large to be solely from those operations.

Tracing the source of a banned chemical

Scientists have said that they need more time and data to pin down the causes of the CFC-11 resurgence.

“When it comes to definitive answers, I think we have to first emphasize that this mystery has yet to be solved,” said Keith Weller, a spokesman for the United Nations Environment Program, which helps organize the ozone layer talks.

The CFC-11 mystery has wide implications. The ozone layer has been healing, but the return of a banned substance is an alarming breach in one of the world’s most effective environmental pacts and could slow the layer’s recovery.

CFC-11 is also a potent greenhouse gas. If it is leaking directly into the atmosphere from factories, even more gas may be held in the products made in those factories — for example, in insulation foam — and may enter the atmosphere when those products are eventually destroyed.

Scientists discovered decades ago that CFC-11 and other manufactured chemicals used as refrigerants and aerosols and in the production of insulating foams were destroying the ozone layer, which shields humans, crops and animals from the most damaging solar rays.

In 1987, countries agreed on the Montreal Protocol to phase out such gases, steadily replacing them with ever safer substitutes. The protocol has been praised as a model environmental initiative.

The Chinese government has said it will investigate and stamp out any illicit production off CFC-11, and Chinese industrial associations have vowed that their businesses will not use the chemical.

Officials announced last month that the police had broken up an illegal CFC-11 plant in Henan, a rural heartland province, and found more 30 metric tons of the chemical on the site, according to an official report.

A spokesman for the Chinese Ministry of Ecology and Environment, Liu Youbin, told a news conference on Wednesday that inspectors had checked 1,172 businesses over recent months and found evidence of CFC-11 in only 10.

“If it was just those small, illegal roaming producers, the volume could not be that much,” Chen Liang, an official with the ministry who oversees international cooperation, including in ozone layer policy, said in an interview.

Mr. Weller also said the estimated new emissions of CFC-11, in the order of roughly 13,000 metric tons per year, appeared to be too great to come from illegal production alone.

Yet Mr. Chen also said there were daunting barriers to regulating China’s vast numbers of chemical and foam-making businesses. By his count, there were about 3,000 businesses in the foam sector. But the numbers of scattered, under-the-radar plants could be much higher.

Furthermore, Mr. Chen said, local inspectors across China can lack the equipment to quickly measure levels of CFC-11 or the chemicals used to make it. Fly-by-night chemical producers were hard to uncover and punish, he added.

“After they finish up production, everyone leaves,” Mr. Chen said. “This is very difficult to attack.”

CFC-11 was also once widely used as a refrigerant for fridges and air-conditioners, and Mr. Chen said that the pickup in CFC-11 might come from leakage when appliances were improperly scrapped. But an expert panel that advises Montreal Protocol member governments has assembled numbers that put into doubt the assertion that leakage can explain the persistent CFC-11, at least on its own.

The panel noted that it would take the destruction of 13 million large fridges a year to account for 13,000 tons of CFC-11 released in the atmosphere. China, where fridges tend to be smaller, disposes of about 1.5 million of them a year, the panel said.

The delegates meeting in Ecuador will receive a new United Nations assessment of the health of the ozone layer that confirms the resurgence of CFC-11 emissions, and some are likely to press China for more answers. But scientists have said it will take longer before they can confidently track down the source or sources of the pollution.

In September, a team of scientists published research confirming that China has been emitting unaccountably high volumes of carbon tetrachloride, an ozone-destroying chemical with uses that include the production of CFC-11. But Mark Lunt, a researcher at the University of Edinburgh who was a co-author of the study, stressed it was too early to identify the precise sources of the carbon tetrachloride and say if there was a link with making CFC-11.

The team of scientists would soon submit a research paper on CFC-11, Mr. Lunt said. Other efforts to study the resurgence of CFC-11 are also afoot.

“There’s a huge variety of questions we need to answer,” Paul A. Newman, chief scientist for earth sciences at NASA’s Goddard Space Flight Center, said in a telephone interview. “I just think there’s a real lack of information at this point.”

Henry Fountain contributed reporting from New York.

Chris Buckley covers China, where he has lived for more than 20 years after growing up in Australia. Before joining The Times in 2012, he was a correspondent for Reuters. @ChuBailiang

Bitcoin can push global warming above 2 degrees C in a couple decades

PUBLIC RELEASE: 29-OCT-2018

It alone could produce enough emissions to raise global temperatures as soon as 2033

UNIVERSITY OF HAWAII AT MANOA

A new study published in the peer-reviewed journal Nature Climate Change finds that if Bitcoin is implemented at similar rates at which other technologies have been incorporated, it alone could produce enough emissions to raise global temperatures by 2C as soon as 2033.

"Bitcoin is a cryptocurrency with heavy hardware requirements, and this obviously translates into large electricity demands," said Randi Rollins, a master's student at the University of Hawaii at Manoa and coauthor of the paper.

Purchasing with bitcoins and several other cryptocurrencies, which are forms of currency that exist digitally through encryption, requires large amounts of electricity. Bitcoin purchases create transactions that are recorded and processed by a group of individuals referred to as miners. Miners group every Bitcoin transaction made during a specific timeframe into a block. Blocks are then added to the chain, which is the public ledger. The verification process by miners, who compete to decipher a computationally demanding proof-of-work in exchange for bitcoins, requires large amounts of electricity.

The electricity requirements of Bitcoin have created considerable difficulties, and extensive online discussion, about where to put the facilities or rings that compute the proof-of-work of Bitcoin. A somewhat less discussed issue is the environmental impacts of producing all that electricity.

A team of UH Manoa researchers analyzed information such as the power efficiency of computers used by Bitcoin mining, the geographic location of the miners who likely computed the Bitcoin, and the CO2 emissions of producing electricity in those countries. Based on the data, the researchers estimated that the use of bitcoins in the year 2017 emitted 69 million metric tons of CO2.

Researchers also studied how other technologies have been adopted by society, and created scenarios to estimate the cumulative emissions of Bitcoin should it grow at the rate that other technologies have been incorporated.

The team found that if Bitcoin is incorporated, even at the slowest rate at which other technologies have been incorporated, its cumulative emissions will be enough to warm the planet above 2C in just 22 years. If incorporated at the average rate of other technologies, it is closer to 16 years.

"Currently, the emissions from transportation, housing and food are considered the main contributors to ongoing climate change. This research illustrates that Bitcoin should be added to this list," said Katie Taladay, a UH Manoa master's student and coauthor of the paper.

"We cannot predict the future of Bitcoin, but if implemented at a rate even close to the slowest pace at which other technologies have been incorporated, it will spell very bad news for climate change and the people and species impacted by it," said Camilo Mora, associate professor of Geography in the College of Social Sciences at UH Manoa and lead author of the study.

"With the ever-growing devastation created by hazardous climate conditions, humanity is coming to terms with the fact that climate change is as real and personal as it can be," added Mora. "Clearly, any further development of cryptocurrencies should critically aim to reduce electricity demand, if the potentially devastating consequences of 2C of global warming are to be avoided."

Study reconciles persistent gap in natural gas methane emissions measurements

PUBLIC RELEASE: 29-OCT-2018

The Colorado State University-led study resulted from a large, multi-institutional field campaign

A new study offers answers to questions that have puzzled policymakers, researchers and regulatory agencies through decades of inquiry and evolving science: How much total methane, a greenhouse gas, is being emitted from natural gas operations across the U.S.? And why have different estimation methods, applied in various U.S. oil and gas basins, seemed to disagree?

The Colorado State University-led study, published Oct. 29 in Proceedings of the National Academy of Sciences, resulted from a large, multi-institutional field campaign called the Basin Methane Reconciliation Study. The researchers found that episodic releases of methane that occur mostly during daytime-only maintenance operations, at a few facilities on any given day, may explain why total emissions accountings have not agreed in past analyses.

With invaluable assistance from industry partners, the researchers have significantly advanced basin-level emission quantification methods and shed new light on important emissions processes.

"Our study is the first of its kind, in its scope and approach," said Dan Zimmerle, senior author of the PNAS study, and a senior research associate at the CSU Energy Institute. "It utilized concurrent ground and aircraft measurements and on-site operations data, and as a result reduces uncertainties of previous studies."

The Basin Methane Reconciliation Study included scientists from CSU, Colorado School of Mines, University of Colorado Boulder, the National Oceanic and Atmospheric Administration, and the National Renewable Energy Laboratory. Other scientific partners were University of Wyoming, Aerodyne, AECOM, Scientific Aviation and GHD. The field campaign took place in 2015 in the Fayetteville shale gas play of Arkansas' Arkoma Basin.

The campaign involved more than 60 researchers making coordinated facility- and device-level measurements of key natural gas emissions sources. The campaign also included a series of aircraft flyovers to collect measurements during the same period when researchers were taking measurements on the ground. The flights took place when meteorological conditions allowed accurate regional emissions estimates.

The research team set out to investigate the persistent gap between two widely used methods of estimating methane emissions from natural gas operations. "Bottom-up" estimates, such as those used in the EPA Inventory of U.S. Greenhouse Gas Emissions and Sinks, are developed by measuring emissions from a representative sample of devices, scaled up by the number of devices or emission events. In contrast, "top-down" measurements can be performed at a regional scale, such as flying an aircraft upwind and downwind of a study area to derive total emissions from methane entering or leaving a basin.

In the past, most aircraft-based, basin-scale emissions estimates have been statistically higher than estimates based on bottom-up accounting.

"The key to our efforts was having everyone out in the same field, at the same time," said Gabrielle Petron, a research scientist from CIRES at CU Boulder and NOAA who was the principal investigator for the top-down measurement team. "By comparing the merits and pitfalls of multiple methods of measurement, we were able to paint a much more comprehensive, and we believe accurate, picture of the methane emissions landscape for natural gas infrastructure."

The PNAS paper utilized a comprehensive set of results published in lead-up papers evaluating emissions from natural gas facilities, including well pads, gathering stations, gathering pipelines, and transmission and distribution sectors. The teams made simultaneous measurements using multiple methods at well pads and compressor stations - the largest emissions sources identified in the basin - which highlighted the strengths and weaknesses of various on-site and downwind methods. Natural gas distribution systems were only a small fraction of total natural gas emissions. The team also estimated methane emissions from biogenic sources, such as agriculture and landfills.

The entirety of those research efforts culminated in the CSU-led PNAS paper that synthesized all the data taken by the research teams. This capstone paper compared a bottom-up estimate that accounted for the location and timing of emissions, with a top-down estimate developed via aircraft measurement - an analysis that had never been done before. One of the study's key insights was the importance of understanding the timing of measurements and daytime maintenance activities, which likely explains the persistent gap between previous top-down and bottom-up estimates.

Routine maintenance activities occur in the daytime and can cause short periods of high emissions in the middle of the day. One of these activities is called "manual liquids unloading," which removes liquids buildup at a natural gas well in order to restore gas production flows. The unloading process can temporarily divert the flow of natural gas from the well to an atmospheric vent.

These activities typically happen during the day, around the same time a research aircraft would be conducting measurements. This diurnal variability of methane emissions, the researchers concluded, may help explain why estimates from aircraft measurements have previously been higher than estimates based on annual bottom-up inventories.

In this study, for the first time, the researchers showed that the east-to-west variation of methane emissions across the basin, derived from aircraft measurements, were reproduced by the high-resolution, bottom-up emissions model developed by the CSU team.

Industry partnerships and public/private funding

Working with industry technical experts gave the study team invaluable full site access for bottom-up measurements and hourly and spatially resolved operational data. The data improved the understanding of the magnitude and timing of emissions, particularly for episodic events like maintenance operations.

"What we have found is that we have two good methods of measurements. If you want to compare them, you have to account for timing and location of emissions," said PNAS lead author Tim Vaughn, a CSU research scientist.

Applicability of these results to other production basins is not known, the researchers say, without understanding the timing of emissions in those other basins.

Funding for the PNAS study was provided by the Department of Energy Research Partnership to Secure Energy for America/National Energy Technology Laboratory contract No. 12122-95/DE-AC26-07NT42677 to the Colorado School of Mines. Cost share for the project was provided by Colorado Energy Research Collaboratory, the National Oceanic and Atmospheric Administration, Southwestern Energy, XTO Energy, a subsidiary of ExxonMobil, Chevron, Equinor and the American Gas Association, many of whom also provided operational data and/or site access. Additional data and/or site access was also provided by CenterPoint, Enable Midstream Partners, Kinder Morgan, and BHP Billiton.

Most Americans underestimate minorities' environmental concerns -- even minorities

PUBLIC RELEASE: 29-OCT-2018

CORNELL UNIVERSITY

ITHACA, N.Y. - A new study shows most Americans underestimate just how concerned minorities and lower-income people are about environmental threats, including members of those groups.

These misperceptions contradict significant research that shows racial and ethnic minorities and the poor are consistently among the most worried about environmental challenges, said co-author Jonathon Schuldt, associate professor of communication at Cornell University.

"What really surprised us was just how paradoxical the results were," he said. "We found a very consistent pattern that if the American public thought a group was very low in concern, in fact that same group was reporting high levels of concern."

The study, "Diverse Segments of the U.S. Public Underestimate the Environmental Concerns of Minority and Low-Income Americans," was published in the Proceedings of the National Academy of Sciences. It also found most Americans associate the term "environmentalist" most closely with whites and the well-educated.

Schuldt and his co-authors attribute the findings to stereotypes in American culture. For example, there's a misperception that people with lower incomes have more pressing needs and don't have the luxury to worry about environmental threats. However, poorer people and people of color consistently report the opposite, perhaps because they are typically the hardest hit by environmental challenges.

The researchers conducted an online survey of a nationally representative sample of 1,200 Americans about their levels of concern for the environment, whether they identified as an environmentalist, and the age, socioeconomic class and race they associated with the term "environmentalist."

The findings could have practical implications for environmental advocacy and policy. If policymakers, scholars and practitioners endorse similar views, these misperceptions may influence which groups' perspectives get prioritized and may contribute to the historical marginalization of minority and lower-income populations.

Co-authors on the paper were Adam Pearson of Pomona College, Rainer Romero-Canyas of the Environmental Defense Fund and Columbia University, Matthew Ballew of Yale University and Dylan Larson-Konar of the Environmental Defense Fund and the University of Florida, Gainesville. The research was funded in part by a grant from Cornell's Atkinson Center for a Sustainable Future and the Environmental Defense Fund.

Alterations to seabed raise fears for future

Ocean acidification caused by high levels of human-made CO2 is dissolving the seafloor

PUBLIC RELEASE: 29-OCT-2018

MCGILL UNIVERSITY

Normally the deep sea bottom is a chalky white. It's composed, to a large extent, of the mineral calcite (CaCO3) formed from the skeletons and shells of many planktonic organisms and corals. The seafloor plays a crucial role in controlling the degree of ocean acidification. The dissolution of calcite neutralizes the acidity of the CO2, and in the process prevents seawater from becoming too acidic. But these days, at least in certain hotspots such as the Northern Atlantic and the southern Oceans, the ocean's chalky bed is becoming more of a murky brown. As a result of human activities the level of CO2 in the water is so high, and the water is so acidic, that the calcite is simply being dissolved.

The McGill-led research team who published their results this week in a study in PNAS believe that what they are seeing today is only a foretaste of the way that the ocean floor will most likely be affected in future.

Long-lasting repercussions

"Because it takes decades or even centuries for CO2 to drop down to the bottom of the ocean, almost all the CO2 created through human activity is still at the surface. But in the future, it will invade the deep-ocean, spread above the ocean floor and cause even more calcite particles at the seafloor to dissolve," says lead author Olivier Sulpis who is working on his PhD in McGill's Dept. of Earth and Planetary Sciences. "The rate at which CO2 is currently being emitted into the atmosphere is exceptionally high in Earth's history, faster than at any period since at least the extinction of the dinosaurs. And at a much faster rate than the natural mechanisms in the ocean can deal with, so it raises worries about the levels of ocean acidification in future."

In future work, the researchers plan to look at how this deep ocean bed dissolution is likely to evolve over the coming centuries, under various potential future CO2 emission scenarios. They believe that it is critical for scientists and policy makers to develop accurate estimates of how marine ecosystems will be affected, over the long-term, by acidification caused by humans.

How the work was done

Because it is difficult and expensive to obtain measurements in the deep-sea, the researchers created a set of seafloor-like microenvironments in the laboratory, reproducing abyssal bottom currents, seawater temperature and chemistry as well as sediment compositions. These experiments helped them to understand what controls the dissolution of calcite in marine sediments and allowed them to quantify precisely its dissolution rate as a function of various environmental variables. By comparing pre-industrial and modern seafloor dissolution rates, they were able to extract the anthropogenic fraction of the total dissolution rates.

The speed estimates for ocean-bottom currents came from a high-resolution ocean model developed by University of Michigan physical oceanographer Brian Arbic and a former postdoctoral fellow in his laboratory, David Trossman, who is now a research associate at the University of Texas-Austin.

"When David and I developed these simulations, applications to the dissolution of geological material at the bottom of the oceans were far from our minds. It just goes to show you that scientific research can sometimes take unexpected detours and pay unexpected dividends," said Arbic, an associate professor in the University of Michigan Department of Earth and Environmental Sciences.

Trossman adds: "Just as climate change isn't just about polar bears, ocean acidification isn't just about coral reefs. Our study shows that the effects of human activities have become evident all the way down to the seafloor in many regions, and the resulting increased acidification in these regions may impact our ability to understand Earth's climate history."

"This study shows that human activities are dissolving the geological record at the bottom of the ocean," says Arbic. "This is important because the geological record provides evidence for natural and anthropogenic changes."

Waste plastic as a fuel? Nestle, ReNew and Licella launch collaboration

Michael Holder

Thursday, August 30, 2018 - 3:30am

Square bales of wrapped plastic bottles ready for the melting process.

A group of chemicals, recycling and technology companies have announced a joint effort to explore using mixed plastic waste as a raw material for fuels, chemicals and new plastics.

The collaboration, announced Aug. 16, will see oil refining and chemicals specialist Neste team up with U.K. recycling firm ReNew ELP and Australian technology developer Licella to develop new industrial and commercial uses for mixed plastic waste.

In addition to studying the feasibility and sustainability of using liquefied waste plastic as a refinery raw material, the three companies also seek regulatory acceptance for new chemical recycling processes.

The collaboration forms part of Neste's ambition to introduce liquefied waste plastic as a future alternative raw material to fossil fuel refining. The renewable diesel manufacturer is targeting the production of more than 1 million tonnes of waste plastic each year for this purpose by 2030.

The collaboration forms part of Neste's ambition to introduce liquefied waste plastic as a future alternative raw material to fossil fuel refining.

Matti Lehmus, executive VP of Neste's oil products business, said he believed the collaboration could accelerate commercialization of waste plastic-based products. "Neste has a strong legacy in refining, as well as raw material and pretreatment research, but we still need development of technologies, value chains and supporting legislation for plastic waste-based products to become a reality at industrial scale," he said.

Separately, Neste is also working with retailer IKEA to develop durable and recyclable plastics made from bio-based materials such as waste fats and oil, with a view to launching commercial scale production for the first time later this year.

U.K. recycler ReNew ELP, meanwhile, is constructing a new chemical recycling plant in Teesside designed to recycle end-of-life plastic into a raw material for a range of petrochemical products.

The plant is being developed by Licella, and although it is not included in the wider collaboration project with Neste, it will "nevertheless contribute to a common goal of enabling more efficient waste plastic utilisation in the future," the firms explained.

Len Humphreys, CEO of Licella Holdings, said he was "excited" by both the new ReNew facility and the wider collaboration involving Neste as important means of meeting the "significant challenge of end-of-life plastic."

"The collaboration with Neste and ReNew ELP will help to create markets for recycled carbon fuels and chemicals at a critical time as Europe pushes towards a circular economy," he said.

Swedish school installs saltwater battery, highlights environmental benefits

Saltwater batteries can be fully recycled receiving a “cradle to cradle” certificate, and don’t contain lead or lithium. For some, this prevails over the disadvantages of saltwater batteries being considerably bigger and heavier, as well as having a lower discharging current than lithium-ion batteries.

Hansjrg Weisskopf (CEO BlueSky Energy), Maria Gardfjell (head commissioner of Uppsala),

Marie Utter (political secretary Uppsala Municipality).

Austria-based saltwater battery storage company, BlueSky Energy announced a new project in Sweden. The company will install a 24 kWh saltwater battery alongside a 100 kW solar PV system at a school building in Uppsala, Sweden.

Uppsala’s Foundation for Collaboration between universities, businesses, and public sector, advised the local municipality to opt for a saltwater storage system, citing environmental and safety benefits.

“Uppsala Municipality has become a test bed for new technologies thus creating new knowledge and expertise which can be spread. In Uppsala, there is a unique collaboration between the public sector, private sector, and universities which allows these kinds of exciting new energy solutions to take root,” said Marlene Burwick (Social Democratic Party), Municipal Chairman for the Board of Directors and Commissioner of STUNS Board.

As the school building was completely rebuilt, the municipality sought to install a PV plus storage system to increase self-consumption and improve the school’s carbon footprint.

The current system will cover about 10-12% of self-consumption. A spokesperson of BlueSky disclosed to pv magazine that the system was sized for initial trialing of the concept. There will be additional roof space available, and the current set-up can be scaled to increase self-consumption.

Moreover, the system aims to increase the share of renewable energy in the public building’s energy mix; feeding into the grid is not planned. The school’s 800 pupils will also be tought about climate change, the environment and renewable energy, using the PV plus storage system as an educational resource.

Saltwater battery storage systems can be an alternative to the more market popular lithium-ion storage systems. The company itself cites environmental and safety benefits, pointing to the advantages of its product.

Lithium mining can have adverse effects on the adjacent environment, and so far, there are only limited options to fully recycle lithium from existing batteries. Indeed, the material’s structure, which enables it to store energy, is altered over time, and recycling processes that enable this structure to be restored, rather than wrecking it, are still in the research phases.

Due to saltwater batteries’ lower performance, when it comes to weight and power density, its application is limited stationary storage systems, and cannot be used in EV’s, for example. Nonetheless, theyhave another advantage over lithium-ion based systems, as they cannot explode in a fire and, thus, are less harzardous to fire fighters and occupants in such an event.

BlueSky highlights that it uses Aquion’s saltwater batteries, which reportedly are the only battery storage systems to receive a “cradle to cradle” certificate. For many, this trumps other potential advantages lithium-ion storage systems have.

“It is thanks to the Municipality’s high set climate goals and ambitions to be completely nontoxic in its school environments that we are now testing this battery, which its main content is saltwater, at Tiunda school. Uppsala welcomes the opportunity to share its experiences regarding smart climate and environmental solutions with other municipalities in Sweden and internationally,” said Maria Gardfjell (Environmental Party) Commissioner responsible for Environmental and Climate Issues.

LOOKING AHEAD AT SOLAR PANEL RECYCLING IN ILLINOIS SEMINAR & VIDEO

September 12, 2018 Laura B. Illinois,

Presented by Nancy Holm, Assistant Director, and Jennifer Martin, Environmental Program Development Specialist, both from the Illinois Sustainable Technology Center, a division of the Prairie Research Institute at the University of Illinois at Urbana-Champaign. Presentation slides are available here.

The new Illinois Future Energy Jobs Act which went into effect on June 1, 2017, now requires an updated Renewable Portfolio Standard (RPS) in Illinois. This new RPS directs that solar energy use be expanded across the state, increasing Illinois’ solar capacity from about 75 MW to about 2,700 MW by 2030. With these new requirements for increased solar installations, there will be a critical need to examine the disposal and recycling or repurposing of used and broken solar panels to ensure protection of the environment and also to explore the economic benefits of recycling/repurposing solar panels in the state.

As the solar energy industry continues to grow in the state as well as nationally and globally, there is a looming waste management issue at the end-of-life of these panels. The average lifespan of a panel is 30 years. Given this, the International Renewable Energy Agency estimated that there will be a surge in solar panel disposal in the early 2030s, and by 2050, there will be 60 to 78 million cumulative tons of solar panel waste globally. Some countries, particularly in Europe, have established recycling networks. However, in the U.S. we are just beginning to develop a network of recyclers and, as yet, there is not a strong recycling infrastructure in place in Illinois. These used or broken solar panels should be properly recycled to prevent toxic compounds from leaching into the environment, as well as to avoid disposal of valuable and finite resources into landfills.

The Illinois Sustainable Technology Center is working with the Illinois Environmental Protection Agency, Solar Energy Industry Association, Illinois Solar Energy Association, and other entities to create awareness around these issues, as well as establish the necessary policy and standards required to build a network of solar panel recyclers in Illinois. Details on solar panel waste and recycling will be discussed in this presentation.

Video:

https://youtu.be/dLb7y2BkMnk

National Teachers Group Confronts Climate Denial: Keep the Politics Out of Science Class

The association urged science teachers at all levels to emphasize that 'no scientific controversy exists regarding the basic facts of climate change.'

Phil McKenna

BY PHIL MCKENNA

SEP 13, 2018

"If there are controversies or other issues to deal with, we want them to have a good solid foundation in evidence-based knowledge to carry out that conversation," said David Evans, executive director of the National Science Teachers Association. Credit: Peter Macdiarmid/Getty Images

In response to what it sees as increasing efforts to undermine the teaching of climate science, the nation's largest science teachers association took the unusual step Thursday of issuing a formal position statement in support of climate science education.

In its position statement, the National Science Teachers Association (NSTA) calls on science teachers from kindergarten through high school to emphasize to students that "no scientific controversy exists regarding the basic facts of climate change."

"Given the solid scientific foundation on which climate change science rests, any controversies regarding climate change and human-caused contributions to climate change that are based on social, economic, or political arguments—rather than scientific arguments—should not be part of a science curriculum," it says.

It also urges teachers to "reject pressures to eliminate or de-emphasize" climate science in their curriculum. And it urges school administrators to provide science teachers with professional development opportunities to strengthen their understanding of climate science.

"Now, more than ever, we really feel that educators need the support of a national organization, of their educational colleagues and their scientist colleagues, because they have encountered a lot of resistance," David Evans, the executive director of NSTA, said.

"In climate science, as in other areas, we really emphasize the importance that students learn the science in science class, and if there are controversies or other issues to deal with, we want them to have a good solid foundation in evidence-based knowledge to carry out that conversation," he said.

Judy Braus, executive director of the North American Association for Environmental Education, said her organization fully supports the NSTA position statement. "We feel that it's important to address the misinformation that's out there about climate" change, she said.

Only Evolution Draws This Kind of Response

NSTA has issued position statements in the past on topics such as safety, gender equity and the responsible use of animals in the classroom, but this is only the second focused on the teaching of subject matter that can be controversial for reasons not related to the science itself but for societal or political reasons.

"Over the last five years, the two issues that have had the most controversy with them have been evolution on a continuing basis and climate change, and there has been more controversy around climate change," Evans said.

Teachers and school boards have been under pressure from organizations that oppose climate policies, including some that have promoted misinformation and aruged for climate change to be removed from state science curricula. Last year, the Heartland Institute, a conservative advocacy organization with close ties to the fossil fuel industry, mailed approximately 300,000 copies of its publication "Why Scientists Disagree About Global Warming" to middle, high school and college science teachers around the country.

Evans said Thursday's position statement was not a direct response to the Heartland mailings but was precipitated by attacks on climate science curriculum that have been building since the National Research Council recommended climate science be included in K–12 science education in 2012.

Pressure to Change State Science Standards

Battles have erupted in recent years in states including Texas, Louisiana and Idaho, over the role climate science should play in new state science standards.

Glenn Branch, deputy director of the National Center for Science Education, a nonprofit that defends the integrity of science education against ideological interference, said the position statement comes at a key time: Arizona is now devising new science standards and arguing over climate change. The draft standards have not yet been approved by the state Board of Education, but he said "the latest revision deletes a whole slew of high school level standards on climate change."

Branch, who was not involved in developing NSTA's position statement, said the document should help classroom teachers who may feel political or societal pressure to eliminate climate science instruction.

"A teacher who is being pressured by a parent or an administrator can say 'look, I'm a professional, I'm trained for this, both before I became a teacher and through continuing education, I have responsibilities to my profession, and my professional organization, the NSTA says this is what I should be doing,'" Branch said. "I think that will be empowering for many teachers."

Nebraska researchers to lead largest drone-based study of storms

UNIVERSITY OF NEBRASKA-LINCOLN

PUBLIC RELEASE: 14-SEP-2018

The University of Nebraska-Lincoln will lead the most ambitious drone-based investigation of severe storms and tornadoes ever conducted.

More than 50 scientists and students from four universities will participate in the TORUS (Targeted Observation by Radars and UAS of Supercells) study, deploying a broad suite of cutting-edge instrumentation into the US Great Plains during the 2019 and 2020 storm seasons. The project will exceed $2.5 million, with the National Science Foundation awarding a three-year, $2.4 million grant and the National Oceanic and Atmospheric Administration providing additional financial support.

It is the largest-ever study of its kind, based upon the geographical area covered and the number of drones and other assets deployed, according to Nebraska's Adam Houston.

The project will involve four unmanned aircraft systems or drones, a NOAA P3 manned aircraft, eight mesonet trucks equipped with meteorological instruments, three mobile radar systems, a mobile LIDAR system, and three balloon-borne sensor launchers. The University of Colorado Boulder, Texas Tech University and the University of Oklahoma, along with the National Severe Storms Laboratory, also are participating.

"We are flying more aircraft at the same time," said Houston, associate professor of Earth and atmospheric sciences and one of seven principal investigators. "We've only flown one drone in the past, now we're going to fly four. We can fly in more parts of the storm at the same time, get more data and answer a more extensive set of questions."

The TORUS team is authorized to operate over approximately 367,000 square miles -- virtually all of the Central Plains from North Dakota to Texas, Iowa to Wyoming and Colorado. The aim is to collect high resolution data from within the storm, to improve forecasting for tornadoes and severe storms, Houston said.

The research goal is use data collected to improve the conceptual model of supercell thunderstorms, the parent storms of the most destructive tornadoes, by exposing how small-scale structures within these storms might lead to tornado formation. These structures are hypothesized to be nearly invisible to all but the most precise research-grade instruments. By revealing the hidden composition of severe storms and associating it to known characteristics of the regularly-observed larger scale environment, the TORUS project could improve supercell and tornado forecasts.

"There are fundamental problems with tornado warnings," Houston said. "The false alarm rate is high -- 75 percent of the time we don't get a tornado. Yet if you reduce the false alarm rate, you also reduce the rate of detection. We need to improve that gap to save lives. We can do that if we can improve our understanding of small-scale structures and small-scale processes that lead up to storms.

"We want to know how the storm influences the environment, and vice versa, in the seconds, minutes and hours leading up to the storm and afterwards."

The Central Plains, aka "Tornado Alley," serves as a great laboratory to better understand severe storms, he said.

"Every place in the United States is vulnerable to super cell thunderstorms. What we learn in this laboratory called the Central Plains is applicable to everywhere -- it's geographically agnostic."

Research preparations began Sept. 1. The TORUS team will begin fieldwork May 13, locating where they see a promising weather pattern developing. Their fieldwork will continue until June 16, with a goal of staying out as long as possible.

It is dangerous work, Houston said.

"There's no getting around it. We're putting ourselves in the path of these serious storms."

Hurricane raises questions about rebuilding along North Carolina's coast

Anna Mehler Paperny

RODANTHE, N.C. (Reuters) - When Florence was raging last Friday on North Carolina’s Outer Banks, the hurricane tore a 40-foot (12-meter) chunk from a fishing pier that juts into the ocean at the state’s most popular tourist destination.

The privately owned Rodanthe pier has already undergone half a million dollars in renovation in seven years and the owners started a new round of repairs this week.

“The maintenance and upkeep on a wooden fishing pier is tremendous,” said co-owner Terry Plumblee. “We get the brunt of the rough water here.”

Scientists have warned such rebuilding efforts are futile as sea levels rise and storms chew away the coast line but protests from developers and the tourism industry have led North Carolina to pass laws that disregard the predictions.

The Outer Banks, a string of narrow barrier islands where Rodanthe is situated, may have been spared the worst of Florence, which flooded roads, smashed homes and killed at least 36 people across the eastern seaboard.

Still, the storm showed North Carolinians on this long spindly finger of land that ignoring the forces of nature to cling to their homes and the coast’s $2.4 billion economy may not be sustainable.

Some have called for halting oceanside development altogether.

September 18, 2018. REUTERS/Eduardo Munoz

“We need to actually begin an organized retreat from the rising seas,” said Duke University geologist Orrin Pilkey.

In a government study published in 2010, scientists warned that sea levels could rise 39 inches by 2100. (bit.ly/2xAqn6y)

Higher sea level will cause more flooding and render some communities uninhabitable, as well as affect the ocean vegetation, jeopardize the dune systems that help stabilize the barrier islands, and cause more intense erosion when storms like Florence make landfall, scientists said.

Developers said the study was too theoretical to dictate policy.

Some argue policymakers do not need a 90-year projection to know something needs to change.

“When we have a hurricane, that shows everybody where their vulnerabilities are today, forget 100 years from now, but right now,” said Rob Young, a geologist at Western Carolina University who co-authored the study by the state’s Coastal Resources Commission (CRC).

Young said he would like to see development move back from the ocean’s edge and laments that homeowners and developers rebuild almost any structure damaged or destroyed by a bad storm.

But the idea of retreating is a tough sell for the people who live there and have invested in property.

“You’re asking us to say, ‘Hey, 4,000 or 5,000 people on little Hatteras Island, it’s time for you to pack up and move,’ and that’s not a reasonable expectation,” said Bobby Outten, manager for Dare County on the Outer Banks.

Opponents of using the CRC study to set policy said that most Outer Bank homeowners recognize the risks.

“If you’re buying on the coast, anyone that buys in an area surrounded by water, you’re always taking a risk that you’re going to have storm damage,” said Willo Kelly, who has worked in real estate for more than a decade.

Even though she acknowledges that sea levels are rising, Kelly is also among those who opposed making state policy decisions, including anything affecting home insurance or property values, based on the study’s dire 90-year forecast of sea-level rise.

Kelly supported a 2012 state law that banned North Carolina from using the 90-year prediction on rising sea levels to influence coastal development policy.

The CRC released a second report in 2015 predicting sea level rise over a 30-year period, instead of 90 years. The new report was praised by developers as being more realistic and said sea levels would rise 1.9 to 10.6 inches. (bit.ly/2xyGDVr)

The 2012 law was welcomed by the development community and panned by scientists whose warnings, they felt, were going unheeded. Members of the legislature who sponsored the bill did not return requests for comment.

After this year’s storm demolished the sandy protective berms that stand between the water and the main coastal road, the state sent backhoes to rebuild them and officials to assess damage to bridges and roads.

“There’s also a sense of denial,” said Gavin Smith, director of the University of North Carolina’s Coastal Resilience Center, adding that with rising seas and more intense storms it will be more costly and more difficult to replace infrastructure.

Rodanthe Pier was lucky this time, sustaining only moderate damage, said Clive Thompson, 58, who works at the pier. In the past, nor’easters have ripped its end from the ground and tore pilings from sand.

The beach was not so lucky. The ocean ate away about 50 meters of what used to be dry sand above the high-tide line, he said.

“It’s a waste of man hours, time and money, having to do this over and over,” he said. “One day I hope people understand the power of water. ... It don’t play.”

California First State To Bar Restaurants From Handing Out Straws

"A new law bars California restaurants from automatically giving out single-use plastic straws with drinks, starting next year.

The bill, which only applies to full-service restaurants, was signed by California Gov. Jerry Brown on Thursday. Customers who want a plastic straw still will be able to request them from the restaurant under the law.

“Plastic has helped advance innovation in our society, but our infatuation with single-use convenience has led to disastrous consequences,” Brown wrote in a signing message, according to the Los Angeles Times.

Oklahoma: "Insurer Says Oil Companies Caused Quakes, Wants Money Back"

"An insurance company is suing oil companies in Oklahoma, saying their disposal operations caused the state's largest earthquake.

Steadfast Insurance Co. paid about $325,000 to the Pawnee Nation for damage it sustained in a magnitude 5.8 earthquake in September 2016. It's suing to recover the money.

Steadfast, a subsidiary of Zurich Holding Company of America Inc., filed suit in U.S. District Court in Tulsa last month against EnerVest Operating LLC and six other oil companies that operate in the area where the quake was centered. The tribe has also sued some of the same companies."

The world’s first hydrogen-powered train makes (almost) complete sense

By Akshat RathiSeptember 20, 2018

In a world worried about climate change, hydrogen is a magical fuel. It has all the features of natural gas: it’s a fuel that’s easy to access, transport, and burn. And unlike natural gas, when you burn hydrogen, you only get water—no carbon dioxide. It’s the reason why, as the world continues to fail to hit climate goals, it turns back to hydrogen time and again.

The latest example is the world’s first noise-free, zero-emissions, hydrogen-powered train running about 100 km (about 60 miles) between Cuxhaven and Buxtehude in Germany. Built by the French multinational rail company Alstom with funds from the German government, the train can run at a maximum speed of 140 km per hour (about 90 miles per hour) for about 1,000 km on a full tank of hydrogen.

Currently, most train routes in the world run on diesel, which doesn’t just produce carbon dioxide but a lot of particulate pollution too. You might argue that, instead of burning diesel, why not electrify these routes, and power them with wind or solar? After all, we already do have some trains that run efficiently on electricity.

And it does indeed make sense to do that in many cases. But there are edge cases, such as that of Cuxhaven and Buxtehude, a route that runs from the North Sea to just outside Hamburg, where building the necessary electric infrastructure is prohibitively expensive. That may be because the location is too far from an electric grid or because the line will run too infrequently to recoup the investment needed. Alstom estimates that electrifying a railway line costs around €1.2 million ($1.4 million) per km.

One other option to run zero-emission trains would be to power them on batteries instead of using electricity from the grid. Here, however, physics becomes a problem. Batteries are bulky. They may be efficient at transporting four or six people in a car, but moving hundreds in a train is much harder. The heavier the train becomes, the more batteries it needs to move; the more batteries we add, the heavier the train becomes.

That may change if batteries are able to pack more energy for the same weight. Until then, the next-best option is hydrogen-powered fuel cells.

There’s just one problem: hydrogen doesn’t have the ready supply chain of diesel. Over the past century, we’ve built out a large infrastructure to produce and transport diesel, and we’ve designed ever-better diesel engines. To use hydrogen at the same or lower cost, we will need to build a similar network.

Currently, the cheapest way to produce hydrogen is from natural gas. In large reactors, natural gas is exposed to high-temperature steam to produce hydrogen and carbon dioxide. Using this hydrogen to run the train defeats the purpose somewhat. The new train itself may be zero emissions at the point of use because it only produces water and no particulate pollution. But the hydrogen supply chain doesn’t entirely eliminate greenhouse gases.

The hope is that, as the use of hydrogen becomes widespread, the technology to produce hydrogen from water (essentially reversing the reaction occurring inside the train) by using renewable electricity coming from wind, solar, or nuclear power will become cheaper. Alstom has an order to build 14 more such trains across Germany. Other countries, including France, Denmark, Norway, Italy, Canada, the Netherlands, and the UK are all looking into hydrogen trains.

Just like the steam-engine kickstarted the age of coal, the German train could just kickstart the age of hydrogen.

North Carolina Flooding Soaks Federal Taxpayers All Over Again

By Christopher Flavelle

September 21, 2018, 4:00 AM EDT

One home in Nags Head received insurance payouts 28 times

Billions for rebuilding homes, but little to help owners move

When Hurricane Florence reached Belhaven, North Carolina, last weekend, the flooding that followed was unlike anything Ricky Credle, the mayor of the tiny coastal town had ever experienced.

“I’ve been here 45 years,” Credle said in a phone interview. “I’ve never seen this much water.”

For federal taxpayers, however, the flooding in Belhaven was all too familiar.

Since 1978, 120 homes in the town of 1,600 people have cost federal taxpayers $13.4 million in National Flood Insurance Program payouts, according to data the Natural Resources Defense Council obtained from the Federal Emergency Management Agency; 36 received more money than their total value. Five of those homes have gotten federal flood insurance payments ten times or more, including one that received money 15 times.

The result: homes that flood keep getting rebuilt with public money, only to flood again. That leaves people living in vulnerable areas who might rather receive money to move away -- and the flood insurance program, already more than $20 billion in debt, going deeper underwater.

“We spend all this money to rebuild these homes, and we spend very little money helping people get out of these homes -- even when that’s what they want,” said Rob Moore, a senior policy analyst at the Natural Resources Defense Council. “Efforts to help move people move somewhere safer are seen as a last option, instead of a first option.”

Encouraging people to keep rebuilding in vulnerable places reflects the design of the flood insurance program. Not only does it subsidize people’s premiums, it imposes no limit on the number of times a homeowner can make a claim. At the same time, federal programs designed to pay people to move out of flood-damaged homes often take years to result in offers, by which time many people have repaired their homes and moved back in.

There are more than 1,100 so-called severe repetitive loss properties across North Carolina, the NRDC data shows. The federal government has paid out $163.9 million in flood insurance for those homes, which is almost 60 percent of their combined total value. More than 400 have gotten more in federal insurance claims than the home is worth.

One home, in Nags Head, received flood insurance payments 28 times before it was torn down.

Severe repetitive loss properties account for just two percent of all flood insurance policies, but as much as one-third of all claims, according to R.J. Lehmann, director of insurance policy for the R Street Institute, a Washington think tank.

The result is that the program relies ever more heavily on taxpayers. Last year, the Congressional Budget Office calculated that annual costs exceed premiums by about one-third -- and that was before Hurricanes Harvey, Irma and Maria. In October, Congress forgave $16 billion in debt that the program owed the U.S. Treasury.

“The NFIP was not designed to retain funding to cover claims for truly extreme events,” the Congressional Research Service wrote in a report released this year.

Belhaven demonstrates that trend as well as any place. For all the money the federal government has spent in the aftermath of previous flooding there, it has spent very little on reducing the threat from the next flood. Just one-third of the town’s severe repetitive loss properties have gotten money for what the government calls “mitigation” -- a category that includes flood-proofing or raising the home on stilts.

The most effective type of mitigation is tearing down the house, and paying its owner to move elsewhere. But that step is also the most rare. The NRDC’s data show that of the 120 homes, none have been torn down through a buyout program.

Credle, the Belhaven mayor, said moving some residents to safer locations should be considered as part of the response to Florence. He said it didn’t matter if that meant his town shrinks, along with its property tax revenue.

“You always want the tax base to be good,” Credle said. “But you also want people to be safe.”

Washington, a North Carolina town at the mouth of the Pamlico River that was also flooded by Florence, has 114 severe repetitive loss properties. Together, those 114 homes have received $12.6 million in federal insurance payments since 1978. Yet at most 28 of those homes have received federal money to reduce their flood exposure, and only one has been torn down. One of those homes, with a total value of $102,000, has gotten federal flood insurance payments 11 times.

In New Bern, a town 45 minutes south of Washington, 29 severe repetitive loss homes have received $2.7 million in federal flood insurance payments. The federal government has acted to protect just six against flood damage. None of those homes have been bought out and demolished.

As climate change gets worse, federal policies have effectively reduced the cost of living in flood-prone areas, increasing the number of homes that continue to flood at taxpayer expense.

By subsidizing flood insurance by allowing an unlimited number of claims, the insurance program serves to increase other types of federal disaster costs, Lehmann said. “If people weren’t living there, then there would be less disaster assistance necessary,” he said.

FEMA didn’t immediately respond to a request for comment.

Congressional Action

Congress worried about the perverse incentives of the flood insurance program long before Florence. In 2012, Congress passed legislation phasing out the subsidies offered to homeowners, increasing premiums in places like coastal North Carolina. Homeowners objected, and Congress rolled back those changes in 2014.

Then, last year, the House passed a bill that would limit the amount of money a home could get from the flood insurance program to three times the value of that home.

The bill didn’t clear the Senate.

“People that live in this cycle of flooding and rebuilding may never even be offered assistance to move somewhere safer,” the NRDC’s Moore said. “If they have flood insurance, they’re always offered the option to rebuild in the same location.”

Florence: Only about 10 percent on hard-hit Carolina coast have flood insurance

Updated Sep 17, 5:51 PM; Posted Sep 17, 5:37 PM

By The Washington Post

As Americans in North and South Carolina make it out of the Florence floodwaters, they face another daunting task: figuring out whether they can afford to rebuild.

Few have flood insurance in the areas with the worst destruction. Home insurance does not typically cover flooding, a fact many realize the hard way. People have to purchase a separate flood insurance policy at least a month in advance of a major storm to be eligible for reimbursement.

Only about one in 10 homes have flood insurance in the counties hit by Florence, according to a Washington Post analysis comparing the number of policies in National Flood Insurance Program data with the number of housing units in counties hit by the storm. Milliman, an actuarial firm, found similar results.

In Craven County, North Carolina, which is home to New Bern, a city that has dominated headlines for severe flooding and hundreds of rescues, 9.9 percent have flood insurance. In Wilmington, which is located in New Hanover County and has also seen devastating flooding, slightly more -- 14.2 percent -- have flood insurance, according to The Washington Post analysis. Florence has caused historic flooding around New Bern and Wilmington, two parts of the southeast coast with some of the lowest take-up rates for flood insurance.

The federal government requires flood insurance in parts of the country that are designated as "special flood hazard areas," but many homeowners and renters still do not get it because it is too expensive or they do not believe they are truly at risk. The government does little to enforce the requirement, leaving it mainly up to banks to mandate flood insurance as a condition of getting a home mortgage.

Some wrongly believe they do not need insurance because they can rely on Federal Emergency Management Agency (FEMA) grants, but those cover only up to $33,000 in damages. The typical payout is a few thousand dollars. Flood insurance, in contrast, covers up to $250,000 for the home and another $100,000 for possessions.

Shawn Spencer in Wilmington, North Carolina, spent much of Saturday raking debris out of the ditches in front of his home, hoping that would keep his property from flooding. Like many, he was not sure whether he had flood insurance for his home and the bike shop he runs as a small business.

"I'm not 100 percent certain if I've got it," said Spencer, 47, who has lived in Wilmington his entire life and cannot remember flooding this bad. "The government is forever moving the flood plain."

A key issue with Hurricane Florence and Hurricane Harvey last year is that some of the worst flooding occurred inland, sometimes in areas that were not FEMA-designated flood hazard zones. In some coastal communities, such as the Outer Banks, flood insurance take up rates are upward of 60 percent, but the rate of people with flood insurance drops off sharply just a few miles in from the coastline.

"When people hear they aren't in a FEMA flood zone, they often think there's no risk their area will flood," said John Rollins, an actuary at Milliman who specializes in insurance for catastrophe events.

As catastrophic flooding continues to occur beyond the coastline, FEMA is trying to warn Americans that anywhere it rains can flood.

"Just one inch of floodwater in a home can cause $25,000 worth of damage," David Maurstad, FEMA's deputy associate administrator for insurance and mitigation, said on CNBC on Monday. "People think their homeowner policy may cover them from flooding, and it doesn't."

Low-income families are particularly at risk. They are even less likely to have flood insurance and the lack a financial cushion to pay for hotel rooms to ride out the storm, let alone funds to rebuild. Many low-income families will turn to FEMA for aid money, but the average FEMA grant last year in the wake of Hurricane Harvey was only $4,300. That is not enough to replace a trailer home, and it is far below the average flood insurance claim of $115,000.

The federal government's other main program to help people who have severely damaged homes from flooding is a Small Business Administration loan, but low-income people often fail to qualify for the low-interest SBA loans because they do not have good credit, says Carolyn Kousky, director of policy research at the Wharton Risk Center.

"FEMA grants are a few thousand dollars. They won't get your home back to pre-disaster conditions," Kousky said. "When people see headlines about Congress passing $15 billion in hurricane aid, they think it's going to households, but most of it is going to infrastructure and local governments."

Recycling, Once Embraced by Businesses and Environmentalists, Now Under Siege

The U.S. recycling industry is breaking down as prices for scrap paper and plastic have collapsed.

Workers at Cal-Waste Recovery Systems pre-sort raw recycling. The company has been struggling to sell its mixed-paper recycling to its usual customer, China.

The U.S. recycling industry is breaking down.

Prices for scrap paper and plastic have collapsed, leading local officials across the country to charge residents more to collect recyclables and send some to landfills. Used newspapers, cardboard boxes and plastic bottles are piling up at plants that can’t make a profit processing them for export or domestic markets.

“Recycling as we know it isn’t working,” said James Warner, chief executive of the Solid Waste Management Authority in Lancaster County, Pa. “There’s always been ups and downs in the market, but this is the biggest disruption that I can recall.”

U.S. recycling programs took off in the 1990s as calls to bury less trash in landfills coincided with China’s demand for materials such as corrugated cardboard to feed its economic boom. Shipping lines eagerly filled containers that had brought manufactured goods to the U.S. with paper, scrap metal and plastic bottles for the return trip to China.

As cities aggressively expanded recycling programs to keep more discarded household items out of landfills, the purity of U.S. scrap deteriorated as more trash infiltrated the recyclables. Discarded food, liquid-soaked paper and other contaminants recently accounted for as much as 20% of the material shipped to China, according to Waste Management Inc.’s estimates, double from five years ago.

The tedious and sometimes dangerous work of separating out that detritus at processing plants in China prompted officials there to slash the contaminants limit this year to 0.5%. China early this month suspended all imports of U.S. recycled materials until June 4, regardless of the quality. The recycling industry interpreted the move as part of the growing rift between the U.S. and China over trade policies and tariffs.

The changes have effectively cut off exports from the U.S., the world’s largest generator of scrap paper and plastic. Collectors, processors and the municipal governments that hire them are reconsidering what they will accept to recycle and how much homeowners will pay for that service. Many trash haulers and city agencies that paid for curbside collection by selling scrap said they are now losing money on almost every ton they handle.

The upended economics are likely to permanently change the U.S. recycling business, said William Moore, president of Moore & Associates, a recycled-paper consultancy in Atlanta.

Cal-Waste Recovery Systems plans to invest more than $6 million on new sorting equipment to produce cleaner bales of recyclables.

“It’s going to take domestic demand to replace what China was buying,” he said. “It’s not going to be a quick turnaround. It’s going to be a long-term issue.”

The waste-management authority in Lancaster County this spring more than doubled the charge per ton that residential trash collectors must pay to deposit recyclables at its transfer station, starting June 1. The higher cost is expected to be passed on to residents though a 3% increase in the fees that haulers charge households for trash collection and disposal.

The additional transfer-station proceeds will help offset a $40-a-ton fee that the authority will start paying this summer to a company to process the county’s recyclables. Before China raised its quality standards at the beginning of this year, that company was paying Lancaster County $4 for every ton of recyclables.

Mr. Warner may limit the recyclable items collected from Lancaster County’s 500,000 residents to those that have retained some value, such as cans and corrugated cardboard. He said mixed plastic isn’t worth processing.

“You might as well put it in the trash from the get-go,” he said.

Environmentalists are hoping landfills are only a stopgap fix for the glut of recyclables while the industry finds new markets and reduces contaminants.

“Stuff is definitely getting thrown away in landfills. Nobody is happy about it,” said Dylan de Thomas, vice president of industry collaboration for the Recycling Partnership in Virginia. “There are very few landfill owners that don’t operate recycling facilities, too. They’d much rather be paid for those materials.”

Pacific Rim Recycling in Benicia, Calif., slowed operations at its plant early this year to meet China’s new standard. But company President Steve Moore said the more intensive sorting process takes too long to process scrap profitably. Pacific Rim idled its processing plant in February and furloughed 40 of its 45 employees.

“The cost is impossible. We can’t make money at it,” Steve Moore said. “We quit accepting stuff.”

China stopped taking shipments of U.S. mixed paper and mixed plastic in January. Steve Moore said mixed-paper shipments to other Asian countries now fetch $5 a ton, down from as much as $150 last year. Other buyers such as Vietnam and India have been flooded with scrap paper and plastic that would have been sold to China in years past.

Dave Vaccarezza, president of Cal-Waste Recovery Systems near Sacramento, Calif., intends to invest more than $6 million in new sorting equipment to produce cleaner bales of recyclables.

“It’s going to cost the rate payer to recycle,” he said. “They’re going to demand we make our best effort to use those cans and bottles they put out.”

China stopped taking shipments of U.S. mixed paper and mixed plastic in January. Cal-Waste Recovery Systems workers sift through recycled trash.

Sacramento County, which collects trash and recyclables from 151,000 homes, used to earn $1.2 million a year selling the scrap to Waste Management and another processor from scrap. Now, the county is paying what will amount to about $1 million a year, or roughly $35 a ton, to defray the processors’ costs. Waste Management paid the county $250,000 to break the revenue-sharing contract and negotiate those terms.

County waste management director Doug Sloan expects those costs to keep climbing. “We’ve been put on notice that we need to do our part,” he said. The county hasn’t yet raised residential fees.

Some recyclers said residents and municipalities need to give up the “single-stream” approach of lumping used paper and cardboard together with glass, cans and plastic in one collection truck. Single-stream collections took hold in the waste-hauling industry about 20 years ago and continue to be widely used. Collecting paper separately would make curbside recycling service more expensive but cut down on contamination.

“We’re our own worst enemies,” said Michael Barry, president of Mid America Recycling, a processing-plant operator in Des Moines, Iowa, of single-stream recycling. “It’s almost impossible to get the paper away from the containers.”

Even relatively pure loads of paper have become tough to sell, Mr. Barry said, noting the domestic market for paper is saturated as well. He stockpiled paper bales at Mid America’s warehouse, hoping prices would improve. They didn’t. He has trucked 1,000 tons of paper to a landfill in recent weeks.

“We had to purge,” he said. “There’s no demand for it.”

World's first electrified road for charging vehicles opens in Sweden

Stretch of road outside Stockholm transfers energy from two tracks of rail in the road, recharging the batteries of electric cars and trucks

Supported by The Skoll Foundation

Daniel Boffey in Brussels

Thu 12 Apr 2018 11.55 EDT First published on Thu 12 Apr 2018 08.40 EDT

The world’s first electrified road that recharges the batteries of cars and trucks driving on it has been opened in Sweden.

About 2km (1.2 miles) of electric rail has been embedded in a public road near Stockholm, but the government’s roads agency has already drafted a national map for future expansion.

Sweden’s target of achieving independence from fossil fuel by 2030 requires a 70% reduction in the transport sector.

The technology behind the electrification of the road linking Stockholm Arlanda airport to a logistics site outside the capital city aims to solve the thorny problems of keeping electric vehicles charged, and the manufacture of their batteries affordable.

Energy is transferred from two tracks of rail in the road via a movable arm attached to the bottom of a vehicle. The design is not dissimilar to that of a Scalextric track, although should the vehicle overtake, the arm is automatically disconnected.

The electrified road is divided into 50m sections, with an individual section powered only when a vehicle is above it.

When a vehicle stops, the current is disconnected. The system is able to calculate the vehicle’s energy consumption, which enables electricity costs to be debited per vehicle and user.

The “dynamic charging” – as opposed to the use of roadside charging posts – means the vehicle’s batteries can be smaller, along with their manufacturing costs.

A former diesel-fuelled truck owned by the logistics firm, PostNord, is the first to use the road.

Hans Sll, chief executive of the eRoadArlanda consortium behind the project, said both current vehicles and roadways could be adapted to take advantage of the technology.

In Sweden there are roughly half a million kilometres of roadway, of which 20,000km are highways, Sll said.

“If we electrify 20,000km of highways that will definitely be be enough,” he added. “The distance between two highways is never more than 45km and electric cars can already travel that distance without needing to be recharged. Some believe it would be enough to electrify 5,000km.”

Building the eRoadArlanda: the government’s roads agency has already drafted a national map for future expansion. Photograph: Joakim Krger/eRoadArlanda

At a cost of €1m per kilometre, the cost of electrification is said to be 50 times lower than that required to construct an urban tram line.

Sll said: “There is no electricity on the surface. There are two tracks, just like an outlet in the wall. Five or six centimetres down is where the electricity is. But if you flood the road with salt water then we have found that the electricity level at the surface is just one volt. You could walk on it barefoot.”

National grids are increasingly moving away from coal and oil and battery storage is seen as crucial to a changing the source of the energy used in transportation.

The Swedish government, represented by a minister at the formal inauguration of the electrified road on Wednesday, is in talks with Berlin about a future network. In 2016, a 2km stretch of motorway in Sweden was adapted with similar technology but through overhead power lines at lorry level, making it unusable for electric cars.

Lawsuits filed against L.A. County, lenders over green energy program

Andrew Khouri

By ANDREW KHOURI

APR 12, 2018 | 1:45 PM

Homeowner Reginald Nemore says he can't afford his PACE loan for solar panels and is worried he will lose his house. (Wally Skalij / Los Angeles Times)

Attorneys representing homeowners filed lawsuits Thursday against the Los Angeles County, alleging a county program that funds solar panels and other energy-efficient home improvements is a "plague" that's ruined the finances of many borrowers by saddling them with loans they cannot afford.

The twin suits seeking class-action status were filed in Los Angeles County Superior Court against the county, as well as Renovate America and Renew Financial, the county's private lender partners for the Property Assessed Clean Energy program.

The complaints say borrowers are now at risk of losing their homes because the loans are liens on a house, lacked adequate consumer protections, and were marketed and sold by unscrupulous contractors that were not properly monitored.

"Many PACE participants are living hand-to-mouth to hold onto their homes, fearful of what is yet to come," the nearly identical suits say. They note that many of those in trouble have low incomes, are elderly or don't speak English as their first language.

L.A. County representatives did not respond to a request for comment.

In a statement, Renew Financial said it couldn't comment on pending litigation, but said it's worked with government authorities to improve consumer protections, including new state laws that took effect this year.

Renovate America said it similarly supported the enhanced protections and finds "no merit in the allegations in the complaints."

Specifically, the lawsuits allege the county and lenders have committed financial elder abuse, while the lenders charged inflated interest rates and broke a county contract that said they were to provide "best in class" protections against predatory lending and special safeguards for seniors.

While the lenders have said they checked borrowers for previous bankruptcies or missed mortgage payments prior to approval, they did not ask for their incomes until recently, basing approvals largely on home equity.

The lawsuits said the county knew, or should have known, its program would harm vulnerable homeowners and has looked the other way as problems piled up.

The suits seek class-action status for borrowers who took out loans between March 2015 and March 2018 and that had excessive debt-to-income ratios or were left with very little residual income after making their loan payments. The lawsuits said the class size is unknown, but argued PACE has been a "disaster for thousands of vulnerable homeowners."

The suits, brought by Irell & Manella, Public Counsel and Bet Tzedek, ask that loans for borrowers in the class be canceled and that homeowners be returned any money paid.

"We can't keep up with the number of complaints about this program," said Jennifer H. Sperling, an attorney with Bet Tzedek. "This is a systemic problem."

The class time frame was chosen because as of April 1, under a new state law, PACE lenders must ask for a borrower's income and make a "good faith determination" of a borrower's ability to repay.

Another new bill mandated that PACE providers have a phone conversation with all homeowners before they take out the loan to ensure they understand the terms. Renew Financial has said it's always had such calls and Renovate America said it started doing so before the law passed.

The bills, which included other reforms as well, were signed by Gov. Jerry Brown last year following repeated complaints over unscrupulous contractors who market and sign people up for the loans on tablet computers and smartphones.

"If they had let me know from Day One this is what [you are] going to get into ... there is no way I would have signed," said Reginald Nemore, a 58-year-old former bus driver who had to retire after a back injury.

He took out a Renovate America loan for solar panels and attic insulation in 2016. Nemore said before a contractor handed him a smartphone to sign, the individual didn't explain to him exactly how much he would be paying. He said he was told he'd qualify for a $7,000 government check for going green, but found out it isn't available to him.

Nemore said he wasn't told he could lose his house if he didn't pay and only found out the true cost when paperwork arrived in the mail after the loan was finalized. He now owes roughly $240 a month for 25 years, even though he said he and his wife, who suffers from multiple sclerosis, sometimes only have $50 or less in their checking account each month.

Nemore's attorney said he and his wife only take in around $2,475 from Social Security disability payments.

"I don't want to be uprooted," Nemore said. "But I don't know if it is going to be up to me to have that choice."

First started in 2008, PACEprograms are established by government authorities, which allow the privately financed loans to be repaid as line items on property tax bills. In addition to solar panels and energy-efficient air conditioners, the loans in California can be used for items such as low-flow toilets that save water.

In Southern California, Los Angeles, Riverside, San Bernardino and San Diego counties have approved PACE programs and contracted with private lenders to fund and largely operate the efforts.

If PACE loans go unpaid, a homeowner can be foreclosed upon. However, Renovate America and Renew Financial — which partner with Los Angeles County and issue loans under the Hero and California First programs, respectively — told The Times last year they hadn't foreclosed on anyone for nonpayment of PACE loans.

L.A. County said it had set up reserve funds to cover missed borrower payments for a time, making a quick foreclosure unlikely for those who've missed PACE payments.

Program proponents, including the lenders, have praised them as a tool for saving energy and combating global warming. And they say most customers have come away happy.

A recent study from the Lawrence Berkeley National Laboratory found PACE programs boosted the deployment of residential solar photovoltaic systems in California cities with the program by 7% to 12% between 2010 and 2015.

But consumer groups say contractors who serve as de-facto mortgage brokers too often misrepresent how the financing works, sticking people with loans they can neither understand nor afford. And they note mortgage servicers will often cover the PACE bills for a time and might be the ones foreclosing upon a delinquent homeowner.

In the lawsuits filed Thursday, attorneys allege on behalf of their clients that Renovate America and Renew Financial failed to screen and monitor their network of contractors and encouraged predatory lending and aggressive marketing.

Often, the suit says, contractors served as a homeowner's "primary" source of information about the loans. Lenders told contractors they didn't need to determine if customers could afford the loan and rubber stamped payment to contractors, without regard to whether they followed certain guidelines, the lawsuits allege.

Renovate America has said it's tried to weed out bad contractors, kicking them out when they don't follow the rules. In October, the company announced its chief executive was stepping down and it retained a law firm to "conduct and make public a third-party review of practices and procedures."

In its statement Thursday, the company said a different outside review of its loans for the Western Riverside Council of Governments found it followed the applicable consumer protections that the agency set up more than 99% of the time.

The Climate and the Cross: evangelical Christians debate climate change

The latest Guardian documentary asks whether a group of Christian Americans might offer salvation for the country’s attitude towards climate change

Charlie Phillips

Fri 13 Apr 2018 07.00 EDT Last modified on Fri 13 Apr 2018 10.11 EDT

A new Guardian documentary, The Climate and the Cross, explores a battle among US evangelicals over whether climate change is real and a call to protect the Earth – the work of God, and therefore to be welcomed – or does not exist.

Evangelicals have traditionally been the bedrock of conservative politics in the US, including on climate change. But a heated debate is taking place across the country, with some Christians protesting in the name of protecting the Earth. One group has built a chapel in the way of a pipeline, and a radical pastor has encouraged his congregation to put themselves in the way of diggers. Meanwhile, a firm supporter of Donald Trump crisscrosses the country promoting solar power.

But there is still the traditional resistance: a climate scientist who denies that the world is warming, and a preacher in Florida who sees the fact he was flooded as a positive sign of a divine presence.

The Climate and the Cross

The film-makers, Chloe White and Will Davies, make documentaries, animated content, educational, promotional and fundraising films. Their previous documentary for the Guardian, Quiet Videos, was about the makers of ASMR videos that are said to give viewers “head orgasms”. It toured international film festivals to much acclaim.

Video:

https://youtu.be/qsUlVk__2sM

Running out of Lake Erie options, Ohio looks to get tougher on farmers

ByTom Henry | BLADE STAFF WRITERPublished on April 13, 2018 | Updated 6:20 p. m.

Nearly four years after Toledo’s 2014 water crisis, the Maumee River and other western Lake Erie tributaries are still so besieged by algae-forming phosphorus that the Kasich administration now agrees it has no recourse but to impose tough, new regulations on farmers.

Ohio Environmental Protection Agency Director Craig Butler during an interview with The Blade left no doubt what’s being contemplated would be a fundamental shift — one the state’s powerful agricultural industry was hoping to avoid — away from the administration’s longstanding emphasis on voluntary incentives.

Such incentives are designed to entice farmers to be better stewards of their land through the usual suite of buffer strips, cover crops, drainage-control structures, and windbreak. They certainly will continue to be encouraged, because they do reduce nutrient-enriched agricultural runoff to some degree, he said.

But voluntary incentives are not yielding nearly enough results and are not being embraced widely enough for Ohio to achieve the goal it made with Michigan and Ontario to reduce phosphorus levels 40 percent over 2008 levels by 2025, Mr. Butler said.

“We don't see the trend line moving big enough or fast enough. It's time for us to consider the next step,” Mr. Butler said.

Joe Cornely, Ohio Farm Bureau Federation spokesman, said he’s disappointed by the recent developments.

“This is a massive shift. It comes as a bit of a surprise, given the Kasich administration's track record of trying to be thoughtful and deliberate in terms of regulating any industry, including agriculture,” he said. “Where is the wisdom in simply slapping on one more set of regulations? The fact there is no bill introduced yet probably tells you something.”

The non-binding agreement Ohio made with Michigan and Ontario two years ago also includes an interim goal of 20 percent less phosphorus by 2020.

The Ohio EPA is quietly courting legislators, lobbyists, and public officials to throw their support behind a yet-to-be-introduced bill that Mr. Butler claims would provide him with a powerful enforcement tool.

One key part of it: Expanding the state’s definition of “agricultural pollution” to include farm runoff.

That would direct the state agriculture department to establish rules for streams classified as “watersheds in distress” because of fertilizer that escapes from fields.

It would also make it easier for the Ohio EPA to move in with requirements for strict nutrient-management plans in those areas. Those plans would, among other things, limit the spread of manure and commercial fertilizer. Applications would not be banned but would be much more regulated based on soil tests, Mr. Butler said.

“It is critical to change that definition,” he said.

Another goal of the bill is to put a statewide cap on phosphorus discharges from all sewage-treatment plants across Ohio at 1.0 part per million or less.

Toledo’s permit, which expires July 31, 2019, allows the city to discharge no more than 1.0 ppm. But many other plants, especially smaller ones, don’t have to adhere to that amount.

The goal of reducing phosphorus levels by 40 percent was promoted by area scientists on a task force created a few years ago by the U.S.-Canada International Joint Commission, a 109-year-old government agency that helps the two nations resolve boundary water issues.

The year 2008 was selected as the benchmark because it was seen as one of the more typical years for agricultural runoff over the past decade. Even with a 40 percent reduction, though, scientists believe the region will continue to have some algae, just not as often or as much.

Ohio Rep. Steve Arndt (R., Port Clinton) is being courted as the lead sponsor of the bill Mr. Butler wants.

But Mr. Arndt said he has some concerns about it.

He said the Ohio EPA first approached him about sponsoring legislation a year ago, but that the part about distressed watersheds didn’t come up until last month. He said he’s now working on the fourth version of the bill.

He agrees with Mr. Butler, though, that Ohio is likely on the verge of getting tougher with farmers because it has few options left.

“We are not going to get there [to the 40 percent reduction] with any of the present strategies in place,” Mr. Arndt said.

The Ohio EPA’s 2017 Lake Erie tributary report showed high concentrations of phosphorous and other nutrients. 2017 was the fourth-worst year for algae since 2002.

The report is a compendium of data collected by the U.S. Geological Survey, Heidelberg University, the Ohio Department of Natural Resources, and the Ohio EPA.

By May 1 of last year, the Maumee River had 860 metric tons of phosphorus recorded at Heidelberg University’s Waterville station, a figure the river is not supposed to surpass for an entire year if it is going to achieve a 40 percent reduction. It ended up recording 1,800 metric tons for the year.

Mr. Butler said 860 metric tons should be the Maumee’s “new normal,” but it’s not. The figures have remained consistently high, except for drier-than-usual springs such as 2012, since the early 2000s.

“This is all about the amount of phosphorus coming down the Maumee,” he said. “The numbers are pretty telling.”

Mr. Butler said the new strategy is unrelated to the state’s recent agreement to designate western Lake Erie’s open water as impaired. The impairment requirements will be dealt with later, once agency officials have a better idea what they’re supposed to do to comply with them, he said.

“You can’t confuse it with the impairment designation,” Mr. Butler said.

He said almost 90 percent of the region’s phosphorus pollution now is believed to come from nonpoint sources, with agriculture by far being the largest of those.

A mass balance report coming out soon is expected to help the Ohio EPA better identify where most of the phosphorus is coming from, Mr. Butler said.

Karl Gebhardt, Ohio Lake Erie Commission executive director and Ohio EPA deputy director for water resources, said some of the ideas for the Lake Erie watershed have been inspired by some of the success the agency has had in limiting manure-spreading near Grand Lake St. Marys since a massive algal bloom there in 2010.

Data near Grand Lake St. Marys “shows water going into it is much better now than it was before when it was a watershed in distress,” he said.

US judge blocks weed-killer warning label in California

SAN FRANCISCO (AP) — A U.S. judge blocked California from requiring that the popular weed-killer Roundup carry a label stating that it is known to cause cancer, saying the warning is misleading because almost all regulators have concluded there is no evidence that the product’s main ingredient is a carcinogen.

U.S. District Judge William Shubb in Sacramento issued a preliminary injunction on Monday in a lawsuit challenging the state’s decision last year to list glyphosate as a chemical known to cause cancer.

The listing triggered the warning label requirement for Roundup that was set to go into effect in July.

Glyphosate is not restricted by the U.S. Environmental Protection Agency and has been widely used since 1974 to kill weeds while leaving crops and other plants alive.

The International Agency for Research on Cancer, based in Lyon, France, has classified the chemical as a “probable human carcinogen.” That prompted the California Office of Environmental Health Hazard Assessment to add glyphosate to its cancer list.

Shubb said a “reasonable consumer would not understand that a substance is ‘known to cause cancer’ where only one health organization had found that the substance in question causes cancer.”

“On the evidence before the court, the required warning for glyphosate does not appear to be factually accurate and uncontroversial because it conveys the message that glyphosate’s carcinogenicity is an undisputed fact,” he said.

Sam Delson, a spokesman for the state’s office of environmental health hazard assessment, noted that the judge did not block the state from putting glyphosate on its cancer list, but only from requiring the warning label.

“We are pleased that the listing of glyphosate remains in effect, and we believe our actions were lawful,” he said.

He said the office had not decided yet whether to appeal the ruling on Monday.

The ruling came in a lawsuit filed by the national wheat and corn growers associations, state agriculture and business organizations in Iowa, Missouri, North Dakota and South Dakota, and a regional group representing herbicide sellers in California, Arizona and Hawaii. The plaintiffs also include St. Louis-based Monsanto Co., which makes Roundup.

“Every regulatory body in the world that has reviewed glyphosate has found it safe for use, and no available product matches glyphosate with a comparable health and environmental safety profile,” Chandler Goule, CEO of the National Association of Wheat Growers, said in a statement.

The lawsuit will continue with Shubb’s injunction in place.

A brief history of plastic, design’s favorite material

More and more Americans reckon with their wild overuse of plastic. Here’s how we got so dependent on the stuff in the first place.

BY ERIC BECKMAN

From its early beginnings during and after World War II, the commercial industry for polymers–long-chain synthetic molecules of which “plastics” are a common misnomer–has grown rapidly. In 2015, over 320 million tons of polymers, excluding fibers, were manufactured across the globe.

Until the last five years, polymer product designers have typically not considered what will happen after the end of their product’s initial lifetime. This is beginning to change, and this issue will require increasing focus in the years ahead.

“Plastic” has become a somewhat misguided way to describe polymers. Typically derived from petroleum or natural gas, these are long-chain molecules with hundreds to thousands of links in each chain. Long chains convey important physical properties, such as strength and toughness, that short molecules simply cannot match.

“Plastic” is actually a shortened form of “thermoplastic,” a term that describes polymeric materials that can be shaped and reshaped using heat.

The modern polymer industry was effectively created by Wallace Carothers at DuPont in the 1930s. His painstaking work on polyamides led to the commercialization of nylon, as a wartime shortage of silk forced women to look elsewhere for stockings.

When other materials became scarce during World War II, researchers looked to synthetic polymers to fill the gaps. For example, the supply of natural rubber for vehicle tires was cut off by the Japanese conquest of Southeast Asia, leading to a synthetic polymer equivalent.

Curiosity-driven breakthroughs in chemistry led to further development of synthetic polymers, including the now widely used polypropylene and high-density polyethylene. Some polymers, such as Teflon, were stumbled upon by accident.

Eventually, the combination of need, scientific advances, and serendipity led to the full suite of polymers that you can now readily recognize as “plastics.” These polymers were rapidly commercialized, thanks to a desire to reduce products’ weight and to provide inexpensive alternatives to natural materials like cellulose or cotton.

TYPES OF PLASTIC

The production of synthetic polymers globally is dominated by the polyolefins–polyethylene and polypropylene.

Polyethylene comes in two types: “high density” and “low density.” On the molecular scale, high-density polyethylene looks like a comb with regularly spaced, short teeth. The low-density version, on the other hand, looks like a comb with irregularly spaced teeth of random length–somewhat like a river and its tributaries if seen from high above. Although they’re both polyethylene, the differences in shape make these materials behave differently when molded into films or other products.

Polyolefins are dominant for a few reasons. First, they can be produced using relatively inexpensive natural gas. Second, they’re the lightest synthetic polymers produced at large scale; their density is so low that they float. Third, polyolefins resist damage by water, air, grease, cleaning solvents–all things that these polymers could encounter when in use. Finally, they’re easy to shape into products, while robust enough that packaging made from them won’t deform in a delivery truck sitting in the sun all day.

However, these materials have serious downsides. They degrade painfully slowly, meaning that polyolefins will survive in the environment for decades to centuries. Meanwhile, wave and wind action mechanically abrades them, creating microparticles that can be ingested by fish and animals, making their way up the food chain toward us.

Recycling polyolefins is not as straightforward as one would like owing to collection and cleaning issues. Oxygen and heat cause chain damage during reprocessing, while food and other materials contaminate the polyolefin. Continuing advances in chemistry have created new grades of polyolefins with enhanced strength and durability, but these cannot always mix with other grades during recycling. What’s more, polyolefins are often combined with other materials in multilayer packaging. While these multilayer constructs work well, they are impossible to recycle.

Polymers are sometimes criticized for being produced from increasingly scarce petroleum and natural gas. However, the fraction of either natural gas or petroleum used to produce polymers is very low; less than 5% of either oil or natural gas produced each year is employed to generate plastics. Further, ethylene can be produced from sugarcane ethanol, as is done commercially by Braskem in Brazil.

HOW PLASTIC IS USED

Depending upon the region, packaging consumes 35% to 45% of the synthetic polymer produced in total, where the polyolefins dominate. Polyethylene terephthalate, a polyester, dominates the market for beverage bottles and textile fibers.

Building and construction consumes another 20% of the total polymers produced, where PVC pipe and its chemical cousins dominate. PVC pipes are lightweight, can be glued rather than soldered or welded, and greatly resist the damaging effects of chlorine in water. Unfortunately, the chlorine atoms that confer PVC this advantage make it very difficult to recycle–most is discarded at the end of life.

Polyurethanes, an entire family of related polymers, are widely used in foam insulation for homes and appliances, as well as in architectural coatings.

The automotive sector uses increasing amounts of thermoplastics, primarily to reduce weight and hence achieve greater fuel efficiency standards. The European Union estimates that 16% of the weight of an average automobile is plastic components, most notably for interior parts and components.

Over 70 million tons of thermoplastics per year are used in textiles, mostly clothing and carpeting. More than 90% of synthetic fibers, largely polyethylene terephthalate, are produced in Asia. The growth in synthetic fiber use in clothing has come at the expense of natural fibers like cotton and wool, which require significant amounts of farmland to be produced. The synthetic fiber industry has seen dramatic growth for clothing and carpeting, thanks to interest in special properties like stretch, moisture-wicking, and breathability.

As in the case of packaging, textiles are not commonly recycled. The average U.S. citizen generates over 90 pounds of textile waste each year. According to Greenpeace, the average person in 2016 bought 60% more items of clothing every year than the average person did 15 years earlier, and keeps the clothes for a shorter period of time.

This post originally appeared on The Conversation. Eric Beckman is professor of chem/petroleum engineering at the University of Pittsburgh.

7 years later, new study shows East Chicago kids exposed to more lead because of flawed government report

Lead contamination forces families from their homes in East Chicago

1,000 East Chicago residents were ordered to leave a housing complex in East Chicago tainted with lead and arsenic and find new lodging.

Michael Hawthorne, Contact Reporter

After federal officials assessed potential health hazards in East Chicago seven years ago, they declared young children could play safely in neighborhoods built on or near former industrial sites contaminated with brain-damaging lead.

Now the same government agency confirms it woefully misled parents and city officials.

Kids living in two of the contaminated neighborhoods actually were nearly three times more likely to suffer lead poisoning during the past decade than if they lived in other parts of the heavily industrialized northwest Indiana city, according to a report unveiled last week by an arm of the U.S. Centers for Disease Control and Prevention.

Written in dry, bureaucratic language, the mea culpa is the latest acknowledgement that federal and state officials repeatedly failed to protect residents in the low-income, predominantly Hispanic and African-American city, despite more than three decades of warnings about toxic pollution left by the USS Lead smelter and other abandoned factories.

The new report comes as the U.S. Environmental Protection Agency oversees the excavation of contaminated soil from dozens of East Chicago yards, a project delayed in part because the CDC branch — the Agency for Toxic Substances and Disease Registry — advised the EPA in 2011 that “breathing the air, drinking tap water or playing in soil in neighborhoods near the USS Lead Site is not expected to harm people’s health.”

Many current and former residents are angry the hazards weren’t identified and removed years ago. Some had no idea they were living on toxic land until East Chicago’s mayor ordered the 2016 evacuation of the former West Calumet Housing Complex, built on the grounds of another lead smelter.

“Nobody ever told us about the lead,” said Akeeshea Daniels, a lifelong city resident who moved into the taxpayer-funded complex a month after her now-14-year-old son was born. “It’s been two years since they forced us to move, and we still haven’t gotten our questions answered.”

With the help of public interest lawyers working on behalf of former tenants, Daniels said she discovered medical records that showed her son had elevated levels of lead in his blood when he was a toddler, a possible explanation for the attention deficit hyperactivity disorder he was diagnosed with a few years later.

The EPA found lead levels in Daniels’ apartment as high as 32,000 parts per million — 80 times the federal safety limit for areas where children play.

Lead is unsafe to consume at any level, according to the EPA and CDC. Ingesting tiny concentrations can permanently damage the developing brains of children and contribute to heart disease, kidney failure and other health problems later in life. A recent peer-reviewed study estimated more than 400,000 deaths a year in the U.S. are linked to lead exposure — or 18 percent of all deaths.

According to the new report by the toxic substances registry, more than 27 percent of West Calumet children ages 5 and younger who were tested between 2005 and 2015 had lead levels at or above 5 micrograms per deciliter of blood, the CDC’s current threshold for medical intervention and home inspections. Roughly the same percentage of kids were poisoned in the residential neighborhood immediately east of the former housing complex.

Lead poisoning

Two plastic swans help hold up a sign that reads: "Do not play in the dirt or around the mulch" at a housing unit at the West Calumet Housing Complex in East Chicago in 2016. (Antonio Perez/Chicago Tribune)

By contrast, 11.5 percent of young kids in the rest of East Chicago and 7.8 percent of children statewide had lead levels at or above the CDC threshold in the same period, according to the report.

Following the infamous public health disaster in Flint, Mich., where actions by state-appointed officials triggered alarming levels of lead in drinking water, about 4.9 percent of children tested there had high lead levels.

Federal health officials based their flawed 2011 assessment of East Chicago on lead poisoning rates throughout the city rather than data from specific neighborhoods targeted by the EPA’s Superfund program for highly contaminated industrial sites.

Relying on the wrong data led to erroneous or misleading conclusions such as “declining blood lead levels in small children appear to confirm that they are no longer exposed to lead from any source.”

Since the EPA tends to prioritize cleanups the same way battlefield medics assess the wounded — concentrating on immediate or obvious risks first — agency officials have said the 2011 report factored heavily into their decision to focus initially on a gradual approach at the East Chicago site.

The toxic substances registry is required by law to assess risks to public health at every Superfund site. Its new report pins the blame for its East Chicago snafu on Indiana state officials who provided data used in the 2011 report.

After promising a response, a spokesperson for the health agency did not answer emailed questions from the Tribune. In a statement, the EPA said it has removed contaminated soil from more than 400 yards so far and “will continue to partner and coordinate with ATSDR to protect human health and the environment.”

Local, state and federal officials had plenty of reasons to act years ago.

They always knew the public housing complex had been built on the site of a former lead smelter. In 1994, the toxic substances registry recommended extensive soil testing and blood screening, and four years later the agency reported unusually high rates of childhood lead poisoning in the area.

Yet a 2016 Tribune investigation found officials were slow to propose a cleanup or inform residents about potential hazards. Only a single soil sample from the public housing complex had been collected and analyzed before the EPA declared the development and surrounding neighborhoods a Superfund site in 2009, the newspaper found.

Since then, lead poisoning rates have been declining in neighborhoods within the Superfund site boundaries, as well as in all of East Chicago, the rest of Indiana and the nation as a whole, according to the new report from the federal health agency. But the percentage of kids tested throughout Indiana also has declined dramatically in recent years, suggesting an untold number of poisoned children aren’t accounted for in official statistics.

“These results reflect the harms that stem from environmental injustice,” said Debbie Chizewer, a lawyer with the Environmental Advocacy Center at Northwestern University's Bluhm Legal Center, who has been working with East Chicago residents to ensure they have safe housing. “Government agencies at all levels all failed this community.”

Full Report:

https://www.atsdr.cdc.gov/HAC/pha/USSmelterandLeadRefinery/US_Smelter_Lead_Refinery_HC_2018-508.pdf

TURNING COFFEE WASTE INTO COFFEE CUPS

21 AUGUST 2018

A Macquarie PhD student believes he’s come up with a way to turn coffee waste into biodegradable plastic coffee cups.

He’s developed a method to turn coffee grounds into lactic acid, which can then be used to produce biodegradable plastics, and is now refining the process as he finishes his PhD.

“Australians consume six billion cups of coffee every year, and the coffee grounds used to make these coffees are used only once and then discarded,” says researcher Dominik Kopp.

“In Sydney alone, over 920 cafes and coffee shops produced nearly 3,000 tonnes of waste coffee grounds every year.

“Ninety-three per cent of this waste ends up in landfill, where it produces greenhouse gases that contribute to global warming.”

However, 50 per cent of coffee grounds are made up of sugars, which are ideal candidates to convert into valuable bio-based chemicals, or chemicals derived from plant- or animal-based feedstocks rather than crude oil.

“Our group is looking for new ways to convert biowaste—whether that be agricultural, garden, paper or commercial food waste—into valuable raw materials that can be used to produce high-value compounds in more environmentally-friendly ways,” says Associate Professor Anwar Sunna, Dominik’s supervisor and head of the Sunna Lab which is using the rapidly growing field of synthetic biology to address biotechnology and biomedical challenges.

Dominik sourced coffee grounds from one of the coffee shops on Macquarie’s campus and took them back to the lab.

“We assembled a synthetic pathway to convert the most abundant sugar in the coffee grounds, mannose, into lactic acid,” he says.

“Lactic acid can be used in the production of biodegradable plastics, offering a more sustainable and environmentally-friendly alternative to fossil fuel-derived plastics.

“You could use such plastics to make anything from plastic coffee cups to yoghurt containers to compost bags to sutures in medicine.”

Their method was inspired by a metabolic pathway that is thought to exist in an evolutionarily ancient organism, which lives in hot and extremely acidic environments.

Dominik was awarded the INOFEA Early Career Award for Applied Biocatalysis or Nanobiotechnology for the poster he presented on his research at the 18th European Congress on Biotechnology last month.

His next step will be to further refine his conversion pathway, and improve the yield of lactic acid.

“I think my project is one of many interesting approaches on how to use synthetic biology in a responsible manner for the development of a more sustainable and greener industry that doesn’t rely on crude oil,” says Dominik.

“The simple idea that we are converting waste into a valuable and sustainable product is extremely exciting!”

Kopp, D., Care, A., Bergquist, P.L., Willows, R. and Sunna, A. Cell-free synthetic pathway for the conversion of spent coffee grounds into lactic acid. Poster presented at: 18th European Congress on Biotechnology; 2018 July 1-4; Geneva, Switzerland.

Links to interesting Graphics Below :

https://images.fastcompany.net/image/upload/w_596,c_limit,q_auto:best,f_auto/wp-cms/uploads/2018/08/i-1-90217212-a-brief-history-of-plastic-designand8217s-favorite-material.jpg

https://images.fastcompany.net/image/upload/w_596,c_limit,q_auto:best,f_auto/wp-cms/uploads/2018/08/i-2-90217212-a-brief-history-of-plastic-designand8217s-favorite-material.jpg

https://images.fastcompany.net/image/upload/w_596,c_limit,q_auto:best,f_auto/wp-cms/uploads/2018/08/i-4-90217212-a-brief-history-of-plastic-designand8217s-favorite-material.jpg

Biorenewable, biodegradable plastic alternative synthesized by CSU chemists

PUBLIC RELEASE: 22-JUN-2018

The team describes synthesis of a polymer called bacterial poly(3-hydroxybutyrate), or P3HB

COLORADO STATE UNIVERSITY

Colorado State University polymer chemists have taken another step toward a future of high-performance, biorenewable, biodegradable plastics.

Publishing in Nature Communications, the team led by Professor of Chemistry Eugene Chen describes chemical synthesis of a polymer called bacterial poly(3-hydroxybutyrate) - or P3HB. The compound shows early promise as a substitute for petroleum plastics in major industrial uses.

P3HB is a biomaterial, typically produced by bacteria, algae and other microorganisms, and is used in some biomedical applications. Its high production costs and limited volumes render the material impractical in more widespread commodity applications, however.

The team, which includes the paper's first author and research scientist Xiaoyan Tang, used a starting material called succinate, an ester form of succinic acid. This acid is produced via fermentation of glucose and is first on the U.S. Department of Energy's list of top 12 biomass-derived compounds best positioned to replace petroleum-derived chemicals.

The researchers' new chemical synthesis route produces P3HB that's similar in performance to bacterial P3HB, but their route is faster and offers potential for larger-scale, cost-effective production for commodity plastic applications. This new route is enabled by a class of powerful new catalysts they have designed and synthesized. They have filed a provisional patent through CSU Ventures for the new technology.

--------------------------

‘Infinitely’ recyclable polymer shows practical properties of plastics

Apr, 2018

By Anne Manning

Eugene Chen’s lab at Colorado State University has developed a completely recyclable polymer. Credit: Bill Cotton/Colorado State University

The world fell in love with plastics because they’re cheap, convenient, lightweight and long-lasting. For these same reasons, plastics are now trashing the Earth.

Colorado State University chemists have announced in the journal Science another major step toward waste-free, sustainable materials that could one day compete with conventional plastics. Led by Eugene Chen, professor in the Department of Chemistry, they have discovered a polymer with many of the same characteristics we enjoy in plastics, such as light weight, heat resistance, strength and durability. But the new polymer, unlike typical petroleum plastics, can be converted back to its original small-molecule state for complete chemical recyclability. This can be accomplished without the use of toxic chemicals or intensive lab procedures.

Polymers are a broad class of materials characterized by long chains of chemically bonded, repeating molecular units called monomers. Synthetic polymers today include plastics, as well as fibers, ceramics, rubbers, coatings, and many other commercial products.

The work builds on a previous generation of a chemically recyclable polymer Chen’s lab first demonstrated in 2015. Making the old version required extremely cold conditions that would have limited its industrial potential. The previous polymer also had low heat resistance and molecular weight, and, while plastic-like, was relatively soft.

But the fundamental knowledge gained from that study was invaluable, Chen said. It led to a design principle for developing future-generation polymers that not only are chemically recyclable, but also exhibit robust practical properties.

The new, much-improved polymer structure resolves the issues of the first-generation material. The monomer can be conveniently polymerized under environmentally friendly, industrially realistic conditions: solvent-free, at room temperature, with just a few minutes of reaction time and only a trace amount of catalyst. The resulting material has a high molecular weight, thermal stability and crystallinity, and mechanical properties that perform very much like a plastic. Most importantly, the polymer can be recycled back to its original, monomeric state under mild lab conditions, using a catalyst. Without need for further purification, the monomer can be re-polymerized, thus establishing what Chen calls a circular materials life cycle.

This piece of innovative chemistry has Chen and his colleagues excited for a future in which new, green plastics, rather than surviving in landfills and oceans for millions of years, can be simply placed in a reactor and, in chemical parlance, de-polymerized to recover their value – not possible for today’s petroleum plastics. Back at its chemical starting point, the material could be used over and over again – completely redefining what it means to “recycle.”

“The polymers can be chemically recycled and reused, in principle, infinitely,” Chen said.

Chen stresses that the new polymer technology has only been demonstrated at the academic lab scale. There is still much work to be done to perfect the patent-pending monomer and polymer production processes he and colleagues have invented.

With the help of a seed grant from CSU Ventures, the chemists are optimizing their monomer synthesis process and developing new, even more cost-effective routes to such polymers. They’re also working on scalability issues on their monomer-polymer-monomer recycling setup, while further researching new chemical structures for even better recyclable materials.

“It would be our dream to see this chemically recyclable polymer technology materialize in the marketplace,” Chen said.

The paper’s first author is CSU research scientist Jian-Bo Zhu. Co-authors are graduate students Eli Watson and Jing Tang.

Full Report:

https://www.nature.com/articles/s41467-018-04734-3#Abs1

New study finds US oil & gas methane emissions 60 percent higher than estimated

PUBLIC RELEASE: 21-JUN-2018

High emissions findings undercut the case that gas offers substantial climate advantage over coal

UNIVERSITY OF COLORADO AT BOULDER

The U.S. oil and gas industry emits 13 million metric tons of the potent greenhouse gas methane from its operations each year, 60 percent more than estimated by the U.S. Environmental Protection Agency, according to a new study published today in the journal Science.

Significantly, researchers found most of the emissions came from leaks, equipment malfunctions and other "abnormal" operating conditions. The climate impact of these leaks in 2015 was roughly the same as the climate impact of carbon dioxide emissions from all all U.S. coal-fired power plants operating in 2015, they found.

"This study provides the best estimate to date on the climate impact of oil and gas activity in the United States," said co-author Jeff Peischl, a CIRES scientist working in NOAA's Chemical Sciences Division in Boulder, Colorado. "It's the culmination of 10 years of studies by scientists across the country, many of which were spearheaded by CIRES and NOAA."

The new paper assessed measurements made at more than 400 well pads in six oil and gas production basins and scores of midstream facilities; measurements from valves, tanks and other equipment; and aerial surveys covering large swaths of the U.S. oil and gas infrastructure. The research was organized by the Environmental Defense Fund and drew on science experts from 16 research institutions including the University of Colorado Boulder and the University of Texas Austin.

Methane, the main ingredient of natural gas, is a potent greenhouse gas that has more than 80 times the warming impact of carbon dioxide over the first 20 years after its release. The new study estimates total US emissions at 2.3 percent of production, enough to erode the potential climate benefit of switching from coal to natural gas over the past 20 years. The methane lost to leakage is worth an estimated $2 billion, according to the Environmental Defense Fund, enough to heat 10 million homes in the U.S.

The assessment does suggest that repairing leaks and addressing other conditions that result in the accidental release of salrable methane could be effective. "Natural gas emissions can, in fact, be significantly reduced if properly monitored," said co-author Colm Sweeney, an atmospheric scientist in NOAA's Global Monitoring Division. "Identifying the biggest leakers could substantially reduce emissions that we have measured."

New data confirm increased frequency of extreme weather events

Public Release: 21-Mar-2018

European national science academies urge further action on climate change adaptation

European Academies' Science Advisory Council, Leopoldina - Nationale Akademie der Wissenschaften

New data show that extreme weather events have become more frequent over the past 36 years, with a significant uptick in floods and other hydrological events compared even with five years ago, according to a new publication, "Extreme weather events in Europe: Preparing for climate change adaptation: an update on EASAC's 2013 study" by the European Academies' Science Advisory Council (EASAC), a body made up of 27 national science academies in the European Union, Norway, and Switzerland. Given the increase in the frequency of extreme weather events, EASAC calls for stronger attention to climate change adaptation across the European Union: leaders and policy-makers must improve the adaptability of Europe's infrastructure and social systems to a changing climate.

Globally, according to the new data, the number of floods and other hydrological events have quadrupled since 1980 and have doubled since 2004, highlighting the urgency of adaptation to climate change. Climatological events, such as extreme temperatures, droughts, and forest fires, have more than doubled since 1980. Meteorological events, such as storms, have doubled since 1980 (Figure 1, 2013 (Figure 2.1 in 2013 report); Figure 2, 2018 (Figure 1 in 2018 updated publication).

These extreme weather events carry substantial economic costs. In the updated data (Figure 3; Figure 2 in 2018 updated publication), thunderstorm losses in North America have doubled - from under US$10 billion in 1980 to almost $20 billion in 2015. On a more positive note, river flood losses in Europe show a near-static trend (despite their increased frequency), indicating that protection measures that have been implemented may have stemmed flood losses.

Professor Michael Norton, EASAC's Environment Programme Director states, "Our 2013 Extreme Weather Events report - which was based on the findings of the Norwegian Academy of Science and Letters and the Norwegian Meteorological Institute - has been updated and the latest data supports our original conclusions: there has been and continues to be a significant increase in the frequency of extreme weather events, making climate proofing all the more urgent. Adaptation and mitigation must remain the cornerstones of tackling climate change. This update is most timely since the European Commission is due to release its evaluation of its climate strategy this year."

Is a contemporary shutdown of the Gulf Stream (AMOC) possible?

The update also reviews evidence on key drivers of extreme events. A major point of debate remains whether the Gulf Stream, or Atlantic Meridional Overturning Circulation (AMOC), will just decline or could 'switch off' entirely with substantial implications for Northwest Europe's climate. Recent monitoring does suggest a significant weakening but debate continues over whether the gulf stream may "switch off" as a result of the increased flows of fresh water from northern latitude rainfall and melting of the Greenland icecap. EASAC notes the importance of continuing to use emerging oceanographic monitoring data to provide a more reliable forecast of impacts of global warming on the AMOC. The update also notes the recent evidence which suggests an association between the rapid rate of Arctic warming and extreme cold events further south (including in Europe and the Eastern USA) due to a weakened and meandering jet stream.

Federal report on climate change: High-tide flooding could become routine by 2100

Updated 1:57 PM; Posted 1:57 PM, March 28, 2018 US News & World Report

By The Washington Post

High-tide flooding, which can wash water over roads and inundate homes and businesses, is an event that happens once in a great while in coastal areas. But its frequency has rapidly increased in recent years due to sea-level rise. Not just during storms, but increasingly on sunny days, too.

Years ago, the late Margaret Davidson, a coastal programs director at the National Oceanic and Atmospheric Administration, warned it wouldn't be long until such flooding becomes routine. "Today's flood will become tomorrow's high tide," she said.

A new NOAA report has published startling new projections that affirms Davidson's warning. Due to sea-level rise, high-tide flooding is already accelerating along many parts of the U.S. coastline, including the Mid-Atlantic.

By 2100, the report says, "high tide flooding will occur 'every other day' (182 days/year) or more often" even under an "intermediate low scenario" in coastal areas along the East Coast and Gulf of Mexico. This scenario works under the assumption that greenhouse gas emissions - which warm the climate and speed up sea-level rise - are curbed.

For a more aggressive "intermediate" scenario, in which greenhouse gas emissions carry on at today's pace, high tide flooding is forecast to occur 365 days per year.

In February, scientists published a separate study concluding that they had already detected an acceleration in sea-level rise and that it is likely to speed up more in the coming decades as ice sheets disintegrate and ocean waters expand.

The implications of the steepening sea-level rise on high-tide flooding are already apparent on the ground.

William Sweet, the lead author of the NOAA report, has witnessed an uptick in flooding firsthand. "I moved to Annapolis (Maryland) last year," he said in an interview. "Since I moved there, I've seen six or seven days when waters hit storefronts. What we're finding is along the East Coast, and the Mid-Atlantic in particular, already the flooding is accelerating."

In just 15 years, the incidence of high-tide flooding in the Mid-Atlantic has doubled, from an average of three days per year in 2000 to six in 2015, according to the NOAA report.

"That's a very important outcome," Sweet said. "What it means is that the change is not a gradual linear change, but measured in leaps and bounds. By the time you realize there is a flooding issue, impacts will become chronic rather quickly."

A similar acceleration in coastal flooding has been seen in other locations along the East Coast, including Florida, the Carolinas and in the Northeast.

Practically every time tides are intensified by the lunar cycle in South Florida, media reports show inundation from "sunny day flooding," which seldom occurred mere decades ago.

This past winter, Boston observed its first and third highest tides in recorded history as nor'easters battered southern New England in January and March.

"The record-breaking event of January 2018 would not have broken the record had it not been for relative sea-level rise that carried the tide and storm surge above the level reached in 1978," Andrew Kemp, an assistant professor of earth and ocean sciences at Tufts University, told the Boston Globe.

The prospect of high-tide flooding occurring every day or even every other day late this century is difficult to fathom.

Michael Lowry, a visiting scientist at the University Corporation for Atmospheric Research, expressed shock on Twitter after seeing these projections. "It's hard to overstate the significance of this," he said. "That isn't even the intermediate high, high, or extreme scenarios that bring us 365 (days per year) high-tide flooding in my lifetime. It's crazy."

Even by 2050, just over 30 years from now, the report projects high-tide flooding will occur between 50 and 250 days per year along the East Coast, depending on the emissions scenario.

"This is such a huge deal," tweeted Alan Gerard, a division chief at the National Severe Storms Laboratory. "And even the increase by 2040. That's not that far away."

Astrid Caldas, a senior climate scientist at the Union of Concerned Scientists who tracks the impacts of sea-level rise, is extremely worried about the projected flooding increases.

"(B)y mid-century, the frequency of this type of 'minor' flooding would become so disruptive that business as usual would be practically impossible without significant adaptation measures," Caldas said. "Without planning for flooding with measures such as protecting, elevating, accommodating the water, or even moving stuff out of the way, the impacts on the cities, their economy, and their residents would be immense."

She added: "Just imagine seeing streets (and property) flooded every other day. That gives a completely new meaning to the term 'nuisance flooding.' Or actually, it completely obliterates the concept, as flooding would become much more than a nuisance, but a rather serious problem requiring significant resources and innovative policies."

NOAA's Sweet said informing decision-makers and planners about the escalating flood risk was impetus for the report. "(Floodwaters) getting up to storefronts in Annapolis becomes the new tide (by late this century)," he said. "When you project out that far, the assumption is that society will adapt. That's the real reason we're doing this: to inform decision-making so the best science is available to plan accordingly."

In the D.C. region, significant infrastructure is at risk from rising seas, including assets around the National Mall, the Southwest Waterfront, Old Town Alexandria in Virginia, and Annapolis.

To respond to this risk, an interagency team known as the District of Columbia Silver Jackets formed in 2014. It includes members from regional and federal agencies, such as the D.C. government, U.S. Army Corps of Engineers, National Park Service and National Weather Service, as well as academia.

The team convened a flood summit in 2016, has developed an online flood inundation mapping tool, and conducted a study for vulnerable neighborhoods along Watts Branch, a tributary of the Anacostia River.

"While our team has made great strides in identifying and helping to mitigate flood risk in the region, there is still so much work to be done, including taking a more comprehensive, integrated and regional approach to flood risk management," the Silver Jackets team said in a statement. "The statistics in this report are staggering, and really hit home on the fact that time is of the essence as we plan for the future and take steps now to ensure our nation's resources and treasures are not impacted by flooding."

Caldas stressed global action is needed in addition to local measures to lower the risk of high-tide flooding. "(A)dhering to the commitments of the Paris (climate) Agreement is exceptionally important, because emissions for the next couple of decades, up to mid-century, will define how much sea level rise we will see in the end of the century - and how much tidal flooding we will see: every other day or every day," she said.

Washington bans firefighting chemicals that may cause cancer

UPDATED: Tue., March 27, 2018, 5:39 p.m.

Water that may be contaminated with chemicals used in firefighting foam on Fairchild Air Force Base is flushed from a fire hydrant in Airway Heights last December. Washington is banning those chemicals for civilian fire departments, although it has no authority to extend the ban to military bases.

By Jim Camden

jimc@spokesman.com

(509) 879-7461

OLYMPIA – Firefighting foam with a chemical thought to cause cancer and other health problems will be banned in two years for local fire departments and districts in Washington.

A new law signed Tuesday bans the group of chemicals that are contaminating some wells in Airway Heights and other water sources near military bases, although it won’t directly affect that contamination.

Perfluorinated or polyfluorinated compounds, or PFAS, are a key ingredient in some foams used to extinguish fuel fires, and also are commonly applied to firefighters’ protective gear. They last a long time, are almost indestructible under most natural situations and travel easily through the soil to get into underground water supplies.

They may also be responsible for a high rate of cancers in firefighters, lawmakers were told in hearings for the bill signed by Gov. Jay Inslee on Tuesday. Washington may be the first state to impose such a ban, which takes effect in 2020.

In Airway Heights, the contamination is linked to firefighters practicing with the foam on Fairchild Air Force Base. The state can’t tell the Defense Department not to buy foam with PFAS to put out aviation fuel fires on its bases but the foam can’t be used for training on bases or at airports. The bill is directed at local fire departments and fire districts, some of which already are finding substitutes or altering training to reduce the use of the foam except in real emergencies.

Water Scarcity Looms in London’s Future

March 28, 2018

The city’s water demand is expected to exceed supply within the next decade.

London draws a majority of its water from the Thames and Lea rivers.

In recent years, London has grappled with water shortages and sewage overflows as the city’s population steadily increases. Water demand is expected to exceed supply within the next decade, and severe water shortages could affect Britain’s capital by 2040.

London’s water stress is influenced by several factors. The metropolitan area receives roughly 600mm of rain each year, less than cites such as Istanbul, Turkey, and Sydney, Australia. The relatively low rainfall, coupled with the growing likelihood of hotter, drier summers, is putting pressure on the city’s water supply. At the same time, London’s population is expanding by 100,000 people each year.

London’s key water sources–the Thames and Lea rivers–are also facing pollution problems. Plastic and other debris infiltrate the waterways and clog the city’s sewer system. In addition, the rivers are sometimes deluged with wastewater when heavy rainfall overwhelms the sewage system’s treatment capacity. In order for London to avoid extreme water stress, advocates say, the city needs to cut consumption and make river preservation a priority.

“Pressure on our rivers and water supplies is only going to rise. We need to act now to ensure we have enough water for people and nature in years to come.” –Rose O’Neill, freshwater programme manager at WWF-UK, in reference to the need for careful water stewardship in London.

By The Numbers

13.4 million Possible population of London by 2050, according to the London Assembly Environment Committee. The population is currently nearing 9 million.

20 percent Amount that demand for water could outstrip London’s supply by 2040.

65 percent Proportion of London’s water that is pumped from rivers. The other 35 percent is drawn from aquifers.

20 million ($26 million) Amount that Thames Water, London’s main water provider, was charged in March 2017 for discharging sewage into the River Thames. The leaks occurred over a 2-year period and amounted to 1.4bn litres of sewage.

70 percent Proportion of flounder in the River Thames that have bits of plastic in their guts, according to a 2015 study.

25,000 tonnes Amount of debris that Thames Water removes from its sewage system every year. Recently, fatbergs–congealed masses of cooking fat, diapers, and wipes–have been obstructing sewer pipes in London. The fatbergs increase the likelihood of raw sewage flowing into streets.

According to a 2016 study by Water UK, there is a 20 percent chance that London residents will need to queue for water at standpipes during a summer drought in the next 25 years. The researchers ran models based on climate change, population growth, and future environmental protections, and found that UK summers will likely become increasingly hot and dry. The most extreme models predicted that summer droughts could result in economic costs of up to 1.3 billion per day throughout England and Wales.

One recommendation for improving London’s water outlook is to utilize a “twin track” approach to water management. This method would focus on preserving the city’s current water supply while also implementing policies to lower demand. New infrastructure is needed as well, especially to help improve the city’s aging sewage system.

If London cannot incentivize residents to reduce water use, then extra storage capacity will likely be needed. Thames Water sold 25 reservoirs after becoming privatized in the late 1980s. The utility insists that the reservoirs were “service” rather than “storage” reservoirs, but the move has drawn scrutiny, especially as London’s water stress grows. In either case, the company is now faced with buying new land in order to increase water storage capacity–an expensive endeavor in modern-day London.

After 7 years, Madison compost program trashed

CHRIS RICKERT crickert@madison.com Jun 12, 2018

The city's organics collection program will end next week. Popular with participants, the program suffered from too much contaminated waste and a lack of a biodigester to collect the gas it produced.

Madison’s organics-collection pilot program will end next week, a victim of people putting too many non-compostables into their compostables carts and a city that couldn’t afford its own biodigester, the city’s recycling coordinator announced Monday.

Bryan Johnson emphasized that collecting food scraps and possibly other compostable materials citywide continues to be a long-term goal, and ending the current seven-year-long program will give officials time to study programs elsewhere that have successfully been able to process waste to create biogas for sale to utilities.

“We are committed to making this work,” he said, but “what we’ve been getting out of the bins really is just incompatible with the processing options.”

There are currently about 1,100 households and 40 businesses in the program.

While the load of food scraps, soiled paper and other material delivered to a Middleton digester in March had fewer plastic bags, metals and other noncompostables than the prior load, “the contamination within the material created a very labor-intensive and slow process, which also requires thousands of gallons of water,” he said.

That contamination included lumber scraps and two seemingly full bottles of liquor, he said, and the extra water and labor needed to separate debris and process the compostables meant a $200-per-ton fee from the digester’s operator, GL Dairy Biogas, which is “far too expensive.” By contrast, the county landfill charges a dumping fee of $50 a ton.

Johnson said that when the program began in 2011 with about 500 participants, the plan was for the city to open its own biodigester in 2016, but “this did not happen.”

Constructing its own biodigester would allow the city to tailor it to the city’s waste, Johnson said.

But at a cost of $30 million to $35 million, it would also be “way beyond the scope of this city,” said Mayor Paul Soglin.

Ultimately, Soglin said, he’d like to see the city focus on collecting waste from larger producers — such as cafeterias, grocery stores and large employers — and then find a way to work with the Madison Metropolitan Sewerage District to process it.

MMSD officials said the city reached out in early 2016 looking for ways to bring its curbside collection to the district’s South Side Madison site, which doesn’t have the facilities to handle such waste now, said process and research engineer Matt Seib.

“In addition to studying the potential for energy conversion, his research team has been looking at opportunities to create valuable industrial products from a waste stream composed of organic material,” MMSD communications manager Jennifer Sereno said in an email.

Those products could include fertilizer or organic molecules for industrial applications, Seib said, but there are no specific plans for what MMSD could do with Madison’s organics or how soon it could be doing it.

“We remain open to potential future collaboration in this area,” Sereno said.

Johnson said sending Madison’s organics to MMSD “is not a near-term solution considering the way everything is currently performed.” As part of his research, he said he plans to learn more about a private curbside composting service for commercial customers in Milwaukee County.

Jaclyn Lawton, chairwoman of the city’s Solid Waste Advisory Committee, which oversaw the pilot program, said focusing on large producers could make sense.

“Large producers are also more able to comply with the materials to be composted,” she said. “Households, even those volunteering for the pilot, did not have uncontaminated materials as a group.”

Soglin said other factors to consider in large-scale composting are how consistent the compostable material is and how much utilities are willing to pay for the gas it produces.

“The cost of collecting individual household substrate compared to the benefit is astronomical,” he said.

The rules on what could be put in the curbside carts changed as the city changed processors in its pilot program. At one point, pet waste and diapers were compostable, and some participants have not changed what they’ve put into the compost stream as the rules have changed, Johnson said.

Ending the pilot and starting fresh would allow the city to set strict limits in any new program, he said.

Households that participate in the program will get a letter this week announcing its end, and Streets Division crews will collect participants’ black refuse carts as soon as they can after final collections next week, Johnson said.

Nothing should be put in the carts between the final scheduled collection and when the carts themselves are collected next week.

MADISON, Wis. (AP) — Madison is ending its compost collection program because residents were putting too many non-compostable items in their carts and the city can’t afford its own biodigester.

Bryan Johnson, the city’s recycling coordinator, told the Wisconsin State Journal that ending the program will give officials time to study other options for collecting food scraps and other compostable materials.

The program began in 2011 with about 500 participants, he said. It currently has about 1,100 households and 40 businesses involved. Participants will receive a letter about the programs end and crew will collect refuse carts after the final collection next week, Johnson said.

“We are committed to making this work,” Johnson said. “(But) what we’ve been getting out of the bins really is just incompatible with the processing options.”

Separating non-compostable materials, such as plastic bags and metals, is a labor-intensive and slow process that requires additional water, Johnson said. The digester’s operator, GL Dairy Biogas, charges a $200-per-ton fee to separate debris from compostable material.

The county landfill has a $50-per-ton dumping fee.

Initial program plans had called for the city building its own biodigester in 2016, but the project never happened, Johnson said.

Constructing a biodigester would cost up to $35 million, which is beyond the city’s ability, said Mayor Paul Soglin.

Soglin said he hopes the city can find ways to work with larger producers before integrating the process into the Madison Metropolitan Sewerage District.

The district’s process and research engineer, Matt Seib, is studying ways to create industrial products from organic material waste, MMSD communications manager Jennifer Sereno said in an email. Products could include fertilizer or organic molecules for industrial applications, she said.

----------------------------------------

Another Viewpoint :

Madison’s composting pilot program struggles with contamination

BY CAMERON BREN JANUARY 25, 2018

Part of Colin Thompson’s job at Batch Bakehouse is hauling garbage and recycling to the curb. For Thompson, this involves an extra layer of sorting.

Each week, Thompson also puts out eight bins full of the bakery’s food waste. “It reduces waste on the whole and increases the overall sustainability of our business and lessens our environmental impact,” Thompson says. “If local gardeners can end up using it, it builds the community.”

Batch is part of a group of 40 businesses and 1,100 residents participating in a pilot composting project that began in 2011. Although the project doesn’t take much effort, it requires vigilance.

Every once in a while, Thompson catches someone tossing compost into a garbage bag. Or, worse, throwing plastic into one of the composting bins. “I usually catch it but I can see the potential for slip-ups like that,” Thompson says.

Those slip-ups are putting a wrench in Madison’s ambitions to join San Francisco, Seattle, Portland and other cities that have a curbside composting collection. Contamination in the pilot project has left the city with no place to send its composting waste, says Madison recycling coordinator Bryan Johnson.

“The most persistent contamination has been plastic, like plastic bags, plastic bottles, and other plastic containers like food service boxes or salad containers,” Johnson says. “On the surface of it, you would think it would be easy to only put food scraps or compostables into a container, but everyone makes mistakes from time to time and a plastic bottle or a plastic utensil winds up in the wrong container.”

Johnson says a small amount of plastic can contaminate a lot of composting. “Organics are routinely shredded prior to composting, so the plastic bottles or plastic bags would also be shredded and spread throughout the material being processed,” Johnson says. “There are screens to get some of the debris out, but tiny flecks of plastic, metal, or glass are very difficult to remove with just screens.”

Initially, the city sent composting to a digester at UW-Oshkosh. After that site stopped accepting the material, the city turned to digesters owned by Gundersen-Lutheran in Middleton and Blue Ribbon Organics in Caledonia — both have since stopped doing business with the city due to persistent contamination.

The companies have to be picky because they’re trying to sell a product to home gardeners. “Marketable compost does not have room for error when it comes to plastic or other debris in the product,” Johnson says.

Although the city’s composting currently has no where to go, Johnson is asking residents in the pilot to keep participating. “We are hanging on to the organics until we find a place to go with them,” he says. But, he adds, “In instances where the pile is too foul to keep holding, we have no choice but to send it to the landfill.”

The city could create its own composting site or biodigester, which would be more forgiving since it would not be trying to sell the end product to home gardeners. Instead, its composting could be used for agriculture or construction, which allows for higher levels of contamination.

For years, the city has proposed building its own biodigester, but the project keeps getting postponed because of high costs and tight budgets. A biodigester would cost $12-$18 million to build and require more trucks and workers to collect and process the composting. However, the project would slowly pay for itself, by reducing the amount of waste the city pays to dump at the landfill.

Ald. Marsha Rummel, Common Council president, says several alders support the project, but ultimately the mayor sets the budget. “Recently we were faced with a new police station, fire station and other projects so the biodigester got pushed back by the mayor,” Rummel says. “The council supports the pilot and I believe alders see the benefits of a biodigester.”

The city is looking for possible partners — such as the Madison Metropolitan Sewer District — to help pay for it, Johnson says.

“Exploring partnerships is a way to help share the costs, but it is still difficult to predict when the economics of the situation will align to allow for the continued investment,” he says. “However, when we can, we will.”

Despite the challenges, Johnson sees curbside composting in Madison as inevitable. “The food we toss out is a valuable resource when we manage and process it correctly. I have no doubts we will be there.”

Thompson is hoping the city figures it out soon.

“I’d like to see it expanded and brought into my neighborhood,” he says. “Even on an individual basis it cuts down on a lot of waste.”

Exclusive: U.S. Army forms plan to test 40,000 homes for lead following Reuters report

AUGUST 27, 2018 / 2:21 PM / UPDATED A DAY AGO

Joshua Schneyer, Andrea Januta

NEW YORK (Reuters) - The U.S. Army has drafted a plan to test for toxic lead hazards in 40,000 homes on its bases, military documents show, in a sweeping response to a Reuters report that found children at risk of lead poisoning in military housing.

The inspection program, if implemented, would begin quickly and prioritize thousands of Army post homes occupied by small children, who are most vulnerable to lead exposure. Ingesting the heavy metal can stunt brain development and cause lifelong health impacts.

The lead inspections would cost up to $386 million and target pre-1978 homes to identify deteriorating lead-based paint and leaded dust, water or soil, according to the military documents.

A draft Army Execution Order says the program’s mission is to mitigate all identified lead hazards in Army post homes in the United States. In homes where dangers are detected, the Army would offer soldiers’ families “temporary or permanent relocation” to housing safe from lead hazards, it says.

The Army's mobilization comes after Reuters published an investigation on August 16 describing lead paint poisoning hazards in privatized military base homes. It documented at least 1,050 small children who tested high for lead at base clinics in recent years. Their results often weren't being reported to state health authorities as required, Reuters found.

Behind the numbers were injured families, including that of a decorated Army colonel, J. Cale Brown, whose son JC was poisoned by lead while living at Fort Benning, in Georgia.

The article drew a quick response from lawmakers, with eight U.S. senators demanding action to protect military families living in base housing.

Read Reuters' reports exposing the hidden hazards of lead poisoning across America.

The Army’s planned response is laid out in military documents, including the draft Execution Order, minutes from a private meeting attended by top Army brass, and other materials.

One priority, detailed by Under Secretary of the Army Ryan McCarthy in an August 22 meeting, is for the military’s response to counter any sense “that we … are not taking care of children of Soldiers and are not taking appropriate action quickly enough,” meeting minutes say. “The Army will remain focused on the actions to assess, inspect, and mitigate risks to Soldiers and Families,” the minutes say, citing McCarthy and Vice Chief of Staff General James C. McConville.

Army spokeswoman Colonel Kathleen Turner acknowledged plans are being formulated but said no decisions have been made. “Out of an abundance of caution, we are going above and beyond current requirements to ensure the safety of our soldiers and their families who work and live on all of our installations,” Turner said in a statement. “We are currently evaluating all options to address these concerns.”

Old lead-based paint becomes a poisoning hazard when it deteriorates, and poor maintenance of military base homes can leave legions at risk. About 30 percent of service families – including some 100,000 small children – live in U.S. military housing owned and operated by private companies in business with the military.

There are nearly 100,000 homes on U.S. Army bases, and the lead inspections are expected to focus on the approximately 40,000 built before a 1978 U.S. ban on the sale of lead paint.

The plans depart from guidance that appeared on the Army Public Health Center's website as recently as last week, which "discouraged" lead-based paint inspections in Army homes. The website has since been updated and omits that language.

Under the plans, the documents show, the Army would:

- Inspect all pre-1978 Army family housing units nationwide, including visual lead-based paint assessments by certified personnel, swipe-testing for toxic lead paint dust, and testing of tap water. Some homes will also receive soil testing. This phase alone, described as “near term actions,” will cost between $328 million to $386 million, the Army’s Installation Services director estimated.

- Temporarily or permanently relocate families when hazards are found. “If a Family or Soldier are concerned with potential negative impacts from lead; the U.S. Army will offer them a chance to relocate to a new residence,” the documents say. “We must do everything we can to maintain that trust.”

Conduct town hall meetings on Army posts to address residents’ lead concerns. The Army intends to do so with “empathy,” the meeting minutes say. “Tone is key and can be just as important as the actions we take.”

The documents leave some questions unanswered. They don’t say how long it would take to inspect all 40,000 homes. Also unclear is whether the Army has funds immediately available for the program, or would need Congressional authorization to set them aside.

The Army would ensure that the private contractors who operate base housing “are meeting their obligations” to maintain base homes, the documents say, and would require them to show compliance with lead safety standards through independent audits.

The documents do not discuss whether private housing contractors would bear any of the costs of the lead inspections, or how any repairs would be funded.

In most cases, Army post homes are now majority-owned by private real estate companies. Under their 50-year agreements with the Army, corporate landlords operating military housing agreed to control lead, asbestos, mold, and other toxic risks present in some homes, particularly historic ones.

FAMILIES, SENATORS PRESS FOR ANSWERS

The Army plans come as base commanders and housing contractors face a wave of complaints about potential home lead hazards, and a rush of military families seeking lead tests for their children.

Last week, the hospital at Fort Benning, where Reuters reported that at least 31 small children had tested high for lead exposure in recent years, began offering “walk-in” lead testing. Some concerned families are already being relocated; in other homes, maintenance workers are using painter’s tape to mark peeling paint spots that residents found contained lead by using store-bought testing kits.

Lead poisoning is preventable, and its prevalence in the United States has declined sharply in recent decades. Still, a 2016 Reuters investigation documented thousands of remaining exposure hotspots, mostly in civilian neighborhoods.

Last week, eight senators, including Republican Johnny Isakson of Georgia and Democrat Claire McCaskill of Missouri, pushed amendments to legislation to examine and address the military's handling of lead exposure risks.

In coming weeks, Army officials plan to meet with lawmakers to address their concerns, the military documents show.

Reuters Investigative Report:

https://www.reuters.com/investigates/special-report/usa-military-housing/

Arizona clean-energy ballot measure can go on November ballot, judge rules

Ryan Randazzo, Arizona Republic Published 6:12 p.m. MT Aug. 27, 2018

Arizona Public Service

A Maricopa County Superior Court judge on Monday allowed a clean-energy initiative to go on the November ballot despite a challenge from opponents who argued the initiative did not gather enough legal signatures.

An opposition group backed by the parent company of Arizona Public Service Co. vowed to appeal the ruling to the Arizona Supreme Court.

The Clean Energy for a Healthy Arizona measure, now called Proposition 127, would require electric companies such as APS and Tucson Electric Power Co. to get half their power from renewable sources like solar and wind by 2030.

"Proposition 127 will give voters a chance to finally utilize Arizona’s greatest asset — its sunshine," said DJ Quinlan, spokesman for the campaign.

The existing state standard, set by the Arizona Corporation Commission, requires those companies to get 15 percent of their power from renewables by 2025, and the utilities are on track to meet that. The goal increases annually and is 8 percent this year.

APS spending millions to defeat measure

The parent company of APS, Pinnacle West Capital Corp., has contributed $11 million to a political action committee combating the initiative.

Arizona Public Service Co. says a state constitutional

Arizona Public Service Co. says a state constitutional amendment on renewable energy pushed by Clean Energy for a Healthy Arizona would dangerously overstimulate solar- and wind-power development. (Photo: Mike McPheeters)

“Proposition 127 would fundamentally alter the Arizona Constitution and implement costly new regulations to raise electric rates for Arizona families and businesses," said a response from the opposition group Monday.

"Before we proceed any further down this path, it is only prudent to be certain the initiative has met the bare standards necessary to be on the ballot.”

APS officials say that the easiest way to comply with the standard, should voters approve it, will be to build many new solar power plants, and the cost of those plants will double the average household utility bill.

The company also says that a lot of solar power on the grid will create a glut of power in mild-weather months when the solar plants generate power but customers don't use air conditioning.

That glut would force the company to curtail output at its coal and nuclear plant, forcing it to close ahead of schedule, APS said.

Measure backers: APS 'lying' about ballot proposal

Proponents of Prop. 127 doubt those APS claims.

“APS has been lying about the validity of our signatures for months,” Quinlan said. “If we can’t trust them on this, why should voters trust any of their other profit-motivated attacks against clean energy?”

An opposition group funded by APS called Arizonans for Affordable Electricity has repeatedly challenged the signature-gathering efforts of the ballot measure, and found some of the people registered to collect signatures had felony crime convictions.

Those signature gatherers were inconsequential to the hundreds of thousands of signatures collected, but the opposition group advertised the felony records and even conducted robocalls to warn voters that the people collecting signatures might pose a threat.

The clean-energy committee on July 5 submitted more than 480,000 signatures. It needed only about 226,000 to qualify for the ballot.

Phoenix voters Aug. 28, 2018, will weigh in on a series of ballot measures that could change the way the city conducts its elections.

A sample review of the signatures by the county recorders found more than 72 percent of the signatures to be valid, or about 328,908 signatures, which is more than enough to qualify for the ballot.

The plaintiffs in the case, led by state Rep. Vince Leach, R-Tucson, subpoenaed 1,180 of the signature collectors to trial the week of Aug. 20, and more than 900 showed up.

However, Judge Daniel Kiley threw out the signatures gathered by 235 of the signature gatherers who did not show up to court, and threw out more from some who showed up but left early, or did not return to court as directed later in the week.

Those actions did not eliminate enough signatures to prevent the measure from making the ballot.

Popular French environment minister quits in blow to Macron

Laurence Frost, Geert De Clercq

PARIS (Reuters) - French Environment Minister Nicolas Hulot resigned on Tuesday in frustration over sluggish progress on climate goals and nuclear energy policy, dealing a major blow to President Emmanuel Macron’s already tarnished green credentials.

Hulot, a former TV presenter and green activist who consistently scored high in opinion polls, quit during a live radio interview following what he called an “accumulation of disappointments”.

“I don’t want to lie to myself any more, or create the illusion that we’re facing up to these challenges,” Hulot said on France Inter. “I have therefore decided to leave the government.”

Hulot was among Macron’s first ministerial appointments following his May 2017 election victory. His inclusion helped to sustain a green image France had earned 18 months earlier by brokering the Paris Agreement to combat global greenhouse emissions.

But the centrist president has watered down a series of campaign pledges on the environment, including a commitment to cut the share of nuclear power in French electricity to 50 percent by 2025 and boost renewable energy.

Those policy shifts have been a repeated source of frustration for Hulot. Since a post-election honeymoon period, they have been accompanied by a sharp slide in Macron’s ratings, which touched new lows after his bodyguard was filmed assaulting demonstrators last month.

Hulot said he had not informed Macron before announcing his resignation.

“This may not be the right protocol, but if I had warned them they might have talked me out of it yet again,” Hulot said. His cabinet portfolio also included energy.

Government spokesman Benjamin Griveaux said the cabinet “regretted” his resignation, but also described it as “a blow from which we’ll recover”.

“I don’t understand why he is stepping down when we had many successes in the first year that are to his credit,” Griveaux told BFM Television. “He didn’t win all his battles but that’s the way it goes for ministers.”

Greenpeace France director Jean-Francois Julliard said that while Macron had “made some fine speeches” and stood up to U.S. President Donald Trump on climate change, he had “never turned these words to concrete action” at home.

.

Shares in power utility EDF, which is on the hook for the cost of decommissioning older nuclear plants, surged 2.7 percent in early trading before settling back to 14.26 euros at 0901 GMT, still up 1 percent on Monday’s close.

Hulot had also voiced disappointment after Macron wavered on a commitment to ban the weedkiller glyphosate, sold under Monsanto’s Roundup trademark, and failed to prevent Total’s La Mede refinery producing biofuel from imported palm oil linked to deforestation.

His announcement came a day after the government promised to relax hunting laws, in a move widely seen as an attempt to bolster Macron’s appeal in rural areas.

Prime Minister Edouard Philippe said he would discuss the “composition of the government” with Macron following Hulot’s departure, leaving open the possibility of other changes.

Macron’s office sought to attribute the resignation to the “frustration, even exhaustion” of a ministerial novice suddenly confronted with the slow-moving machinery of government.

“Political and administrative timeframes aren’t necessarily as one hopes, coming from activism,” an Elysee official said, insisting that Hulot’s deep commitment to environmentalism was evidenced by the government’s policy track record.

In his radio interview, however, Hulot emphasized the inadequacy of “mini-steps” on climate change by France and other nations, voicing hope that his exit might “provoke deep introspection in our society about the reality of the world”.

His doubts about remaining in government had grown over the summer as devastating droughts were met with a tepid political response, he said.

Alain Juppe, a conservative former prime minister and presidential contender, said he was impressed by Hulot’s “high-mindedness and by the nobility of his act”.

“Beyond the inevitable political buzz, I hope this decision encourages us all to think and to change,” he said on Twitter.

Reporting by Laurence Frost and Geert De Clercq; Additional reporting by Yann Le Guernigou, Richard Lough and Michel Rose; Editing by Richard Lough, John Stonestreet and Peter Graff

German e-cars still hampered by lack of charging stations

Sales of electric cars in Europe have just crossed the 1 million mark, as more consumers are persuaded to opt for green mobility. Can the infrastructure to support the transition from fuel-powered vehicles keep up?

Manufacturers of fully electric vehicles (EVs) and plug-in hybrids are delighted; sales of the more climate-friendly cars and vans across Europe have grown 42 percent in the first half of the year.

Industry analysts EV Volumes report that a million EVs are now plying the continent's roads, and that Germany is set to overtake Norway — which until now has led the way in the transition to electric cars — in total sales for the year.

Until now, earlier adopters of battery-powered cars have found it relatively easy to keep their vehicles charged up, as long as they don't travel too far. After all, most charging is done at home, while state subsidies have led to public charging stations springing up at shopping malls and city center car parks.

Although Germany has enough charging points for the current number of EVs on the roads, many industry analysts question whether governments, the private sector, and even apartment block owners will be able to cope with future demand for charging infrastructure.

"If the number (of EVs) increases significantly, problems will arise," warned transport expert and blogger Martin Randelhoff.

He told DW that while workplaces and those living in detached houses can install new charging infrastructure relatively easily, those in multi-story apartment blocks, where most Germans live, will struggle to manage the changeover.

Finding a parking lot on Berlin's crowded inner-city streets can be a nightmare. Building charging stations for most of the cars is an illusion

Even for those apartments that do have underground car parks, the cost of installing charging points for most or every parking space is likely to be a heavy burden for many residents' associations.

Although government subsidies have been generous until now, many European countries continue to face public sector cuts in the wake of the 2008 financial crisis. So despite ambitious targets to ban the most polluting vehicles in the medium term, consumers cannot rely on receiving similar cash incentives as EVs become mainstream.

Two saving graces include the financial backing by many corporations to build charging stations at their headquarters, and a European Union directive that will ensure that all newly constructed buildings will automatically be EV-ready.

"(Under EU rules), the cost of the charging points in all new commercial and residential buildings, as well as all buildings undergoing major renovations is now expected to be included in the overall budget for such developments," Liana Cipcigan, Co-Director of the Electric Vehicle Centre of Excellence at Cardiff University in Wales told DW.

The demand by public charging stations is set become more urgent as more drivers opt for electric vehicles, which many crowded cities will struggle to provide in the most convenient spots.

"The question is ... whether they are in the right places, and can be supported with the right charging rates to suit driver requirements and behaviors," said John Massey, managing director of the British green energy firm Grey Cells Energy.

Even bigger challenges lie ahead for national power grids, for example in Britain, whose electricity infrastructure may struggle to cope with increased demand from EVs without huge investment.

"An unavoidable fact is that we can't have millions of electric cars all plugging in and charging ... all at the same time (especially during the early evening peak), without having to spend billions on distribution network upgrades," Massey told DW.

He predicts that as more drivers transition to EVs, the demand for electricity will spread more evenly during the day, as more charging is done at work. Premium pricing for home charging at peak times, meanwhile, will also help nudge consumer behavior.

Public EV charging stations present their own challenges, as they will require EVs to come with fast-charging batteries, so they'll be able to service tens of thousands of drivers each day.

High-end brands like Tesla and Porsche are built with ultra-fast battery technology, but most electric vehicles can't currently handle the maximum speeds available from fast and ultra-fast charging points.

Last year, Germany installed the second largest number of public charging stations in the world after France — it already has some 13,500 charging points, according to figures from the utility lobby group BDEW. That compares to some 14,300 petrol (gas) stations, with 85,000 pumps.

But it'll be years before EV charging points are as efficient as their fossil fuel equivalents, and therefore as profitable.

Even the planned fast-charging EV service station, being built by Sortimo on the A8 autobahn between Stuttgart and Munich, will only enable 4,000 charging operations daily, even with its impressive 144 charging points.

Comparing an 18-hour (not 24-hour) day would only allow for 1.5 cars to be charged by each electric "pump" per hour. Compare that to a typical motorway petrol station, which can serve up to 12-15 vehicles every few minutes.

"There are significant challenges for batteries to accommodate fast charging but we can see progress coming from brands like BMW, VW and Audi," Cipcigan told DW, exalting how Germany's auto-makers are leading the way in developing a common ultra-fast charging standard.

German retail giant Aldi has inaugurated the first of 28 ultra-fast charging stations along the busy A5 highway this year. The hope is to lure more customers into its shops. But it's uncertain if clever marketing alone will make the investment pay off in the end

So for now, although some charging stations are capable of delivering 160 kilometers (100 miles) of range in 10 minutes, most EV owners won't benefit from the advanced technology, for now.

In other words, there's no alternative than for tens of thousands of charging stations around Europe to continue to sit, unused, Massey told DW.

"It almost certainly needs to be a case of the infrastructure leading the vehicles, so that consumers get used to seeing fast chargers wherever they normally stop, and they don't worry about being able to access a charger."

Colorado regulators green-light Xcel’s plan boosting renewables, cutting coal

Critics say plan to retire coal-fired plants early will cost ratepayers millions

Project coordinator Tim Farmer walks past cooling towers for Comanche 3, which will add 750 megawatts of power. Critics cite the plant’s cost and its use of coal, and they question whether, with energy demand dropping, it is even needed.

By JUDITH KOHLER | jkohler@denverpost.com |

PUBLISHED: August 27, 2018 at 9:01 pm | UPDATED: August 27, 2018 at 9:03 pm

A plan by Xcel Energy Colorado to boost the share of power it gets from wind and solar and retire a third of its coal generation was green-lighted Monday by state regulators.

The Colorado Public Utilities Commission voted 2-1 in support of what Xcel calls the Colorado Energy Plan, which the company says will cut carbon dioxide emissions by nearly 60 percent, increase renewable energy sources to 55 percent of its mix by 2026 and save customers about $213 million.

As part of the plan, Xcel, Colorado’s largest electric utility, will phase out its Comanche 1 and 2 coal-fired plants in Pueblo about a decade earlier than the original target date of 2035. Xcel says the plan will invest $2.5 billion in eight counties and save customers about $213 million, thanks to the declining costs of renewable energy.

The commission is expected to issue a written decision approving the plan in the first week of September.

Utilities commissioners, staff and supporters said the low costs of wind and solar and the improving technology of batteries that store energy from renewable sources present “a rare opportunity” to advance an energy program that will reduce greenhouse gas emissions while promoting a clean-energy economy.

“The Colorado Energy Plan Portfolio is a transformative plan that delivers on our vision of long-term, low-cost clean renewable energy for our customers, stimulating economic development in rural Colorado, and substantially reducing our carbon emissions,” Alice Jackson, Xcel Energy Colorado president, said in a written statement. “We are excited to move forward.”

Xcel’s plan, filed in June, will significantly boost power from renewable energy sources and phase out 660 megawatts of coal power by retiring the two coal-fired units in Pueblo. The utility will add about 1,100 megawatts of wind, 700 megawatts of solar, 275 megawatts of battery storage and 380 megawatts from existing natural gas sources.

One megawatt provides power to 1,100 Colorado homes.

An alternate plan, which the Colorado Office of Consumer Counsel supported, would have slightly reduced the solar and battery storage capacity. Commissioner Wendy Moser, who supported the alternate, said shutting down the coal plants earlier than planned isn’t “going to be free.”

“Customers will pay higher rates” to cover the costs, said Moser, who does agree with closing the coal plants.

Moser and Commissioner Frances Koncilja also noted that jobs will be lost when the coal plants close. Xcel has said it will try to find other jobs for displaced workers.

The utilities commission staff and others, including a coalition led by the Independence Institute, a libertarian think tank, questioned Xcel’s estimates of savings for customers. Commission adviser Bob Bergman said the staff believes the $213 million projected savings is “likely overstated.”

“Most savings won’t occur until after 2034,” Bergman said.

Amy Oliver Cooke, the Independence Institute’s executive vice president, said the ratepayers’ coalition analysis projected no cost savings until 2046 and up to $500 million in rate increases to pay for Xcel’s plan. She said because wind and solar prices keep dropping, it would make better economic sense to wait to build new facilities until prices drop even further.

In 2016, Xcel sought bids from energy companies as it was developing its electric resource plan. Xcel received more than 400 bids, many of those at historically low prices for wind and solar energy. Bergman called the number of bids and prices unprecedented. He said it’s important take advantage of the low prices, existing tax credits and environmental benefits.

Western Resource Advocates, a Boulder-based conservation group, said Xcel’s plan will dramatically cut carbon pollution, create hundreds of jobs and invest in Colorado’s rural economy.

“Colorado’s bold decision to invest in clean energy and a healthier future for the next generation shows what the public – and the marketplace – already know, that conservation and clean energy go hand in hand with a growing, healthy economy,” said Jon Goldin-Dubois, president of Western Resource Advocates.

Numerous children have been poisoned by lead in homes approved by D.C. housing inspectors

By Terrence McCoy, Reporter

August 15

Chanelle Mattocks remembers everything about that night in 2014, when lead poisoned her son.

She was giving Alonzo, then 3, a bath in a tub that her landlord had just painted to pass a housing inspection. She turned to find a washcloth, and when she swiveled back, she found the boy with bits of peeling paint in his mouth. She tried get it out, but it was too late.

The lead tests came back positive: Alonzo had more than double what the government defines as “elevated,” and he hasn’t been the same since.

Between March 2013 and March 2018, at least 41 families discovered that their homes, subsidized by a housing voucher and approved by city inspectors, contained lead contaminants, according to a tabulation requested by The Washington Post through the Freedom of Information Act.

The District Department of Energy and Environment, which performed the count and the testing, said it inspected about half of the homes because a child living at the property, or visiting it often, had tested positive for elevated levels of lead; the other homes were investigated following a tip about possible lead hazards. The agency said that the list wasn’t exhaustive and that there may be more.

The findings again highlight key weaknesses in federal guidelines established by the U.S. Department of Housing and Urban Development, which the District and other cities follow. Many rental properties supported by housing vouchers in the city receive inspections under these standards. But they require only visual inspections for peeling paint and don’t mandate lead testing, unlike states such as Maryland and Rhode Island.

“You cannot detect with any certainty that a house does not contain toxic lead dust without doing a dust test, period,” said Ruth Ann Norton, president of Baltimore’s Green & Healthy Homes Initiative and one of the nation’s foremost experts on lead-poisoning prevention.

Since 2013, the District has subsidized and inspected more than 18,900 properties, all while it tries to meet a crisis in homelessness and affordable housing. In the first seven months of 2018, the D.C. Department of Human Services placed 367 homeless families — nearly three times as many as it did in 2013, according to city statistics.

Rick White, a spokesman for the District Housing Authority, which performs many of the inspections for subsidized properties, said that most of the voucher properties in the tabulation were overseen by the agency. After hazardous lead was found in the homes, some families moved out when their landlords did not abate the contamination. Other landlords cleared the properties of lead hazards and provided documentation to city authorities, and the families stayed. It is the landlords’ responsibility, he said, to ensure that the homes are free of hazardous lead.

“I do not want you, or your newspaper, mistakenly believing or inaccurately reporting that DCHA is not fully meeting its legal obligations,” he said, adding that the city is also reviewing how cities that have made strides in lead remediation, such as Baltimore, conduct their lead inspections. “Rest assured that if federal laws or regulations are amended, then we will adjust our operating practices accordingly. . . . In all cases, DCHA immediately takes appropriate actions against any private property owner where a DCHA inspector identifies peeling paint.”

The fix for peeling paint, however, often includes another coat of paint. But superficial and cosmetic fixes, according to housing advocates, lawyers and tenants, do little to address more significant and underlying issues, such as plumbing problems or leaking roofs, that can cause paint to crack and peel again. And that’s when lead paint, effectively banned in 1978, becomes dangerous.

“Sometimes families chose housing that may not be great because they feel like they don’t have any other options,” said Kathy Zeisel of the Children’s Law Center. “They may believe the coat of paint has resolved the issue, but by the end of the month, the paint is peeling all over again, and the water is coming through the walls.”

It was a problem for Donna Black. She moved into a house on Rittenhouse Street in Northwest Washington with her housing voucher in 2013, while she was pregnant. When she first saw the home, she didn’t feel good about it but didn’t want to seem “choosy.” Plus, the inspectors had said it was okay, so she assumed it was safe.

“That was very false,” she said.

The roof started leaking. The paint started peeling. She gave birth. She named the baby Damion.

A year later, his blood carried twice the amount of lead the government calls elevated, although most advocates and scientists say any trace of lead in a child’s system can lead to diminished cognitive function.

Four years after that, Black is homeless, living in a Holiday Inn Express with Damion, whose needs her life revolves around. “My son is not a normal 3-year-old,” she said.

A lot of days, she’s filled with anger.

“We’re very upset with the city,” she said. “The city is the number one reason why this has happened to my son. . . . They let our family move in there, and it was fixed up to the point where it could look like it was okay, but it really wasn’t.”

Mattocks, too, has trouble understanding how to a raise a child who is different from her seven other children. Alonzo, now 7, is always behind in his schooling, and she worries about what sort of life he will have. “I’m worried that, as an African American male, they’re already having so many issues with police brutality and being discriminated against that I’m fearful . . . that this will be another barrier that he’ll have to try to get through.”

Mattocks and Black filed lawsuits against the housing authority and their landlords in District Superior Court in 2016, but the housing authority was dismissed from the cases after arguing that it wasn’t liable, although that decision is being appealed. “There really should be stricter standards to protect the children,” said Alan Mensh, the attorney representing the two.

Scott Muchow, the landlord for Mattocks’s property, declined to address specific questions about Alonzo’s lead poisoning. “In late 2016, I received notice of a lawsuit for lead paint related issues at the property from Ms. Mattocks, but during discovery, Ms. Mattocks chose to voluntarily dismiss the case,” Muchow said in a statement.

The lawsuit against Black’s landlord, Jerome Lindsey, who could not be reached to comment, is pending.

Mattocks and Black said they were less interested in money than a sense of justice. They moved into homes that were supposed to be safe but turned out to be anything but, and now they’re raising children whose needs exceed their means. And no one, they say, wants to take responsibility.

“So who do we hold responsible?” Mattocks said. “We have to hold the city accountable, and the landlords accountable, we have to hold all of these people accountable . . . so that the children we call our future, we take care of these children. . . . But how do we do that if we don’t hold them accountable?”

Terrence McCoy covers poverty, inequality and social justice in urban and rural America. He is the recipient of numerous awards, including the 2016 George Polk Award for stories that showed how companies in an obscure industry made millions of dollars from exploitative deals with the poor and disabled. He has served in the Peace Corps in Cambodia

St. Louis suburb suspends curbside recycling collection

AUTHOR

Katie Pyzyk

@_PyintheSky

PUBLISHED

Aug. 21, 2018

Officials in the St. Louis, Missouri suburb of Kirkwood have decided to suspend the city's curbside recycling collection program, starting on October 22.

Leaders made the decision because they have been unable to find a company to process mixed recyclables after their current processor, Resource Management, notified the city that it would no longer accept single-stream materials.

The situation is being blamed on unfavorable market conditions brought on by China's regulatory measures for tighter contamination standards, according to the St. Louis Post-Dispatch.

Kirkwood leaders expect that other St. Louis-area municipalities will be forced to follow suit and suspend their curbside recycling programs because of unfavorable market conditions. In another suburb, Hazelwood, Republic Services operated one of its two major processing facilities for the region's recyclables. Paper makes up about one-third of the recycling stream at that facility, and Republic indicates the paper market has all but dried up since China enacted its stricter standards.

A statement on Kirkwood's website says employees are actively seeking alternatives that would allow the curbside program to continue, but so far none have materialized. "The City at this time has been unable to find another company in the region that can process our mixed recyclables," the statement says.

Prior to the China-induced market trouble Kirkwood was receiving $15 per ton from Resource Management for the single-stream materials, but the hauler recently began charging the city $35 per ton for material dropped off at its facility. That change is projected to cost the city an additional $13,000 per year, though leaders say the curbside program cancelation isn't simply a problem of finances.

“We have to have a place that will accept the collected mixed recyclables,” said Russ Hawes, Kirkwood’s chief administrative officer, in a statement. “This decision was not based on costs.”

If the city cannot find a processor, it will retrofit the depository where residents used to drop off separated recyclables prior to the curbside program launch in 2011. Residents will be able to bring source-separated paper, glass, aluminum, tin and some plastics.

The city will launch an educational campaign next month to inform residents about these changes and the importance of reducing contamination. The existing curbside recycling carts will be relabeled and used for traditional trash pick-up. Although the city says it is trying to salvage its curbside recycling program, the measures it's taking to move in another direction may signal a point of no return.

“We have to prepare a plan of action if we cannot find a facility to take our mixed recyclables," said Kirkwood Mayor Tim Griffin in a statement. “We are doing everything we can to save our curbside recycling program, in light of the sudden and drastic recycling market changes.”

Democratic National Committee Backtracks On Its Ban Of Fossil Fuel Donations

The move comes just two months after the party adopted a resolution to prohibit oil, gas and coal company contributions.

By Alexander C. Kaufman

Tom Perez, chair of the Democratic National Committee, introduced the new resolution to "support fossil fuel workers."

Tom Perez, chair of the Democratic National Committee, introduced the new resolution to “support fossil fuel workers.”

The Democratic National Committee passed a resolution Friday afternoon that activists say effectively reverses a ban on fossil fuel company donations.

The resolution introduced by DNC Chair Tom Perez states that the party “support[s] fossil fuel workers” and will accept donations from “employers’ political action committees.” It was approved by a 30-2 vote just two months after the committee adopted another resolution prohibiting donations from fossil fuel companies by a unanimous vote.

The new resolution nods to “forward-looking employers” that are “powering America’s all-of-the-above energy economy and moving us towards a future fueled by clean and low-emissions energy technology, from renewables to carbon capture and storage to advanced nuclear technology.”

“I am furious that the DNC would effectively undo a resolution passed just two months ago just as the movement to ban fossil fuel corporate PAC money is growing (and Democrats are winning),” said R.L. Miller, president of the super PAC Climate Hawks Vote, who co-sponsored the original resolution.

DNC spokeswoman Xochitl Hinojosa said in an email that the new resolution was “not a reversal,” noting in a statement after the vote that “any review of our current donations reflects” the Democrats’ “commitment” to turn away the fossil fuel industry. The DNC has not accepted any fossil fuel donations since adopting the ban.

The key difference could be if the new resolution applies only to campaigns ― in which case, it may not annul the original resolution but would “repudiate the spirit” of the earlier one, according to Jerald Lentini, deputy director of the Democratic fundraising group It Starts Today.

“Smart Democrats are very good at splitting hairs and nitpicking,” Miller said. “It’s trying to manufacture distinctions out of whole cloth.”

Party activists ― including Christine Pelosi, the main author of the first resolution and House Minority Leader Nancy Pelosi’s daughter ― hoped the DNC would consider a second proposal this month to stop accepting contributions over $200 from individuals who work for the fossil fuel industry. That, the thinking went, would limit the influence of high-paid executives while remaining open to the working class that Democrats aim to champion. The original resolution, which Perez voted for in June, barred the DNC from accepting contributions from corporate PACs tied to oil, gas and coal companies. But it allowed for the DNC to continue accepting individual donations from workers in those industries.

Instead, the party appears to be backtracking.

The DNC’s new resolution “reaffirms its unwavering and unconditional commitment to the workers, unions and forward-looking employers that power the American economy,” according to the text.

As such, it states that the DNC “will continue to welcome the longstanding and generous contributions of workers, including those in energy and related industries, who organize and donate to Democratic candidates individually or through their unions’ or employers’ political action committees.”

The resolution, proposed as historic wildfires are scorching California, makes no mention of climate change.

Hinojosa said the resolution came in response to “concerns from Labor” that the original fossil fuel donations ban “was an attack on workers.” Just 4.4 percent of workers in the mining sector ― including coal, oil and gas ― are union members, according to the Union Membership and Coverage Database. The International Brotherhood of Electrical Workers, however, remains a strong supporter of building pipelines and donated more than $305,000 to the DNC this year.

To put a fine point on it: This proposal isn't to let union members keep donating to the DNC. It's to let fossil fuel executives keep donating and selling influence among Democrats. Certain unions (incl some building trades) see their interests as aligned with those of executives

The strength of the fossil fuel donations ban seemed in question almost immediately after it passed. The DNC refused to announce the resolution, declining to comment to HuffPost for a story that made the vote public.

At the Texas Democratic Party’s convention two weeks later, a state party official opposed a state-level proposal to ban fossil fuel donations and oppose new gas extraction, arguing that the DNC’s own resolution was not set in stone.

A.J. Durrani, a retired engineer and manager at the oil giant Shell who recently joined the national party committee, said the DNC did not include the earlier vote in the minutes from its last executive committee meeting.

“There was no mention in it,” Durrani said by phone in June. As far as he was concerned, he said, “As of right now, the DNC has not voted.”

Durrani did not immediately respond to a request for comment on Friday.

Texas Democrats ultimately voted down their proposed resolution.

In February 2017, DNC rank-and-file rejected another Pelosi resolution to forbid “registered, federal corporate lobbyists” from serving as “DNC chair-appointed, at-large members” and to reinstate former President Barack Obama’s ban on corporate PAC donations.

Obama halted contributions from PACs and lobbyists in 2008 after winning the party’s presidential nomination. But then-DNC Chair Debbie Wasserman Schultz loosened the restrictions in July 2015 before completely rolling back the ban in February 2016, nine months before that year’s presidential election.

The energy and natural resource sectors, including fossil fuel producers and mining companies, gave $2.6 million to the DNC in 2016, according to data collected by the nonpartisan Center for Responsive Politics. That’s less than 5 percent of the $56.1 million that the finance and real estate sectors ― the DNC’s largest corporate donors ― contributed that year.

Oil and gas companies spent a record $7.6 million on Democratic races in 2016. That’s a pittance compared to the $53.7 million in direct donations to Republicans, who received 88 percent of the industry’s contributions during that election cycle. Republicans have taken in 89 percent of the industry’s donations so far in 2018. That figure rises to 95 percent of the coal sector’s largesse this year.

Coal miner Michael Nelson shakes hands with President Donald Trump as Trump prepares to sign Resolution 38, which nullifies the coal industry’s “stream protection rule.”

But even as fossil fuel companies entrench with Republicans and the Trump administration continues deregulating drilling, mining and emissions, Democrats remain slow to mount a serious challenge to the industry most responsible for anthropogenic global warming.

On the state level, Washington Gov. Jay Inslee (D) failed to pass the nation’s first carbon tax even as his party enjoys complete control over the deep-blue state. New York Gov. Andrew Cuomo (D) has refused to swear off fossil fuel donations, halt new fracked gas infrastructure in the state, or support a widely hailed climate bill despite an aggressive primary challenge from progressive Cynthia Nixon and intense pressure from environmentalists.

On the national level, most bills from Democrats who purport to be Congress’ biggest climate hawks amount to half measures, either exempting major polluters such as the meat industry, directing carbon tax revenues that could be used to mitigate the effects of climate change to lowering taxes, or waiting until 2050 to end fossil fuel use.

There are some bright spots. Last September, Rep. Tulsi Gabbard (D-Hawaii) introduced the Off Fossil Fuels for a Better Future Act ― considered one of the most progressive climate bills yet proposed ― which calls for ending oil, gas and coal use by 2035, cutting all subsidies to drilling, mining and refining companies, and funding programs to help workers transition into new industries.

Alexandria Ocasio-Cortez, the likely next representative for New York’s 14th Congressional District, has vowed to push for a “Green New Deal,” a federal plan to spur “the investment of trillions of dollars and the creation of millions of high-wage jobs.” Other progressive candidates are now rallying around the Green New Deal concept and calling for nascent proposals for a jobs guarantee program to be married to renewable energy targets. That, proponents say, may be the key to wooing unions away from supporting lucrative fossil fuel projects.

NASA launching Advanced Laser to measure Earth's changing ice

Washington DC (SPX) Aug 23, 2018

NASA's Ice, Cloud and land Elevation Satellite-2 (ICESat-2) spacecraft arrives at the Astrotech Space Operations facility at Vandenberg Air Force Base in California ahead of its scheduled launch on Sept. 15, 2018.

Next month, NASA will launch into space the most advanced laser instrument of its kind, beginning a mission to measure - in unprecedented detail - changes in the heights of Earth's polar ice.

NASA's Ice, Cloud and land Elevation Satellite-2 (ICESat-2) will measure the average annual elevation change of land ice covering Greenland and Antarctica to within the width of a pencil, capturing 60,000 measurements every second.

"The new observational technologies of ICESat-2 - a top recommendation of the scientific community in NASA's first Earth science decadal survey - will advance our knowledge of how the ice sheets of Greenland and Antarctica contribute to sea level rise," said Michael Freilich, director of the Earth Science Division in NASA's Science Mission Directorate.

ICESat-2 will extend and improve upon NASA's 15-year record of monitoring the change in polar ice heights, which started in 2003 with the first ICESat mission and continued in 2009 with NASA's Operation IceBridge, an airborne research campaign that kept track of the accelerating rate of change.

A Technological Leap

ICESat-2 represents a major technological leap in our ability to measure changes in ice height. Its Advanced Topographic Laser Altimeter System (ATLAS) measures height by timing how long it takes individual light photons to travel from the spacecraft to Earth and back.

"ATLAS required us to develop new technologies to get the measurements needed by scientists to advance the research," said Doug McLennan, ICESat-2 project manager at NASA's Goddard Space Flight Center.

"That meant we had to engineer a satellite instrument that not only will collect incredibly precise data, but also will collect more than 250 times as many height measurements as its predecessor."

ATLAS will fire 10,000 times each second, sending hundreds of trillions of photons to the ground in six beams of green light. The roundtrip of individual laser photons from ICESat-2 to Earth's surface and back is timed to the billionth of a second to precisely measure elevation.

With so many photons returning from multiple beams, ICESat-2 will get a much more detailed view of the ice surface than its predecessor, ICESat. In fact, if the two satellites were flown over a football field, ICESat would take only two measurements - one in each end zone - whereas ICESat-2 would collect 130 measurements between each end zone.

As it circles Earth from pole to pole, ICESat-2 will measure ice heights along the same path in the polar regions four times a year, providing seasonal and annual monitoring of ice elevation changes.

Tracking Ice Melt

Hundreds of billions of tons of land ice melt or flow into the oceans annually, contributing to sea level rise worldwide. In recent years, contributions of melt from the ice sheets of Greenland and Antarctica alone have raised global sea level by more than a millimeter a year, accounting for approximately one-third of observed sea level rise, and the rate is increasing.

ICESat-2 data documenting the ongoing height change of ice sheets will help researchers narrow the range of uncertainty in forecasts of future sea level rise and connect those changes to climate drivers.

ICESat-2 also will make the most precise polar-wide measurements to date of sea ice freeboard, which is the height of sea ice above the adjacent sea surface. This measurement is used to determine the thickness and volume of sea ice.

Satellites routinely measure the area covered by sea ice and have observed an Arctic sea ice area decline of about 40 percent since 1980, but precise, region-wide sea ice thickness measurements will improve our understanding of the drivers of sea ice retreat and loss.

Although floating sea ice doesn't change sea level when it melts, its loss has different consequences. The bright Arctic ice cap reflects the Sun's heat back into space. When that ice melts away, the dark water below absorbs that heat. This alters wind and ocean circulation patterns, potentially affecting Earth's global weather and climate.

Beyond the poles, ICESat-2 will measure the height of ocean and land surfaces, including forests. ATLAS is designed to measure both the tops of trees and the ground below, which - combined with existing datasets on forest extent - will help researchers estimate the amount of carbon stored in the world's forests. Researchers also will investigate the height data collected on ocean waves, reservoir levels, and urban areas.

Potential data users have been working with ICESat-2 scientists to connect the mission science to societal needs. For example, ICESat-2 measurements of snow and river heights could help local governments plan for floods and droughts.

Forest height maps, showing tree density and structure, could improve computer models that firefighters use to forecast wildfire behavior. Sea ice thickness measurements could be integrated into forecasts the U.S. Navy issues for navigation and sea ice conditions.

"Because ICESat-2 will provide measurements of unprecedented precision with global coverage, it will yield not only new insight into the polar regions, but also unanticipated findings across the globe," said Thorsten Markus, an ICESat-2 project scientist at Goddard. "The capacity and opportunity for true exploration is immense

Environmental groups fight back against corporate lawsuits

BISMARCK, N.D. (AP) — Twenty environmental and civil liberties groups are fighting back against lawsuits they believe are aimed at limiting free speech and silencing critics.

The “Protect the Protest” task force announced Tuesday targets what are known as strategic lawsuits against public participation, or SLAPP, which use legal action and the threat of financial risk to deter people and groups from speaking out against something they oppose.

“We know from our own experience that this legal bullying tactic will work if it’s not shut down,” said Katie Redford, co-founder and director of EarthRights International.

The effort is to include billboard advertisements, training sessions for journalists and nonprofits, panel discussions and rallies outside the corporate offices of companies the groups believe use such lawsuits.

Rallies are planned next week in San Francisco, New York City and Dallas. Dallas is the base for Energy Transfer Partners, the company that built the Dakota Access oil pipeline and sued Greenpeace, Earth First and BankTrack for up to $1 billion for allegedly working to undermine the $3.8 billion project to move North Dakota oil to a shipping point in Illinois.

Greenpeace and the Center for Constitutional Rights, which also is involved in helping defend against that lawsuit, are among the Protect the Protest participants.

Spokeswomen for ETP did not immediately respond to a request for comment Tuesday.

The company’s lawsuit filed a year ago alleges the environmental groups disseminated false and misleading information about the project and interfered with its construction. ETP maintains that the groups’ actions interfered with its business, facilitated crimes and acts of terrorism, incited violence, targeted financial institutions that backed the project, and violated defamation and racketeering laws. The groups maintained the lawsuit was an attack on free speech.

U.S. District Judge Billy Roy Wilson this summer dismissed both BankTrack and Earth First as defendants. He said ETP failed to make a case that Earth First is an entity that can be sued, and that BankTrack’s actions in imploring banks not to fund the pipeline did not amount to radical ecoterrorism.

EarthRights International helped defend BankTrack, assistance that Redford said exemplifies the type of collective effort the task force will bring.

Wilson also ordered ETP to clarify its claims against Greenpeace, and has given that group until Sept. 4 to file its response to ETP’s amended complaint.

Greenpeace USA Executive Director Annie Leonard on Tuesday said a $300 million lawsuit filed against the group by the Canadian timber industry over its forest protection advocacy is another example of the type of lawsuits the task force hopes to battle.

“A healthy democracy is a precondition for a healthy environment, and we can’t have a healthy democracy without informed, engaged public dissent,” she said.

Algae Bloom in Lake Superior Raises Worries on Climate Change, Tourism

"In 19 years of piloting his boat around Lake Superior, Jody Estain had never observed the water change as it has this summer. The lake has been unusually balmy and cloudy, with thick mats of algae blanketing the shoreline.

“I have never seen it that warm,” said Mr. Estain, a former Coast Guard member who guides fishing, cave and kayak tours year-round. “Everybody was talking about it.”

But it was not just recreational observers along the shores of the lake who noticed the changes with concern. Lake Superior, the largest of the Great Lakes with more than 2,700 miles of shoreline, is the latest body of water to come under increased scrutiny by scientists after the appearance this summer of the largest mass of green, oozing algae ever detected on the lake."

Portland tidal energy startup entering rough investment waters

Buoyed by a recent $6 million infusion, Ocean Renewable Power Co. hopes to raise $12 million this year and $18 million later, but venture funds have been favoring wind and solar projects.

By Peter McGuireStaff Writer

A tugboat maneuvers Ocean Renewable Power Co.'s RivGen Power System into place on the Kvichak River in Alaska. Over a two-year trial, the system provided about a third of the power for a remote village nearby. Courtesy of Ocean Renewable Power Co.

Portland-based alternative energy startup Ocean Renewable Power Co. is stepping into rough financial waters as it tries to secure investments worth tens of millions of dollars to advance its unique marine power projects.

The company wants to raise $12 million in private investment by the end of the year to set up improved versions of its RivGen and TidGen underwater turbines, said founder and CEO Christopher Sauer. Ocean Power, which recently got a $6 million infusion, has developed a sustainable technology that allows underwater turbines to generate electricity from flowing rivers and shifting tides.

ACROSS GLOBE, marine power underutilized

Globally, marine power receives the least investment by far compared with other renewable energies, excluding large hydro dams.

$200 MILLION: Total new investment attracted worldwide by the marine energy sector in 2017 (14 percent drop from 2016)

$161 BILLION: What investors put toward solar energy in 2017

$107 BILLION: What investors put toward wind energy in 2017

Source: United Nations Environment Program and Bloomberg New Energy Finance, 2018 annual renewable energy investment report

An initial funding round will be followed by a push for another $18 million intended to start production for the commercial market and reach profitability in the next four years.

“We are getting closer and closer to commercialization. We have a clear path how to get there,” Sauer said.

But the company is fighting investment headwinds. Despite the promise of tidal energy, big investors are shying away from marine projects in favor of wind turbines and solar panels, cheaper proven technologies with relatively low risk and high yields.

“It is a tough world out there,” said Angus McCrone, chief editor of Bloomberg New Energy Finance. “Most of the big funds want to invest in wind and solar projects, not tech developments.”

Recent memories of well-funded, but ultimately failed, wave energy companies like Scotland-based Pelamis Wave Power make venture funds wary of fronting new marine power attempts, McCrone said.

“It doesn’t mean it is impossible to raise money, but it is more difficult than it was 10 years ago,” he said.

Globally, marine power receives the least investment by far compared with other renewable energies, excluding large hydro dams.

The marine energy sector attracted just $200 million in total new investment worldwide in 2017, a 14 percent drop from the year before. In contrast, investors sank $161 billion into solar and $107 billion into wind last year, according to the 2018 annual renewable energy investment report by the United Nations Environment Program and Bloomberg New Energy Finance.

“From an investor perspective, tidal has been very much a sideshow and is of lower importance,” said Edward Guinness, lead manager for the alternate energy fund at Guinness Atkinson Asset Management, a firm with offices in Los Angeles and London.

The fund manages about $400 million in energy investments, with about $30 million in alternate energy, Guinness said.

“The thing about tidal is that it is a very nice idea, it is very compelling,” he said. “But it has not been providing cost reductions that other technologies have and it is unlikely to do so.”

Marine power ventures tend to be expensive, risky and unproven – prone to escalating costs and damage from saltwater and heavy seas, Guinness said.

“We are not really looking to invest in early-stage tech companies,” he said. “We are looking for reasonably mature investment opportunities.”

Sauer is clear-eyed about the realities of lining up money to move ORPC ahead, but believes the company has the right plan.

It intends to install optimized versions of its turbines on the Kvichak River in Alaska and the St. Lawrence River near Quebec City in Canada – models intended to demonstrate the possibility of power generation in remote communities.

“We think once those systems are up and operating that it is a good break point to raise the rest of the money we need to be cash flow-positive by 2022,” Sauer said.

The 14-year-old company has received millions of dollars in federal and state funding and private investment on its path to making commercially viable hydrokenetic power systems. In 2012, ORPC was the first company in the U.S. to generate electricity to the power grid with a tidal turbine in Cobscook Bay, in easternmost Maine.

Since that experiment, the company has branched out. In 2014, it tested a river version of its turbine on the Kvichak River, and over a two-year trial proved it could provide about a third of the power for Igiugig, a remote village nearby.

In June, the Igiugig Village Council and ORPC were awarded $2.3 million from the Department of Energy for further tests of the river turbine. His company received a separate $650,000 award for research and development and has pulled in about $3 million from private investors this year, Sauer said.

A focus on remote communities in North America may give ORPC the base it needs to make the leap into commercial markets, Sauer said. Energy costs are typically very high in isolated villages and townships that depend on generators and fuel. Transportable underwater turbines are a cheaper and cleaner alternative and one that may provide local employment opportunities.

“The early adoption market for us are these remote communities because our systems today can save them money,” Sauer said.

If more communities buy into the product, it could help ORPC overcome the production cost barrier and provide a springboard into European and South American markets.

“Once these are operating and have benchmark performance, we are then in a sales mode,” he said.

The company also plans a new tidal project in Eastport with the prospect of generating 5 megawatts.

Raising money for marine power technology has always been tough, Sauer said. Most of the private investment his company has received has come from independent family funds focused on “impact” investing – putting money into companies that match investors’ ethical goals.

Recently, however, parties that have been on the sidelines, such as private equity firms and renewable energy funds, have shown more interest in marine power innovations, Sauer said.

“We are making progress” he said. “I am hopeful that in our next ‘raise’ here we will have all types of investors involved.”

Peter McGuire can be contacted at 791-6365 or at:

pmcguire@pressherald.com

‘Toilet to tap’ water nearly matches bottled H20 in taste test, California university researchers discover

David Downey

PUBLISHED: August 17, 2018 at 8:49 am | UPDATED: August 17, 2018 at 1:19 pm

UC Riverside researcher Daniel Harman with drinking water samples at the Riverside campus on Tuesday, Aug. 14, 2018. Harmon recently conducted a taste study on the difference between recycled water, tap water and commercial bottle water,

Saddled with the “toilet to tap” label, recycled water still has a bit of an image problem. But in a blind taste test, UC Riverside researchers found that people prefer its flavor over tap water and that they like it as much as bottled water.

Intuitively, that may sound crazy. But it makes sense, suggests UCR’s Daniel Harmon, lead author on a recent study analyzing the taste test published recently in the journal Appetite.

“Bottled water and recycled water go through more or less identical purification processes,” Harmon said. Both, experts said, are subjected to reverse osmosis, which removes most contaminants.

The study is encouraging, water officials say, because it comes at a time when Southern California is having to rely increasingly more on recycled water, and not just for turf and crop irrigation. As the planet warms, droughts become more severe and water supplies shrink. It also comes as state officials are expanding the ways agencies can filter recycled water and add it to drinking supplies. UCR’s research may help set the stage for one day piping it directly into drinking-water systems.

UC Riverside researcher and study lead author Daniel Harmon holds up clear plastic cups of drinking-water samples at the Riverside campus on Tuesday, Aug. 14, to demonstrate how his team conducted an experiment to see whether participants — college students between the ages of 18 and 28 — could tell the difference between recycled, tap and commercial bottled water in a blind taste test. (Photo by Watchara Phomicinda, The Press-Enterprise/SCNG)

“It’s inevitable that we’re going to have to use this resource more and more,” said Harmon, a doctoral candidate in developmental psychology at UCR.

Kevin Pearson, a spokesman for Eastern Municipal Water District, which supplies drinking water to more than 800,000 people in Riverside County, termed the results encouraging.

“This goes to show that people are willing to use this as a water source,” Pearson said.

What Harmon’s team did was bring in 143 UCR students ranging in age from 18 to 28, one at a time.

“We wanted to figure out whether people could tell the difference between recycled water, tap water and commercial bottled water,” Harmon said. “They were presented with three clear cups labeled A, B and C. They were completely blind to the source of any water.”

After tasting the water, participants rated samples on a scale of one to five — one indicating strong dislike and five a strong like for. Harmon said bottled water received the highest score at 3.79, but recycled water nearly matched it at 3.77. The groundwater-based tap water sample scored 3.45.

“We were surprised that the groundwater was less liked,” he said.

Harmon said researchers also evaluated personalities and analyzed whether that factored into preferences. Their conclusion? It did.

They found that people who are open to new experiences tended to like the three samples the same. But people who are more nervous or anxious preferred bottled water, Harmon said.

“What we learned is, purity and freshness is king in water preference,” he said.

Harmon said the taste test was conducted in 2015, but the study was published this year. Researchers are considering a follow-up study, he said, saying he could not hint what that might entail.

The implications are huge. The state is moving toward more extensive reuse of the waste water that flows through our sewer lines. It’s already an important part of our supply.

In Southern California, the region consumes 3.5 million acre-feet of water annually. And, of that, the source of 460,000 acre-feet is recycled waste water, said Deven Upadhyay, Metropolitan Water District assistant general manager and chief operating officer.

“It’s been on an uptick for years,” he said.

A common measuring unit in the water world, an acre-foot is about 325,000 gallons or what would cover an acre one foot deep. It’s what three Southern California families use during the course of a year.

Upadhyay said some recycled water irrigates parks, golf courses and sport fields. Some is used by industry. And some is used for drinking, showering and washing dishes.

Orange County residents are already doing the latter — on a large scale.

Orange County Water District, in partnership with the Orange County Sanitation District, just celebrated its 10th anniversary of operating the world’s largest recycled-water plant. It generates 100 million gallons of fresh water daily from waste water, said Denis Bilodeau, district president. And his agency is expanding the plant.

Orange County Water District officials were so confident in the purity of their recycled waste water that they handed out bottles of it in Hollywood in March 2017. (Photo courtesy of Orange County Water District)

The purified sewer water represents 30 percent of all water produced by the water district for residents of northern and central Orange County, Bilodeau said.

It’s injected into the ground.

“It goes through several hundred feet of sand and soil,” where it is subjected to additional filtration, he said. “And eventually it is taken up through drinking water wells.”

A similar plant is coming to Los Angeles County. Upadhyay said Metropolitan is partnering with the Sanitation Districts of Los Angeles County on a demonstration project that will be completed in 2019 and serve as a precursor to a large-scale operation.

Taking the process a step further, the San Diego area is preparing to add recycled water to a reservoir. The State Water Resources Control Board cleared the way for that through a roll-out of regulations in March.

By 2023, the board anticipates unveiling rules that would set the stage for piping recycled water directly into drinking systems.

George Tchobanoglous, a retired UC Davis professor of civil and environmental engineering who recently served on statewide expert water panel, said it may be 10 years before that level of recycling arrives.

“I think that’s a ways off,” he said.

Tchobanoglous said some issues likely will have to be resolved and officials will have to secure the public’s confidence. It’s unfortunate, he said, that a number of years ago someone popularized the phrase “toilet to tap.”

But, he said, “It is inevitable.”

He said direct piping of recycled water into homes is done in several water-starved places around the globe now. One such example is South Africa.

One can also look to the stars to find an example. At the International Space Station, Bilodeau said, every drop is captured. “Even the perspiration is recycled.”

“This isn’t a science fiction thing,” he said. “It already occurs.”

US Government Security Agencies play tug of war over pipeline protection

Blake Sobczak, Sam Mintz and Peter Behr, E&E News reporters

Energywire: Thursday, August 23, 2018

A natural gas well pad operated by Cabot Oil and Gas Corp. in northern Pennsylvania is pictured. Blake Sobczak/E&E News

Natural gas pipeline companies are being pulled in three different directions as federal agencies mull how to handle new security threats to an increasingly vital resource.

Should the U.S. government bail out competitors to natural gas to ease the power grid's reliance on the fuel, as called for by a leaked plan from the Department of Energy?

Should policymakers preserve the status quo, counting on voluntary cooperation from the sector and a slim staff of specialists to gain a window into pipeline security, as the Department of Homeland Security favors?

Or should U.S. lawmakers consider beefing up gas security oversight and moving it out of DHS's hands, an idea raised in the halls of the Federal Energy Regulatory Commission?

Shared among all three agencies — and the energy firms lobbying them — is a sense that cyberthreats to the gas pipeline networks are only set to rise as companies digitize operations and hackers backed by foreign intelligence services grow more intrusive.

"We now are dealing with nation states," said Dave McCurdy, CEO of the American Gas Association (AGA), at a July 31 cybersecurity conference in New York City. "The government isn't necessarily organized for this 21st-century paradigm ... you've got some challenges with the federal agencies, if you're in industry."

The path chosen will inevitably reverberate in the bulk power grid, which in recent years has grown to rely on natural gas more than any other fuel source for generating electricity.

"There's more concern about what is the impact of what would happen if there is an interruption in the gas supply," McCurdy said.

The gas industry has pushed back against proposals to require baseline security standards for large pipelines and related infrastructure.

"The threat evolves too quickly for a regulation or mandate to be the most effective method of maintaining the highest level of safety," McCurdy said. Beyond opposing mandatory security standards, the gas industry is also opposing any independent assessments of whether its cyber and physical defenses adequately protect its networks.

The DOE plan

To the Department of Energy strategists, grid security concerns already justify intervention through use of the Federal Power Act and a 1950s-era defense statute to prop up alternatives to natural gas.

In a draft policy memo leaked in June, DOE claimed that pipelines' distributed nature, coupled with menacing online threats to their digital control systems, makes them harder to secure from attack.

DOE has floated propping up economically ailing coal and nuclear plants to pre-empt a future in which the country's power grid relies overwhelmingly on "just-in-time" supplies from gas pipelines. DOE has not yet disclosed how many coal and nuclear plants it would support, how they would be chosen, and what subsidies would cost.

Because coal and nuclear plants have on-site fuel, the thinking goes, they would be more resilient in the face of cyber or physical attacks.

Energy Secretary Rick Perry has played up this argument, laying the rhetorical groundwork for a policy that has pre-emptively drawn fierce opposition from environmentalist and some energy industry quarters.

"Wind and solar are interruptible, and so [are] gas pipelines. The only forms that are not interruptible are coal and nuclear — because they've got fuel on-site," Perry said at a conference in Texas earlier this month (Energywire, Aug. 6).

Perry's critics says his rationale is a policy proposal in search of a security problem. A major attack on U.S. grid infrastructure is as likely to focus on high-voltage transmission systems or local power distribution utilities as on pipelines. If adversaries take down part of the grid, power flow from all generation is halted, experts note.

Nuclear plants are the only "unique" generators from a security standpoint, and are likely to have the most stringent cybersecurity and physical defenses imposed by the Nuclear Regulatory Commission, said Dewan Chowdury, founder and CEO of the cybersecurity company MalCrawler and a consultant to major gas and electric utilities.

The DHS plan

The Transportation Security Administration, better known for its role guarding the nation's airports, is charged with ensuring that vital gas pipelines are adequately protected against various threats.

E&E News reported last year that the DHS agency has assigned six full-time staffers to oversee the more than 300,000 miles of gas transmission lines crisscrossing the nation (Energywire, May 23, 2017). A TSA spokesperson confirmed that the number of full-time employees working on pipeline security remains the same.

"TSA has exercised security responsibilities over pipelines since 2002 and continues to exercise those responsibilities while working in conjunction with industry partners," the agency said in a statement.

The office relies on voluntary cooperation with large pipeline companies and industry groups like the AGA to gain a window into the sector's security practices and defenses.

Earlier this year, TSA published updates to nonbinding pipeline security guidelines that urge companies to lock down their corporate and operational networks from hackers.

But compliance with the guidelines is not enforced, and agency officials have said there is no specific timeline for pipeline firms to complete "enhanced cybersecurity measures" for their most critical facilities.

An E&E News report in 2017 documented TSA's lack of cybersecurity staffing and the absence of any systematic review of gas pipeline cyberdefenses, either by TSA, FERC or the industry itself.

An update of that reporting showed that that lack of oversight and accountability on cyber vulnerabilities has not significantly changed.

Meanwhile, homeland security officials have warned of Russia-linked hackers probing U.S. critical infrastructure networks across the country (Energywire, March 16).

While no cyberattack is known to have disrupted the flow of gas or electricity anywhere in the United States, hackers have interrupted third-party billing and document-sharing services used by large gas and power utilities. Several years ago, hackers thought to be linked to the Chinese government also launched a series of cyber intrusions into gas pipelines' corporate networks, according to law enforcement officials and DHS briefings.

"I do think you need to have this different conversation both with TSA, with DOE, with [FERC], with NERC and with the gas industry about how you get that door protected," said Steven Naumann, vice president of transmission and NERC policy at utility giant Exelon Corp. "I know it's an old adage, but if you're a burglar, you're going to go to the easy target. And if you're spending all this money, all this time, all this effort on protecting the cyber health of the electric system, and then you can attack the gas system, water system, railroad system ... then we're so vulnerable."

Gas industry representatives contend that the nature of the commodity — slowly moving through pipelines, rather than the near-instantaneous path of electrons — intrinsically helps protect gas pipelines from certain threats.

Jennifer O'Shea, AGA vice president for communications, noted that the dispersed, redundant design of the U.S. pipeline system heads off risks from single points of failure carrying major consequences.

She said the industry "actively partners with multiple federal and law enforcement agencies, exchanging threat intelligence with the government through the Downstream Natural Gas Information Sharing and Analysis Center (DNG-ISAC).

Still, the extent of the gas industry's voluntary compliance with TSA standards isn't clear. E&E News asked AGA and the Interstate Natural Gas Association of America (INGAA) whether the industry has put in place any comprehensive review procedures of how well interstate pipelines are complying with TSA's voluntary guidelines. The query asked how many interstate pipeline companies had received TSA cyber reviews this year, and what the industry's expectations were for the timetable and scope of TSA future reviews of critical pipelines. Neither organization answered these questions.

The FERC plan

FERC has signaled that the independent agency wants to move to take on a larger role in ensuring the security of the pipelines that feed into America's growing fleet of gas-fired power generators, although officials have been cautious to avoid stepping on the toes of other entities.

"There should be no aspect of our nation's energy infrastructure that is left unprotected in a cyber sense, by whatever means we need to do that," said Republican FERC Chairman Kevin McIntyre at a June press conference.

He said he had not formulated a personal view on which government agency should be assigned oversight, or whether there should be mandatory standards, but did suggest that FERC could take on more authority in the future.

"I am not speaking for a formal FERC initiative or anything like that, but I wouldn't be terribly surprised to see if we were to move in that direction at some appropriate time; we're just not there now," he said.

His statements were noticed by others, including fellow Republican Commissioner Neil Chatterjee, who had suggested in a joint op-ed with Democratic Commissioner Richard Glick earlier this year that pipeline cybersecurity oversight be moved from the TSA to DOE.

"[McIntyre] wasn't ready to commit to commission action, but he did indicate he could see us taking action in the area," Chatterjee said in a recent interview.

Chatterjee, who called TSA's pipeline oversight office "clearly undermanned," noted that FERC is responsible for grid reliability, which could be endangered by cyberthreats to pipelines.

"FERC has a very serious role to play in this, but by suggesting the [oversight move to] DOE and not at FERC, this isn't just some jurisdictional grab," he said.

The PJM Interconnection, an eastern U.S. grid operator, has called on FERC to insist that all gas pipelines provide more information about operations and vulnerabilities to the power generators that depend on them. "In PJM's view, confidential information sharing should be both uniform and mandatory when the information is identified as needed to enhance the reliability" of grid and gas systems, said the grid operator.

"PJM urges the commission to drive further coordination through the exercise of its authority over both natural gas pipelines and the electric industry," the organization said (Energywire, April 17).

Now the gas industry must hold its breath to see to what extent FERC will intervene, possibly by requiring independent assessments of gas pipeline cyber vulnerabilities to cyber or physical attacks, or natural disasters, that would cut off fuel supply to essential gas-fired power generators.

FERC Chief of Staff Anthony Pugliese highlighted the possibility of intervention earlier this month when he told an industry audience that FERC's staff was working with DOE, the Department of Defense and the National Security Council "to identify the plants that we think would be absolutely critical to ensuring that not only our military bases, but things like hospitals and other critical infrastructure are able to be maintained, regardless of what natural or man-made disaster might occur."

Pugliese singled out pipelines as a target for state-sponsored cyberattacks. "More and more, you have adversarial countries ... who see pipelines, for example, as an area of great opportunity; let's put it that way."

The INGAA challenged Pugliese's statements about pipeline vulnerability, with INGAA spokeswoman Cathy Landry pointing to a fact sheet on the industry's preparation.

"It appears that Mr. Pugliese could use a refresher on some basic facts and our industry's commitment to this very serious issue," Landry said in a statement.

INGAA also released a statement from its president, Don Santa, who said the DOE plan "represents a solution to a problem that does not exist. If the Energy Department acts, consumers will be saddled with as much as $11.8 billion to pay for the uneconomic coal and nuclear plants.

"That might be justifiable if these facilities increased the reliability of the grid. But they

don't".

U.S. Wind Power Is ‘Going All Out’ with Bigger Tech, Falling Prices, Reports Show

Three new government reports detail how the wind industry is expanding — offshore and onshore — and the role corporations, technology and tax credits are playing.

BY DAN GEARINO, INSIDECLIMATE NEWS

AUG 23, 2018

New technology is allowing for bigger, more efficient wind turbines that make wind energy possible even in areas with lower wind speeds.

Wind power capacity has tripled across the United States in just the last decade as prices have plunged and the technology has become more muscular, the federal government's energy labs report.

Three new reports released Thursday on the state of U.S. wind power show how the industry is expanding onshore with bigger, more powerful turbines that make wind energy possible even in areas with lower wind speeds.

Offshore, the reports describe a wind industry poised for a market breakthrough.

"Right now it's going full bore," said Mark Bolinger, a research scientist at the Lawrence Berkeley National Laboratory and co-author of one of the new reports. "The industry is really going all out."

Some of the key findings:

The country's wind energy capacity has tripled since 2008, reaching 88,973 megawatts by the end of 2017. Wind contributed 6.3 percent of the nation's energy supply last year.

The average price of wind power sales agreements is now about 2 cents per kilowatt-hour, down from a high of 9 cents in 2009 and low enough to be competitive with natural gas in some areas.

State renewable energy requirements once were the leading contributor to demand for new wind farms, but they were responsible for just 23 percent of new project capacity last year due to rising demand for clean energy from corporate customers, like Google and General Motors, and others.

Offshore wind is going from almost nothing, with just five wind turbines and 30 megawatts of capacity off Rhode Island, to 1,906 megawatts that developers have announced plans to complete by 2023.

"The short story is wind is doing well in the markets, has been doing well, and looks like it will continue to do well," said Michael Webber, deputy director of the energy institute at the University of Texas at Austin, who was not involved with the reports.

"It's despite a lot of these policy shifts that have happened under the Trump administration," he said, referring to proposed rules aimed at boosting fossil fuels. "It's as if the markets have spoken, and they've chosen wind."

Texas is the country's wind energy leader by a long stretch, with 22,599 megawatts of capacity, including 2,035 megawatts newly installed last year, which also led the nation.

Oklahoma was second for total capacity and new capacity last year, with 7,495 megawatts and 851 megawatts, respectively. That's likely to slow, though, after Gov. Mary Fallin signed legislation in April to end a key state tax incentive for wind power three years early.

Nationally, an important driver for onshore wind power expansion has been the federal Production Tax Credit, which is also phasing out. The ticking clock for incentives has led to a surge in projects heading into 2020, the final year when developers can get the full value of the credit.

Onshore Wind Boosted By Bigger Turbines

Turbines at land-based wind farms are getting larger and more efficient, according to the Lawrence Berkeley Labs report. This has pushed average turbine capacity to 2.32 megawatts, up 8 percent from the prior year and up more than 200 percent since the late-1990s.

Rotors are getting larger, blades are getting longer, and turbines are getting taller, with more projects exceeding 500 feet—the level at which federal aviation regulators need to issue a special permit.

Increases in size and other design improvements are allowing turbines to operate at levels closer to their full capacity. Turbines built between 2014 and 2016 had an average "capacity factor" of 42 percent, compared to an average capacity factor of 31.5 percent for projects built from 2004 to 2011. (For a capacity factor of 100 percent, a power plant would be running at full capacity around the clock.)

The U.S. industry has a track record of market cycles that are tied to qualifying for federal tax credits. This is what led to a still-standing record for capacity additions in 2012, followed by a drop-off in 2013.

Current tax credits are likely to lead to another banner year in 2020 and 2021. "After that, it's less certain," Bolinger said.

But the annual ups and downs do not provide a complete picture, because wind farms take years to develop. For instance, looking at 2017 in isolation, new project completions were slightly down for the second year, yet it was the third consecutive year of 7,000 or more megawatts of new capacity, the first time that had happened in the U.S. market.

Offshore Wind Energy on the Rise

The U.S. offshore wind market is gearing up for major growth, with the tiny Block Island Wind Farm in Rhode Island about to get a lot of company, the new report from the National Renewable Energy Laboratory suggests.

Thirteen states have offshore wind projects in some phase of development, or areas that are likely to be open for leasing: California, Connecticut, Delaware, Hawaii, Maine, Maryland, Massachusetts, New Jersey, New York, North Carolina, Ohio, Rhode Island and Virginia.

In Massachusetts, the Vineyard Wind project, expected to be started next year and completed in 2021, stands to become the country's largest offshore wind farm for at least a few years at 800 megawatts.

Massachusetts is one of several states in the Northeast that have passed laws to encourage offshore wind, setting off a competition as states try to attract the companies that will serve the industry.

Offshore wind is an attractive option there, in part because it would be close to major population centers in a way that would be difficult for onshore wind or solar projects in densely populated areas.

But even with projects planned, the United States is far behind the offshore wind leaders. The United Kingdom and Germany have 5,824 and 4,667 megawatts, respectively, followed by China, with 1,823 megawatts.

Smartphones may be used to better predict the weather

Data could be harnessed to forecast flash floods and other natural disasters, Tel Aviv University researchers say

PUBLIC RELEASE: 23-AUG-2018

Flash floods occur with little warning. Earlier this year, a flash flood that struck Ellicott City, MD, demolished the main street, swept away parked cars, pummeled buildings and left one man dead.

A recent Tel Aviv University study suggests that weather patterns that lead to flash floods may one day be tracked and anticipated by our smartphones.

"The sensors in our smartphones are constantly monitoring our environment, including gravity, the earth's magnetic field, atmospheric pressure, light levels, humidity, temperatures, sound levels and more," said Prof. Colin Price of TAU's Porter School of the Environment and Earth Sciences, who led the research. "Vital atmospheric data exists today on some 3 to 4 billion smartphones worldwide. This data can improve our ability to accurately forecast the weather and other natural disasters that are taking so many lives every year."

Prof. Price collaborated with TAU master's student Ron Maor and TAU doctoral student Hofit Shachaf for the study, which was published in the Journal of Atmospheric and Solar-Terrestrial Physics.

Smartphones measure raw data, such as atmospheric pressure, temperatures and humidity, to assess atmospheric conditions. To understand how the smartphone sensors work, the researchers placed four smartphones around TAU's expansive campus under controlled conditions and analyzed the data to detect phenomena such as "atmospheric tides," which are similar to ocean tides. They also analyzed data from a UK-based app called WeatherSignal.

"By 2020, there will be more than six billion smartphones in the world," Prof. Price said. "Compare this with the paltry 10,000 official weather stations that exist today. The amount of information we could be using to predict weather patterns, especially those that offer little to no warning, is staggering.

"In Africa, for example, there are millions of phones but only very basic meteorological infrastructures. Analyzing data from or 10 phones may be of little use, but analyzing data on millions of phones would be a game changer. Smartphones are getting cheaper, with better quality and more availability to people around the world."

The same smartphones may be used to provide real-time weather alerts through a feedback loop, Prof. Price said. The public can provide atmospheric data to the "cloud" via a smartphone application. This data would then be processed into real-time forecasts and returned to the users with a forecast or a warning to those in danger zones.

The study may lead to better monitoring and predictions of hard-to-predict flash floods. "We're observing a global increase in intense rainfall events and downpours, and some of these cause flash floods," Prof. Price said. "The frequency of these intense floods is increasing. We can't prevent these storms from happening, but soon we may be able to use the public's smartphone data to generate better forecasts and give these forecasts back to the public in real time via their phones."

Interesting Graph:

https://insideclimatenews.org/sites/default/files/styles/colorbox_full/public/wind-capacity-chart-eere-529px.png?itok=1qlMCdWK

This study was first made available online in April 2018. It will be published in the print edition of Journal of Atmospheric and Solar-Terrestrial Physics in September 2018.

Tiny fern holds big environmental promise

PUBLIC RELEASE: 11-JUL-2018

CORNELL UNIVERSITY

ITHACA, N.Y. - A tiny fern - with each leaf the size of a gnat - may provide global impact for sinking atmospheric carbon dioxide, fixing nitrogen in agriculture and shooing pesky insects from crops. The fern's full genome has been sequenced by a Cornell University and Boyce Thompson Institute (BTI) scientist and his colleagues around the world, as reported in the journal Nature Plants.

Azolla filiculoides is a water fern often found fertilizing rice paddies in Asia, but its ancestry goes much further back.

"Fifteen million years ago, Earth was a much warmer place. Azolla, this fast-growing bloom that once covered the Arctic Circle, pulled in 10 trillion tons of carbon dioxide from our planet's atmosphere, and scientists think it played a key role in transitioning Earth from a hot house to the cool place it is today," said Fay-Wei Li, a plant evolutionary biologist at BTI, adjunct assistant professor of biology at Cornell and the lead author of the work, "Fern Genomes Elucidate Land Plant Evolution and Cyanobacterial Symbioses."

Li and senior author Kathleen M. Pryer of Duke University led a group of more than 40 scientists from around the world to sequence the genome completely. As the group sequenced the genome, it identified a fern-specific gene shown to provide insect resistance.

"In general, insects don't like ferns, and scientists wondered why," said Li, who explained that one of the fern's genes likely transferred from a bacterium. "It's a naturally modified gene, and now that we've found it, it could have huge implications for agriculture."

Nitrogen fixation is the process by which plants use the chemical element as a fertilizer. While plants cannot fix nitrogen by themselves, Li said, the genome reveals a symbiotic relationship with cyanobacteria, a blue-green phylum of bacteria that obtain their energy through photosynthesis and produce oxygen. Special cavities in the Azolla leaf host cyanobacteria to fix nitrogen, while the plant provides sugary fuel for the cyanobacteria.

"With this first genomic data from ferns, science can gain vital intelligence for understanding plant genes," said Li. "We can now research its properties as a sustainable fertilizer and perhaps gather carbon dioxide from the atmosphere."

Ferns are notorious for having large genomes, some as large as 148 gigabases, or the equivalent of 148 billion base pairs of DNA sequences. On average, fern genomes are 12 gigabases - a reason why scientists have not sequenced one, until now. The Azolla is .75 gigabases.

Funding came from the National Science Foundation, the German Research Foundation and the Beijing Genomics Institute. About $22,000 was provided through crowdfunding at Experiment.com

OFFSHORE WIND : Nascent American industry could fully arrive under Trump

Saqib Rahim, E&E News reporter Energywire: Thursday, July 12, 2018

Ryan Zinke. Photo credit: @SecretaryZinke/Twitter

Interior Secretary Ryan Zinke is seen speaking at an offshore wind conference in New Jersey in April. @SecretaryZinke/Twitter

In April, Interior Secretary Ryan Zinke came to an offshore wind conference in Princeton, N.J., to give remarks on "energy dominance."

It was a charged moment to be giving the speech. Three months prior, the Trump administration had proposed to open 90 percent of federal waters to oil and gas leasing. Zinke had offered to exempt Florida, prompting an outcry from liberal Northeast states like New Jersey that wanted to move away from fossil fuels.

What Zinke said, to many raised eyebrows in the audience, was that offshore wind, just like oil and gas, fit into the "energy dominance" framework. He said if a state chose wind, it had a friend in the White House.

One audience member said the tone was "almost apologetic."

"Of the current energy portfolio, probably wind has the greatest opportunity for growth," said Zinke. "Let's make American energy great. Let's make sure we make wind energy great."

One could easily have imagined a different tone. Zinke, after all, worked for an administration that has cast doubt on climate science, rolled back Obama-era regulations on carbon pollution, and worked to subsidize coal and nuclear plants.

Offshore wind, after more than a decade of development under Presidents George W. Bush and Obama, could hardly have been a juicier target. The young industry hadn't yet found a footing in the U.S., and it will need heavy subsidies to get started. For the Northeast states advancing it, most of which had sued over Trump's climate policies, climate wasn't a side issue; it's the point.

But instead of following the same pattern of conflict and lawsuits, offshore wind is on the brink of arrival. A year and a half into the Trump administration, with upheaval all around the U.S. energy world, offshore wind is benefiting from an oddly cooperative dynamic between states and the federal government. The projects envisioned under Obama are moving toward fruition, and if trends hold, the U.S.'s first utility-scale offshore wind project could be under construction by the end of Trump's first term and operational in 2021.

In Maryland and Massachusetts, regulators have approved financing for two installations of 368 and 800 megawatts, respectively. New Jersey and New York are laying the policy groundwork for 5,900 MW more. California, Connecticut, Delaware, Rhode Island and Virginia are exploring projects and policies of their own.

It's happened not despite, but thanks to, the help of the U.S. government, offshore wind advocates admit.

"Yeah, I can't explain it myself," said Nathanael Greene, director of renewable energy policy at the Natural Resources Defense Council. "I think as a whole, this administration has a lot of people that are probably not actually anti-renewable as much as 'all-of-the-above' advocates ... [t]hey don't care about the environmental aspect of it. That's not what's driving them. But this is [potentially] a big American industry."

"I think the administration rightly sees offshore wind as a power source that can help us achieve energy independence and security ... and also dominance in the global economy," said Stephanie McClellan, director of the Special Initiative on Offshore Wind at the University of Delaware. "I don't want to say it's not a surprise, but it's fitting."

Offshore wind fits 'energy dominance'

On Nov. 8, 2016, the day Trump was elected, many friends of offshore wind had their doubts.

During the Obama administration, Catherine Bowes, program director for offshore wind at the National Wildlife Federation, had spent years advocating to open up U.S. waters to the industry. With large-scale wind farms going up off European shores, the Obama administration leased more than a million acres of federal waters as potentially eligible for the same.

Supporters of the industry expected the next step to occur under a Hillary Clinton presidency: proposing, and ultimately building, real projects. Instead, the president-elect was Donald Trump, who had sued to block a wind project visible from a golf course he owned in Scotland. "I want to see the ocean, I do not want to see windmills," he said in 2006.

"Oh, my god, this is game over under the federal administration now," Bowes remembered thinking that night. "We've worked so hard to get to this place, and now we need them."

If Trump had wanted to crush offshore wind, he would have had plenty of options.

Wind turbines do not end up in federal waters trivially; they result from a plenitude of federal and state approvals. It can take years just to set up the conditions for an offshore wind developer to lease an area — and, once it has, five to seven years before there are windmills in the water.

The Bureau of Ocean Energy Management, which is in the Interior Department, is the lead office in charge of the federal permitting. But throughout the permitting process, many other offices get to touch the ball, including EPA, the Navy and Coast Guard, the Army Corps of Engineers, the Federal Aviation Administration, NOAA, and the Fish and Wildlife Service.

And before building a project, a developer must get federal approval for a host of highly regulated activities, such as measuring wind and water conditions at the site. When the company's ready to build, its plan goes through a full environmental review under the National Environmental Policy Act.

As some in the offshore wind business observe, it would have been easy for the Trump administration to barricade the regulatory process, whether by denying permits or just stalling. That might have "killed the industry in the cradle" by casting doubt over the process, said one lawyer with offshore wind clients.

Instead, it appears the Trump administration has been attracted to offshore wind not for the reason states like it — climate — but because it fits their brand of "energy dominance."

"The assumption was that they weren't for it. My alternative posture back to you would be that I think that they looked at it through the lens of American energy," said Jon Hrobsky, a policy director at Brownstein Hyatt Farber Schreck and former Interior official under the George W. Bush administration. "If you're only willing to look at offshore wind through the lens of climate, then I understand [that assumption], but that's not how this administration appears to be approaching it."

The proof is in the regulatory pudding. In March, the companies behind Bay State Wind, a 400-800 MW installation proposed for Massachusetts waters, said the project was the first-ever offshore wind project to receive "FAST 41" status — a special designation meant to accelerate the federal permitting process. Bay State is a joint project of Denmark-based rsted, the largest developer of offshore wind in the world, and U.S. utility Eversource Energy.

And in June, an Interior royalty committee suggested that Interior direct its wind leasing program to deploy at least 20 gigawatts of offshore wind — beyond what's currently on deck — by 2024 (Energywire, June 7).

Inspired by Denmark

It is unclear whether the Trump administration had this position from the start or whether it evolved into it. Either way, it seems that Denmark, which has 1.3 GW of offshore wind, played a part.

According to Zinke's public schedules, in October of last year, he met with the Danish ambassador to the U.S., Lars Gert Lose, and Dong Energy, a Danish company that once focused on oil and gas but has become the largest developer of offshore wind in the world. (Dong, which has since changed its name to rsted A/S, has developing projects in Massachusetts, New Jersey and Virginia.) Later that month, Zinke met with Denmark's energy minister, Lars Christian Lilleholt.

In January, Zinke released a plan to open up 90 percent of federal waters to oil and gas leasing. Coastal states revolted, saying Florida had been unfairly exempted and demanding the same exemption. In New York, Gov. Andrew Cuomo (D) said if oil explorers came to its coast, he'd lead a flotilla of boats to block them (Energywire, May 7).

With the controversy brewing, Zinke's counselor for energy policy, Vincent DeVito, traveled to Denmark for a two-day "fact-finding" tour focused on offshore wind. "We exchanged best practices for leasing and flexible permitting," DeVito said in an email to E&E News.

Less than three months later, Zinke was in New Jersey touting offshore wind, not for climate reasons but as a nascent industry that could produce jobs and infrastructure in America. He followed it up with an April op-ed in the Boston Globe that said offshore wind "will play an increasing role in sustaining American energy dominance," and he elaborated further in a June interview with the Washington Examiner.

"When the president said energy dominance, it was made without reference to a type of energy," he said. "It was making sure as a country we are American energy first and that includes offshore wind. There is enormous opportunity, especially off the East Coast, for wind. I am very bullish."

Many offshore wind advocates admit to being pleasantly baffled by this bullishness. But they have ways to make sense of it.

Some note that offshore wind hits the alpha-male notes of Trumpian "dominance," with its need for manufacturing, steel, concrete, ports and power lines.

Others observe that while offshore wind requires subsidies, it's the states, not the federal government, that actually have to produce most of the money.

Dan Kish of the Institute for Energy Research, a think tank with ties to the Koch family, is no advocate for the industry. Still, he said, if states want to raise their own citizens' electricity bills, the Trump administration might not want to get mixed up in it.

"If [states are] choosing to spend more on electricity deliberately as a result of policies, I don't think the administration is necessarily going to stand in their way," he said. "Sometimes you let the child touch the hot stove after repeatedly attempting to get his attention."

Proponents fire back that costs are falling, thanks to projects in Europe. Last year, Dong won an auction in Germany with a project it claimed would require zero subsidy.

Developments like these have encouraged Northeast states that the industry's cost promises are credible; Massachusetts' legislation requires project costs to fall over time (Energywire, Dec. 15, 2016).

The sun rises over Block Island, the United States' first offshore wind farm. Deepwater Wind

Another potential draw for Trump: the connection to Big Oil. As many observed, offshore wind opens up business opportunities for companies that can build infrastructure in deep water. Putting up a turbine means drilling into the ocean floor, laying down a massive foundation and managing logistics on the rolling waters the whole time — core competencies for companies that work in the U.S. Gulf of Mexico.

"There's a lot of big corporations that are interested in offshore wind," said Joe Martens, director of the New York Offshore Wind Alliance. "The same businesspeople that they are logically sympathetic to are in the offshore wind space."

And in some places, it's already making the federal government money. In December 2016, the month after Trump's election, Statoil ASA (which has since changed its name to Equinor ASA) paid the federal government more than $42 million to lease waters off the Long Island coast. The company estimates the site could accommodate a gigawatt of offshore wind.

Royal Dutch Shell PLC, which is a partner in developing a project off the Dutch coast, has expressed interest in New York, as well.

If offshore wind is indeed emerging as a major business interest, there's a telltale sign in Washington: lobbying. Last year, Republican Rep. Andy Harris of Maryland tried to slot language into a budget bill to cut off agency funding for projects proposed to go up within 24 nautical miles of Maryland.

That prompted an opposition letter from the National Ocean Industries Association and the American Wind Energy Association — the lead groups representing the offshore oil and wind industries. The language was cut from the final budget bill.

But as veterans of the offshore wind world know, it's too early to declare success. A single tweet could always throw the industry into disarray. And in the coming years, the industry will be dependent on the federal government in two key areas: opening up more water for development and approving the projects already proposed.

BOEM is currently managing 12 active commercial offshore wind leases. At a congressional hearing last month, James Bennett, chief of BOEM's renewable energy office, said the program can currently fulfill its "primary objectives" — managing these leases and creating new leases — despite proposed budget cuts for BOEM. Down the road, he said, "Some of those activities will not happen as quickly if we don’t have the resources to pursue them."

In a worst-case scenario, one offshore wind consultant said, the administration could take years to evaluate a project's construction proposal, only to reject it.

"Shhh," said the consultant. "Don't jinx it."

Can ‘Vaccines’ for Crops Help Cut Pesticide Use and Boost Yields?

Researchers are using gene editing to develop biodegradable vaccines that protect crops from pathogens. As the world looks to feed more and more people, this and other emerging technologies hold promise for producing more food without using chemical pesticides.

BY RICHARD CONNIFF • APRIL 19, 2018

When European researchers recently announced a new technique that could potentially replace chemical pesticides with a natural “vaccine” for crops, it sounded too good to be true. Too good partly because agriculture is complicated, and novel technologies that sound brilliant in the laboratory often fail to deliver in the field. And too good because agriculture’s “Green Revolution” faith in fertilizers, fungicides, herbicides, and other agribusiness inputs has proved largely unshakable up to now, regardless of the effects on public health or the environment.

And yet emerging technologies — including plant vaccines, gene editing (as distinct from genetic engineering), and manipulation of plant-microbe partnerships — hold out the tantalizing promise that it may yet be possible to farm more sustainably and boost yields at the same time. In truth, there’s little choice this time but to deliver on that promise, because of the need to feed 10 billion people later in this century and to do so on a limited land area and in spite of unpredictable climate change.

The idea of vaccines to protect plants from pathogens has kicked around in agribusiness for decades without much result. But it has moved closer to reality in recent years as research has demonstrated that it’s possible to tweak a natural genetic messenger to target specific crop pathogens. Double-stranded RNA (or dsRNA) is a key activator of the immune system in plants and many other organisms. One current line of research uses genetic engineering to modify this natural defense system — resulting, for instance, in a new cotton cultivar with dsRNA targeting plant-sucking insects that have become a pest in China.

Organisms produced by gene editing contain nothing that could not in theory have resulted from natural breeding.

Genetic engineering has, however, faced technological and regulatory limits reducing its ability to respond quickly to an emerging pathogen and it can also be unpopular with the public, especially in food crops. A study published last month in Plant Biotechnology Journal instead proposes a method to take a pathogen from a crop, sequence it in the lab, and quickly produce dsRNA with a long nucleotide sequence specifically targeting that pathogen. Bacterial fermentation chambers could then rapidly produce enough of that targeted dsRNA to spray on crops. Senior author Manfred Heinlein, a plant molecular biologist at the French National Center for Scientific Research, estimates that the process could move from identification of a new pathogen to commercial application of a targeted spray in six months or less.

In contrast to chemical pesticides, Heinlein and his co-authors write, “dsRNA agents are biocompatible and biodegradable compounds that are part of nature and occur ubiquitously inside and outside organisms as well as in food.” Because the targeted dsRNA matches a long nucleotide sequence in the pathogen, it remains effective up to a point even as parts of that sequence in the pathogen mutate. So it’s unlikely to face the evolved resistance that typically renders chemical treatments ineffective.

A plant (left) infected with the tobacco mosaic virus, visible in fluorescent markings, compared to a plant (right) treated with a double-stranded RNA "vaccine," which activated its immune system.

A plant (left) infected with the tobacco mosaic virus, visible in fluorescent markings, compared to a plant (right) treated with a double-stranded RNA "vaccine," which activated its immune system. HEINLEIN ET AL. 2018, PLANT BIOTECHNOLOGY JOURNAL

Heinlein adds a few caveats: It’s not technically a vaccine. Instead of eliciting persistent immunity as a vaccine would do, it works only if the pathogen is present and even then only for the 10 or 20 days it takes for the dsRNA to biodegrade. The research also needs to progress from successful testing in the laboratory against the tobacco mosaic virus to field tests against other common crop pathogens. That includes learning whether dsRNA imposes unexpected stresses on the plant or produces non-target effects — for instance, against beneficial insects. If those tests go well, though, Heinlein speculated that the technology could be replacing chemical pesticides in some applications within as little as five years.

CRISPR gene editing technology, Science magazine’s 2015 Breakthrough of the Year, has already begun to put altered crops in the field, and the pace of introductions is likely to accelerate. It’s a more palatable alternative to genetic engineering, which has faced regulatory limits and a “frankenfood” reputation because it introduces genes of one species into another. Gene editing, by contrast, isn’t transgenic. Instead, using a technology called CRISPR/Cas9, scientists precisely and inexpensively snip minute genetic sequences from the genome of an organism, or add in sequences from another individual of the same species. Such organisms contain nothing that could not in theory have resulted from natural breeding, and they have so far entered the market without regulatory delays.

.

Much of the appeal of gene editing has to do with the potential to get plants to do work currently done by pesticides and other costly inputs. Wheat, for instance, typically requires heavy use of fungicides against powdery mildew. But researchers in China have already developed a gene-edited variety resistant to the disease. Gene editing technology is cheap enough that it could also potentially help developing regions that cannot afford, or otherwise lack access to, commercial pesticides, and where genetic engineering is now the other likely way to address diseases that can devastate rice, maize, cassava, and other staple crops.

It may, however, be a more realistic indicator of things to come that the first commercially available gene-edited crop planted in U.S. fields, in 2015, was an oilseed rape modified for herbicide resistance. It’s a reminder that past improvements in agricultural technology have at times served only to bind farmers even more inextricably to chemical inputs — for instance, with the 15-fold increase in use of the herbicide glyphosate brought on by the introduction of genetically-engineered Roundup-ready crops in the mid-1990s.

Scientists have developed a gene-edited wheat variety resistant to powdery mildew, caused by the fungus Blumeria graminis.

Scientists have developed a gene-edited wheat variety resistant to powdery mildew, caused by the fungus Blumeria graminis. THOMAS LUMPKIN/CIMMYT

“It’s a curious vision of sustainable agriculture … that sees overcoming resistance to agrochemicals as progress,”commented Maywa Montenegro, a University of California at Berkeley environmental policy researcher. “Should we really be enabling farmers to spray more glyphosate into their fields when the World Health Organization has found the chemical to be a ‘probable’ carcinogen and when it’s been associated with collapsing populations of monarch butterflies?”

One answer to that question came in March from the agribusiness giant (and glyphosate manufacturer) Monsanto. It committed $100 million to found a startup called Pairwise focused, according to a Reuters report, on using gene editing “to alter commodity crops, including corn, soy, wheat, cotton, and canola, exclusively for Monsanto.”

New genomic tools have for the first time allowed detailed analysis of microbial communities living in and around plants — the agricultural counterpart to the human microbiome. That has made the study of these mutualistic microbes a hot topic of academic and commercial research, with the potential to substitute biopesticides and biofertilizers for agrochemical inputs. Researchers have touted microbes that can help various crops pull phosphorus, nitrogen, and other nutrients from the environment; survive droughts, soil salinity, heat, cold, and other forms of stress; resist insect pests and disease; and improve photosynthesis, among other things. But these promising results often prove disappointing in the field, where hundreds of microbial species may interact with one another and a plant in ways scientists are only beginning to understand.

The classic example of a plant-microbe partnership is of course nitrogen fixation by peanuts, chickpeas, and other legumes. These plants house and feed specialized bacteria in nodules on their roots. The bacteria in turn pull nitrogen from the air and turn it into food for the plant. This partnership seems as if it ought to be easy enough to manipulate, and farmers have for decades been spending a little extra cash at planting time to boost the nitrogen-fixing process by inoculating soybeans and other seeds with an extra dose of these beneficial bacteria.

“We want to know if we can breed a plant to basically control the way a fungus forages for a nutrient,” says one scientist.

“It works great the first time you grow soybeans in a field,” says R. Ford Denison, an agricultural ecologist at the University of Minnesota. But after that, for reasons scientists don’t entirely understand, the soil builds up a population of less efficient bacteria. They demand more support from the plant that houses them, and supply less nitrogen in return. Farmers paradoxically end up adding nitrogen fertilizer to plants designed by evolution to provide nitrogen on their own.

But research into the agricultural microbiome is really just beginning and it will inevitably develop more sophisticated ways to put microbes to work in the cause of sustainable agriculture. For instance, in 2003, Toby Kiers, then a post-doctoral student in Denison’s lab, discovered that nitrogen-fixing plants have the ability to detect and starve out inefficient bacteria. She’s now working at Amsterdam’s Free University to develop plants with more nuanced control over their microbial partners.

“We want to know,” says Kiers, “if we can breed a plant to basically control the way a fungus forages for a nutrient. If a plant is deficient in a nutrient, can we align it so it uses the fungi to get what it needs?” The fungi she studies are already completely dependent on the host plant, so “we think we can tweak the partnership to make sure the plant is even more in control.” The eventual goal is a plant-microbe partnership that can essentially farm itself, so that “no matter where you put it, or what the conditions are, that partnership is able to maximize uptake.”

Up to now, all the attention in agriculture has gone to the above-ground portion of crops, and to the pesticides, herbicides, fertilizers, and other inputs farmers could add to make plants grow faster. “And now,” says Kiers, “I think people are realizing that the real revolution is going to happen underground.”

Richard Conniff is a National Magazine Award-winning writer whose articles have appeared in The New York Times, Smithsonian, The Atlantic, National Geographic, and other publications. His latest book is "House of Lost Worlds: Dinosaurs, Dynasties, and the Story of Life on Earth."

Why This Insurer Wants to Put the Spotlight on Growing ‘Ocean Risk’

Chip Cunliffe, director for sustainable development at XL Catlin, a global insurance and reinsurance group, explains why his company is hosting the first-ever Ocean Risk Summit in May and what it hopes to accomplish.

WRITTEN BY

Jessica Leber

PUBLISHED ON

April 19, 2018

WHEN A COASTAL community floods or a drought kills crops, people, businesses and communities lose out – but so can the insurers who cover them. That’s why, for years, the insurance industry has been at the forefront of thinking about the avalanche of new risks that come with a warming climate.

XL Catlin, a leading insurance and reinsurance group that suffered a net loss in 2017 after a year of “severe natural catastrophes,” believes more attention also needs to be paid to how the ocean is changing – and how communities can build resilience to warming, acidifying or overfished waters. The company is hosting the first Ocean Risk Summit in Bermuda, where it’s headquartered, on May 8–10, along with several scientific and Bermuda-based organizational partners.

To Chip Cunliffe, director for sustainable development for XL Catlin, the conference “is very much a first” in its effort to engage the insurance community and broader business, finance and government sectors in these issues. While the company has a history of taking part in ocean research – in the past decade, it has sponsored major scientific surveys of global coral reef health, Arctic sea ice and the deep ocean and is a member of the Bermuda Institute of Ocean Science’s Risk Prediction Initiative – increasingly, it is also focusing on oceans as an industry imperative, Cunliffe said.

For instance, at a conference in Mexico in March, XL Catlin executives backed efforts to write the first-ever insurance policy for a coral reef – in this case, the Mesoamerican Reef that protects the Yucatan peninsula and its $9 billion tourist industry from hurricanes.

Oceans Deeply spoke with Cunliffe about how XL Catlin thinks about ocean risk and what it hopes to achieve at the summit.

Oceans Deeply: What does XL Catlin mean by ‘ocean risk’?

Chip Cunliffe: The oceans are changing very rapidly. We are interested in understanding how the changes in the oceans are impacting the future risk landscape.

What we’re finding is that it means quite a few things. We’re looking at sea-level rise because as the ocean warms, water expands, but you also have a corresponding likelihood of an increased intensity of storms. Looking at the storm intensity over the last couple of years, you’ve seen those increase. You take those two things together and you’re likely to see greater impacts from storm surges, so [there is] coastal flooding inundation and impacts to coastal infrastructure. But not only that, you’re also likely to see impacts to human health.

We have illegal and unreported fishing and overfishing of fish stocks around the world. But what we’re also seeing, there is a fairly big problem in that the fish are also moving poleward, because of the warming ocean. If those fish stocks are moving, what does that mean for the fishing community and what does that mean for food security? Another area is looking at aquaculture, and the potential impact that a warming ocean might have and whether those current areas that are used for aquaculture might need to move.

We’re also seeing Arctic sea-ice loss, and there may be some weather system changes related to that. We’re also seeing militarization of the Arctic.

Oceans Deeply: There’s clearly a lot going on.

Cunliffe: From the point of risk perspective, there’s a lot of stuff going on. These risks are far too big for just one business to get a hold of. So [we’re looking at] identifying ways in which the insurance industry could potentially help identify solutions to mitigate some of those risks. It’s also looking at how to build resilience and identify where resilience needs to be built.

Oceans Deeply: What do you hope to achieve with this conference?

Cunliffe: Number one is to put ocean risk up the agenda. Oceans are really only just coming into the political consciousness, if I may say that, given that it’s taken 25 years or so for the Intergovernmental Panel on Climate Change to add it to its AR5 [Fifth Assessment ] report on climate change – and [there’s] just one chapter on it.

Partly, it’s about communicating the fact that the oceans are changing and we need to be cognizant of that. Ultimately, the big thing for this conference is to identify solutions. But it can’t be done just through insurance. It can’t be done just by policy. We need a multi-sectoral approach to mitigating these risks.

That obviously comes across from a financial point of view: Where might funds come from to help build that resilience? We certainly could focus in on the legal aspects. We also need to be focusing on regulation and technology and data. Everyone says we all need more data, and I think that is particularly true in the ocean space.

Oceans Deeply: Has XL Catlin quantified the future costs of ocean risk or even seen its effects in the claims you’re seeing today?

Cunliffe: As a business, and probably more globally, no. Ultimately this is the first conference of its kind looking at these kinds of risks. Out of the back end of this we want to move that area of focus forward.

Oceans Deeply: Are ocean issues viewed not just as a potential cost but also as a future business opportunity?

Cunliffe: At the moment, and up until now, it’s certainly been a focus for us from a corporate responsibility perspective. That, for us, is really key. In fact, it started out as a sponsorship opportunity but it quickly became obvious to us that it wasn’t just about branding and, in fact, that became fairly secondary to really being a good corporate citizen.

The ocean risk perspective has come from the fact that we have done all these different expeditions, and we’ve focused on this area for quite a long time. As we’ve been listening to the likes of the United Nations and the IPCC report and others … there’s a whole lot more communication now about risk and resilience. And so it makes sense for us to work with others to identify how we could potentially help mitigate those risks.

Oceans Deeply: Anything else you want to mention?

Cunliffe: This is quite a new narrative, really. It’s an area of science and risk that is not particularly well known. It seems right that we should be doing this now when the oceans are coming more front and center in the national and international news, related to plastics and fishing and other things. Things like the BBC’s “Blue Planet” and other efforts to communicate. We’re all running in the same direction and all helping each other along

Fewer than 1 percent of offshore drilling tracts auctioned by Trump receive bids

BY MIRANDA GREEN - 08/15/18 02:27 PM EDT 79

Oil and gas companies bid on fewer than one percent of the offshore tracts made available by the Trump administration during an auction Wednesday.

Of the 14,622 tracts made available by the Interior department for bidding on 801,288 acres in federal waters off the Gulf of Mexico, only 144 received bids.

The percentage of tracts bid on was slightly less than the previous lease sale in March that leased just over 1 percent of tracts made available.

The last sale in March was the biggest offshore lease sale in United States history--with 77.3 million acres made available. The sale saw 33 companies bidding on plots off the cost of Texas, Louisiana, Mississippi, Alabama, and Florida for $124.8 million. Of the 14,474 tracts available for bidding only 148 tracts received any bids.

The Obama administration held much smaller sales that focused only on areas where oil companies had expressed interest.

The administration Wednesday nevertheless hailed the latest sale as a success, promoting the nearly $180,000,000 in sales generated in a press release.

"Today’s lease sale is yet another step our nation has taken to achieve economic security and energy dominance,” Interior Deputy Secretary Bernhardt said in a statement. “The results from the lease sale will help secure well-paying offshore jobs for rig and platform workers, support staff onshore, and related industry jobs, while generating much-needed revenue to fund everything from conservation to infrastructure.”

Despite the revenue, the limited interest is a considerable blow to an administration that has been preaching the need to expand oil and gas drilling on public lands and waters in order to increase U.S. energy dependence and revenue. Fewer than 30 companies submitted bids.

In January Interior Secretary Ryan Zinke announced a desire to expand offshore drilling to new areas off the coast. The call met resistance from a number of state leaders opposed to opening up drilling off their coastlines. Zinke has yet to formally announce where new drilling might be but has acknowledged that oil and gas companies have more interest in plots located on land rather than offshore.

Critics say the bids shows a lack of interest by fossil fuel groups to invest in offshore drilling which is expensive, risky and not always profitable.

Environmental groups have also jumped on the lackluster results as proof that drilling is overall not desirable.

"When will Ryan Zinke finally get the message that it’s time to scrap his reckless offshore drilling plans? Millions of Americans and elected officials from both sides of the aisle have made it clear that the public does not want dangerous drilling off our coasts, and even corporate polluters aren’t buying what Zinke’s selling," Athan Manuel, director of the Sierra Club’s land protection program, said in a statement.

Electric Bills Skyrocket Due to Costs of Coal in Appalachia as Region’s Economy Collapses

As natural gas and renewables get cheaper elsewhere, residents in Appalachia are stuck paying for coal-fired power plants that no longer make economic sense.

By James Bruggers, Aug 14, 2018

WHITESBURG, Kentucky — Buddy Sexton opened a community meeting about the financial troubles of local volunteer fire departments with his head bowed: "Let people understand what a need we have in this county, in the mighty name of Jesus, we pray."

Sexton then delivered some grim news to the gathering of about two dozen eastern Kentucky residents sitting in folding chairs set up in the engine bays of the Mayking Fire Department.

Just like the townspeople, his tiny department is facing crushing electric bills. They have gone up by several hundred dollars a month during the winter. That's a lot in a local budget that is already strained as revenues from taxes on mining go steadily down.

"I've got financial reports here that I am afraid to look at anymore," said Sexton, the department's treasurer. "We're in danger of shutting down."

As coal mining has collapsed across Appalachia, residents in eastern Kentucky and West Virginia have been socked with a double whammy—crippling electric bills to go along with a declining economy.

The pain in a region once known for cheap power has been felt in homes, businesses, schools and even at volunteer fire departments.

The problem of rising power bills has many causes.

Mines and other businesses have shut down and people have moved away—mining jobs are off 70 percent since 2010—so utilities are selling less power and spreading their fixed costs across fewer customers.

Electricity customers are shouldering the costs of shuttering old coal-burning power plants and cleaning up the toxic messes they leave behind.

And experts say it hasn't helped that utilities in eastern Kentucky and West Virginia have continued to invest in burning coal.

"They're doubling down on coal at a time when coal is not competitive," said James M. Van Nostrand, a professor at the West Virginia University College of Law with decades of experience in the energy field. "It's really tragic."

AEP's Rising Rates and Surcharges

The common denominator is American Electric Power, one of the nation's largest utilities. It owns Kentucky Power, along with two subsidiaries in neighboring West Virginia, Wheeling Power and Appalachian Power.

In May, Wheeling Power and Appalachian Power requested permission from the Public Service Commission of West Virginia to boost their monthly residential bill 11 percent because of declining sales. That was on top of a 29 percent increase between 2014 and 2018.

Customers in both states are furious that the regulators are going along.

"Our jobs available in this state are not a living wage, and many are working two, three jobs just to make it," wrote Elizabeth Bland of Beckley, West Virginia, in her protest letter to the commission. "Please turn down this request from Appalachian Power for the sake of all West Virginians."

Rising rates are just part of the problem.

Kentucky Power's monthly bill also includes surcharges, and a line for each customer's share of the utility's fixed costs. These add up in precious dollars.

Kentucky Power's customers received a bit of a break in January after vigorous protests. State regulators rejected most of the utility's proposed rate increases and they reduced some surcharges. That will save an average residential customer about $7 a month. Even so, that bill has nearly doubled in the past decade and is now about $151 plus local government fees and taxes, state regulators said.

The average bill per customer at Kentucky Power has been among the highest in the nation for an investor-owned utility, according to 2016 numbers from the U.S. Energy Information Agency, the most recent comparisons available.

"We're hit hard," Alice Craft, a Whitesburg-area resident, told InsideClimate News. "The power companies, they are just greedy, greedy, greedy."

Surcharges that pay for retiring a coal plant are especially irritating, said state Rep. Angie Hatton, a lawyer and Democrat from Whitesburg.

"They are charging us for shutting down our coal-fired plants that were keeping us all employed," she said.

Hatton, who is the daughter of one coal miner and wife of another, also sees the value of solar power as an option for businesses and residents with high power bills. She helped defeat legislation in Kentucky his spring that would have devalued rooftop solar power. Utilities backed it, but it was rejected by lawmakers driven by constituents' rage about power bills.

With Poverty High, Something Has to Give

Allison Barker, a spokeswoman for Kentucky Power, blamed last winter's unusually cold weather for part of the high electric bills and suggested customers could turn their thermostats down.

"Keeping the thermostat the same during extreme weather can significantly increase usage and bills," she said.

But the underlying problem, she acknowledged, was the loss of hundreds of industrial and commercial customers and thousands of residential customers since 2014. During that time, she said, energy use has dropped 15 percent, and the company has faced new costs to comply with environmental protections.

AEP's West Virginia utility, Appalachian Power, said in May that its electricity demand had dropped 14 percent since 2013 and it had lost 11,000 customers.

Poverty also comes into play.

Kentucky Power's service area includes 10 of the 100 poorest counties in the United States, with poverty rates as high as three times the national average. The region has a lot of substandard housing with poor insulation, adding to the expense of electric heat.

Something has to give when people on small, fixed incomes are hit with higher power bills, said Roger McCann, executive director of Community Action Kentucky, an umbrella group for poverty-fighting agencies with services including home-energy assistance.

"Seniors will cut their medicine in half, or they will go without food," he said. "If you don't pay the bill, the lights go out."

McCann said he knew of one woman in a mobile home who had a $600 monthly electric bill while living on a Social Security benefit of just $660 a month.

"Some mobile homes are not insulated and the heat just goes through them," he said.

Shifting Coal Costs to Consumers

Kentucky State Rep. Rocky Adkins, the Democratic minority floor leader and a potential candidate for governor, says high bills and the region's economic woes both stem from a 2012 decision by Kentucky Power as Adkins and others were trying to salvage the utility industry.

Together, AEP and the Kentucky Public Service Commission reached a decision to pull the plug on nearly $1 billion in environmental upgrades at the utility's Big Sandy power plant. It stopped burning coal three years later. One unit was converted to natural gas.

Meanwhile the Kentucky regulators steered the company to invest more than $500 million in the nearby coal-fired Mitchell plant, which had already been upgraded. That solution, the commission argued, would save customers money.

At the time, Adkins railed that this was waving the "white flag" of surrender to natural gas, environmentalists and regulators.

But the rout was already under way.

In the years that followed, the Mitchell plant's coal boilers have been undercut on the regional grid by cheaper natural gas. Kentucky Power found itself with an unneeded mountain of 400,000 tons of coal there. In June, Kentucky regulators gave the utility permission to sell the surplus, raising $17.6 million.

"We lost the power plant and the rates have gone through the roof," Adkins told ICN. "It has happened in a region of Kentucky that is more challenged than any other region of the country because of the downturn in the coal economy."

But there's another way of looking at the outcome, said Cathy Kunkel, a West Virginia-based analyst for the Institute for Energy Economics and Financial Analysis.

She explained how the maneuver supported coal-fired power while shifting its costs from shareholders to consumers.

There are two main business models for selling electricity. In deregulated wholesale markets, merchant vendors compete to sell power to the grid, and high-cost generators lose out. In regulated markets, the rates are set by the state and utilities are guaranteed a profit.

Before 2012, the Mitchell plant had been owned by a deregulated AEP company which sold its power in the open marketplace. The 2012 deal put half of Mitchell under a regulated AEP subsidiary, and that gave it economic protection, at least for a time. Two years later, West Virginia regulators allowed Wheeling Power, also regulated, to buy the other half.

Acting together, Kunkel said, AEP and its regulators had shielded the struggling coal fired plant from the pressures of the marketplace and shifted the costs onto customers. "The problem when the plant is uncompetitive is that it ends up not running very much, and not producing much revenue for customers, who end up losing out because they're still stuck paying for all of the fixed costs and operating costs of the plant," she said.

It's a method the company has used elsewhere.

"Those plants are not competitive in the wholesale market," said Van Nostrand, the law professor. "We are buying them and sticking them on the backs of the ratepayers," he said.

A spokesman for the Kentucky regulatory agency defended its 2012 actions, saying that exhaustive analysis showed that it diversified AEP's fuel mix and avoided an even more costly debacle at the Big Sandy plant.

A spokesman for West Virginia's regulators provided ICN a recent staff report to lawmakers. The state's electric prices are still competitive compared to elsewhere, it asserted, but they are rising, "because of our nearly 100 percent reliance on coal."

'These People Have Been Sold a Bill of Goods'

The solution, Kunkel said, should include more federal help for an economic transition away from coal.

"If you were to do that in a smart way it would include money for energy efficiency and money for rooftop solar," she said.

Some eastern Kentucky residents blame coal's slide on environmental regulations in general, and former President Barack Obama in particular. Van Nostrand said that's just not the case.

"These people have been sold a bill of goods for the last 10 years that everything in the coal industry would have been fine if the (U.S. Environmental Protection Agency) had just left us alone," Van Nostrand said. "It's been cheap natural gas and it's been cheap renewables." The EPA is further down on the list of the decline in the coal industry, he added.

At the Mayking Fire Department community meeting, residents were told that coal severance taxes—payments to their county from taxes collected on coal as it leaves the ground—had fallen from about $2 million a year to about $180,000 a year, and that the county budget had been slashed from $11 million to $6 million—all in the last five years. That left little money to help keep local volunteer fire departments afloat.

Wintertime power bills at the station just outside Whitesburg have risen from as low as about $800 a month to about $1,400 a month in recent years, Fire Chief Tony Fugate told ICN.

By the time the Mayking department, with its approximately $25,000-a-year budget, pays its bills for insurance and equipment, "there's nothing left for the power bill, unless you have fundraisers," he said.

But cookouts and other fundraisers, even asking drivers at intersections to drop money in a fireman's boot, can only go so far, he said.

"You can only have so many roadblocks and hot dog dinners," Fugate said.

James Bruggers covers the U.S. Southeast, part of ICN's National Environment Reporting Network. He came to InsideClimate News in May 2018 from Louisville's Courier Journal, where he covered energy and the environment for more than 18 years. He has also worked as a correspondent for USA Today and was a member of the USA Today Network environment team. Before moving to Kentucky, Bruggers worked as a journalist in Montana, Alaska, Washington and California, covering a variety of issues including the environment. Bruggers' work has won numerous recognitions, including the National Press Foundation's Thomas Stokes Award for energy reporting. He served on the board of directors of the Society of Environmental Journalists for 13 years, including two years as president. He lives in Louisville with his wife, Christine Bruggers, and their cat, Lucy.

James can be reached at james.bruggers@insideclimatenews.org.

How Energy Companies and Allies Are Turning the Law Against Protesters

In at least 31 states, lawmakers and governors have introduced bills and orders since Standing Rock that target protests, particularly opposition to pipelines.

By Nicholas Kusnetz

AUG 22, 2018

A version of this ICN story was published in the Washington Post.

The activists were ready for a fight. An oil pipeline was slated to cross tribal lands in eastern Oklahoma, and Native American leaders would resist. The Sierra Club and Black Lives Matter pledged support.

The groups announced their plans at a press conference in January 2017 at the State Capitol. Ashley McCray, a member of a local Shawnee tribe, stood in front of a blue "Water is Life" banner, her hair tied back with an ornate clip, and told reporters that organizers were forming a coalition to protect native lands.

They would establish a rural encampment, like the one that had drawn thousands of people to Standing Rock in North Dakota the previous year to resist the Dakota Access Pipeline.

The following week, an Oklahoma state lawmaker introduced a bill to stiffen penalties for interfering with pipelines and other "critical infrastructure." It would impose punishments of up to 10 years in prison and $100,000 in fines—and up to $1 million in penalties for any organization "found to be a conspirator" in violating the new law. Republican Rep. Scott Biggs, the bill's sponsor, said he was responding to those same Dakota Access Pipeline protests.

The activists established the camp in March, and within weeks the federal Department of Homeland Security and state law enforcement wrote a field analysis identifying "environmental rights extremists" as the top domestic terrorist threat to the Diamond Pipeline, planned to run from Oklahoma to Tennessee. The analysis said protesters could spark "criminal trespassing events resulting in violence." It told authorities to watch for people dressed in black.

An FBI team arrived to train local police on how to handle the protest camp.

McCray recalls a surveillance plane and helicopters whirring above the Oka Lawa camp. Demonstrators were pulled over and questioned on their way in or out, though the local sheriff said people were only pulled over for violating traffic laws.

In May the governor signed the bill to protect critical infrastructure. Merely stepping onto a pipeline easement suddenly risked as much as a year in prison.

"That was really pretty successful in thwarting a lot of our efforts to continue any activism after that," McCray said.

Oka Lawa never drew the kind of participation and attention that made the Dakota Access Pipeline a national cause, and the Diamond Pipeline was completed quietly later that year.

McCray has since channeled her energy toward politics, running for a seat on the Oklahoma Corporation Commission, which regulates the energy industry. But even as a candidate, McCray says, she now watches her words and her Facebook posts, afraid of being implicated as a conspirator if someone were to violate the law, even if she doesn't know the person. "I don't feel safe, honestly," she said.

Ashley McCray says Oklahoma's new law has made protesters think carefully about what they have the freedom to say.

Across the country, activists like McCray are feeling increasingly under assault as energy companies and their allies in government have tried to turn the law—and law enforcement—against them.

In Louisiana, which enacted a similar law in May, at least nine activists have been arrested on felony charges stemming from the new law since it went into effect on Aug. 1. In one incident, three people were pulled off a canoe and kayak after they maneuvered the boats on a bayou to protest construction of an oil pipeline. The arrests were conducted by off-duty officers with the state Department of Public Safety and Corrections who were armed and in uniform, but at the time were working for a private security firm hired by the pipeline developer.

Dozens of bills and executive orders have been introduced in at least 31 states since January 2017 that aim to restrict high-profile protests that have ramped up as environmentalists focus on blocking fossil fuel projects.

In addition to Oklahoma's infrastructure bill and similar legislation enacted in two other states, these bills would expand definitions of rioting and terrorism, and even increase penalties for blocking traffic. Twelve have been enacted, according to the International Center for Not-for-Profit Law. The bills have all come since the election of President Donald Trump, who openly suggested violence as a way to handle protesters on the campaign trail, and once in office called the nation's leading news organizations "the enemy of the American people."

At the same time, law enforcement and private companies have conducted surveillance on campaigners, while some federal and state officials have suggested pipeline protesters who break laws be charged as terrorists. Corporations have hit landowners and environmental groups with restraining orders and hundred-million-dollar lawsuits.

Some pipeline opponents have conducted dangerous and illegal stunts, cutting pipelines with oxyacetylene torches or closing valves. But most protests have been peaceful. If they've broken laws by trespassing, activists say, they've done so as part of a tradition of civil disobedience that stretches to the nation's colonial roots.

"All of the social progress we've made has depended, over the entire history of this nation, from the very beginning, on that ability to speak out against things that are wrong, things that are legal but should not be," said Carroll Muffett, president of the Center for International Environmental Law. "This country, for all its failing, has long respected the importance of that. These bills put that fundamental element of our democracy in jeopardy."

The modern environmental movement sprung out of mass protest, when millions took to the streets for the first Earth Day in 1970. In the decade that followed, civil disobedience emerged as a core tactic as greens joined with anti-war protesters to launch anti-nuclear campaigns.

The movement returned to those roots with its fight to stop the Keystone XL oil pipeline. Beginning in 2011, activists staged sit-ins ending in arrests that galvanized the movement and drew national attention to what had been the mundane work of building pipelines.

Across the country, people began physically obstructing fossil fuel infrastructure. Protesters blocked train tracks carrying coal and oil in Washington state, halted trucks at the gates of a gas storage facility in upstate New York and kayaked in front of an oil rig in Seattle. This was before hundreds were arrested at Standing Rock.

In the absence of a clear federal energy plan, fossil fuel projects effectively became the policy, locking in oil, gas and coal infrastructure—and their greenhouse gas emissions —for generations. They became the focus of environmental groups, who found common cause with tribes and landowners fighting to protect their land and water. The groups launched campaigns that delayed or blocked several major coal export facilities, pipelines and other projects, costing energy companies millions or even billions of dollars.

To push back, the industry turned to its supporters in government.

Soon after Oklahoma's critical infrastructure bill passed last year, the conservative American Legislative Exchange Council (ALEC) used it to write model legislation for other states.

In at least six more states, lawmakers introduced similar bills that would impose steep penalties for trespassing on, or tampering with, pipeline property and other infrastructure. Two were enacted this year. Two others are pending.

In Wyoming, one bill was openly proposed on behalf of energy companies. In other states the ties are barely hidden.

Louisiana state Rep. Major Thibaut, a Democrat, introduced a bill that followed ALEC's model by adding pipelines to a list of critical infrastructure facilities. When he presented it to a Senate committee in April, he brought Tyler Gray, a lobbyist for the Louisiana Mid-Continent Oil and Gas Association. During the hearing, Gray answered most of the questions. At one point, he leaned over to Thibaut to recommend that he accept an amendment. Neither Thibaut nor Gray responded to requests for comment.

Gov. John Bel Edwards signed it into law in May, after an amendment removed the threat of conspiracy charges.

Iowa enacted a law that makes "critical infrastructure sabotage" a felony. Lawmakers in Wyoming and Minnesota also approved critical infrastructure bills this year, but the governors there vetoed the legislation. Similar bills are pending in Ohio and Pennsylvania.

The bills may get their first test soon in Louisiana after the arrests of the three activitists on the bayou. Activists had been conducting tree-sits and other actions for weeks and were careful to stay off the pipeline easements once the new law took effect, said Cherri Foytlin, an activist organizer in the state. The three who were taken from their boats, for example, claim to have been in navigable waters, which are supposed to be public. The definition of what's navigable has been the subject of much debate and legal wrangling in the state, however, and the off-duty officers pulled the activists off their boats to arrest them.

"It was really scary," said Cindy Spoon, one of the activists, in a video posted to Facebook. "They grabbed my wrist, grabbed my waist and they started to pull one of my arms behind my back and put me in a stress position."

The local district attorney has not yet formally filed charges. If the case proceeds, the activists will challenge the law itself, said William Quigley, a professor at Loyola University College of Law in New Orleans who is representing them pro bono. They could face up to five years in prison if convicted.

Two more incidents occurred over the weekend involving activists who had erected a stand high in the trees to block construction. Quigley said the group had permission from property owners to be on the land. But deputies with the St. Martin Parish Sheriff's Office arrested six people, including a journalist. The property is co-owned by hundreds of parties, and some of them have not signed easement agreements with Energy Transfer Partners, the primary owner of the pipeline.

The St. Martin Parish Sheriff's Office did not return requests for comment on the weekend arrests.

"I think this shows how ridiculous this law is if this is the way it's going to be applied," Quigley said.

The legislation is not without precedent. Since 1990, nearly a dozen states have passed bills known as "ag-gag" laws that prohibit surreptitiously recording inside feedlots and breeding facilities. Many came after exposs of horrific conditions and animal abuse. Three of those laws were subsequently overturned by courts.

But environmentalists and free speech advocates say the new bills are part of a broader effort to recast environmental activists as criminals, even terrorists.

For example:

In May 2017, the American Fuel and Petrochemical Manufacturers published a blog post about activists who vandalized pipelines, under the headline: "Pipelines Are Critical Infrastructure—and Attacking Them Is Terrorism."

Last year, Energy Transfer Partners filed a federal lawsuit against Greenpeace and other groups seeking hundreds of millions of dollars. The complaint accuses some of the nation's leading environmental organizations of operating an "eco-terrorist" conspiracy at Standing Rock. (A judge dismissed the case against one of the defendants last month, but has yet to rule on a motion to dismiss the case against Greenpeace.)

Another industry group launched a database to track "criminal acts on critical energy infrastructure," claiming eco-terrorism incidents were on the rise.

In Congress, 84 members wrote a letter to Attorney General Jeff Sessions asking if protestors who tamper with pipelines could be prosecuted as domestic terrorists. Sponsors of the state pipeline bills have also invoked terrorism.

Supporters of the bills say they do not suppress lawful protests.

"There's a legal process to stop something," said Oklahoma state Rep. Mark McBride, who sponsored another bill that assigns civil liability to anyone who pays protesters to trespass. "But if you're chaining yourself to a bulldozer or you're standing in the way of a piece of equipment digging a ditch or whatever it might be, yes, you're causing harm to the project and to the person that's contracted to do that job."

Advocacy groups say the bills are unnecessary—trespassing and vandalism are already unlawful, and protesters who have disrupted operations have largely been charged under existing statutes. They fear the legislation uses a handful of dangerous incidents as a pretext to intimidate mainstream advocates and target more widespread acts of peaceful civil disobedience, like temporarily blocking access to a construction site. Several environmental and civil liberties organizations are now in talks about how to respond to the industry's actions.

"The clear attempt there is to bring environmental justice, environmental advocacy organizations into a realm of criminal liability," said Pamela Spees, a senior staff attorney at the Center for Constitutional Rights, which represents activists in Louisiana. "They're basically trying to silence and minimize the impact of environmental organizations."

Ever since the state's first gusher began spurting crude in 1897, oil has dominated life in Oklahoma. Today, it is home to fracking pioneers, including Continental Resources. The town of Cushing, the self-dubbed "pipeline crossroads of the world," is a critical oil trading hub. "Environmental activist" is not a badge many wear openly in Oklahoma.

Dakota Raynes spent the past few years at Oklahoma State University writing his dissertation about how people have responded to a fracking boom that's literally shaken the state. As drillers began injecting more wastewater into wells from 2010-2015, the number of earthquakes jumped more than 20-fold to 888.

Raynes interviewed dozens of activists, lawmakers, regulators and ordinary citizens. He found that almost everyone is reluctant to speak publicly against the industry. They fear neighbors or colleagues or parents of their children's friends will catch wind and shun them. Raynes said he's attended events with activists who have returned to their cars afterward to find screws driven into tires.

"Many people read Oklahoma as a hostile context in which to engage in any kind of pro-environmental work," he said. "And they all link that back to the phenomenal amount of power that the oil and gas industry has, both as a cultural force, as a legal force, as a political force."

Standing Rock changed that deference, said Mekasi Camp Horinek, a member of the state's Ponca tribe.

Horinek grew up on tribal land near an oil refinery and a factory that produces carbon black, a sooty petroleum product. He blames the facilities for contributing to disease and deaths in his community. When the call came to join the resistance at Standing Rock, he became a leading organizer. (The Ponca's ancestral land in Nebraska lies in the path of the Keystone XL pipeline, another project he's fought.)

"I think the state of Oklahoma felt threatened and was worried that people would rise up," he said about the critical infrastructure bill. "And they should be."

Horinek says he remains undaunted, but he said the new law had an immediate effect on others.

People were suddenly threatened with long prison terms if they crossed onto pipeline property. "That definitely weighs heavy on a person's thoughts," he said.

Oklahoma has 39 American Indian tribes, most forced to move there during the Trail of Tears in the 19th Century, and indigenous activists led the fight against the Diamond Pipeline. "They can't risk a six-month sentence for trespassing. Who's going to take care of their kids, their parents, their grandparents?" he said.

Biggs, the state representative who sponsored the bill, now works for the federal Department of Agriculture. He declined to comment for this article. The Oklahoma Oil and Gas Association also declined to make anyone available for an interview.

Activists say the bill was a clear assault. 31States That Have Proposed Laws Targeting Protests.

The Sierra Club has a policy against engaging in civil disobedience. But the bill's conspiracy element—which says a group can be charged with 10 times the fines given to a person who violates the provisions—worries the organization's Oklahoma director, Johnson Grimm-Bridgwater. He was among those who spoke at the 2017 press conference. While Grimm-Bridgwater says participating in a press conference should qualify as constitutionally protected speech, there's no telling how a prosecutor might apply the new statute.

"The law is punitive and is designed to create friction and divisions among groups who normally wouldn't have a second thought at working together," he said.

Brady Henderson, legal director for the Oklahoma chapter of the American Civil Liberties Union, said there's widespread concern in both liberal and libertarian circles about the law, and he's fielded questions from several nonprofits about the conspiracy clause.

The ACLU in Oklahoma is considering legal challenges, but may have to wait until a district attorney tries to use the new law. Still, Henderson said, the language is so broad that it can apply to almost anything.

"By equating those kinds of things together, essentially political speech on one end and on the other end outright terrorism," he said, "the bill is a pretty gross instrument."

Protesters Under Surveillance

As activists were fighting the Diamond Pipeline in Oklahoma, Energy Transfer Partners was planning another oil pipeline at the southern end of its network.

The Bayou Bridge pipeline is slated to run from Texas to St. James Parish, Louisiana, already home to many petrochemical facilities. Along the way, it crosses through the Atchafalaya Basin, the nation's largest river swamp and a center of the state's crawfish industry.

Environmentalists were alarmed by the risks the project posed to residents of St. James Parish and the swamp's fragile ecosystem. And they quickly felt as if law enforcement agencies were working against them. For one thing, some officials openly supported the project. Joseph Lopinto, now Jefferson Parish sheriff, spoke at a hearing last year on behalf of the National Sheriffs' Association and urged regulators to approve the pipeline.

Anne Rolfes, an organizer of the pipeline resistance and founder of the advocacy group Louisiana Bucket Brigade, said activists also suspected they were being watched.

During the protests at Standing Rock, a private security company hired by Energy Transfer Partners had compiled daily intelligence briefings and coordinated with local law enforcement, as detailed in reports by The Intercept.

Rolfes' group this year also obtained a handful of documents through a public records request that indicate state officials were tracking their efforts.

One state police report from November said the agency sent an investigator from its criminal intelligence unit to a hearing where activists had allegedly planned a protest, and that a local sheriff's office planned to send a plainclothes officer. Rolfes said no protest was planned; activists merely intended to speak at a public hearing.

In the following months, an intelligence officer with the Governor's Office of Homeland Security and Emergency Preparedness sent two emails to colleagues describing environmental groups' priorities, quoting Rolfes' newsletter and an article about the groups and adding a photograph of her.

Louisiana State Police declined to comment. Mike Steele, a spokesman for the state Homeland Security Department, rejected the notion that his agency spies on activists. He said this type of open-source tracking was similar to what they conduct ahead of football games or festivals. "We do that with any type of event, because safety is the number one concern," he said.

Energy Transfer Partners spokeswoman Alexis Daniel issued a statement saying, "any claims that our company or our security contractors have inappropriately monitored protestors in false." It's unclear what surveillance, if any, the company or its contractors have deployed in Louisiana. Energy Transfer Partners has also been accused of spying by a family in Pennsylvania.

Anne Rolfes, founder of the advocacy group Louisiana Bucket Brigade, speaks at a protest in 2017. A similar photo of her was circulated by a law enforcement agency. Credit: Julie Dermansky

Rolfes said Louisiana officials are treating activists like criminals. "We're going to be followed while we participate in the democratic process?" she said.

The arrests in August of the three protesters on boats who were detained by off-duty officers only deepened the activists' feelings that the state and energy industry were working against them. While the arresting officers were working as contractors, they wore their state uniforms and badges and carried weapons.

Spees, with the Center for Constitutional Rights, said the incident represented a problematic melding of public and private authorities. After the industry-supported bill was enacted by the state, she said, state employees acted on behalf of an energy company to detain activists under the new law.

"It's as though they've become the hired hands for private oil companies and pipelines companies," she said.

Activists in Virginia, where two gas pipelines are drawing protests, have also been tracked by that state's Fusion Center—which shares information across agencies to combat terrorism and crime—according to a report by the Richmond Times-Dispatch.

In Washington state, a local sheriff's office has monitored activists opposed to a major pipeline project that connects with Canada's tar sands, sharing information with that state's Fusion Center, documents obtained by The Intercept show. Protest groups there already tried to block tanker traffic associated with the project, and local authorities have conducted trainings to prepare for mass protests.

The nation's intelligence community has a dark history of tracking political movements, including the FBI's COINTELPRO, which snooped on Martin Luther King, Jr., and others. Earlier this decade, FBI agents monitored opponents of the Keystone XL pipeline, according to The Guardian, and the bureau had at least one informant at Standing Rock.

Even if local law enforcement agencies today are doing little more than collecting news articles and other open source information, covert surveillance risks equating political protest with criminal activity, said Keith Mako Woodhouse, a historian at Northwestern University who wrote a book about radical environmentalists.

"It gives a sense that law enforcement is, if not on the side of, at least more sympathetic to the industries and practices that are being protested," he said. "Presumably they're not surveilling the energy companies."

Pipeline companies have cultivated that relationship. They have reimbursed state law enforcement millions of dollars for providing security for their projects in North Dakota, Iowa and Massachusetts, for example. Most major pipeline companies also run charitable campaigns that have contributed millions of dollars to local emergency responders and other agencies and groups along their routes. The National Sheriff's Association, in turn, supported the Bayou Bridge and Dakota Access pipelines while it called protesters "terrorists."

A Worrying Trend

Over the past decade, the sense of urgency around climate change has intensified. Scientists say there is little time before the planet is committed to potentially devastating warming. The only practical way to avoid that is to largely eliminate fossil fuel use over the next several decades.

The government's response has been minimal, "so the next step is civil disobedience," said Woodhouse, the historian. "And if anything, it's surprising that it hasn't happened sooner and at a greater scale."

Woodhouse said efforts to suppress environmental protest trace back to the rise of radical environmental groups in the 1980s and `90s, when public officials slapped names like "eco-terrorism" and "ecological terrorism" on activists who vandalized equipment or animal testing facilities.

Now, Woodhouse said, environmentalists are threatening one of the country's most powerful industries. Some advocates say the new push against activists seems more widespread and unabashed.

"I think there is a very worrying trend—and the trend is on both sides of the political aisle to be perfectly frank and candid—a trend of trying to suppress political speech that you don't agree with," said David Snyder, executive director of the First Amendment Coalition.

"I think a lot of people feel a lot more threatened than they have in a long time," he said, "and I think that's true on all sides of the political spectrum. And when people feel threatened, they tend to lash out."

ICN reporter Georgina Gustin contributed to this story.

Flint water crisis: Michigan's top health official to face trial over deaths

State’s health director to stand trial for involuntary manslaughter in two deaths linked to legionnaires’ disease in the Flint area

Associated Press in Flint

Tue 21 Aug 2018 10.44 EDT Last modified on Tue 21 Aug 2018 11.03 EDT

Special agent Jeff Seipenko walks out of the courtroom with signed paperwork in hand after a judge authorized charges on 14 June 2017 in Flint, Michigan for Nick Lyon in relation to the water crisis.

Special agent Jeff Seipenko walks out of the courtroom with signed paperwork in hand after a judge authorized charges on 14 June 2017 in Flint, Michigan for Nick Lyon in relation to the water crisis. Photograph: Jake May/AP

A judge has ordered Michigan’s health director to stand trial for involuntary manslaughter over two deaths linked to legionnaires’ disease in the Flint area, the highest-ranking official to face criminal charges as a result of the city’s tainted water scandal.

Nick Lyon is accused of failing to issue a timely alert about the outbreak. District court judge David Goggins said deaths probably could have been prevented if the outbreak had been publicly known. He said keeping the public in the dark was “corrupt”.

Goggins found probable cause for a trial in Genesee county court, a legal standard that is not as high as beyond a reasonable doubt. Lyon also faces a charge of misconduct in office.

When the judge announced his decision, a woman in the gallery said, “Yes, yes, yes.”

“It’s a long way from over,” Lyon told the Associated Press. He declined further comment.

Some experts have blamed legionnaires’ on Flint’s water, which was not properly treated when it was drawn from the Flint river in 2014 and 2015. Legionella bacteria can emerge through misting and cooling systems, triggering a severe form of pneumonia, especially in people with weakened immune systems.

At least 90 cases of legionnaires’ occurred in Genesee county, including 12 deaths. More than half of the people had a common thread: they spent time at McLaren hospital, which was on the Flint water system.

‘Nothing to worry about. The water is fine’: how Flint poisoned its people – podcast

The outbreak was announced by the governor, Rick Snyder, and Lyon in January 2016, although Lyon concedes that he knew that cases were being reported many months earlier. He is director of the health and human services department.

Nonetheless, he denies wrongdoing. Lyon’s attorneys said there was much speculation about the exact cause of legionnaires’ and not enough solid information to share earlier with the public.

The investigation by the office of the state attorney general, Bill Schuette, is part of a larger investigation into how Flint’s water system became contaminated when the city used Flint river water for 18 months. The water was not treated to reduce corrosion. As a result, lead leached from old pipes.

“We’re not looking at today as a win or a loss. We’re looking at today as the first step and the next step for justice for the moms, dads and kids of Flint,” said Schuette’s spokeswoman, Andrea Bitely, who specifically mentioned the families of two men whose deaths the prosecution blames on Lyon – 85-year-old Robert Skidmore and 83-year-old John Snyder.

An additional 14 current or former state and local officials have been charged with crimes, either related to legionnaires’ or lead in the water. Four agreed to misdemeanor plea deals; the other cases are moving slowly.

“Normally we don’t see government officials accused of manslaughter based on what they didn’t do,” said Peter Henning, a professor at Wayne State University law school in Detroit. “That does make it an unusual case, and it will make government officials be much more cautious. Maybe that’s the message here.”

Defense attorney John Bursch said the judge’s decision was “mystifying”. Goggins spent more than two hours summarizing evidence from weeks of testimony, but he did not specifically explain what swayed him to send Lyon to trial.

“We had 20 pages of argument in our legal brief that he didn’t address,” Bursch said outside court. “He didn’t talk about the law at all.”

A trial would be many months away after Snyder’s term as governor ends on 1 January. He said Lyon “has my full faith and confidence” and will remain as Michigan’s health director.

A courtroom spectator, Karina Petri, 30, of Milwaukee said sending a senior official to trial is “long overdue”.

“He withheld the truth. There’s no excuse,” said Petri, who wore a “Flint Lives Matter” shirt. “He could have changed hundreds of lives.”

Shopping for Community Solar? Contract Terms Are Getting Friendlier

Solstice introduces a “no-risk” contract, following a trend of increasingly flexible terms.

More flexible terms could lower the barrier to entry for consumers interested in solar power.

Community solar is supposed to make investing in PV easier. But many contracts do exactly the opposite.

Common contract terms put customers on the hook for cancellation fees or signup periods stretching into two decades. The lack of flexibility is generally a turnoff for customers, limiting signups from the 50 to 75 percent of U.S. consumers who can’t access traditional rooftop solar.

Solstice, a community solar organization focused on customer management, recently introduced a “no-risk” contract tied to a new 2.73-megawatt Delaware River Solar project in New York. The contract includes no cancellation fee and lasts just one year. Solstice called the release a “milestone in U.S. solar accessibility” and said the terms “allow renters to participate without fear of getting stuck with a contract that they can’t take with them if they move.” The project will serve 400 households after its estimated Q4 completion.

Other community solar providers are moving to simplify contract terms as well, but this particular offering is “rare,” according to Wood Mackenzie Power & Renewables Senior Solar Analyst Michelle Davis.

“We can have a future reliable, cheap, resilient grid that is 100 percent powered by clean energy.

Senator Heinrich: A 100% Clean Energy Grid Is ‘Completely Doable’

SunPower's distributed generation sales were already up 45 percent last quarter.

“This new Solstice contract is a great example of program designers meeting customers’ demands,” said Davis. “Community solar customers want community solar to be easy, not fraught with lengthy contracts, cancellation fees, or requirements to transfer a contract in the same utility territory.”

The offering from Solstice follows a trend of community solar developers moving beyond traditional power-purchase agreements. Increasingly, community solar companies are looking to tailor flexible options for consumers rather than using the template set out by the solar industry’s long contract commitments.

Other companies, like Arcadia Power, have also shunned cancellation fees.

In an interview with Greentech Media earlier this month, Arcadia’s CEO Kiran Bhatraju said the company is working to make good on the promise of community solar, which should look more like “universal access” than a rooftop solar sale. Arcadia’s data shows that on average Americans move every 5.5 years.

According to EnergySage, many of Arcadia’s term lengths last 20 years, but the company has no agreement cancellation fee, and its partnerships with utilities nationwide means the company can transfer a customer if they move.

That’s more flexible than offerings from other companies like Renovus Solar, with projects in New York that have 25-year contracts and don’t allow cancellations. EnergySage data identifies other developers like the Clean Energy Collective that require customers to transfer contracts to other consumers if they move outside the service area, or pay a cancellation fee. A past Clean Energy Collective contract included a 50-year term and its currently available contracts last up to 25 years.

Other companies have more favorable terms, such as those offered by Solstice and Arcadia. Community Power Partners offers a three-year term, but requires a cancellation fee in the first year. BlueWave Solar has a 20-year contract, but no termination fee and asks for 90 days' notice. Nexamp offers a pay-as-you-go model with no cancellation fee, though the company asks for a notification period that differs by state. That means 90 days in states including New York to six months in Massachusetts.

The flexibility of the Solstice model, though, could come with added cost. Davis said the turnover from one-year contracts will also mean faster churn on Solstice’s waiting list and customer pipeline.

“I can see this contract structure leading to an increased need for customer management for this specific project,” Davis said. “This is not an insignificant cost for community solar projects, and many developers outsource customer acquisition and management since it takes substantial resources and expertise. Time will tell if the benefits of increased customer engagement will outweigh the costs of managing customer wait lists and pipelines.”

In a conversation with Greentech Media in July, Solstice co-founder and CEO Steph Speirs said narrowing the customer base through unappealing contract terms makes customer acquisition costs spike. She said Solstice's wait list is designed specifically to lessen the impact of any churn, with more consumers waiting to fill demand.

As customer-friendly options become the norm, Speirs said it should push bigger players like NRG, with 20-year contracts and a required transfer, to follow suit.

“People aren’t signing up for marriage for 20 years,” Speirs said. “Why are we trying to force people to sign up for community solar for 20 years?”

TimberFish Launches IndieGoGo to Raise Trout on Brewery Waste and Wood Chips

By Catherine Lamb - July 12, 2018 0

It’s no secret that wild-caught seafood is fraught, what with its declining supply and associations with inhumane labor practices. Many tout farmed fish as a more ethical and sustainable (not to mention cheaper) way to satisfy our seafood cravings, which is why aquaculture is the fastest growing food-producing sector. In fact, as of 2016, aquaculture produced half of all fish for human consumption.

While aquaculture doesn’t lead to overfishing of limited ocean resources, it can have other unsavory consequences. Farmed fish produce a lot of waste (AKA fish poop), and can sometimes cause chemicals to leak into our drinking water.

And then there’s the fish food. Often, farmed fish are fed pellets of corn, soy, unwanted chicken parts, or even fish meal. Sometimes people even catch smaller, less popular fish from the ocean and grind them up to feed their farmed bretheren. Obviously, it takes a lot of energy and environmental resources to create all this fish food, and even more to filter out waste from fish enclosures.

TimberFish Technologies‘ eponymous technology promises to offer a more palatable alternative to aquaculture. The company launched in 2008 and have so far raised or won $260K, which they used to build a test facility at Five & 20 Spirits & Brewing facility in Westfield New York.

There, they feed their fish not with animal parts or corn, but with a combination of nutrient-rich wastewater from food processors (such as breweries, distilleries, and wineries) and woodchips. Microbes grow on the woodchips, small invertebrates (like worms and snails) eat the microbes, and the fish eat the invertebrates. The fish poop is grub for the microbes, and the whole cycle starts again.

In addition to seafood, the TimberFish system’s only outputs are clean water and spent wood chips, which can be used as a biofuel or soil supplement. Another benefit is that TimberFish can build their aquaculture farms close to cities, shortening the supply chain and guaranteeing fresher fish.

This is obviously not as idyllic as plucking salmon from the Alaskan seas or catching trout in a mountain stream, but, as aquaculture operations go, it’s not bad. And it’s certainly cost-efficient; diverting a waste product to make it profitable.

This week TimberFish Technologies launched an IndieGoGo campaign to raise funds for their no-waste, sustainable aquaculture system. If they reach their $10,000 goal, they’ll use the funds to design plans for a larger commercial facility, which they estimate could produce 2 to 3 million pounds of fish per year

Why recycling is now a money loser, not a moneymaker for Philly

Updated: JULY 23, 2018 — 5:00 AM EDT

by Frank Kummer, Philly.com @FrankKummer | fkummer@philly.com

A truck loaded with curbside recycling opened its giant maw and five tons of plastic, cardboard, and metal spewed to the ground next to a 25-foot-high pile of recyclables recently dumped at the Republic Services facility in Philadelphia.

By the end of the day, the Grays Ferry location had received up to 500 tons of recyclables — processed at a rate of 25 tons an hour. It’s a staggering amount to move, sort, bale, and ship on a daily basis. About 20 percent goes to landfills because of non-recyclable items such as hoses, rigid plastic containers, and pizza boxes mixed in.

Five years ago, Philadelphia was getting paid good money by contractors for recyclables — about $65 a ton.

By last year, it was paying $4 a ton just to get rid of it. Now, it’s paying $38 a ton.

The change has not gone unnoticed. That plastics and other scraps meant for recycling are ending up in landfills — at a cost to taxpayers — was an issue raised through Curious Philly, our new question-and-response forum that allows readers to submit questions about their community in need of further examination.

“We made the argument that recycling was so much cheaper, but that’s getting closer to not being true,” said Nic Esposito, director of Philadelphia’s Zero Waste and Litter Cabinet. “That’s scary.”

City taxpayers will likely chip in about $2 million to cover the increase this year, according to budget figures.

If the trend continues, recyclables could become just as expensive to get rid of as ordinary waste, which means items once recycled will go straight to landfills, undoing decades of work to get residents and businesses to go green. (It costs about $63 a ton to dispose of waste at a landfill.)

All the eggs in one basket

The cause of what many call a recycling crisis began in 2013 when China shifted away from its role as the world’s major recycler. As part of its crackdown on pollution, China began demanding ever purer loads of recyclables. Previously, Chinese facilities might have accepted loads that were 10 percent or 20 percent contaminated by non-recyclable plastics or paper and cardboard that was wet or dirty.

But the last year has been particularly painful for recycling. China is now demanding that loads be no more than half a percent contaminated, and contain no mixed paper (such as office paper mixed in with newspaper) — an impossible standard to meet for most municipalities. In Philadelphia, as much as one-fifth of each load might be contaminated.

Recycling companies had to figure out how to unload millions of tons elsewhere. Recyclers have been able to find some domestic and overseas markets such as Vietnam and India for plastics, but those countries are now overwhelmed.

China distressed markets even more when it refused all recycling from May 4 through June 4. Recyclers had to stockpile loads.

“We put so many eggs in the basket in China,” Esposito said.

Bales of aluminum cans and cardboard stacked high off the floor at Republic Services’ processing plant in the Grays Ferry neighborhood of Philadelphia.

Philadelphia and many municipalities use contractors to process single-stream recycling. Contractors are now appealing to residents to use less plastic and be cautious of what they put in recycling bins.

The goal is to get loads as pure as possible.

“We weren’t exporting plastic to China in recent years,” said Frank Chimera, a senior manager with Republic Services, the city’s contractor to process recycling. “But we were exporting mixed paper, cardboard. Prior to all this we were sending about 60 percent to China. As of March 1, we stopped sending anything.”

The company has had to stockpile mixed paper loads at times until it found domestic buyers. Some loads are being shipped to Indonesia and India. But that poses other problems.

“The cost to ship is greater than the value of paper,” Chimera said.

The crunch has caused turmoil for recyclers, who have always borne the brunt of processing but made up for it by selling sorted recyclables as a commodity on the open market.

“Recycling economically does not make sense right now,” he said. “So companies like Republic are propping up the recycling industry.”

Shift lead Tim Spross stands between aisles of landfill waste (left) and bales of cardboard at Republic Services’ processing plant in Grays Ferry.

Plastic bags a big problem

On Tuesday, Chimera stood with Tim Spross, also of Republic Services, high atop a steel platform where thousands of pieces of paper and cardboard rushed down a conveyor belt.

Workers scrambled to pick out as many plastic bags, dirty cardboard, and non-paper items from the swift-moving stream. Plastic bags, almost ubiquitous, are especially onerous because they clog machinery, forcing lines to shut down.

“Plastic bags are one of our biggest problems,” Spross said.

He walked to a wall of bales of non-recyclables that were removed from the trucks and conveyors. The wall ran 10 feet tall and 50 feet long. All of it will go to a landfill.

Employees at a Waste Management recycling processing facility work to free a machine clogged with plastic bags . Plastic bags are a significant source of machine shutdowns.

Contamination drives up costs

John Hambrose, a spokesman for Waste Management, which contracts with other municipalities, says his company is experiencing similar issues. Waste Management has a recycling processing facility on Bleigh Avenue in Holmesburg.

Hambrose said the situation is not at the point where recyclables are going straight to landfills. But he said costs are being passed down to municipalities and, ultimately, taxpayers.

“In the last few months, we have been working with municipalities,” Hambrose said. “We’re evaluating the quality of materials they are sending us. When we find high levels of contamination, that’s when additional fees may incur.”

Contamination occurs when residents either don’t know what’s recyclable or simply don’t care.

Food waste gets tossed in with plastic, contaminating an entire load. Liquids left in bottles drip down, ruining paper. Pizza boxes stained with grease are unusable. Heavy rigid plastics, such as crates and bread carriers, also get tossed in, although they are not recyclable. Some residents toss in dirty diapers.

“If there’s too much trash in recycling, then it’s basically all trash,” Hambrose said. “You’re really trying to do this efficiently and make a business out of it. But contaminated loads drive up your costs. At the end, you’ve got low commodity values. So we’re really caught at both ends.”

Recycling companies such as Republic Services and Waste Management are sometimes locked into longer-term contracts with municipalities and have to eat the additional costs.

For example, Camden County has a multiyear contract with Republic Services that locks in a cooperative agreement with 37 municipalities to pay $5 a ton to unload recycling — $33 less per ton than Philly.

“Other than the fact that we were getting money for recyclables, and now we are paying money, it hasn’t hit us as hard as other municipalities because of our participation in the co-op,” said Thomas Cardis, business administrator for Gloucester Township in Camden County. “If we weren’t in that bid, then we would have some serious concerns right now. But, ultimately, we’re concerned we will eventually be susceptible to the market.”

Back at the Republic Services facility in Grays Ferry, workers were sorting through the just-delivered load from the recycling truck. They loaded the non-recyclables into carts and wheeled them to a 12-foot-high hill of debris waiting to be sent to a landfill. As soon as workers clear the pile, it starts to rise anew.

What’s recyclable in Philly?

Plastics: food containers, bottles, jars, detergent and shampoo bottles, and pump and spray bottles – all rinsed. (Philly takes all plastics from #1 to #7, which might be stamped on the bottom with the recycling symbol)

Paper: Newspapers, magazines, brochures, junk mail, envelopes, scrap paper, paper bags, paperback books – any plastic sleavs removed and not soiled with food waste.

Cartons: Milk, juice, wine and soup – all rinsed.

Aluminum: Cans, paint cans (dried latex), aluminum baking dishes, foil

Glass: bottles and jars – rinsed.

Carboard: boxes, paper towel rolls, egg cartons and dry food and shipping boxes.

What can’t be recycled in Philly:

Plastic Bags

Styrofoam

Disposable Plates, Cups, and Takeout Containers

Greasy or Food –soiled paper & cardboard

Tissues, Paper Towels, and Napkins

Light Bulbs

Cassette Tapes (VHS and audio)

Garden Hoses

Needles and Syringes

Propane Tanks

Pots & Pans

NOTE: What gets recycled may vary from town to town, but most have a list on their municipal websites.

Global warming set to exceed 1.5C, slow growth - U.N. draft

Alister Doyle Environment Correspondent

OSLO (Reuters) - Global warming is on course to exceed the most stringent goal set in the Paris agreement by around 2040, threatening economic growth, according to a draft report that is the U.N.’s starkest warning yet of the risks of climate change.

Governments can still cap temperatures below the strict 1.5 degrees Celsius (2.7 Fahrenheit) ceiling agreed in 2015 only with “rapid and far-reaching” transitions in the world economy, according to the U.N.’s Intergovernmental Panel on Climate Change (IPCC).

The final government draft, obtained by Reuters and dated June 4, is due for publication in October in South Korea after revisions and approval by governments.

It will be the main scientific guide for combating climate change.

“If emissions continue at their present rate, human-induced warming will exceed 1.5C by around 2040,” according to the report, which broadly reaffirms findings in an earlier draft in January but is more robust, after 25,000 comments from experts and a wider pool of scientific literature.

The Paris climate agreement, adopted by almost 200 nations in 2015, set a goal of limiting warming to “well below” a rise of 2C above pre-industrial times while “pursuing efforts” for the tougher 1.5 goal.

The deal has been weakened after U.S. President Donald Trump decided last year to pull out and promote U.S. fossil fuels. (nL1N1IZ1BA)

Temperatures are already up about 1C (1.8F) and are rising at a rate of about 0.2C a decade, according to the draft, requested by world leaders as part of the Paris Agreement.

“Economic growth is projected to be lower at 2C warming than at 1.5 for many developed and developing countries,” it said, drained by impacts such as floods or droughts that can undermine crop growth or an increase in human deaths from heatwaves.

In a plus-1.5C world, for instance, sea level rise would be 10 centimeters (3.94 inches) less than with 2C, exposing about 10 million fewer people in coastal areas to risks such as floods, storm surges or salt spray damaging crops.

It says current government pledges in the Paris Agreement are too weak to limit warming to 1.5C.

IPCC spokesman Jonathan Lynn said it did not comment on the contents of draft reports while work was still ongoing.

A man holds an umbrella as he walks next to his buffalo on the banks of the river Ganga on a hot summer day in Allahabad, India May 24, 2016. REUTERS/Jitendra Prakash

“It’s all a bit punchier,” said one official with access to the report who said it seemed slightly less pessimistic about prospects of limiting a rise in global temperatures that will affect the poorest nations hardest.

The report outlines one new scenario to stay below 1.5C, for instance, in which technological innovations and changes in lifestyles could mean sharply lower energy demand by 2050 even with rising economic growth.

And there is no sign that the draft has been watered down by Trump’s doubts that climate change is driven by man-made greenhouse gases.

The draft says renewable energies, such as wind, solar and hydro power, would have to surge by 60 percent from 2020 levels by 2050 to stay below 1.5C “while primary energy from coal decreases by two-thirds”.

By 2050, that meant renewables would supply between 49 and 67 percent of primary energy.

The report says governments may have to find ways to extract vast amounts of carbon from the air, for instance by planting vast forests, to turn down the global thermostat if warming overshoots the 1.5C target.

It omits radical geo-engineering fixes such as spraying chemicals high into the atmosphere to dim sunlight, saying such measures “face large uncertainties and knowledge gaps.”

Boston has new rules to help buildings withstand climate change

A parking lot on Long Wharf was one of many stretches of downtown Boston that flooded during a storm in January.

By Jon Chesto GLOBE STAFF, JUNE 15, 2018

Memories of the storm last winter that flooded parts of downtown Boston are still fresh at City Hall. Now, the Walsh administration is pushing developers to make their buildings better able to withstand another watery apocalypse.

The Boston Planning & Development Agency on Thursday approved new rules to make big buildings more resilient to the effects of climate change. City officials hope the measures will help minimize flooding, keep the lights on in more buildings during power outages, and make it easier to upgrade street lights and other public works.

“We think we’ve identified a way forward that appears to be the first of its kind in the nation,” said Brian Golden, the agency’s director.

The rules are initially being tested for a two-year period and differ for projects based on size. For the largest developments — at least 1.5 million square feet — developers will need to assess installing an on-site power plant, and build one if it’s financially feasible. They will also have to consolidate all wiring for cable, Internet, and other telecom services into one underground tube, so there is less disruption to streets and sidewalks during repairs.

Any new development above 100,000 square feet will have to retain more rainfall than currently required, to help prevent runoff during storms from contributing to floods in the surrounding area. In 2017, the planning agency received applications for 39 projects over 100,000 square feet.

Meanwhile, developers for projects above 50,000 square feet would need to install extra wiring and technology for “smart” traffic signals and street lights if the projects require new or improved signals or lights.

The agency’s work on this policy began about two years ago. But recent storms have added to the urgency.

Golden said the rules could also help reduce traffic jams caused by construction, by requiring more coordination for underground utility work.

He said developers offered input during the process; HYM Investment Group, for example, expects to use these standards for a complex it will build at the old Suffolk Downs track.

But the development industry’s chief local lobbying group, NAIOP Massachusetts, is pushing back, opposing both the on-site power and stormwater retention requirements.

“To increase this holding capacity is a bit problematic,” chief executive David Begelfer said of the new rainwater rule. “Boston’s an old urban city. It’s very dense.”

Matthew Kiefer, a development lawyer with Goulston & Storrs, said it could be hard to connect on-site power supplies with the local electricity grid. “We’ve talked to clients who have tried to do it in different settings,” Kiefer said. “It’s kind of challenging.”

While Kiefer applauded some of the agency’s goals, he said city officials should in exchange allow developers to build bigger projects to compensate for the added expense.

“I would be careful about adding too many costs to developments,” Keifer said. “When the market starts to soften a little bit, can you really support the costs of these things?”

As Nuclear Struggles, A New Generation Of Engineers Is Motivated By Climate Change

June 15, 20185:01 AM ET

JEFF BRADY

Entering the control room at Three Mile Island Unit 1 is like stepping back in time. Except for a few digital screens and new counters, much of the equipment is original to 1974, when the plant began generating electricity.

The number of people graduating with nuclear engineering degrees has more than tripled since a low point in 2001, and many are passionate about their motivation.

"I'm here because I think I can save the world with nuclear power," Leslie Dewan told the crowd at a 2014 event as she pitched her company's design for a new kind of reactor.

Dewan says climate change, and the fact that nuclear plants emit no greenhouse gases, are the big reason she became a nuclear engineer. And she is not the only one.

"The reason that almost all of our students come into this field is climate change," says Dennis Whyte, head of the Department of Nuclear Science and Engineering at the Massachusetts Institute of Technology.

The surge in new engineers comes as the nuclear industry, just like coal, is struggling to compete against cheaper natural gas and renewable energy. The Nuclear Energy Institute estimates that more than half of the nation's 99 nuclear reactors are at risk of closing in the next decade.

President Trump has ordered Energy Secretary Rick Perry to take steps to help financially troubled coal and nuclear power plants, though he has cited the reason as grid resilience and national security.

We wanted the type of nuclear reactor that people would want to have in their communities : Leslie Dewan, co-founder of Transatomic Power

But nuclear plant operators are echoing young engineers like Dewan as they lobby for public subsidies to keep reactors open.

"If you are concerned about climate change, or concerned about the environment, you should be very concerned about the future of [Three Mile Island]," says David Fein, Exelon's senior vice president of state governmental and regulatory affairs.

TMI parent company Exelon announced last year it will close Three Mile Island Unit 1 in 2019 unless there are policy changes that would make the plant profitable again. A different reactor on the site near Middletown, Pa. — Unit 2 — was involved in the country's worst nuclear accident in 1979.

Three Mile Island Nuclear Power Plant To Shut Down In 2019

Fein is among those who argue that nuclear plants should be recognized as clean energy and paid for the public benefit of not emitting greenhouse gases or other pollutants. It's a strategy that has worked in other states: Illinois, New York and — most recently — New Jersey.

"If you are concerned about climate change, or concerned about the environment, you should be very concerned about the future of [Three Mile Island]," says David Fein, the senior vice president of state governmental and regulatory affairs at Exelon, which owns Three Mile Island Unit 1.

Opponents of new subsidies include anti-nuclear activist Eric Epstein with the watchdog group Three Mile Island Alert. "If you consider nuclear green then you have to ignore high-level radioactive waste," he says.

The federal government still doesn't have permanent storage for that waste, and Epstein says there are the environmental costs of uranium mining to consider as well.

Others question giving nuclear plants public money that could be used for renewable energy instead.

"It really is this sort of philosophical battle: Are we building the energy economy of the future? Or are we just, sort of, keeping with the status quo?" says Abe Silverman, vice president for regulatory affairs group and deputy general counsel for NRG, which has opposed state subsidy proposals.

The nuclear industry thinks it has a good chance of persuading people to support its side of this debate.

Ann Bisconti, who does opinion research for the industry, says a lot fewer people oppose nuclear energy now than just after the Three Mile Island accident.

"People have moved, very much, into middle positions — they're very mushy on nuclear energy," Bisconti says. And that creates an opportunity to win them over by talking about the need for nuclear to limit the effects of climate change, she says.

"You can't get there without nuclear in the fuel mix," says Chris Wolfe, who works as a generation planning engineer at South Carolina Electric & Gas and is on the board of North American Young Generation in Nuclear.

Despite the industry's struggles, the U.S. still gets a fifth of its electricity from nuclear, and there are jobs to be had.

The number of nuclear engineering degrees awarded each year peaked in the late 1970s. Then it dropped steadily amid the Three Mile Island partial meltdown in Pennsylvania and the Chernobyl disaster in the former Soviet Union. Numbers started to climb again in 2001 as talk of climate change increased. That left an age gap in the nuclear workforce, with many now ready to retire.

"Baby boomers are leaving the industry, so there are a lot of job opportunities," says Wolfe.

Wolfe chose to work at a utility for job security, but she is cheering on nuclear entrepreneurs like Dewan, who co-founded a startup business called Transatomic Power. She says the company has raised about $6 million from investors.

"We're adapting something that's called a molten salt reactor that was first developed in the very, very early days of the nuclear industry — back in the 1950s and 1960s," Dewan says. With modern materials and a few changes, she believes such a reactor can be financially viable today.

Her company's first design aimed to use spent nuclear fuel to generate electricity. But it turned out some of the calculations were wrong.

Still, she says this new design would produce about half the waste of existing nuclear plants. And Dewan says it's safer because the effects of a meltdown would be limited to the plant site.

Nuclear Power Plants Struggle To Compete With Cheap Natural Gas

"Part of why we started this company is that we wanted the type of nuclear reactor that people would want to have in their communities," says Dewan.

That's a hard sell to anti-nuclear activists like Epstein. "I'm seeing the same arrogance I saw in the '70s," he says. "I think the new generation is like the old generation, in that they view themselves as flawless high priests of technology."

Dewan sees that skepticism as a legacy her generation will have to address. But she thinks the problem of climate change is too important to give up on nuclear energy now.

THE BIKE SHARE WAR IS SHAKING UP SEATTLE LIKE NOWHERE ELSE

AUTHOR: MARK HARRISBY MARK HARRIS

Seattle has amassed a trove of data on its dockless bike share program, which could help it better manage the 10,000 new bikes now active on its streets.

SEAN HEALY WAS driving for Uber in Seattle when a passenger offered him an intriguing job. It wasn't the first time a rider had proposed a new line of work; in the bustling tech scene in Seattle, his passengers often seemed to be scouting for people to hire. But last summer, the person in the backseat of his family’s minivan was the general manager of a company called Ofo.

THE MAN PITCHED Healy on a new gig-economy job transporting bikes around the city. The work itself sounded just so-so, but the man painted it as something bigger—a vision of a new kind of city, with fewer cars clogging the landscape and a metropolis made safe for people on two wheels. Kind of like Europe. "He seemed like a decent guy," says Healy, 33. "He wanted to do things that were not just environmentally sustainable but ethically sustainable."

Ofo's business is dockless bike sharing, and it was about to launch its US operations in Seattle. Dockless bike share is just the latest of a dozen new approaches to urban mobility in increasingly congested cities. Ride-hailing services, app-powered carpools, on-demand car rentals, electric bikes, scooters, and even self-driving taxis are all jockeying for riders on the streets of American cities. Together they are reinventing the way we navigate urban environments, reducing private car usage, improving traffic and commute times, and cutting emissions.

But where alternatives to car ownership are well-established in the US's major metropolises, bike shares are still finding their niche. Paris, London, and New York have all adopted bike share programs that use docks, bulky stations that are built into parking spaces that dictate where the bikes' users must start and end rides. Though they cost a fraction of a more traditional, multibillion-dollar transit project, the stations are still expensive to install and maintain, and their fixed locations limit the number of riders they can attract.

What makes a dockless bike share program appealing is that, beyond the bikes themselves, it doesn't need any infrastructure. With nothing to build, a city can introduce a new way of getting around virtually overnight. A smartphone app tells users where cheap, GPS-enabled bikes are located and lets them rent one. Upon arriving at a destination, a rider simply leaves the self-locking bike there for the next user. Dallas, Los Angeles, Washington, DC, and several small Florida cities, among other places, have all embraced small dockless bike share programs.

Seattle could use a transportation reboot. The home of a thriving tech sector, Seattle is a fast-growing city and home to some of the country's worst traffic. Last year, when it decided to give dockless bikes a chance, it didn't have a bike share program of any kind. Success here could set up dockless bike share for a nationwide roll-out. Failure could mean more cars, more fumes, and more traffic jams for all. But what looked on the surface like an easy win ended up revealing the limits of a startup-led revolution. As Seattle residents discovered, just because the city could get bikes on the streets with little investment didn't mean it should.

WHEN DOCKLESS BIKE share began in Chinese cities three years ago, the downsides of the idea soon became apparent. Tens of thousands of broken or stranded bikes littered those cities before their governments cracked down, impounding bikes and setting limits on their use. Seattle's Department of Transportation wanted to avoid that mess.

So last July, the city allowed three companies—Ofo, LimeBike, and Spin—to deploy up to 4,000 bikes each in a six-month trial, in return for a deluge of data about their customers and operations. Seattle planners wanted to understand in granular detail how the systems would work, and how its citizens would use them. Now the data is in, much of it sourced by WIRED through a series of public records requests.

Seattleites have taken to dockless bike sharing like nowhere else in the country. Not only does Seattle have nearly a quarter of all the nation's dockless bikes, its bikes get three times the usage of those elsewhere in the US. More than 350,000 riders have covered more than a million miles in the scheme's first five months, and 74 percent of the city is in favor of them, according to a transportation department survey. Three-quarters of rides are being used to access public transit, helping to fill in the gaps left by those established systems.

When Healy started his new gig in September, he was tasked with leading a team of workers redistributing Ofo's bright yellow bikes around the city and bringing in damaged bicycles for repair. Bikes can pile up at popular destinations, block sidewalks, or end up stranded in low-traffic parts of town. Healy's job was to keep the bikes deployed in the places where they are most likely to get used, and prevent them from becoming a menace to the city.

At first, he loved his work. An engineering student with four children and who built a basic ion thruster for satellites for fun, Healy used the work to cover his bills while taking classes. Not only was he helping to bring a cheap, healthy transportation option to Seattle, he liked the company itself. Ofo was hiring people with housing and addiction challenges through a local employment charity. "My manager was taking people from the bottom, helping them to grow," Healy says.

But a few months later, that manager was promoted, and things started to change. The three companies selected by Seattle were in the thick of a price war to lure more riders. Every bike company has an internal goal, the number of rides per bike per day, that it uses to woo investors and predict future earnings. While the companies were expanding in Seattle, they did everything they could to hit that figure—including regularly driving down the cost of a ride to zero. To attract and retain riders, the companies had to make sure their bikes were always available where customers looked for them. Keeping teams like Healy's in constant motion became essential to the program's success.

Healy noticed that Ofo kept deploying more and more bikes. "They weren't hiding their strategy, which was to overrun the city," he says. "They wanted a bike on every corner." The work was hard, involving lifting the 42-pound bikes into and out of vans many times a day, he recalls. Workers rode in the back of the van alongside poorly secured bikes, and they lacked protective gloves.

Eventually, the pressure to keep deploying bikes to desirable locations led to a new rule, Healy says: Badly damaged bikes would no longer be painstakingly cannibalized for parts but simply thrown out. "I wondered, why are we working to save the Earth with bike share if we're just going to create more trash?" he says. (Ofo says that irreparably damaged bicycles are now recycled.)

The thousands of new bikes in circulation inevitably led to conflict with residents. According to the feedback collected by Seattle's transportation department, car owners are blaming shared bikes for scraping their vehicles. Residents are peeved that unsightly bikes are clogging up sidewalks, parks, and driveways, making the streets less navigable for pedestrians and annoying local businesses. Vandals have been systematically cutting brake cables of bikes from all three companies. Some activists are now trying to oust the bike companies.

Dockless bike share was meant to be the opposite of Uber—green, healthy, equitable, and affordable. But as 10,000 new bikes sprawl over the city, Seattle's latest transport revolution is pitting residents against each other, with the programs' fans and foes arguing over its effect on the city. Seattle residents want access to bikes, that much is clear—but couldn't the bikes be just a little less annoying?

FOR SEATTLE, A bike share program was a gamble from the start. One year earlier, a different attempt at a bike share had failed in this hilly, drizzly city. Between 2014 and 2017, a nonprofit called Pronto ran a docked bike share program in Seattle that got pretty much everything wrong. It installed too few stations, which were located away from bike paths and tourist attractions. Rides were expensive and required a helmet sourced from unappealing communal bins. The program launched just as the weather turned at the end of a glorious summer, dooming it to a slow start.

There are currently no penalties for riders who violate the parking rules for Seattle's dockless bikes, such as blocking a sidewalk. JENNY RIFFLE

When Pronto went bust in early 2016, the city bought it but changed little about its operation. Early last year, Seattle decided to cut its losses and close Pronto down. "Seattle has this love-hate relationship with bikes," says Andrew Glass-Hastings, director of transit and mobility at the Seattle Department of Transportation. Despite a mild climate, decent cycling infrastructure, and traffic jams among the worst in the world, only about 3 percent of commuters used a bike to get to work in 2017. By contrast, 62 percent of commuters in the world's top cycling city, Copenhagen, ride to work. Some Seattle residents blame the rain, others the hills or a tough helmet law.

But maybe they were just waiting for something cheaper, simpler, and more convenient. Around that same time, dockless bike share programs were starting to take off in cities outside the US. Ofo was one of the first companies to find early success in this market. In 2014, members of the Peking University cycling club started a campus project they called Ofo, chosen because the word looks like a cyclist. The idea was that some students would share their bikes with others who would pay to use them, but the founders soon realized their fellow students were more interested in conveniently renting bikes than sharing their own.

In a world of carbon fiber frames and electric motors, Ofo's heavy steel bikes were lumbering and basic. The company's innovation was to package the entire tracking and rental system on the bike itself. A solar-powered unit mounted on the rear wheel powered a cellular link, GPS receiver, and lock. When a user scanned a unique QR code on the bike's frame using a smartphone app, the system would send an unlock signal to the bike and charge the rider's credit card.

Ofo was an immediate success. From its launch in 2015 with 20,000 bikes in Beijing, it now operates more than 10 million bicycles in more than 250 cities globally. If it has taken you 10 minutes to read this far, another quarter of a million people have taken one of its yellow bikes for a spin. That scale and speed of growth caught the attention of Didi Chuxing, Alibaba, and others, who have provided Ofo with more than $2 billion in funding. Soon, other dockless bike share companies popped up: Chinese copycat Mobike, bankrolled by Alibaba rival Tencent, claims to be the world's biggest dockless operator. In the US, dockless bike share companies include Spin, LimeBike, and Jump (which was acquired by Uber).

"When Pronto shut down, we found ourselves the largest city in the country without a bike share system," says Glass-Hastings, which he says created ripe conditions for a dockless company to step in. Once Ofo, Spin, and LimeBike began offering rides costing just $1 for 30 minutes (or longer, in some cases) last July, the program immediately surpassed Pronto both in volume and in trips per bike. As the summer turned to fall, the number of riders climbed steadily, topping 4,000 in October before the rains caused it to slump.

Even with the drizzle, by December, dockless bike share riders had covered more than a million miles. If all those rides had replaced car journeys, around 400 fewer tons of carbon dioxide would have been emitted, to say nothing of the time and emissions saved by relieving congestion on the roads.

The pilot program is now officially in an evaluation stage, and Glass-Hastings hinted strongly that the transportation department will recommend that it continue when the current permits expire in July. "For the first time, Capitol Hill recently ran out of bikes," he says of a popular downtown neighborhood. "There is a great deal of pent-up demand. It's really rewarding to see people use a system that is a great addition to the transportation system at almost zero cost to the city. It's been a win-win."

THE DOCKLESS BIKES have equally passionate critics, however. As head of Washington state's Department of Transportation from 2001 to 2007, Doug McDonald's job was all about keeping his part of the country moving. Now retired and living in Seattle, McDonald describes himself as a pedestrian activist. He has long been fighting a campaign against a city rule that allows cyclists on sidewalks.

Doug McDonald has been sending several photos a day of badly parked dockless bikes to Seattle's transportation department. JENNY RIFFLE

The arrival of dockless bike share got McDonald even more worked up. "If you watch people on yellow bikes, they tend to be inexperienced cyclists. There is crazy weaving in and out," he says. "And whatever profit comes from lending public rights of way to the bikeshare companies, Seattle gets not a dime. That's why my hair is on fire."

He finds the idea that there's no cost to the city ridiculous. "That ignores the cost to everybody for whom the bicycles are in the way or an annoyance," he says. Full safety statistics have not been made available, but the city says that five collisions with pedestrians were reported during the program. McDonald himself sends about five pictures a day of badly parked dockless bikes to Seattle's transportation department, and he regularly fires off emails debating the finer points of city and state laws on who has the right of way.

McDonald is the most outspoken of the program's critics, but he is not the only one. "The entire city is starting to look like the backyard of ill-behaved 7-year-olds who refuse to pick up after themselves," reads one complaint filed to the city in early September. Another one reads, "It is as if the main priority in Seattle now is to be sure that no one ever has to be farther than a half-block from a bike. Must we remain a nursery school for entitled children?"

Defenders of Seattle's parks—the city's natural treasures (Seattle spends more than three times as much on parks and recreation per capita than New York City or Washington, DC)—also take issue with the bikes. "I see bikes by all three companies parked haphazardly all over [Discovery] Park - —blocking trails, crushing native plants, etc," reads a complaint from December. "What is the city doing to ensure these companies and their customers follow city rules?"

In September, a LimeBike employee got into an altercation outside a downtown caf when dropping off bikes, according to another complaint. When a worker from the caf confronted the worker about the bikes parked out front, the LimeBike operative told him that moving a bike was a felony and that he would be charged if he did. (The city told LimeBike that this was incorrect and "unacceptable" behavior).

The city’s rules on where to park these bikes are strict and clear. Bikes should not be left on corners, driveways, or ramps, nor blocking building entrances, benches, bus stops, or fire hydrants, and they should always leave six feet clear for pedestrians on sidewalks. Companies have two hours to move bikes that are reported as being incorrectly parked. But while the smartphone apps communicate these restrictions to users, nothing prevents riders from leaving a bike wherever they want, nor are there currently any penalties for doing so.

Bikes started to pile up at popular places like the ferry dock and light rail stations, clogging up the walkways and obstructing commuters. They were also showing up in more worrying places. Bikes left in the road were getting hit by cars, and a mangled Spin bike was found near a train track, presumably damaged by a passing locomotive.

Mike Hemion, a diver, has been finding bikes in the water on a regular basis.

They were also getting tossed into lakes. "As soon as I saw the bikes on the roads, that same week we started seeing them in the water," says scuba instructor Mike Hemion, who teaches commercial divers in Seattle's bays and lakes. "Three out of four times when we dive downtown on the waterfront now, there's a bike in the water." In the early days of these water retrievals, workers were expected to fish them out themselves; LimeBike workers even cobbled together a makeshift grappling hook to snare them. Now the three bike share companies just call Hemion.

The three dockless bike startups call Hemion when they need their bikes retrieved. JENNY RIFFLE

McDonald thinks expanding the fleets, and especially the addition of electric bikes (LimeBike has already replaced nearly half of its manual bikes with ebikes), will only make matters worse. "I think there's going to be a bad, bad accident," he says. McDonald wants to tie approval of future dockless permits to a new rule banning all bikes from sidewalks, saying, "The stink I've made is going to get bigger."

FOR THE COMPANIES themselves, gaining significant riderships in Seattle has come at a cost. LimeBike and Spin have significantly less investor cash in their coffers than Ofo. But even the Chinese company's billions might not last forever in an all-out price war. Although all three dockless bike companies in Seattle officially charge at least $1 per ride, the average cost has probably been closer to zero. The companies give out free rides to new customers, and Ofo has only recently started charging riders at all. In Singapore, Ofo riders can even earn cryptocurrency with every ride.

LimeBike has been attempting to lure riders with a range of incentives it calls Bonus Bikes, which unlock free rides and (for a while) entered users into a drawing for a free iPhone X. Spin has shunned price cuts so far—and has probably suffered for it. The smallest of the three, Spin has deployed around 2,000 bikes in Seattle, only half as many as its competitors. "We're not here to ride a hype train," says CEO Derrick Ko. "We want to be a permanent fixture in the US."

To get there, it will need to attract many more riders. A Spin document, submitted for a new bike share scheme, says the company has a target of two rides per bike per day, "a level at which bikes are adequately available to riders and the system can operate profitably." However, city data shows that Seattle users only briefly reached that number, during warmer months. Its average over the pilot was 0.84 rides per bike per day—and the US average for dockless companies was a meagre 0.3 last year.

The tight competition has put a strain on the three companies' street teams, like the one run by Sean Healy. They start early in the morning and work well into the night, moving bikes around the city. According to an Ofo document, the company rebalances about 10 percent of its fleet—400 bikes—every day.

One day, while moving a new bike, Healy sliced his fingers on a metal edge. "It was razor sharp and cut right into my hand," he says. "There was no Band-Aid, no first aid kit, no procedure. I got fired up." He started to notice the other ways Ofo seemed to be cutting corners on safety. At that time the company had set a goal of deploying 310 bikes each day, which meant moving dozens of bikes at a time per vehicle. "We were working pretty hard," Healy says. "Lifting, extending your arms, lowering bikes off a truck 60 times. One guy injured his back doing that because we were doing so many. We weren't thinking about ergonomics, we weren't thinking about safety."

Ofo says that all the workers, even the ones like Healy who were recruited elsewhere, were actually employees of the homeless charity, which the company says was responsible for providing their protective equipment. The charity says that it is now providing gloves and other protective gear, although it remains Ofo’s responsibility to ensure the workplace itself is not hazardous.

The more Healy thought about it, the more his view of the company, especially its treatment of its disadvantaged workers, changed. What had seemed like a socially conscious move was looking more and more to him like exploitation. "There was a power dynamic here," says Healy. "This was a billion-dollar company and these guys off the streets couldn't get gloves?"

In December, he staged a walkout to protest the working conditions. Although none of his homeless coworkers joined in (for fear of losing their jobs, according to Healy), social media coverage of his action forced some changes. The employment charity conducted a safety audit of Ofo in January. It would not say whether it found any problems, only that it provided Ofo with recommendations on best safety practices. Ofo complied. Workers no longer ride in the back of vans with unsecured bikes, and the company now supplies personal protective equipment like gloves and offers safety workshops.

Healy, however, has gone back to driving for Uber.

THE DOCKLESS BIKE share companies faced their biggest showdown with the University of Washington. Yet that altercation also led to a possible solution that could help the bikes coexist peacefully on crowded city streets, in Seattle and beyond.

Tensions with the university’s Seattle campus first arose over Ofo workers using its bathrooms, so the company instructed its workers to stop relieving themselves on campus. Then, in November, the university banned Ofo altogether from dropping off bikes on its grounds. Early this year, the school revised its position and decided to start charging companies for the privilege of serving its students and staff. It recently put out a request for proposals for bike share schemes that came with strict contractual terms. They include offering a 50 percent discount on campus and allowing the university to impound any bike "at any time for any reason or for no reason."

A key requirement of the university, which all three companies agreed to, is to set up geo-fences to force bikes to be parked only in permitted areas. If a bike's GPS senses it is being locked outside an authorized area, the app will warn the user. Spin and LimeBike said they might impose a fine on riders who disobey, and all three said they would restrict or even ban habitual bad parkers. Good behavior, on the other hand, could be rewarded with free rides. "We are currently testing a carrots/sticks approach for smart parking, including bonuses for parking in a preferred location," wrote LimeBike in its response to the university. All the companies agreed to a one-hour response time for badly parked bikes, at the university's command.

They also agreed to pay the university for operating on campus, albeit at different rates. LimeBike suggested an annual $5 per bike fee to cover the university's costs for managing the parking infrastructure and other aspects of the scheme. Spin proposed handing over 10 percent of net profits; Ofo was more generous, offering 10 percent of its revenue from rentals on campus. The university says it is still negotiating the contracts but hopes to sign them soon.

The school’s negotiations with the three companies stand in contrast to the adversarial relationships that cities and new transportation networks have fallen into in recent years. If a city tries to impose restrictions on Uber and Lyft, for example, they threaten to leave, as has happened in Austin, Houston, Quebec, and Seattle. (In Austin, the companies followed through on their promise and left.) Banning new services, as San Francisco recently did with electric scooters, means losing any benefits they also bring.

Yet the willingness—even eagerness—of Seattle's three dockless bike sharing companies to enter into expensive and restrictive agreements with the University suggests that another way is possible. Instead of squeezing into gaps between the rules, these companies are entering into operational and financial partnerships with a jurisdiction, albeit a small one.

"It's a good sign that public entities are negotiating with private services that are using the public right of way," says Jennifer Dill, professor of Urban Studies and Planning at Portland State University. "Most cities did not do so with ride-hailing services. But it's a fast-changing landscape. If companies consolidate or the economic model shifts, so could the agreements."

Some people are always going to ride too fast on sidewalks, park in front of bus stops or throw bikes into lakes. Agreements holding companies accountable for the jerks who use their services help discourage those behaviors, while revenue sharing gives cities the resources to deal with other issues that emerge—and defend against accusations of selling off the public realm. Seattle has gathered the data showing that people want dockless bike sharing. Now, it also has evidence that companies are willing to bear responsibility for its drawbacks, putting the city in the perfect position to drive a harder bargain for its permanent program. If it works, it could provide a model for other cities around the world struggling to adapt to new transportation platforms.

But whether Seattle’s program gets extended, revised, or killed outright, Mike Hemion pictures a busy few months ahead. "In the summer, almost every other time we dive, we find a bike," he says. He doesn't seem to mind. "The bikes are pretty cool. I use them all the time myself."

Nuclear Power Won’t Survive Without A Government Handout

By Maggie Koerth-Baker

Nuclear power plants in the U.S. aren’t as profitable as they used to be. Without government support, plants are likely to close.GETTY IMAGES

Once upon a time, if you were an American who didn’t like nuclear energy, you had to stage sit-ins and marches and chain yourself to various inanimate objects in hopes of closing the nation’s nuclear power plants. Today … all you have to do is sit back and wait.

There are 99 nuclear reactors producing electricity in the United States today. Collectively, they’re responsible for producing about 20 percent of the electricity we use each year. But those reactors are, to put it delicately, of a certain age. The average age of a nuclear power plant in this country is 38 years old (compared with 24 years old for a natural gas power plant). Some are shutting down. New ones aren’t being built. And the ones still operational can’t compete with other sources of power on price. Just last week, several outlets reported on a leaked memo detailing a proposed Trump administration plan directing electric utilities to buy more from nuclear generators and coal plants in an effort to prop up the two struggling industries. The proposal is likely to butt up against political and legal opposition, even from within the electrical industry, in part because it would involve invoking Cold War-era emergency powers that constitute an unprecedented level of federal intervention in electricity markets. But without some type of public assistance, the nuclear industry is likely headed toward oblivion.

“Is [nuclear power] dying under its own weight? Yeah, probably,” said Granger Morgan, professor of engineering and public policy at Carnegie Mellon University.

Morgan isn’t pleased by this situation. He sees nuclear energy as a crucial part of our ability to reduce the risks of climate change because it is our single largest source of carbon emissions-free electricity. Morgan has researched what the U.S. could do to get nuclear energy back on track, but all he’s come up with is bad news (or good news, depending on your point of view).

The age of the nuclear fleet is partly to blame. That’s not because America’s nuclear reactors are falling apart — they’re regularly inspected, and almost all of them have now gone through the process of renewing their original 40-year operating licenses for 20 more years, said David McIntyre, a public affairs officer at the U.S. Nuclear Regulatory Commission. A few, including the Turkey Point Nuclear Generating Station in Florida, have even put in for a second round of renewals that could give them the ability to operate through their 80th birthdays.

Instead, it’s the cost of upkeep that’s prohibitive. Things do fall apart — especially things exposed to radiation on a daily basis. Maintenance and repair, upgrades and rejuvenation all take a lot of capital investment. And right now, that means spending lots of money on power plants that aren’t especially profitable. Historically, nuclear power plants were expensive to build but could produce electricity more cheaply than fossil fuels, making them a favored source of low-cost electricity. That changed with the fracking boom, Morgan told me. “Natural gas from fracking has gotten so cheap, [nuclear plants] aren’t as high up in the dispatch stack,” he said, referring to the order of resources utilities choose to buy electricity from. “So many of them are now not very attractive economically.”

Meanwhile, new nuclear power plants are looking even less fetching. Since 1996, only one plant has opened in the U.S. — Tennessee’s Watts Bar Unit 2 in 2016. At least 10 other reactor projects have been canceled in the past decade. Morgan and other researchers are studying the economic feasibility of investment in newer kinds of nuclear power plants — including different ways of designing the mechanical systems of a reactor and building reactors that are smaller and could be put together on an assembly line. Currently, reactors must be custom-built to each site. Their research showed that new designs are unlikely to be commercially viable in time to seriously address climate change. And in a new study that has not yet been published, they found that the domestic U.S. market for nuclear power isn’t robust enough to justify the investments necessary to build a modular reactor industry.

Combine age and economic misfortune, and you get shuttered power plants. Twelve nuclear reactors have closed in the past 22 years. Another dozen have formally announced plans to close by 2025. Those closures aren’t set in stone, however. While President Trump’s plan to tell utilities that they must buy nuclear power has received criticism as being an overreach of federal powers, states have offered subsidies to keep some nuclear power plants in business — and companies like Exelon, which owns 22 nuclear reactors across the country, have been happy to accept them. “Exelon informed us that they were going to close a couple plants in Illinois,” McIntyre said. “And then the legislature gave them subsidies and they said, ‘Never mind, we’ll stay open.’”

So intervention can work to keep nuclear afloat. But as long as natural gas is cheap, the industry can’t do without the handouts.

Maggie Koerth-Baker is a senior science writer for FiveThirtyEight.

Study of Treated Wastewater Detects Chemicals from Drug Production Facilities

Nationwide study shows how manufacturing facilities can increase levels of medical compounds in waterways.

Thursday, June 14, 2018 - 14:15

Anna Katrina Hunter, Contributor

(Inside Science) -- With many a daily flush of a toilet, a dilute cocktail of chemical compounds we excrete, from hormones to antidepressants to caffeine to nicotine, are sluiced into the sewers of American life, eventually making their way into municipal wastewater treatment plants.

Most facilities typically process and remove the majority of sediment and organic solids in the waste stream, before releasing the treated water out into the ocean or waterways.

However, they don't remove everything. In the last 15 years, there have been numerous reports in the United States about pharmaceutical chemicals found in waterways and drinking water supplies. Wastewater treatment plants are a well-known source for the discharge of these chemicals into the environment.

Scientists from the U.S. Geological Survey recently visited 13 wastewater treatment plants in nine states and Puerto Rico, to learn if the drug manufacturers themselves are a major source for contamination in waterways.

Although there were sparse, early indications more than 30 years ago of manufacturers as a source of pharmaceuticals in waterways, it wasn’t until the 1990s when findings of estrogen in liquid sewage that led to the feminization of fish sparked interest in understanding how pharmaceuticals reach the environment.

“We were seeing papers actually coming out of India, where they were seeing astronomically high concentrations [of pharmaceuticals],” said Dana Kolpin, a research hydrologist with the U.S. Geological Survey who has worked in water quality research for the past 30 years.

Kolpin said that there was initial skepticism that the same problem existed here in the United States, but their findings suggest otherwise. While levels were not quite as elevated as in India, the USGS study found "pretty hefty concentrations" of pharmaceutical chemicals coming from manufacturers, he said.

In a 2010 study in New York state, USGS scientists found that two municipal wastewater treatment plants receiving waste from pharmaceutical manufacturing facilities had concentrations of opioids in treated water up to 1,000 times higher than in the discharge from wastewater plants processing the waste stream coming from domestic households.

“We’ve gotten to the point where it’s not breaking news to find pharmaceuticals in this river or that stream or lake,” said Tia-Marie Scott, a physical scientist with the U.S. Geological Survey. They can even be found in pristine environments, she added.

Scientists recently embarked on a new, wider study to identify the sources of such pollution nationwide.

The USGS scientists tested the treated discharge coming from each of 20 wastewater treatment plants in nine states across the U.S. and Puerto Rico over a 24-hour period once a year from 2012 to 2014, for 200 pharmaceuticals, nonpharmaceuticals and other chemicals of concern.

Thirteen of the plants received wastewater from pharmaceutical manufacturers and domestic households, while six received water from domestic households only. The remaining plant initially qualified for the former category, but a nearby factory closed during the study.

The team selected their study sites based on nearby manufacturing of seven commonly used pharmaceuticals, including an antidepressant, an opioid, an antibiotic, an anticonvulsant, a synthetic estrogen, a breast cancer drug and an immunosuppressant.

Of those seven common pharmaceutical chemicals, only bupropion, the active ingredient in an antidepressant drug known as Wellbutrin, was found at high concentrations in the treated discharge coming from plants receiving waste from pharmaceutical manufacturers.

The concentrations of bupropion were as much as 452 times higher in the treated discharge from one of these wastewater plants as compared to even the highest concentration found in discharge from plants receiving only domestic wastewater.

This discharged water carried nearly 90 micrograms of bupropion per liter, which is roughly 35 times the anticipated therapeutic dose for fish such as rainbow trout. That much bupropion, even if subsequently diluted in the waterways, could impact the mating behavior and survival of fish populations.

The work was published online in Science of the Total Environment in April.

In studying less commonly used pharmaceuticals, the scientists found elevated levels of as many as 33 pharmaceuticals in the treated water of plants receiving waste from drug manufacturers out of the 200 compounds they tested.

The researchers also found much higher concentrations of some pharmaceuticals in the discharge from plants receiving wastewater from drug manufacturers than from plants with no wastewater coming from these sources.

One such pharmaceutical, an antifungal medication called Fluconazole, was present in the discharge of all 20 wastewater treatment plants. But the concentration was up to 3,000 times higher in the treated discharge from one wastewater treatment plant receiving waste from a nearby drug manufacturer.

The researchers suggest that when drug manufacturers produce large amounts of one particular drug, it can produce these dramatic concentrations in the eventual wastewater discharge.

“This is another interesting study from the USGS group,” said Bryan Brooks, an environmental scientist at Baylor University, in Waco, Texas, and an expert in water quality and toxicology. “The elevated levels as they note in their conclusions really call for additional research to determine what consequences might be presented to organisms living in waterways receiving these effluents.”

The notion that any one treatment could remove all pharmaceuticals from wastewater discharge is “pie in the sky,” said Dana Kolpin. In addition to costing millions of dollars, he said scientists and managers should figure out which compounds are the most threatening and prioritize accordingly.

“All of these pharmaceuticals, and almost every emerging contaminant that we are looking at these days are not regulated,” said Tia-Marie Scott. “There are no thresholds for what is and what is not considered safe.”

More Information:

https://www.sciencedirect.com/science/article/pii/S0048969718313354?via%3Dihub

Here’s another climate change concern: Superheated bugs in the soil, belching carbon

By Stuart Leavenworth

sleavenworth@mcclatchydc.com

Updated August 01, 2018 10:36 AM

WASHINGTON

Rising temperatures that are contributing to wildfires and droughts are also changing the world’s soil so that it pumps out more carbon dioxide, a “feedback loop” that could aggravate climate change, according to a study published Wednesday in the journal Nature.

Increased heat is activating microbes in the soil, converting organic matter into carbon dioxide at a heightened rate, much like a compost heap that decomposes quickly when exposed to a lengthy heat wave.

Scientists say this feedback loop has implications for understanding the build-up of greenhouse gases in the atmosphere. In the past, many researchers assumed that increased carbon dioxide would trigger a boost in growth of forests and vegetation that would capture carbon and counteract impacts of more rapid soil decay.

This week’s study casts doubt on that theory.

“One thing we show here is not only a higher rate of respiration from the soil, but it is rising relative to the growth of vegetation,” said Ben Bond-Lamberty, the study’s lead author and a forest ecologist at the Pacific Northwest National Laboratory, a U.S. Department of Energy facility in Richland, Wa.

Numerous scientists say the atmospheric buildup of carbon dioxide, primarily from industrial emissions, is a prime driver of recent record-hot conditions that have contributed to deadly wildfires, from Northern California to Greece. Charting future impacts not only involves tracking man-made emissions but understanding how ecosystems absorb carbon and are affected by warming temperatures.

In the last year, fires have devastated neighborhoods in the Northern California wine country city of Santa Rosa, the Southern California beach city of Ventura and, now, the inland city of Redding. Hotter weather from changing climates is drying out vegetation, creating more intense fires that spread quickly from rural areas to city subdivisions, climate and fire experts say. But they also blame cities for expanding into previously undeveloped areas susceptible to fire.

Forests and grasslands are obvious reservoirs of carbon, but the planet’s soils are actually larger storehouses — unseen and out of mind. These soils include a mix of living roots and decaying layers of leaves, twigs and other matter, their decomposition affected by temperature, amounts of available oxygen and the work of bacteria, fungi and other microbes.

Scientists have have long known that certain soils worldwide were increasing outputs of carbon dioxide, but this week’s study is the first to synthesize all of that research and provide a global estimate of the increase. The Nature study — a product of the Joint Global Change Research Institute, a partnership between the University of Maryland and the Pacific Northwest laboratory — found that conversion of soil carbon to carbon dioxide has increased 1.2 percent worldwide over a quarter century, from 1990 through 2014.

While 1.2 percent may not sound like a significant increase, soil respiration worldwide emits about five times more carbon dioxide than human activities, said Bond-Lamberty. So “a small percentage of a large number is still a very large number,” he said.

The paper’s authors relied upon a database that drew from 1,500 soil respiration studies worldwide, as well as data from FLUXNET, an international network of sites that monitor the interactions of carbon from land to air. The research was funded by DOE’s Office of Science.

“This study asks the question on a global scale,” said Vanessa Bailey, a soil scientist at the Pacific Northwest laboratory who contributed to the research. “We are talking about a huge quantity of carbon.”

Not all that carbon is the same. Some of it is new, the product of recently fallen leaves and other vegetation. Some of it has built up over time, ancient and sequestered.

Climate change sucks moisture from the West, adding to droughts, fires, federal study reveals

California’s new normal? Ever more-intense heat, fires, droughts and floods

“Imagine a compost bin that you don’t turn over completely,” said Bond-Lamberty. “You end up with some bottom layers where decomposition doesn’t happen. Then, all of a sudden one summer, the temperature spikes and some of those long stored layers will get decomposed.”

Numerous scientists worldwide are studying climate change impact on soils, from the Arctic tundra to the loose layers of the tropics. At least one study has suggested that supercharged microbes are robbing nutrients from the soil, which could ultimately stunt the growth of vegetation. These studies contradict the claims of climate change doubters who say that industrial increases of CO2 are “greening the planet,” benefiting humankind with faster-growing crops and vegetation.

In a blog post last year, the Competitive Enterprise Institute, which has received funding from fossil fuel industries, celebrated a study that documented how plants have increased their uptake of carbon, because of increased CO2 emissions. “So-called carbon pollution has done much more to expand and invigorate the planet’s greenery than all the climate policies of all the world’s governments combined,” wrote a senior fellow for the think tank.

Bond-Lamberty and his co-authors acknowledge that uncertainties remain about carbon emissions from soil, including how emissions vary around the globe, as well as the shortage of data from the poles and the tropics. While their study documents soil emissions rising faster than what global vegetation is absorbing, that finding is likely to be scrutinized.

“Our result does not square well with a number of other studies suggesting the land is a really robust carbon sink,” he said. “So the question is: Are those studies overestimating the strength of the land carbon sink? Or are the studies we used not representative of what is happening globally? How do we reconcile that?”

Plastics Emit Greenhouse Gases as They Degrade

The materials are a previously unaccounted-for source of methane and ethylene, researchers find.

Aug 2, 2018

Shawna Williams

Need another reason to ditch straws? A study published yesterday (August 1) in PLOS ONE reports that plastics—ranging from construction materials to plastic bags—release the greenhouse gases methane and ethylene after being exposed to sunlight and beginning to degrade.

“Our results show that plastics represent a heretofore unrecognized source of climate-relevant trace gases that are expected to increase as more plastic is produced and accumulated in the environment,” the study authors write in their paper.

The researchers, all based at the University of Hawai‘i at Manoa, tested the emissions of seven types of plastics as they degraded: polyethylene terephthalate, polycarbonate, high-density polyethylene and low-density polyethylene (LDPE), acrylic, polypropylene, and polystyrene. All gave off methane and ethylene in the days after being exposed to sunlight, they found, but polyethylene, which is used to make plastic bags, was the worst offender.

See “Plastic Pollutants Pervade Water and Land”

Jennifer Provencher, a plastic pollution researcher at Acadia University in Canada who was not involved in the work, tells Reuters that the results pointed to “another piece of evidence suggesting that losing plastic to the environment is not good.”

The authors note that plastics degradation has not been accounted for in climate change research. “Based on the rates measured in this study and the amount of plastic produced worldwide CH4 [methane] production by plastics is likely to be an insignificant component of the global CH4 budget,” they write. “However, for the other hydrocarbon gases with much lower global emissions to the atmosphere compared to CH4, the production from the plastics might have more environmental and global relevance.”

Full Report : http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0200574

Heat Wave Reveals German Mines, Grenades and Live Explosives Submerged Since World War II

By David Brennan On 8/3/18 at 5:13 AM

As Europe wilts under the most intense heat wave for many years, residents in the eastern part of Germany are facing an additional hazard—World War Two explosives uncovered by retreating water.

German police have warned people living in the states of Saxony-Anhalt and Saxony that low water levels in the Elbe River are uncovering deadly treasures in multiple locations, Deutsche Welle reported.

So far, 22 grenades, mines or other types of explosives have been found this year. Grit Merker, a spokeswoman for Saxony-Anhalt police, said the authorities “ascribe that to the low water level. That's pretty clear.”

This week, the water level of the Elbe fell as low as 20 inches in Magdeburg, just above the historic low of 18.8 inches recorded in 1934. July has been Germany’s hottest month since records began, and July 31 was the warmest day on record, temperatures hitting 103 degrees in Saxony-Anhalt.

When explosives are found, Merker said people are always warned to report the materials and to stay well away from them. Then, explosives specialists can either transport the deadly find or—if they are too unstable—detonate them in place. Though 70 years on the river bed surrounds the bombs with a protective layer of sediment, police always urge caution first and foremost.

But some people apparently do not quite understand the danger of military-grade explosives. Saxony police spokeswoman Wibke Sperling told Deutsche Welle, “Today there was a photo in a newspaper of someone holding up pieces in their hands. That is a classic example of what gives weapons disposal experts a fright.”

Keep up with this story and more by subscribing now

Unexploded ordnance is common in Germany, where some of the most brutal fighting of World War Two took place. The most destructive war in human history, the remnants of battle still litter the country. From the air alone, British and American air forces dropped 2.7 million tons of bombs on Germany between 1940 and 1945, and that is before ground combat on two fronts is taken into account.

The Elbe stretches from the Czech Republic all the way to the North Sea coast, traversing battlefields where millions fought. Like the rest of Germany, it carries mementoes of the country’s tragic history, and many munitions were dumped into the river at the close of the war. On Saturday, for example, two anti-tank mines found in the Elbe were blown up as they were considered too dangerous to move.

While urging caution, Merker said the discoveries were not a surprise. “I don't think people from the weapons disposal service find it a glaring anomaly,” she said.

Around 5,500 unexploded munitions are discovered in Germany each year, with a daily average of 15. Most are bombs dropped by aircraft. It is estimated there are around 100,000 unexploded bombs buried around Germany, each with deadly potential. Police often have to evacuate large numbers of people while uncovered bombs are cleared. The largest post-war evacuation took place in Frankfurt in 2017, when 70,000 people had to leave their homes to allow authorities to defuse a 1.4 ton British bomb.

Green Energy Producers Just Installed Their First Trillion Watts

The next trillion will cost $1.2 trillion by 2023, almost half of the price-tag for the first, according to Bloomberg New Energy Finance

By Jeremy Hodges

August 2, 2018

Global wind and solar developers took 40 years to install their first trillion watts of power generation capacity, and the next trillion may be finished within the next five years.

That’s the conclusion of research by BloombergNEF, which estimated the industry reached the 1-terrawatt milestone sometime in the first half of the year. That’s almost as much generation capacity as the entire U.S. power fleet, although renewables work less often than traditional coal and nuclear plants and therefore yield less electricity over time.

The findings illustrate the scale of the green energy boom, which has drawn $2.3 trillion of investment to deploy wind and solar farms at the scale operating today. BloombergNEF estimates that the falling costs of those technologies mean the next terrawatt of capacity will cost about half as much – $1.23 trillion – and arrive sometime in 2023.

"Hitting one terrawatt is a tremendous achievement for the wind and solar industries, but as far as we’re concerned, it’s just the start,"said Albert Cheung, BloombergNEF’s head of analysis in London. "Wind and solar are winning the battle for cost-supremacy, so this milestone will be just the first of many.’’

The world had a total of about 6.2 terrawatts of installed capacity in 2016, about 1 terrawatt of that being coal plants in China, according to the research group. Like all milestones, reaching 1 terrawatt is an arbitrary mark that scratches the surface of the debate about how much renewables will contribute to the world's energy system.

Each power plant works at a different ``capacity factor,’’ a measure capturing both the efficiency of the facility in generating electricty and how often it works. On average, wind farms have a capacity factor of about 34 percent worldwide, meaning they work about a third of the time, according to BloombergNEF. Some of the best sites have factors above 60 percent. For solar photovoltaics that track the sun, those readings range from 10 percent in the U.K. to 19 percent in the U.S. and 24 percent in Chile's Atacama desert. By comparison, coal plants have a 40 percent capacity factor and nuclear sometimes double that.

Even so, the terrawatt of installed capacity for renewables marks substantial growth for an industry that barely existed at the start of the century. More than 90 percent of all that capacity was installed in the past 10 years, reflecting incentives that Germany pioneered in the early 2000s that made payouts for green power transparent for investors and bankers alike.

Asian nations absorbed 44 percent of the new wind and 58 percent of solar developments to date, with China account for about a third of all those installations.

Wind made up 54 percent of the first terrawatt but solar is expected to overtake wind in early 2020. China has led the world in installing solar power over the last five years holding 34 percent of global solar capacity and it’ll continue to be the world’s largest market for both power sources, reaching 1.1 terrawatts in the country by 2050.

``As we get into the second and third terrawatts, energy storage is going to become much more important,’’ Cheung said. ``That’s where we see a lot of investment and innovation right now.’’

A big analysis of environmental data strengthens the case for plant-based diets

The same foods grown in various ways can also have less impact on the planet, scientists say

BY SUSAN MILIUS 7:00AM, JUNE 6, 2018

FOOD CHOICE MATTERS A new study calculates the bonus for the planet of choosing more foods from plants.

From beef to beer, coffee to chocolate, there are environmental costs in what humanity chooses to eat and drink. Now a new study that quantifies the impact on the planet of producing and selling 40 different foods shows how these choices make a difference.

Agricultural data from 38,700 farms plus details of processing and retailing in 119 countries show wide differences in environmental impacts — from greenhouse gas emissions to water used — even between producers of the same product, says environmental scientist Joseph Poore of the University of Oxford. The amount of climate-warming gases released in the making of a pint of beer, for example, can more than double under high-impact production scenarios. For dairy and beef cattle combined, high-impact providers released about 12 times as many greenhouse gases as low-impact producers, Poore and colleague Thomas Nemecek report in the June 1 Science.

Those disparities mean that there is room for high-impact producers to tread more lightly, Poore says. If consumers could track such differences, he argues, purchasing power could push for change.

The greatest changes in the effect of a person’s diet on the planet, however, would still come from choosing certain kinds of food over others. On average, producing 100 grams of protein from beef leads to the release of 50 kilograms of greenhouse gas emissions, which the researchers calculated as a carbon-dioxide equivalent. By comparison, 100 grams of protein from cheese releases 11 kilograms in production, from poultry 5.7 kilograms and from tofu two kilograms.

Proteins’ consequences

Proteins are not equal in the amount of climate-warming gases, classified as CO2 equivalent, emitted during the farm-to-retail production chain. Wide variation exists even for a single type of protein depending on the producer, a new study finds.

Replacing meat and dairy foods from producers with above-average environmental effects with plant-based products could make a notable difference in greenhouse gas emissions. If cuts came from these higher-impact suppliers, replacing half of each kind of animal product with something from a plant could reduce food’s share of emissions by 35 percent. That’s not too far from the 49 percent drop that could be achieved if the whole world, somehow, went vegan.

The case for switching to a plant-based diet was already pretty powerful, says Ron Milo of the Weizmann Institute of Science in Rehovot, Israel, who studies cell metabolism and environmental sustainability. The new data “make it even stronger, which is an important thing given how strongly we tend to adhere to our food choices,” he says.

Production matters

Depending on the food, greenhouse gas emissions can differ considerably between low-impact producers (at the far left end of the bars) and high-impact ones (at the far right) for a variety of foods, including dark chocolate and rice.

In their study, Poore and Nemecek, of the Swiss government research organization Agroscope in Zurich, also considered the amounts of water and land used as well as nutrient runoff and air pollution created from food production. For such an unusually broad analysis, the researchers crunched the numbers from 570 studies of what are called life-cycle assessments for 40 kinds of food. These studies calculated the environmental impacts of the whole process from growing or processing to transporting and retailing each food.

Producing food overall accounts for 26 percent of global climate-warming emissions, and takes up about 43 percent of the land that’s not desert or covered in ice, the researchers found. Out of the total carbon footprint from food, 57 percent comes from field agriculture, livestock and farmed fish. Clearing land for agriculture accounts for 24 percent and transporting food accounts for another 6 percent.

After the first year of putting the study together, Poore himself gave up eating animal products.

CO2 Levels Break Another Record, Exceeding 411 Parts Per Million

Two years of CO2 measurements at the Mauna Loa Observatory, showing how seasonal highs and lows are steadily rising.

Levels of carbon dioxide in the atmosphere exceeded 411 parts per million (ppm) in May, the highest monthly average ever recorded at the Mauna Loa Observatory in Hawaii, home to the world’s longest continuous CO2 record. In addition, scientists found that the rate of CO2 increase is accelerating, from an average 1.6 ppm per year in the 1980s and 1.5 ppm per year in the 1990s to 2.2 ppm per year during the last decade.

“Many of us had hoped to see the rise of CO2 slowing by now, but sadly that isn’t the case,” said Ralph Keeling, director of the University of California San Diego’s Scripps CO2 Program, which maintains the Mauna Loa record with the National Oceanic and Atmospheric Administration. “It could still happen in the next decade or so if renewables replace enough fossil fuels.”

Annual CO2 concentrations ebb and flow depending on the season. The lowest levels are generally recorded in late August or early September, when vegetation growth in the Northern Hemisphere is at its peak. The highest concentrations are generally measured in May, following winter months with little or no plant growth and just before the springtime boost in productivity.

From 2016 to 2017, the global CO2 average increased by 2.3 ppm — the sixth consecutive year-over-year increase greater than 2 ppm, according to Scripps researchers. Prior to 2012, back-to-back increases of 2 ppm or greater had occurred only twice.

“CO2 levels are continuing to grow at an all-time record rate because emissions from coal, oil, and natural gas are also at record high levels,” Pieter Tans, lead scientist of NOAA’s Global Greenhouse Gas Reference Network, said in a statement. “Today’s emissions will still be trapping heat in the atmosphere thousands of years from now.”

A 1,000-year flood in Maryland shows the big problem with so much asphalt

By Greta Jochemon Jun 5, 2018

The rain started to fall in Ellicott City, Maryland on the afternoon of May 27. Nearby tributaries of the Patapsco River were already dangerously swollen from last month’s steady precipitation. The storm intensified, and floodwaters soon tore through Ellicott City’s main street, submerging the first floors of buildings, sweeping away cars, and killing at least one person.

The storm was a so-called “1,000 year flood,” meaning it had a 0.1 percent chance of occurring this year. But this “exceptionally rare” event is deja vu for residents — they’re still picking up the pieces from a similar flood that destroyed the area back in July 2016.

After that big flood, Robin Holliday spent months rebuilding her business, HorseSpirit Arts Gallery. She didn’t expect a flood like that to happen again, but she also didn’t think the proposed watershed management plan was strong enough. Discouraged, she started to think about leaving. The recent flood solidified her decision.

So what’s behind the propensity for floods in Ellicott City? Part of the problem is its vulnerable location: the town lies at the foot of a hill where river branches meet the Patapsco River. And, of course, climate change makes storms wetter and increases the frequency of severe, record-breaking weather. But there’s another thing people are pointing out: concrete.

When hard, impermeable concrete replaces absorbent green spaces, it’s much easier for floodwaters to overwhelm stormwater drainage. “That’s what happened in Ellicott City,” says Marccus Hendricks, an assistant professor at the University of Maryland School of Architecture, Planning, and Preservation.

In Ellicott City, development has flourished.

“Nearly one-third of the Tiber-Hudson sub-watershed that feeds into historic Ellicott City is now covered by roads, rooftops, sidewalks and other hard surfaces that don’t absorb water,” the Baltimore Sun wrote in 2016.

In a press release, the Sierra Club’s Maryland Chapter called for a stop to development in the Tiber-Hudson watershed: “We may not have control over severe weather events (except by fighting climate change), [but] we can take ownership over the role that development played in this disaster.”

At a recent press conference, a local county official said that Howard County, home to Ellicott City, has been taking steps to prepare for more floods.

“We’re focusing on making sure that what has been approved is being done by the code and by law, making sure that stormwater regulations are being abided by,” said Allan Kittleman, the Howard County executive. Since the flood in 2016, he said the county has designed and engineered more stormwater retention facilities, but larger projects will take time.

This is far from the first time that development and asphalt have had a violent run-in with climate change. Last summer, Hurricane Harvey drenched sprawling Houston with trillions of gallons of water and caused $125 billion in damage. The area saw a 25 percent increase in paved surfaces between 1996 and 2011, according to Texas A&M professor Samuel Brody. Brody found that every square meter of Houston’s pavement cost about $4,000 more in flood damage.

And, rapidly developing or not, our cities are full of these paved surfaces. In the majority of the country, surfaces like pavement or brick make up just 1 percent of the land. Yet in cities, hardscapes account for upwards of 40 percent of land area.

Environmental change coupled with development will likely make this issue one of major national importance, Brody tells Grist.

“Every week, there’s some urbanized area that floods. We look up and say, ‘Oh that’s never happened before and it’s never going to happen again.’ But if you look at the big picture, it’s happening all the time with increasing severity.”

Oil trains on the rebound to Northeast refineries, federal data show

Curtis Tate, Staff Writer, @tatecurtis Published 6:18 a.m. ET June 5, 2018

After a steady decline throughout 2017, the volume of crude oil moving on trains to Northeast refineries is on the rise, federal data show.

After peaking in 2014 and 2015, rail shipments of crude oil from Midwestern sources to the Northeast slowed to a trickle late last summer. Since then, numbers from the U.S. Energy Information Administration show a rebound.

Northeast refiners consumed 3.1 million barrels of Midwestern oil shipped by rail in March, according to EIA data, the most since January 2017, and about 10 times what they received in August and September.

The most recent numbers are still well below the peak of 13.8 million barrels shipped to the Northeast by rail in November 2014. Each barrel contains 42 gallons. An entire unit train of crude oil can carry about 3 million gallons.

Northeast refineries also continue to receive crude oil by rail from Canada, EIA data show. They consumed 881,000 barrels of Canadian oil in March, after receiving nearly none in June and July of 2017.

Crude by rail shipments declined after 2015 when benchmark oil prices plunged from more than $100 a barrel to below $30. In recent weeks, the price has been hovering around $70 a barrel.

Northeast refiners had turned away from North American oil, importing it from the North Sea and West Africa.

The construction of new pipelines in the Midwest caused a precipitous drop in oil shipments by rail to the Gulf Coast, home to about half of U.S. refining capacity.

A series of fiery derailments from 2013 to 2016 across North America prompted a major overhaul in rail safety on both sides of the northern border.

Next month will mark the fifth anniversary of the massive oil train derailment and fire in Lac-Megantic, Quebec, that killed 47 people and leveled the lakeside town's center.

The crashes resulted in stronger tank cars, more training for emergency responders, improved inspections of track and equipment and new testing requirements for the oil itself.

Still, railroads have fought against greater transparency for shipments of flammable liquids in several states.

Last summer, then-Gov. Chris Christie vetoed legislation that would have imposed greater disclosure requirements on railroads shipping crude oil through New Jersey after a lobbying effort from the state's biggest freight haulers.

One CSX executive who lobbied Christie's office in opposition to the bill, Howard "Skip" Elliot, became chief of the U.S Pipeline and Hazardous Materials Safety Administration, the agency in Washington that regulates the shipment of crude oil by rail.

Schumer demands tougher safety rules for oil-by-train shipments in NY

Updated 1:47 PM; Posted 1:47 PM

By Rick Moriarty rmoriarty@syracuse.com,

Syracuse, N.Y. -- U.S. Sen. Charles Schumer is renewing his calls for greater safety standards for the shipment of highly-explosive crude oil through New York by rail, accusing the Trump administration of dragging its feet on new safety rules.

Schumer, D-NY, demanded in a letter to the U.S. Department of Transportation and the Department of Energy this week that they quickly finalize new federal standards for the shipment of crude oil by train.

He said current law allows dangerous Bakken crude oil from North Dakota to be shipped by rail to refineries in the Northeast without first being stabilized, making explosions more likely.

"Oil tank cars go through our state with regular frequency, and they pass through, because of the way the tracks were laid out 100 years ago, some of our major population centers - Buffalo, Rochester, Syracuse, Utica and Albany, and even Plattsburgh and east of Glens Falls," he said in a teleconference with reporters. "And then they pass down through the Hudson Valley, down into Rockland County before they head to New Jersey."

The amount of oil shipped by train to refineries in the Northeast is increasing after slowing last year. Northeast refineries processed 3.1 million barrels of Midwestern oil shipped by rail in March, the most since January 2017, according to the U.S. Energy Information Administration.

The safety of oil-by-rail shipments became an issue when a train carrying 77 tank cars full of Bakken crude oil from North Dakota derailed and exploded in the town of Lac-Megantic in Quebec province, killing 47 people and destroying 30 buildings in July 2013.

Though no such disaster has occurred in New York, "it will sooner or later if we don't do something," Schumer said.

"The bottom line is, any time you are transporting volatile chemicals, there is a risk of explosion," he said. "Things like safer tank cars, better braking, lower speed limits - they all help make the rails safer."

The Department of Transportation and the Department of Energy are studying how crude oil properties affect its combustibility in rail accidents, information that could be used to set new safety rules. However, Schumer said the study is taking too long to complete.

"I am calling on the DOT today to stop dragging their feet and immediately release the necessary study," he said. "The new administration, particularly with some of the people in charge, has been far less open to having the government step in and do some of this."

He said light Bakken crude from North Dakota is more flammable than heavier crude oil but can be made safer through a heating process that forces out some of its more volatile gases before shipment

"Judge Throws Out New York Climate Lawsuit"

"A federal judge has dismissed a suit brought by the city that would have forced fossil fuel companies to pay for some costs of climate change."

"A federal judge has rejected New York City’s lawsuit to make fossil fuel companies help pay the costs of dealing with climate change.

Judge John F. Keenan of United States District Court for the Southern District of New York wrote that climate change must be addressed by the executive branch and Congress, not by the courts.

While climate change “is a fact of life,” Judge Keenan wrote, “the serious problems caused thereby are not for the judiciary to ameliorate. Global warming and solutions thereto must be addressed by the two other branches of government.”"

California hit its climate goal early — but its largest source of pollution keeps rising

By TONY BARBOZA and JULIAN H. LANGE

JUL 22, 2018 | 3:00 AM

California hit its climate goal early — but its largest source of pollution keeps rising

California hit its target to reduce greenhouse gas emissions below 1990 levels four years early, a milestone regulators and environmentalists are cheering as more proof that you can cut pollution while growing the economy.

But a closer look at data released by the state Air Resources Board shows California’s planet-warming emissions aren’t declining across the board.

While emissions from electricity generation have plunged, transportation pollution is rising, and other key industries are flat.

That uneven progress shows the big challenges that loom as California advances toward its more ambitious goal: slashing greenhouse gas emissions another 40% by 2030.

Growth in renewable energy was the main reason California met its 2020 climate goal in 2016, the emissions report released this month shows.

“We’ve seen a substantial increase in solar and wind power, particularly rooftop solar installations,” says Dave Edwards, chief of the Air Resources Board’s greenhouse gas and toxics emissions Inventory branch.

Driving the shift is early compliance with the state’s mandate that 33% of electricity come from renewable sources by 2020 and the falling cost of solar panels, which has spurred more commercial and rooftop installation. By 2016, the state was already at 46% renewable electricity. Solar electricity grew 33% in 2016, while natural gas decreased by more than 15%.

California also got an assist from the weather. After five years of punishing drought, rains swelled rivers and generated more hydroelectric power. During drier years, the state relied more on natural gas.

Overall, the growth in renewables combined with waning imports of coal power to send emissions from electricity generation plunging 18% in 2016 compared to 2015.

Now for the downside.

Emissions from cars and trucks, already California’s biggest source of greenhouse gases, have been on the rise for the past few years in step with post-recession economic growth. Increased driving is the main reason why transportation pollution ticked up another 2% in 2016.

“The deep reductions from electric power generation are compensating for lackluster performance in other sectors of the economy, including an uptick in the transportation sector where we know we have our work cut out for us,” said Alex Jackson, senior attorney at the Natural Resources Defense Council who tracks California climate policy.

To blame for the increase in vehicle pollution is a combination of low gas prices, a growing economy, consumers’ preference for roomier, less efficient vehicles and a slower-than-anticipated transition to electric models. Those factors are essentially wiping out gains from the state’s emissions-cutting regulations.

Other key sectors of the economy, such as oil refineries, residential heating and agriculture, saw greenhouse gas emissions remain relatively flat or even rise slightly in 2016.

State regulators say that in itself is a triumph. Pollution from transportation and industry, they contend, would have been much higher without California’s climate change policies, which functioned as a lid keeping emissions in check even as its economy grew.

Air Resources Board officials played down the significance of rising car and truck pollution. Instead they are crediting measures such as cap-and-trade and the low-carbon fuel standard — market-based programs the state is using to push industry to cut pollution and shift to cleaner transportation fuels — with preventing emissions from rising even higher.

“All the indicators we’re looking at are moving in the right direction,” Edwards said. “We think that we’re on the right trajectory right now toward 2030.”

(Julian H. Lange / Los Angeles Times)

For now, California’s reductions in greenhouse gas emissions remain modest, and are broadly consistent with a nationwide decrease in recent years. The trend across the U.S. is the result of an economic shift: We’re getting less electricity from dirty, coal-fired power plants and more from cheaper, lower-polluting natural gas.

Yet while California’s greenhouse emissions dipped below 1990 levels in 2016, the national rate remained 2.4% above 1990 levels, according to the Environmental Protection Agency.

And though California’s electricity grid is powered increasingly by renewables, it was cleaner than the nation’s to begin with. California’s per-capita greenhouse gas emissions today are about half that of the nation as a whole. And they keep dropping, from a peak of 14 metric tons per person in 2001 to 10.8 in 2016.

That was viewed as an obstacle when California adopted its landmark 2006 climate law, AB 32, which enshrined the goal of cutting greenhouse gases below 1990 levels by the year 2020. At the time, industry and other critics argued that California’s relatively clean power grid would make its climate goals painful and prohibitively costly to reach. That fear, regulators and environmentalists say, has now been proven wrong.

To reach its tougher 2030 goal, however, California will have to pick up the pace and roughly double its greenhouse gas reductions. That feat, environmentalists say, will require not only a cleaner electrical grid but a rapid shift to zero-emission vehicles powered by it.

“Looking at the electricity sector 10 years ago, solar in that time went from exotic to conventional,” says Jimmy O’Dea, senior vehicles analyst for the Union of Concerned Scientists. “And that’s where we’re at with electric vehicles going forward. They’re going to go from exotic to conventional in a short period of time.”

CONTACT

Tony Barboza is a reporter who covers air quality and the environment with a focus on Southern California. He has been on staff at the Los Angeles Times since 2006,is a graduate of Pomona College and completed a Ted Scripps Fellowship in Environmental Journalism at the University of Colorado.

Earth's resources consumed in ever greater destructive volumes

Study says the date by which we consume a year’s worth of resources is arriving faster

Jonathan Watts

Sun 22 Jul 2018 19.01 EDT

Humanity is devouring our planet’s resources in increasingly destructive volumes, according to a new study that reveals we have consumed a year’s worth of carbon, food, water, fibre, land and timber in a record 212 days.

As a result, the Earth Overshoot Day – which marks the point at which consumption exceeds the capacity of nature to regenerate – has moved forward two days to 1 August, the earliest date ever recorded.

To maintain our current appetite for resources, we would need the equivalent of 1.7 Earths, according to Global Footprint Network, an international research organisation that makes an annual assessment of how far humankind is falling into ecological debt.

The overshoot began in the 1970s, when rising populations and increasing average demands pushed consumption beyond a sustainable level. Since then, the day at which humanity has busted its annual planetary budget has moved forward.

Thirty years ago, the overshoot was on 15 October. Twenty years ago, 30 September. Ten years ago, 15 August. There was a brief slowdown, but the pace has picked back up in the past two years. On current trends, next year could mark the first time, the planet’s budget is busted in July.

While ever greater food production, mineral extraction, forest clearance and fossil-fuel burning bring short-term (and unequally distributed) lifestyle gains, the long-term consequences are increasingly apparent in terms of soil erosion, water shortages and climate disruption.

The day of reckoning is moving nearer, according to Mathis Wackernagel, chief executive and co-founder of Global Footprint Network.

Replacing 50% of meat consumption with a vegetarian diet would push back the overshoot date by five days. Photograph: Scott Olson/Getty

“Our current economies are running a Ponzi scheme with our planet,” he said. “We are borrowing the Earth’s future resources to operate our economies in the present. Like any Ponzi scheme, this works for some time. But as nations, companies, or households dig themselves deeper and deeper into debt, they eventually fall apart.”

The situation is reversible. Research by the group indicates political action is far more effective than individual choices. It notes, for example, that replacing 50% of meat consumption with a vegetarian diet would push back the overshoot date by five days. Efficiency improvements in building and industry could make a difference of three weeks, and a 50% reduction of the carbon component of the footprint would give an extra three months of breathing space.

In the past, economic slowdowns – which tend to reduce energy consumption – have also shifted the ecological budget in a positive direction. The 2007-08 financial crisis saw the date push back by five days. Recessions in the 90s and 80s also lifted some of the pressure, as did the oil shock of the mid 1970s.

You can deny environmental calamity – until you check the facts

But the overall trend is of costs increasingly being paid by planetary support systems.

Separate scientific studies over the past year has revealed a third of land is now acutely degraded, while tropical forests have become a source rather than a sink of carbon. Scientists have also raised the alarm about increasingly erratic weather, particularly in the Arctic, and worrying declines in populations of bees and other insect pollinators, which are essential for crops.

The $3 Billion Plan to Turn Hoover Dam Into a Giant Battery

By Ivan Penn

July 24, 2018

Hoover Dam helped transform the American West, harnessing the force of the Colorado River — along with millions of cubic feet of concrete and tens of millions of pounds of steel — to power millions of homes and businesses. It was one of the great engineering feats of the 20th century.

Now it is the focus of a distinctly 21st-century challenge: turning the dam into a vast reservoir of excess electricity, fed by the solar farms and wind turbines that represent the power sources of the future.

The Los Angeles Department of Water and Power, an original operator of the dam when it was erected in the 1930s, wants to equip it with a $3 billion pipeline and a pump station powered by solar and wind energy. The pump station, downstream, would help regulate the water flow through the dam’s generators, sending water back to the top to help manage electricity at times of peak demand.

The net result would be a kind of energy storage — performing much the same function as the giant lithium-ion batteries being developed to absorb and release power.

The process begins when the dam converts water into energy. Here's how:

The Hoover Dam project may help answer a looming question for the energy industry: how to come up with affordable and efficient power storage, which is seen as the key to transforming the industry and helping curb carbon emissions.

Because the sun does not always shine, and winds can be inconsistent, power companies look for ways to bank the electricity generated from those sources for use when their output slacks off. Otherwise, they have to fire up fossil-fuel plants to meet periods of high demand.

And when solar and wind farms produce more electricity than consumers need, California utilities have had to find ways to get rid of it — including giving it away to other states — or risk overloading the electric grid and causing blackouts.

“I think we have to look at this as a once-in-a-century moment,” said Mayor Eric M. Garcetti of Los Angeles. “So far, it looks really possible. It looks sustainable, and it looks clean.”

The target for completion is 2028, and some say the effort could inspire similar innovations at other dams. Enhancing energy storage could also affect plans for billions of dollars in wind projects being proposed by the billionaires Warren E. Buffett and Philip F. Anschutz.

But the proposal will have to contend with political hurdles, including environmental concerns and the interests of those who use the river for drinking, recreation and services.

In Bullhead City, Ariz., and Laughlin, Nev. — sister cities on opposite sides of the Colorado, about 90 miles south of the dam — water levels along certain stretches depend on when dams open and close, and some residents see a change in its flow as a disruption, if not a threat.

“Any idea like this has to pass much more than engineering feasibility,” Peter Gleick, a co-founder of the Pacific Institute, a think tank in Oakland, Calif., and a member of the National Academy of Sciences, internationally known for his work on climate issues. “It has to be environmentally, politically and economically vetted, and that’s likely to prove to be the real problem.”

Housed inside Hoover Dam’s 726-foot structure are massive power-generating units. The proposed pump station would help regulate the water flow through the dam’s generators, sending water back to the top to help manage electricity at times of peak demand.

Using Hoover Dam to help manage the electricity grid has been mentioned informally over the last 15 years. But no one pursued the idea seriously until about a year ago, as California began grappling with the need to better manage its soaring alternative-electricity production — part of weaning itself from coal-fired and nuclear power plants.

In California, by far the leading state in solar power production, that has sometimes meant paying other states to take excess electricity. Companies like Tesla have gotten into the picture, making lithium-ion batteries that are deployed by some utilities, but that form of storage generally remains pricey.

Lazard, the financial advisory and asset management firm, has estimated that utility-scale lithium-ion batteries cost 26 cents a kilowatt-hour, compared with 15 cents for a pumped-storage hydroelectric project. The typical household pays about 12.5 cents a kilowatt-hour for electricity.

Some dams already provide a basis for the Hoover Dam proposal. Los Angeles operates a hydroelectric plant at Pyramid Lake, about 50 miles northwest of the city, that stores energy by using the electric grid to spin a turbine backward and pump water back into the lake.

But the Hoover Dam proposal would operate differently. The dam, with its towering 726-foot concrete wall and its 17 power generators that came online in 1936, would not be touched. Instead, engineers propose building a pump station about 20 miles downstream from the main reservoir, Lake Mead, the nation’s largest artificial lake. A pipeline would run partly or fully underground, depending on the location ultimately approved.

“Hoover Dam is ideal for this,” said Kelly Sanders, an assistant professor of civil and environmental engineering at the University of Southern California. “It’s a gigantic plant. We don’t have anything on the horizon as far as batteries of that magnitude.”

Sri Narayan, a chemistry professor at the university, said his studies of lithium-ion batteries showed that they simply weren’t ready to store the loads needed to manage all of the wind and solar power coming online.

“With lithium-ion batteries, you have durability issues,” Mr. Narayan said. “If they last five to 10 years, that would be a stretch, especially because we expect to use these facilities at full capacity. It has to be 10 times more durable than it is today.”

Mr. Narayan said he felt the Hoover Dam project should be given serious consideration because pumped-storage projects had been tested and proven for decades. In a comparison with lithium-ion batteries, he said, “I think the argument is very good.”

An aerial view of the Colorado River downstream from Lake Mead and Hoover Dam. Harnessing the river’s force, the dam helped transform the American West.

The Los Angeles Department of Water and Power, the nation’s largest municipal utility, says its proposal would increase the productivity of the dam, which operates at just 20 percent of its potential, to avoid releasing too much water at once and flooding towns downstream.

Engineers have conducted initial feasibility studies, including a review of locations for the pump station that would have as little adverse impact on the environment and nearby communities as possible.

But because Hoover Dam sits on federal land and operates under the Bureau of Reclamation, part of the Interior Department, the bureau must back the project before it can proceed.

“We’re aware of the concept, but at this point our regional management has not seen the concept in enough detail to know where we would stand on the overall project,” said Doug Hendrix, a bureau spokesman.

If the bureau agrees to consider the project, the National Park Service will review the environmental, scientific and aesthetic impact on the downstream recreation area. If the Los Angeles utility receives approval, Park Service officials have told it, the agency wants the pumping operation largely invisible to the public, which could require another engineering feat.

Among the considerations is the effect on bighorn sheep that roam Black Canyon, just below the dam, and on drinking water for places like Bullhead City. Some environmentalists worry that adding a pump facility would impair water flow farther downstream, in particular at the Colorado River Delta, a mostly dry riverbed in Mexico that no longer connects to the sea.

Another concern is that the pump station would draw water from or close to Lake Mohave, where water enthusiasts boat, fish, ride Jet-Skis, kayak and canoe.

Keri Simons, a manager of Watercraft Adventures, a 27-year-old rental business in Laughlin, said water levels already fluctuated in stretches of the Colorado close to the river towns. The smaller Davis Dam, just north of Laughlin, shuts off the flow overnight.

One morning this year, the water level just outside town dropped so low that you could walk across the riverbed, Ms. Simons said. “We couldn’t put any boats out until noon,” she said. “Half the river was a sandbar.”

Even if no water is lost because of the pumping project, the thought of any additional stress on the system worries Toby Cotter, the city manager of Bullhead City.

The town thrives on the summer tourism that draws some two million visitors to the area for recreation on the greenish-blue waters, Mr. Cotter said. “That lake is the lifeblood of this community,” he said. “It’s not uncommon to see 100 boats on that lake.”

Despite the possible benefits of the project, there are concerns among community members and business owners in the area. Keri Simons, manager of a watercraft-rental business in Laughlin, said water levels were already inconsistent along stretches of the Colorado, sometimes leaving a footpath across the river.

Environmentalists have been pushing Los Angeles to stop using fossil fuels and produce electricity from alternative sources like solar and wind power. And Mayor Garcetti said he would like his city to be the first in the nation to operate solely on clean energy, while maintaining a reliable electric system.

“Our challenge is: How do we get to 100 percent green?” he said. “Storage helps. There’s no bigger battery in our system than Hoover Dam.”

But old wounds are still raw with some along the Colorado. A coal-fired power plant in Laughlin that the Department of Water and Power and other utilities operated was shut down in 2006, costing 500 jobs and causing the local economies to buckle. And a decision long ago to allot Nevada a small fraction of the water that California and Arizona can draw remains a sore point.

“There’s nothing going on in California with power that has given people who are dealing with them any comfort,” said Joseph Hardy, a Nevada state senator. “I think from a political standpoint, we would have to allay the fears of California, Nevada and Arizona. There will be a myriad of concerns.”

The decision to close the coal plant angered many residents. They wanted the utility to simply add emission-control features known as scrubbers to reduce carbon pollution. The community later hoped a natural-gas plant would replace the coal facility, but Los Angeles could not agree with the local communities on a site.

The 2,500-acre parcel where the coal plant stood remains largely vacant. “There’s still some sting here,” said Mr. Cotter, the Bullhead City official.

There have been local efforts to convert the site into a development of housing and businesses — or to build a solar farm on a plot of land, if Los Angeles would buy the power.

Mr. Garcetti said other states and cities had worked with Los Angeles to build economic development projects for their communities, so he would like to consider similar ideas for the Hoover Dam project, as well as ways to benefit the entire region. “I’m all open ears to what their needs are,” he said.

Mr. Hardy is wary of big-city promises. The Department of Water and Power has treated Nevada so cavalierly, he said, that a security guard at the old coal plant site once refused to return a ball to children after it bounced over the property’s fence. He said the guard had told the children’s parents that they could file a claim to get it back — a process that would take two to three years.

“Not the kindest neighbor,” Mr. Hardy said.

But he said he was willing to meet with Los Angeles officials to make the project successful.

“The hurdles are minimal and the negotiations simple, as long as everybody agrees with Nevada,” Mr. Hardy said. “It would be nice if there was a table that they would come to. I’ll provide the table.”

Caribbean islands rev up for electric car revolution

Given the high costs of imported fuel for energy, several Caribbean islands are considering increasing the number of electric vehicles on their roads. But they face several barriers, including high initial costs and stiff import tariffs on vehicles.

July 25, 2018

By Sophie Hares Thomson Reuters Foundation

BRIDGETOWN, BARBADOS

With her foot down to show off the acceleration of the zippy electric car, Joanna Edghill spins around the car park before plugging the vehicle into a charging point beneath rows of solar panels converting Caribbean rays into power for the grid.

In the five years since she and her husband started their company Megapower, it has sold 300 electric vehicles and set up 50 charging stations plus a handful of solar car-ports on the 21 mile-long island of Barbados.

They are now expanding elsewhere in the Caribbean.

"The main factor with islands is we don't have range anxiety. I can develop and roll out a charging network in Barbados where customers are never more than a few kilometers from a charging point," said Ms. Edghill, who previously worked in international development.

"We have at least 220 days of pure sunlight every year, so why not take advantage of the resources that we do have here?"

Costa Rica's president elect promises zero-carbon transport

Burdened by a costly dependence on imported fuel for energy, Barbados and many other Caribbean islands are considering boosting the number of electric vehicles on their roads.

But they face barriers, including high initial costs, stiff import duties on electric vehicles, and a lack of regulatory support, say people working in the sector.

Globally, the number of electric vehicles topped 3 million in 2017, according to the International Energy Agency (IEA), which predicts there will be 125 million in use by 2030.

That figure could go as high as 220 million if action to meet global climate targets and other sustainability goals becomes more ambitious, says the IEA.

More research, policies and incentives are needed to drive further uptake, it notes.

In Barbados, the island's electricity utility, government departments, and private firms are among the customers buying the lefthand-drive electric cars and delivery vans Megapower imports from Britain, said Edghill.

It has also built solar car-ports, which power charging points, for the Barbados arm of courier giant DHL and the government of St. Vincent and the Grenadines.

Megapower's other solar panels feed into the grid, offsetting the equivalent of the non-renewable power used by 400 electric cars.

"The Caribbean is ripe for the electrification of transportation," said Curtis Boodoo, assistant professor at the University of Trinidad and Tobago who also works on electric vehicles with the CARICOM regional group of 15 countries.

"If you invest in electric vehicles, you are able to use your existing electrical infrastructure, and save the costs of the transportation fuel that you have to import."

Pumping fuel into electricity generation is two to three times more efficient than putting it directly into car engines, he said, meaning more electric vehicles on the roads could help shave the region's hefty bills.

For many Caribbean countries, over half the amount of fuel they import is used for transport. Barbados spent $300 million last year on fuel imports, government data shows.

Crippled with public debt and dogged by rising oil prices, the island's new government wants to make its bus network electric, and eventually switch all government transport too.

Mr. Boodoo said increased state investment in electric buses would help upgrade transport systems, while cutting climate-changing emissions and paving the way for consumers to follow.

Plug-in vehicles could also "piggy-back" on a push to inject more power into the grid from renewables like solar, wind, and hydro, said Devon Gardner, CARICOM's energy program manager.

Costs are high, however, and on some islands, import duties for electric vehicles are higher than on combustion-engine cars.

Trinidad and Tobago has scrapped taxes and import duties for most electric cars, but taxes elsewhere can add up to 100 percent depending on the model.

High purchase prices mean vehicles remain unaffordable for most Caribbean drivers.

A Nissan Leaf electric car, for example, costs about $50,000 in Barbados, compared with $30,000 in Britain.

Heavily indebted Caribbean countries are torn between collecting much-needed revenue from car imports and supporting the roll-out of private electric vehicles, said Mr. Gardner.

"The Caribbean doesn't have the luxury of using some of the levers of incentives that were used by the richer, more developed countries," he said.

Nonetheless, power utilities in the Bahamas, Turks and Caicos, and St. Lucia are starting to install charging networks, which could help the sector expand, said Megapower's Edghill.

"Privately owned utilities want people buying electricity, so every person that is plugging in is a person not buying gas or diesel, but buying their product," she said.

John Felder, who founded Cayman Automotive, plans to open an office soon in Havana and anticipates a healthy market in Cuba for electric bikes and scooters which start at $800.

Low import duties on electric vehicles in Cuba make them cheaper to buy, said Mr. Felder. He has sold about 60 electric cars in the Cayman Islands – which has cut import duties – and installed 15 charging stations he wants to convert to solar.

"The eco-system is very fragile – there are no freeways where you can go 70, 80 miles per hour for hundreds of miles," he said. "Electric vehicles are perfect for the Caribbean."

While fast-improving battery technology is making electric cars more attractive globally, on hurricane-prone Caribbean islands, emerging vehicle-to-grid technology could use power stored in batteries to keep the lights on if disaster strikes.

Power stored in one electric bus could provide energy for up to 50 homes for a day, or power shelters and community centers if overhead electricity cables are knocked out, said Boodoo.

Driving down prices might be key to kick-starting an electric car revolution. But some bet islands will gradually wake up to the benefits plug-in vehicles can bring by improving public transport and taming expensive diesel habits.

"I'd like to say that within five years, 10 percent of the [Barbados] population will be driving electric vehicles – I think that's realistic," said Edghill.

The world's largest solar farm rises in the remote Egyptian desert

By RACHEL SCHEIER

JUL 30, 2018 | 3:00 AM

KarmSolar's Tayebat Workers' Village in Egypt's Bahariya Oasis was built using local sandstone as a model of a sustainable off-grid structure that combined green energy technology with traditional local building methods. It houses up to 500 seasonal farmworkers. (Courtesy of Karmsolar)

In 1913 on the outskirts of Cairo, an inventor from Philadelphia named Frank Shuman built the world’s first solar thermal power station, using the abundant Egyptian sunshine to pump 6,000 gallons of water a minute from the Nile to irrigate a nearby cotton field.

World War I and the discovery of cheap oil derailed Shuman’s dream of replicating his “sun power plant” on a grand scale and eventually producing enough energy to challenge the world’s dependence on coal.

More than a century later, that vision has been resurrected. The world’s largest solar park, the $2.8-billion Benban complex, is set to open next year 400 miles south of Cairo in Egypt’s Western Desert.

It will single-handedly put Egypt on the clean energy map.

That is no small feat for a country that’s been hobbled by its longtime addiction to cheap, state-subsidized fossil fuels and currently gets more than 90% of its electricity from oil and natural gas.

But the prospects for green energy here have never been better as the government has been scaling back fossil-fuel subsidies in line with an International Monetary Fund-backed reform program that aims to rescue an economy ravaged by political upheaval. Meanwhile, the rapidly falling cost of equipment for solar and wind power has increased their allure.

“This is a big deal,” said Benjamin Attia, a solar analyst with U.S.-based Wood Mackenzie, talking about the Benban complex. “I can’t think of another example where so many big players have come together to fill the gap.”

Officials and international finance organizations tout the potential of Egypt’s renewables sector to create jobs and growth as well as reduce emissions in a country whose capital was recently named the second-most polluted large city on Earth by the World Health Organization.

The government’s aim is that by 2025 Egypt will get 42% of its electricity from renewable sources.

The Benban complex, which will be operated by major energy companies from around the world, is expected to generate as much as 1.8 gigawatts of electricity, or enough to power hundreds of thousands of homes and businesses. It will consist of 30 separate solar plants, the first of which began running in December, and employ 4,000 workers.

The U.S. government is backing a local program to train hundreds of technical school students in solar and wind energy.

Hatem Tawfik of Cairo Solar with the "solar flowers" his company installed in the courtyard of a bank in New Cairo. As energy prices have gone up, so has demand for solar.

Last week, Egyptian President Abdel Fattah Sisi inaugurated several big electricity projects, including the expansion of massive wind farms on the gusty Gulf of Suez in the Red Sea. Russia has promised to help build and finance a $21-billion nuclear power plant on Egypt’s north coast.

Driving all these projects is the not-so-distant memory of the electricity crisis that gripped Egypt in the years following the 2011 revolution. Factory and shop closures from rolling blackouts fed mounting public anger that culminated in the 2013 ouster of President Mohamed Morsi, whose prime minister once infamously suggested that Egyptians confront the power shortage by wearing cotton and sleeping in one room.

Today residents no longer face nightly outages, but Egypt — once a gas exporter— must import expensive liquefied natural gas to meet the energy needs of its 96 million people. And power demand is expected to more than double by 2030, much faster than in any other country in the region, according to Victoria Cuming of Bloomberg New Energy Finance.

In preparation, Egypt launched a scheme in 2014 to enable private players to sell power to the public grid, jump-starting its clean energy market, which last year saw a 500% increase in investment.

But if flashy green energy megaprojects like Benban are drawing much of the attention, small businesses are responsible for 80% of private-sector jobs, said Khaled Gasser, who founded the Solar Energy Development Assn., a local industry group. “This is the real market,” he said.

Driven by the idealism and lack of compelling job opportunities that followed the 2011 revolution, several clean-energy entrepreneurs have quietly established a grass-roots market in Egypt.

Ahmed Zahran, 38, started KarmSolar with four friends working out of a cafe after his old bosses fled the country. He had tried in vain to persuade the private equity firm to invest in clean energy. “We were fired by an … who represented everything we hated about this country, so we decided, let’s do it ourselves,” said Zahran, who figured solar power was a no-brainer in a nation that is more than 90% desert.

With private investments, the company began by making solar water pumps for off-grid desert farms that traditionally rely on diesel to pull water from beneath the sand. Now with more than 80 employees, it also builds solar stations to power poultry factories and malls under so-called power-purchasing agreements.

Zahran has also delved into green architecture, designing off-grid eco-lodges and a sustainable “worker’s village” for seasonal farmhands in Egypt’s Bahayira Oasis, a desert outpost of olive and date groves.

As the government has made good on its promise to gradually cut energy subsidies — on July 1 electricity prices rose an average of 26% — more companies are exploring the possibility of adding solar systems to the apartment buildings where most of the country’s urban population lives.

The country’s green energy start-ups are grappling with the same problems that have bedeviled small businesses here for years. In an effort to confront a big one — lack of financing — the Central Bank launched a low-interest loan program in 2016 that aimed to encourage small players, especially in key industries like renewable energy.

Hatem Gamal used it to grow his waste-to-energy startup, Empower, which currently operates two plants, in a wastewater treatment facility and a beef farm, that turn human sludge and animal waste into biogas that he sells to the government. The company has four more plants under construction.

One thing that hasn’t changed: the layers of bureaucracy that businesses must navigate. Gamal had to get licenses or permissions from at least 10 government agencies. He’s become so adept at dealing with red tape that he has a printer in his car.

“People say, ‘You must know someone’ or ‘You must have bribed someone’ or ‘That 5% loan isn’t real,’ ” Gamal said. “But the opportunities are there. I just never take ‘no’ for an answer.”

Toxic algae spreads in Baltic waters in biggest bloom in years

Reuters Staff

HELSINKI/SOPOT, Poland (Reuters) - A huge bloom of toxic algae has spread in the Baltic Sea, forcing people off beaches but delighting scientists who research cancer and antibiotics.

Toxic algae are seen on the beach in Gdynia, Poland, July 3, 2015. Picture taken July 03, 2015. Lukasz Glowala/Agencja Gazeta/via REUTERS

The blue-green algae or cyanobacteria is primarily caused by the excess of nutrients, such as nitrogen and phosphorus used in food production, in the water.

Exceptionally hot weather and a lack of wind has exacerbated the effect, scientists say. Finnish Meteorological Institute said that this July was the hottest on record.

“The environment is giving us back what we have put into it, so it works as a kind of boomerang,” said Hanna Mazur-Marzec, a professor at the Polish Academy of Sciences’ Oceanology Institute in Sopot on the Baltic sea’s southern shore.

The Finnish environment institute SYKE said the outbreak, which has hit particularly hard around the Gulf of Finland and stretches down to Poland’s shores, is among the worst in the past decade.

“In the lakes, we’ve had a five-week period that’s more extensive than the average in the last 20 years,” SYKE said. “Climate change and warming in the Baltic Sea area and lakes will increase the risk of cyanobacterial blooms.”

Cyanobacteria toxins are a risk to the marine ecosystem - especially for mussels and the flounder fish - and also pose threat to human health, SYKE said.

People in Poland, Lithuania and Sweden have been advised by the authorities not to swim in waters where the algae is blooming.

“We are here until Saturday and it is a real pity that we are not allowed to swim in the Baltic Sea but what can we do?”, said Bron Starby, a tourist from Sweden, who is spending his holidays in Sopot, a popular resort in Poland.

But some scientists are pleased because cyanobacteria are known for being a source of unique compounds that provide a means of studying substances that bacteria will be sensitive to in order to beat the growing prevalence of bacterial resistance.

“It’s a great opportunity to collect material (for research). In recent years, we had trouble collecting enough,” said Mazur-Marzec. “Now we can simply fill up a bucket.”

Last year was warmest ever that didn't feature an El Nio, report finds

State of the climate report found 2017 was the third warmest with a record high sea level and destructive coral bleaching

Oliver Milman in New York

Wed 1 Aug 2018 12.08 EDT Last modified on Wed 1 Aug 2018 15.40 EDT

Last year was the warmest ever recorded on Earth that didn’t feature an El Nio, a periodic climatic event that warms the Pacific Ocean, according to the annual state of the climate report by 500 climate scientists from around the world, overseen by the National Oceanic and Atmospheric Administration (Noaa) and released by the American Meteorological Society.

Climate change cast a long shadow in 2017, with the planet experiencing soaring temperatures, retreating sea ice, a record high sea level, shrinking glaciers and the most destructive coral bleaching event on record.

Overall, 2017 was third warmest year on record, Noaa said, behind 2016 and 2015. Countries including Spain, Bulgaria, Mexico and Argentina all broke their annual high temperature records.

Puerto Madryn in Argentina reached 43.4C (110.12F), the warmest temperature ever recorded so far south in the world, while Turbat in Pakistan baked in 53.5C (128.3F), the global record temperature for May.

Concentrations of planet-warming carbon dioxide continued on an upward march, reaching 405 parts per million in the atmosphere. This is 2.2ppm greater than 2016 and is the highest level discernible in modern records, as well as ice cores that show CO2 levels back as far as 800,000 years. The growth rate of CO2 has quadrupled since the early 1960s.

The consequences of this heat, which follows a string of warm years, was felt around the world in 2017.

In May of last year, ice extent in the Arctic reached its lowest maximum level in the 37-year satellite record, covering 8% less area than the long-term average. The Arctic experienced the sort of warmth that scientists say hasn’t been been present in the region for the last 2,000 years, with some regions 3 or 4 degrees Celsius hotter than an average recorded since 1982. Antarctic sea ice was also below average throughout 2017.

Land-based ice mirrored these reversals, with the world’s glaciers losing mass for the 38th consecutive year on record. According to the report, the total ice loss since 1980 is the equivalent to slicing 22 metres off the top of the average glacier.

Prolonged warmth in the seas helped spur a huge coral bleaching event, which is when coral reefs become stressed by high temperatures and expel their symbiotic algae. This causes them to whiten and, in some cases, die off.

A three-year stretch to May 2017 was the “longest, most widespread and almost certainty most destructive” coral bleaching event on record, the report states, taking a notable toll on places such as the Great Barrier Reef in Australia. Global average sea levels reached the highest level in the 25-year satellite record, 7.2cm (3in) above the 1993 average.

“I find it quite stunning, really, how these record temperatures have affected ocean ecosystems,” said Gregory Johnson, an oceanographer at Noaa.

Heatwave made more than twice as likely by climate change, scientists find

There were several major rainfall events in 2017 contributing to a wetter than normal year, with the Indian monsoon season claiming around 800 lives and devastating floods occurring in Venezuela and Nigeria. Global fire activity was at the lowest level since 2003, however.

While exceptionally warm years could occur without human influence, the rapidly advancing field of climate change attribution science has made it clear the broad sweep of changes taking place on Earth would be virtually impossible without greenhouse gas emissions from human activity.

The loss of glaciers and coral reefs threaten the food and water supplies of hundreds of millions of people, while heatwaves, flooding, wildfires and increasingly powerful storms are also a severe risk to human life.

These dangers have been highlighted in stunning fashion this year, with a scorching global heatwave causing multiple deaths from Canada to Japan, while wildfires have caused further fatalties in places such as Greece and the western US.

How summer heat has hit Nordic nuclear plants

Lefteris Karagiannopoulos

OSLO (Reuters) - This year’s unusually warm summer in the Nordic region has increased sea water temperatures and forced some nuclear reactors to curb power output or shut down altogether, with more expected to follow suit.

The summer has been 6-10 degrees Celsius above the seasonal average so far and has depleted the region’s hydropower reservoirs, driving power prices to record highs, boosting energy imports from continental Europe and driving up consumer energy bills.

Nuclear plants in Sweden and Finland are the region’s second largest power source after hydropower dams and have a combined capacity of 11.4 gigawatts (GW).

Reactors need cold sea water for cooling but when the temperature gets too high it can make the water too warm for safe operations, although the threshold varies depending on the reactor type and age.

Unscheduled power output cuts in Swedish and Finnish reactors could push prices even higher, said Vegard Willumsen, section manager at Norway’s energy regulator NVE.

“If nuclear reactors in the Nordics shut down or reduce power due to the heatwave, it could also put pressure on the supply and consequently on the Nordic power prices,” he added.

The Nordic region’s nuclear plants comprise either pressurized water reactors (PWR) or boiling water reactors (BWR) - and both can be affected by warm sea water.

Typically, power would be reduced at the 12 reactors after a certain temperature threshold has been reached and then fully shut down at a higher threshold.

BWRs can keep operating for longer and would only shut down after a several-degree rise in water temperatures from the moment power reductions are triggered.

However, PWRs require a shorter time to shut down after they start reducing power.

Utility Vattenfall, which operates seven reactors in Sweden, shut a 900 megawatt (MW) PWR unit - one of the four located at its Ringhals plant - this week as water temperatures exceeded 25 degrees Celsius.

The firm’s second plant at Forsmark consists of three BWRs and Vattenfall had to reduce output by 30-40 megawatt per reactor earlier in July as the sea water in the area exceeded 23 degrees Celsius.

Finland’s Fortum reduced power at its Loviisa plant last week when water temperatures reached 32 degrees C, close to a threshold of 34 degrees.

The extent to which water temperature affects nuclear plants also depends on the depth that they receive water from. Colder water is deeper.

It also depends on how warm the water is after being used in the reactors and released back into the sea. If used water exceeds 34 degrees Celsius, it can cause major output reductions or shutdowns for certain plants due to safety regulations.

Sweden’s biggest reactor - 1.4 GW Oskarshamn 3 - should be less vulnerable to very hot summers due to the depth of water, said a spokesman for operator OKG, a unit of Uniper Energy (UN01.DE).

“Water intake (is) at a depth of 18 metres where the water naturally is cooler than on the surface ... should it be too hot, we would of course reduce the capacity accordingly,” he said.

Oskarshamn 3 will reduce power if sea water reaches 25 degrees but it was below 20 degrees on Tuesday.

Similarly, Teollisuuden Voima’s Olkiluoto plant in Finland has deeper water which is colder than a 27-degree threshold.

TVO has also built an additional safety mechanism - a canal - which it can use under certain conditions to release used warm water at the other side of the Olkiluoto island.

Teachers lacking educational background in science use inquiry-oriented instruction least

PUBLIC RELEASE: 30-JUN-2018

Not using the pedagogical approach heralded by the National Research Council could be linked to unfilled STEM jobs

UNIVERSITY OF VERMONT

A new study shows that eighth-grade science teachers without an educational background in science are less likely to practice inquiry-oriented science instruction, a pedagogical approach that develops students' understanding of scientific concepts and engages students in hands-on science projects. This research offers new evidence for why U.S. middle-grades students may lag behind their global peers in scientific literacy. Inquiry-oriented science instruction has been heralded by the National Research Council and other experts in science education as best practice for teaching students 21st-century scientific knowledge and skills.

Published in The Elementary School Journal, the study investigated whether the educational backgrounds of 9,500 eighth-grade science teachers in 1,260 public schools were predictive of the level in which they engaged in inquiry-oriented instruction. The authors found that, nationwide, there are two distinct groups of middle-grades science teachers: 1) those with very little formal education in science or engineering, and 2) those with degrees and substantial coursework in science.

Teachers who were most likely to use inquiry-based teaching were those with both education and science degrees, and teachers with graduate-level degrees in science were most likely to teach this way. However, nationally, just half of teachers had these preferred credentials, and nearly one-quarter of eighth-grade teachers had an education-related degree with no formal educational background in science or engineering.

The study's findings beg the question: are middle-grades teachers well-prepared to engage in the kinds of teaching that have been shown to improve student engagement, interest, and preparation for STEM careers? Study author Tammy Kolbe, assistant professor of educational leadership and policy studies at the University of Vermont, says results "point toward the disparate nature of middle-level teachers' educational backgrounds as a possible leverage point for change."

Another key finding was that teachers with undergraduate or graduate degrees in science continued to use inquiry-oriented instruction throughout their careers at a higher rate than their peers. That said, novice teachers with undergraduate minors in science who initially were less likely to teach this way eventually caught up to their peers with stronger educational backgrounds in science. "This suggests that even having an undergraduate minor in science better positions a teacher to adopt and integrate reform-oriented science teaching, compared to teachers with little-to-no formal education in science or engineering," says Kolbe.

"We cannot expect that goals for reforming science education in the United States can be achieved without carefully examining how teachers are prepared," says Kolbe, who co-authored the study with Simon Jorgenson, assistant professor in the Department of Education at UVM. "We show that teachers' educational backgrounds matter for how they teach science, and suggest that teachers' degrees and coursework are valid proxies for what teachers know and can do in the classroom. The study's findings call into question existing state policies and teacher preparation programs that minimize content knowledge requirements for middle-level teachers."

Climate change is making night-shining clouds more visible

PUBLIC RELEASE: 2-JUL-2018

AMERICAN GEOPHYSICAL UNION

WASHINGTON -- Increased water vapor in Earth's atmosphere due to human activities is making shimmering high-altitude clouds more visible, a new study finds. The results suggest these strange but increasingly common clouds seen only on summer nights are an indicator of human-caused climate change, according to the study's authors.

Noctilucent, or night-shining, clouds are the highest clouds in Earth's atmosphere. They form in the middle atmosphere, or mesosphere, roughly 80 kilometers (50 miles) above Earth's surface. The clouds form when water vapor freezes around specks of dust from incoming meteors. Watch a video about noctilucent clouds here.

Humans first observed noctilucent clouds in 1885, after the eruption of Krakatoa volcano in Indonesia spewed massive amounts of water vapor in the air. Sightings of the clouds became more common during the 20th century, and in the 1990s scientists began to wonder whether climate change was making them more visible.

In a new study, researchers used satellite observations and climate models to simulate how the effects of increased greenhouse gases from burning fossil fuels have contributed to noctilucent cloud formation over the past 150 years. Extracting and burning fossil fuels delivers carbon dioxide, methane and water vapor into the atmosphere, all of which are greenhouse gases.

The study's results suggest methane emissions have increased water vapor concentrations in the mesosphere by about 40 percent since the late 1800s, which has more than doubled the amount of ice that forms in the mesosphere. They conclude human activities are the main reason why noctilucent clouds are significantly more visible now than they were 150 years ago.

"We speculate that the clouds have always been there, but the chance to see one was very, very poor, in historical times," said Franz-Josef Lbken, an atmospheric scientist at the Leibniz Institute of Atmospheric Physics in Khlungsborn, Germany and lead author of the new study in Geophysical Research Letters, a journal of the American Geophysical Union.

The results suggest noctilucent clouds are a sign that human-caused climate change is affecting the middle atmosphere, according to the authors. Whether thicker, more visible noctilucent clouds could influence Earth's climate themselves is the subject of future research, Lbken said.

"Our methane emissions are impacting the atmosphere beyond just temperature change and chemical composition," said Ilissa Seroka, an atmospheric scientist at the Environmental Defense Fund in Washington, D.C. who was not connected to the new study. "We now detect a distinct response in clouds."

Conditions must be just right for noctilucent clouds to be visible. The clouds can only form at mid to high latitudes in the summertime, when mesospheric temperatures are cold enough for ice crystals to form. And they're only visible at dawn and dusk, when the Sun illuminates them from below the horizon.

Humans have injected massive amounts of greenhouse gases into the atmosphere by burning fossil fuels since the start of the industrial period 150 years ago. Researchers have wondered what effect, if any, this has had on the middle atmosphere and the formation of noctilucent clouds.

In the new study, Lbken and colleagues ran computer simulations to model the Northern Hemisphere's atmosphere and noctilucent clouds from 1871 to 2008. They wanted to simulate the effects of increased greenhouse gases, including water vapor, on noctilucent cloud formation over this time period.

The researchers found the presence of noctilucent clouds fluctuates from year to year and even from decade to decade, depending on atmospheric conditions and the solar cycle. But over the whole study period, the clouds have become significantly more visible.

The reasons for this increased visibility were surprising, according to Lbken. Carbon dioxide warms Earth's surface and the lower part of the atmosphere, but actually cools the middle atmosphere where noctilucent clouds form. In theory, this cooling effect should make noctilucent clouds form more readily.

But the study's results showed increasing carbon dioxide concentrations since the late 1800s have not made noctilucent clouds more visible. It seems counterintuitive, but when the middle atmosphere becomes colder, more ice particles form but they are smaller and therefore harder to see, Lbken explained.

"Keeping water vapor constant and making it just colder means that we would see less ice particles," he said.

On the contrary, the study found more water vapor in the middle atmosphere is making ice crystals larger and noctilucent clouds more visible. Water vapor in the middle atmosphere comes from two sources: water vapor from Earth's surface that is transported upward, and methane, a potent greenhouse gas that produces water vapor through chemical reactions in the middle atmosphere.

The study found the increase in atmospheric methane since the late 1800s has significantly increased the amount of water vapor in the middle atmosphere. This more than doubled the amount of mesospheric ice present in the mid latitudes from 1871 to 2008, according to the study.

People living in the mid to high latitudes now have a good chance of seeing noctilucent clouds several times each summer, Lbken said. In the 19th century, they were probably visible only once every several decades or so, he said.

"The result was rather surprising that, yes, on these time scales of 100 years, we would expect to see a big change in the visibility of clouds," Lbken said.

The American Geophysical Union is dedicated to advancing the Earth and space sciences for the benefit of humanity through its scholarly publications, conferences, and outreach programs. AGU is a not-for-profit, professional, scientific organization representing 60,000 members in 137 countries. Join the conversation on Facebook, Twitter, YouTube, and our other social media channels.

Company makes old polystyrene new again

Jessie Darland Monday, April 30, 2018

A company in Tigard has created a way to recycle polystyrene into more polystyrene. A very small amount of polystyrene is currently recycled and often ends up in landfills or the ocean.

Portland has an extensive waste and recycling program for garbage, bottles, cans, food waste, yard waste and paper — but where do all the plastics go? Things like Styrofoam, or packing peanuts that fill boxes from online deliveries or electronics purchases?

Polystyrene is a commonly used material that often gets thrown into landfills. According to Agilyx, an environmental solutions company located in Tigard, the U.S. recycles just 1.3 percent of polystyrene used. The company has developed the first way of recycling polystyrene back into polystyrene. They held a ribbon cutting Thursday, April 19, to celebrate the new direction of the company.

"Polystyrene is a very versatile, cost effective and valuable polymer used in our everyday lives," said Joe Vaillancourt, chief executive officer of Agilyx. "However, it is one of the least recycled materials in today's recycling programs. We are proud to have commercialized the first chemical recycling solution, creating a more sustainable end-of-life solution for polystyrene."

Polystyrene can be identified by its "'resin code" — a number six within the recycling symbol on most materials. Some material doesn't have this code, but can still be recycled.

"The easiest way to identify polystyrene foam is it looks like a bunch of balls stuck together and it will break or flake off if bent," said John Desmarteau, project engineer at Agilyx.

Agilyx receives this material and melts it down back into liquid. It's sent to a refiner to be cleaned, then is sent to a manufacturer to make new products out of the material. By recycling the material, less plastic will end up in landfills and the ocean, and by recycling the oil used in the production of these plastics, less will be taken out of the ground to make future products.

"Styrofoam (polystyrene foam) has a useful life of mere hours or days, but exists in the environment for hundreds or thousands of years," said Jeanne Roy, co-director of the Center for Earth Leadership. "It can be carried by the wind, washed into storm drains, and make its way to the ocean, contributing to a mass of floating plastic debris. Marine mammals mistake it for food, and it remains in their guts, often causing death by starvation."

Polystyrene is used to make items including fast food containers, packaging, bike helmets, parts of cars, and coolers. The molecule used in the process of making polystyrene can be used again and again, which is what Agilyx is taking advantage of.

Tyrone Neighbors, owner of Active Daily Fitness, has partnered with Agilyx to give them all of the Styrofoam that comes with exercise equipment he orders. Within the past year, he started bringing Agilyx truckloads of packing material from treadmills and strength equipment.

"It gives me peace of mind that's it's not ending up in a landfill," Neighbors said. He gets the material prepped and bagged, then Agilyx takes it from there.

According to Agilyx, this process has a 50 percent smaller carbon footprint than traditional polystyrene manufacturing.

"They're doing something even the biggest petrochemical companies aren't doing. It's impressive," Vaillancourt said at the ribbon cutting. At the event, attendees could get coffee in Styrofoam cups, then the cups were thrown into the polystyrene compactor and eventually put through the entire system. There was also a tour, where a member of the Agilyx team explained the process and the equipment used.

If you’re puzzling over the recycling codes on your plastics, here’s the scoop on what those codes mean. If you want to know which plastics you can toss in your curbside bin, please check with your local recycling service provider, as the types of plastics accepted differ by location.

Info Graphic :

https://envnewsbits.info/2018/06/29/infographic-plastic-recycling-codes/

Why Vietnam is shutting out some materials

Posted on May 30, 2018

by Colin Staub

Vietnam's Tan Cang portVietnamese authorities have boosted inspections of scrap imports and plan to halt shipments to key ports next month.

According to a number of sources, the changes are a result of a glut of scrap paper and plastic imports and numerous instances of customs violations.

Because of China’s restrictions, Vietnam has become a larger destination for exports of U.S. recyclables. According to the U.S. Census Bureau, U.S. exporters sent more than 173,000 short tons of recovered fiber to Vietnam from January through March, up dramatically from 57,000 short tons during the prior-year period. U.S. ports sent Vietnam nearly 79 million pounds of recovered plastics during the first quarter, up from 40 million pounds during the prior-year period.

In recent months, exporters have cited the Southeast Asian country as a major alternative market that’s increased in the past year. And the country has been a destination for Chinese scrap processors looking to invest in Southeast Asia as a way to work around China’s restrictions.

Now, Vietnamese officials are restricting imports, similar to the actions taken by the Chinese government over the past year.

Giant wave of imports

A May 21 letter from Vietnam’s Tan Cang-Cai Mep terminal to shipping companies, obtained and shared by the Institute of Scrap Recycling Industries (ISRI), describes a major increase in containers of scrap paper and plastic coming through the port.

The letter notes that a different terminal, Tan Cang-Cat Lai, which is among the largest Vietnamese shipping terminals, has amassed more than 8,000 TEUs (twenty-foot equivalent units, a measurement for container quantity) of scrap plastic and paper on-site as of May 21. Tan Cang-Cai Mep, a smaller terminal, has stockpiled 1,132 TEUs that can’t be moved to the larger terminal due to the buildup. Both terminals are operated by Saigon Newport Corporation (SNP).

The larger terminal has already stopped accepting scrap materials, and the smaller one will stop accepting all containers of scrap plastic June 25 through Oct. 15. The terminals will require more thorough documentation beginning June 15, according to the letter. This means shipments must be accompanied by valid import permits and a written guarantee of when the container will be picked up by the buyer.

“This notice comports with rumors that Vietnamese customers had no more room for imported materials and that the build-up of containers of recovered paper and plastic scrap diverted from China were causing delays at Vietnam’s main import terminal,” ISRI wrote.

ISRI added it is “confident this is more to do with port capacity constraints and not a permanent regulatory shift.”

Violations discovered

A handful of recent announcements from Vietnamese customs officials confirm growing concerns over scrap imports since the beginning of 2018. They describe activities similar to those targeted by Chinese customs officials over the past year.

In April, the General Department of Vietnam Customs (GDVC) described numerous recent violations involving scrap paper and plastic coming into the country. The violations included material not meeting Vietnamese quality standards, mislabeling, the use of false import permits and a lack of permits altogether. In response, customs officials have ramped up inspections. According to GDVC, in some areas agents began inspecting 100 percent of scrap paper and plastic imports.

Most recently, GDVC said it had implemented a “plan on risk management for scrap imports,” which includes continuing the heightened inspections and taking stock of all the containers of recyclables sitting at ports.

GDVC alluded to the Chinese material ban, and said it presented “the potential danger for Vietnam and other Southeast Asian countries to become scrap destinations.”

Restricted import permits

An announcement from SNP in late April shows the overcapacity problem has been growing since the beginning of the year.

On April 23, SNP wrote to its customers that the volume of scrap paper and plastic had “soared” since China banned most of the material. Because of that, Vietnam’s Ministry of Natural Resources and Environment and the Ministry of Finance “restricted the import permits for plastic/paper scrap into Vietnam,” according to the SNP letter.

“Consequentially, consignees are not able to complete customs clearance for many shipments of plastic/paper scrap which have arrived at Vietnam seaports,” SNP wrote.

At that time, 1,000 TEU of material Cai Mep had sent to the larger Cat Lai port were stuck without anywhere to go.

“If there is no punctual action, this volume will escalate and cause high yard occupancy at ports” and other storage areas, SNP wrote, “affecting routine operation of both ports and shipping lines.”

China says its waste restrictions will spur US job growth

Posted on June 19, 2018

by Colin Staub

mixed recyclablesChinese officials have responded to concerns from other nations about recent import restrictions. The Chinese comments directly address the “waste versus scrap” debate as well as global economic repercussions of National Sword.

The country’s World Trade Organization (WTO) Notification and Enquiry Center, an office of the Ministry of Commerce (MOFCOM), responded this month to comments from several WTO members and industry stakeholders. China’s note was shared by the Institute of Scrap Recycling Industries (ISRI).

Some of the concerns raised by U.S. recycling stakeholders over the past year have expressed frustration that Chinese regulators do not understand the impact of the policies they are enacting, and do not understand that imports of scrap material have value. But the new response offers evidence to the contrary.

Some of the key takeaways from the letter include the following:

Spurring domestic investment: China says its restrictions will spur economic development within the U.S. The letter explains that restricting imports into China will lead to increasing volumes kept within the U.S., “thereby spurring a new round of investment in the relevant processing industries.”

“Apart from solving environmental problems faced by both countries, this could also bring more employment opportunities to the U.S. recycling industry and create a win-win situation,” Chinese officials wrote.

Waste versus scrap: Chinese regulators say they are aware of the difference between waste and scrap, a point ISRI and others have questioned many times over the past year. The letter states there is “not any globally recognized standard for scrap materials and recyclable materials,” so the Chinese government went with the internationally recognized HS code (harmonized commodity description) system to define the materials that would be restricted from import.

“China would like to reiterate that solid wastes are different from raw materials in general and inherently pollutive,” the letter states.

‘Disposal to the nearest’: The MOFCOM office noted that ISRI’s comments described the infrastructure for processing recovered materials in the U.S. as a “highly efficient system that incorporates the latest sorting technologies and quality control methods.”

Because of that technical advancement, the Chinese officials believe the U.S. has the ability to manage its recyclables domestically. Doing so, the letter states, would be in line with what China describes as the principles of “waste producers responsibility” and “disposal to the nearest,” meaning countries process their own waste material domestically.

“Over decades, enterprises from other countries including the United States have exported large quantities of solid wastes to China and derived huge financial gains,” Chinese officials wrote. “We earnestly hope that these enterprises could also actively fulfil their international social responsibility and contribute to the alleviation of the environmental and health problems in China.”

Adjusting actions: The letter notes that China modified some of the policy changes due to concerns from affected stakeholders. For example, the letter points to the lower 0.5 percent contamination threshold requirement being deferred until March 1.

“The Chinese government made this decision in consideration of the fact that the relevant industries need transitioning and make adjustments to adapt to the new standards, while the risks of pollution resulting from the accumulation of large quantities of solid waste imports during the transitioning period must also be avoided,” the agency wrote.

0.5 percent is realistic: Also on the contamination standards, the letter notes that plastics have been subject to the 0.5 percent contamination limit since 2005. The letter brings that point up to display that China views the 0.5 percent figure, which this year was applied to paper and most other recyclables, as a realistic limit.

“As our past experience with implementing the standards and inspecting waste imports over more than ten years has shown, imported wastes can meet China’s standards if they are properly categorized or pretreated at their sources,” Chinese officials wrote. “Therefore, this standard is not intended to impose any de facto import restriction.”

Global warming may be twice what climate models predict

Past warming events suggest climate models fail to capture true warming under business-as-usual scenarios

University of New South Wales

Future global warming may eventually be twice as warm as projected by climate models under business-as-usual scenarios and even if the world meets the 2C target sea levels may rise six metres or more, according to an international team of researchers from 17 countries.

The findings published last week in Nature Geoscience are based on observational evidence from three warm periods over the past 3.5 million years when the world was 0.5C-2C warmer than the pre-industrial temperatures of the 19th Century.

The research also revealed how large areas of the polar ice caps could collapse and significant changes to ecosystems could see the Sahara Desert become green and the edges of tropical forests turn into fire dominated savanna. "Observations of past warming periods suggest that a number of amplifying mechanisms, which are poorly represented in climate models, increase long-term warming beyond climate model projections," said lead author, Prof Hubertus Fischer of the University of Bern.

"This suggests the carbon budget to avoid 2C of global warming may be far smaller than estimated, leaving very little margin for error to meet the Paris targets."

To get their results, the researchers looked at three of the best-documented warm periods, the Holocene thermal maximum (5000-9000 years ago), the last interglacial (129,000-116,000 years ago) and the mid-Pliocene warm period (3.3-3 million years ago).

The warming of the first two periods was caused by predictable changes in the Earth's orbit, while the mid-Pliocene event was the result of atmospheric carbon dioxide concentrations that were 350-450ppm - much the same as today.

Combining a wide range of measurements from ice cores, sediment layers, fossil records, dating using atomic isotopes and a host of other established paleoclimate methods, the researchers pieced together the impact of these climatic changes.

In combination, these periods give strong evidence of how a warmer Earth would appear once the climate had stabilized. By contrast, today our planet is warming much faster than any of these periods as human caused carbon dioxide emissions continue to grow. Even if our emissions stopped today, it would take centuries to millennia to reach equilibrium.

The changes to the Earth under these past conditions were profound - there were substantial retreats of the Antarctic and Greenland ice sheets and as a consequence sea-levels rose by at least six metres; marine plankton ranges shifted reorganising entire marine ecosystems; the Sahara became greener and forest species shifted 200 km towards the poles, as did tundra; high altitude species declined, temperate tropical forests were reduced and in Mediterranean areas fire-maintained vegetation dominated.

"Even with just 2C of warming - and potentially just 1.5C - significant impacts on the Earth system are profound," said co-author Prof Alan Mix of Oregon State University.

"We can expect that sea-level rise could become unstoppable for millennia, impacting much of the world's population, infrastructure and economic activity."

Yet these significant observed changes are generally underestimated in climate model projections that focus on the near term. Compared to these past observations, climate models appear to underestimate long term warming and the amplification of warmth in Polar Regions.

"Climate models appear to be trustworthy for small changes, such as for low emission scenarios over short periods, say over the next few decades out to 2100. But as the change gets larger or more persistent, either because of higher emissions, for example a business-as-usual-scenario, or because we are interested in the long term response of a low emission scenario, it appears they underestimate climate change.," said co-author Prof Katrin Meissner, Director of the University of New South Wales Climate Change Research Centre.

"This research is a powerful call to act. It tells us that if today's leaders don't urgently address our emissions, global warming will bring profound changes to our planet and way of life - not just for this century but well beyond."

China’s global infrastructure spree rings alarm bells

by Basten Gokkon on 17 July 2018

Governments across Southeast Asia have embraced billions of dollars in construction projects backed by China as they rely on infrastructure-building to drive their economic growth.

But there are worries that this building spree, under China’s Belt and Road Initiative (BRI), makes no concessions for environmental protections, and even deliberately targets host countries with a weak regulatory climate.

Beijing has also been accused of going on a debt-driven grab for natural resources and geopolitical clout, through the terms under which it lends money to other governments for the infrastructure projects.

In parallel, China is also building up its green finance system, potentially as a means to channel more funding into its Belt and Road Initiative.

KUCHING, Malaysia — As governments in Southeast Asia target economic growth through infrastructure development, China, the world’s second-largest economy, has emerged as a ready funder for some of the most ambitious and expensive projects.

Regional leaders have been quick to seize the opportunity offered by Beijing, but environmental experts warn that many of these projects could cause irreversible environmental damage in highly biodiverse areas.

The infrastructure push has come in two waves this past decade, both aimed at establishing new roads, ports and railways across the region in pursuit of improved trade and logistics. In 2010, the Association of Southeast Asian Nations (Asean) announced its Master Plan for Connectivity, which seeks to boost regional integration among the 10 member countries of the bloc through infrastructure projects. Three years later, China inaugurated its Belt and Road Initiative (BRI), a $1 trillion transportation and energy infrastructure construction juggernaut aimed to give Beijing a strong presence in markets across the region as well as Africa and Europe. The initiative is slated for completion in 2049.

Representatives of governments pose for a photo at the Belt and Road International Forum in May 2017 in Beijing. The BRI has the backing of dozens of nations that are home to billions of people. Image courtesy of the Russian Presidential Press and Information Office (by CC 4.0).

China’s Belt and Road Initiative, showing China in Red the members of the Asian Infrastructure Investment Bank in orange. The BRI comprises six proposed corridors of the Silk Road Economic Belt, a land transportation route running from China to southern Europe via Central Asia and the Middle East; and the 21st-Century Maritime Silk Road, a sea route connecting the port of Shanghai to Venice, via India and Africa. Image by Lommes (by CC 4.0).

Overall, these and other initiatives would see paved roads in Asia’s developing nations double in length in the next few years, according to a 2017 report.

Asean is one of China’s largest trading partners, and given its vast natural resources and geographical proximity to China, the association of fast-growing nations is a key priority for the BRI.

More than 60 percent of Chinese overseas direct investment (ODI) to BRI countries from 2013 to 2015 went to Asean member states, according to The Economist Intelligence Unit. China’s total ODI in BRI countries was $21.4 billion in 2015, up from $13.6 billion in 2014 and $12.6 billion in 2013. Much of this money has gone to finance large, and often highly controversial, infrastructure projects.

One of them, for instance, is a high-speed rail line in Indonesia, which the government in Jakarta shelved due to a lack of proper environmental impact studies and conflicts with local zoning plans. Also in Indonesia, a massive dam project supported by the Bank of China and Sinohydro, China’s hydropower authority, has been widely criticized for threatening the only known habitat of the world’s rarest great ape, the Tapanuli orangutan.

There have also been complaints in Laos, Vietnam and Cambodia about potential damage to the environment and communities from Chinese-backed hydropower projects along the Mekong River.

What’s at stake?

The BRI may appear to be an ambitious take on globalization, by covering 68 countries through trade and commerce and linking them to China, but there’s more to it than that, says Mason Campbell, a postdoctoral research fellow at the Centre for Tropical Environmental and Sustainability Science at James Cook University in Australia.

“It’s basically under the guise of internationalizing China, but it’s more about resource extraction and attaining materials,” Campbell told Mongabay on the sidelines of the 2018 conference of the Association for Tropical Biodiversity Conservation in Kuching, Malaysia, in early July.

Wood that was cut for charcoal production near Katha, northern Myanmar. Villages in this area used to cut teak and other valuable trees for export, but have since started producing charcoal. The charcoal being exported to China can be made from any tree, regardless of size or kind. This amount of trees could produce about 150 bags of charcoal. Photo by Nathan Siegel for Mongabay.

Wood that was cut for charcoal production near Katha, northern Myanmar. Villages in this area used to cut teak and other valuable trees for export, but have since started producing charcoal. The charcoal being exported to China can be made from any tree, regardless of size or species. This amount of trees would produce about 150 bags of charcoal. Image by Nathan Siegel for Mongabay.

Campbell, whose work has focused on the Pan Borneo Highway — a project to linking the two Malaysian states on the island with the nation of Brunei — said China’s way of asserting its influence in most infrastructure projects was typically by pushing for Chinese companies to work abroad or finance a development project in a foreign country that would eventually benefit China.

China has pursued minerals, fossil fuels, agricultural commodities and timber from other nations under this model. As a net importer of coal, China has more than 1,600 plants scheduled to be built by Chinese firms in more than 62 countries, including in Southeast Asia. The country is also investing $100 billion annually in Africa for extractive mineral industries and the associated transportation and energy infrastructure. The effort to secure these resources has spawned its own infrastructure boom that typically involves building large-scale roads, railways and other infrastructure to transport commodities from interior areas to coastal ports for export.

In an analysis that only considers the backbone BRI projects, and not these various side projects, the WWF estimates that the Chinese initiative will directly impact 265 threatened species, including endangered tigers, giant pandas, gorillas, orangutans and Saiga antelopes.

“There is significant potential overlap between the terrestrial BRI corridors and areas that are important for biodiversity conservation and for the provision of social and economic benefits to people,” the wildlife NGO wrote in the report. “These overlaps indicate risk areas for potentially negative impacts of infrastructure development.”

It said the major BRI corridors would cut through or broadly overlap with 1,739 Important Bird Areas or Key Biodiversity Areas, as well as 46 biodiversity hotspots or Global 200 Ecoregions.

“It’s huge. It’s as big as oil palm,” Alex Lechner, a researcher from the School of Environmental and Geographical Sciences at the University of Nottingham Malaysia Campus, told Mongabay.

In a report published earlier this year, Lechner highlighted the rampant biodiversity loss in Asia from the BRI, which crosses several terrestrial and marine biodiversity hotspots, wilderness areas and other key conservation areas, such as Southeast Asia’s Coral Triangle.

Road development, the report said, will create direct and indirect impacts, such as habitat loss, fragmentation, and illegal activities such as poaching and logging. In the marine environment, increased sea traffic exacerbates the movement of invasive species and pollution. Poorly planned infrastructure has the risk of locking in undesirable environmental practices for decades to come, it added.

“BRI could have disastrous consequences for biodiversity,” Lechner wrote in the report.

This map provides an overview of the land areas slated for infrastructure development, and likely to be at highest environmental risk, as a result of the BRI corridors. According to a WWF analysis, the BRI is likely to impact threatened species, environmentally important areas, protected areas, water-related ecosystem services, and areas with important wilderness characteristics. Image by WWF.

China is not the only nation promoting its own economic interests over those of other countries and their environmental health. It’s a story that has played out going back to the colonial period and earlier, when European nations ruthlessly exploited the resources and people of Africa, Asia and the Americas.

The difference with China is both scale and speed, says William Laurance, a professor at James Cook University, whose work in the past two decades has primarily focused on infrastructure projects and their impact on the environment.

“No nation has ever changed the planet so rapidly, on such a large scale, and with such single-minded determination,” Laurance wrote in an op-ed last year. “It is difficult to find a corner of the developing world where China is not having a significant environmental impact.”

He also noted that China currently lacks other key factors that would make it easier to track and mitigate the impacts of projects, such as a free press or laws regulating the practice of Chinese businesses abroad.

According to a major World Bank analysis of nearly 3,000 projects, Chinese foreign investors and companies often predominate in poorer nations with weak environmental regulations and controls. This, Laurance said, makes those nations prime “pollution havens” for China and Chinese enterprises, as the latter wouldn’t take any blame for the environmental damage wrought by their activities in the host countries.

Mohammed Alamgir, an environmental scientist at James Cook University, described China’s BRI as also a way of buying long-term political influence across the globe.

“In this arena of political economy, this huge investment from China, they’re the only player in that field,” he said.

Campbell said the political favors sought by China would become clearer once the country receiving the funding for infrastructure projects struggled to pay back the money. A prominent case is that of the $1.5 billion Hambantota port in Sri Lanka. With the host government overleveraging itself to finance the project, it has had to give China a 99-year lease on the port in exchange for debt relief.

The scale of China’s international resource exploitation is only likely to increase. The Beijing-backed Asian Infrastructure Investment Bank (AIIB), the primary investor in BRI projects, is heavily capitalized and moving rapidly to fund overseas projects with “streamlined” environmental and social safeguards. However, the host nations can, in theory, negotiate for much stringent standards for individual projects.

In the meantime, the World Bank in 2016 announced new environmental and social safeguards to remain competitive with the AIIB. But those new standards have been described by some experts, including Laurance, as weaker than the lender’s previous framework.

Laurance said the AIIB and other Chinese development banks could force a “race to the bottom” among multilateral lenders — with potentially grave consequences for the global environment.

In 2016, Chinese President Xi Jinping called for a “green, healthy, intelligent and peaceful” Silk Road. He said the participating countries should “deepen cooperation in environmental protection, intensify ecological preservation and build a green Silk Road.” Over the past decade, Chinese government ministries have released a series of “green papers” outlining lofty environmental and social guidelines for China’s overseas ventures and corporations.

But researchers remain skeptical about this commitment.

“The Chinese government readily admits that compliance with its guidelines is poor, but accepts no blame for this. Instead, it insists that it has little control over its corporations and blames the host nations themselves for not controlling Chinese corporations more carefully,” Laurance said.

“If China really wanted to reign in its freewheeling corporations, it could easily do so by making some strong official statements and visibly punishing a few extravagant sinners. It hasn’t done this for one simple reason: Despite their often-egregious environmental activities, China’s corporations operating overseas are enormously profitable.”

Alamgir said there was still hope for China to improve its environmental commitment by imposing more stringent strategic environmental assessments throughout the project development and funding processes.

“At the end of the day, it’s political will,” he said. “If the Chinese government had the political will to do that.”

Lechner called for more scientists from the global conservation community to look into the environmental impacts of the BRI, and conduct joint research with colleagues in China to better understand the dynamics of the initiative. He said researchers in China had produced about 90 papers reviewing the BRI, but they were in Chinese and only a small percentage were related to environmental aspects.

“I think there’s a lot to be learned about what’s happening internally [in China], because maybe we think [that] it’s a vacuum because it’s not in English, but maybe there are things going on internally,” Lechner said.

Green finance framework

In parallel with the BRI push, China appears to be developing its green financing system. A discussion involving 120 policymakers, financial regulators and practitioners from more than 35 countries in Asia, Africa and Latin America took place in China in May.

The six-day event was also attended by experts from more than 50 international organizations and commercial entities, including the Asian Development Bank, the Climate Bonds Initiative and the Commercial Bank of China. It highlighted Chinese roadmaps for green financial systems and the barriers related to regulatory policies, green definitions, raising awareness, building capacity, and collecting data.

“It was extremely important for senior government leaders at the highest level to send a strong policy signal to regulators and market participants on the importance of green finance to the economy,” said Ma Jun, director of the Center for Finance and Development at Tsinghua University in Beijing, who also led the drafting of China’s green finance guidelines in 2015-2016.

Sean Kidney, CEO of Climate Bond Initiatives, said the green bond market has taken off in the past couple of years, with China and the United States leading the global market.

The Chinese experience, Kidney said, demonstrates that developing a clear green bond taxonomy, verifiers, and disclosure rules will help avoid the problem of greenwashing and also paves the way for smooth accreditation and verification of bond issuances and proceeds management.

Financial institutions based in Shanghai are also seen driving green finance through product innovation, according to Clair Liu, head of international business at the Shanghai Stock Exchange. She cited the exchange’s experience in promoting environmental information disclosure, green bonds, and a green index development.

To achieve a successful green financial system, Ma called on governments to do more to encourage private capital, and to train and develop human resources in the sector.

“Policy coordination among ministries, development of taxonomies, and information disclosure are also key to success,” said Ma, whose current research priorities include investment in the BRI.

Editor’s note: William Laurance is a member of Mongabay’s advisory board.

Company takes meat off the menu at employee events

By The Associated Press

Posted at 6:41 PM, Updated at 7:24 PM

Office space sharing company WeWork says it is no longer serving red or white meat at company events.

NEW YORK — Office space sharing company WeWork says it is no longer serving red or white meat at company events.

In an email to employees Thursday, co-founder and Chief Creative Officer Miguel McKelvey said the company won’t serve pork, poultry or red meat, and it won’t allow employees to expense meals that include those meats to the company. Fish will stay on the menu.

McKelvey said the change means WeWork will use less water and produce less carbon dioxide as well as saving the lives of animals.

The company said employees are welcome to bring whatever food they want to work.

The policy is effective immediately and also applies to the company’s Summer Camp gathering in the United Kingdom in August. McKelvey wrote that WeWork could save 10,000 animals by eliminating meat at the upcoming Summer Camp event. It has 6,000 employees and some 5,000 attended the event in 2017.

WeWork has locations in 22 countries and a total of 75 cities, including 24 in the U.S. The company declined to say how much water it expects to save in 2018 or 2019 from the new policy, or how much carbon dioxide it could save in the next two years. The company gave longer term projections based on its estimates for future growth over five years.

WeWork’s meat policy may be unique, but the company is joining a group of companies that have recently looked for ways to reduce their impact on the environment. Coffee chain Starbucks, airlines including American and Alaska, and the Hilton and Hyatt hotel chains have all recently announced that they will stop using plastic straws so they produce less plastic waste. Some cities have banned the straws as well

This San Francisco Brand Claims To Have The World's Most Eco-Friendly Jeans

Esha Chhabra , Contributor for Forbes

May 15, 2018 @ 01:19 PM 616

Amour Vert's new lineup of jeans claim to be the most sustainable denim in the world.

For the past 7 years, San Francisco-based fashion brand Amour Vert has been focused on manufacturing clothes in the Bay Area that are made from natural, eco-friendly materials. Now, in collaboration with LA-based denim label AGOLDE, they have released a collection of jeans that may well be the worlds most eco-friendly.

The average American woman owns 7 pairs of jeans at any one time, and recent research has put the average quantity of water used to produce a single pair at 1,800 gallons (jeans are washed repeatedly to achieve the right color and feel). For perspective, that works out to about the equivalent of 30 years worth of drinking water for the average American. In the face of a global water crisis, the fashion industrys insatiable thirst is one of its most damaging attributes and one that Amour Verts new approach aims to combat.

Produced at the AGOLDE factory in downtown Los Angeles using an ozone wash technology, the AGOLDE x Amour Vert jeans achieve the desired vintage look, they argue, without the high environmental cost usually associated with denim manufacturing.

The ozone wash procedure relies on a combination of electricity and gas through which denim can be aged in a natural and non-polluting process. It reportedly consumes less than one-tenth of the water used in traditional production methods. It also eliminates the need for harmful chemical processes such as bleaching. Subsequently, lasers are used to ensure that the washes and finishes are consistent. Amanda Halper Salinas, director of Marketing at Amour Vert, says this significantly reduces the quantity of water and the number of wash cycles needed to achieve the desired vintage-look of the denim.

The fashion industry is a major contributor to global pollution and though the industry seems to be waking up to the impact it’s having globally, few brands are making tangible efforts to combat their carbon-footprint and provide customers with a more ethical alternative. Amour Vert has been chipping away at that goal for nearly a decade -- and jeans, Salinas acknowledges, have been one of the hardest products to create in an eco-friendly manner.

The jeans are produced in Los Angeles using more eco-friendly approaches.

Founded by Parisian native Linda Balti and her husband Christoph Frehsee in 2011 on the principle that “a woman should never have to sacrifice style for sustainability,” the company embraces a zero-waste philosophy and has worked alongside American Forests to plant over 160,000 trees to offset its manufacturing footprint.

Amour Vert chose to collaborate with AGOLDE, the LA-based denim manufacturer, because they had worked for years on the technology and had the know-how, Salinas says, to transform denim from a polluting garment to a sustainable one.

Citizens of Humanity, also known for their jeans, is the parent company of AGOLDE. Federico Pagnetti, the Chief Operating Officer at Citizen's of Humanity, says, “We have employees who have been in the business for nearly 15 years and during those years have had time to master their craft. They’ve been constantly evolving and are continuously on the lookout for new sustainable technology and methods to implement.”

There’s also the materials themselves: the new lineup uses certified organic cotton, sourced globally; the reason being that conventional cotton uses, reportedly, 16 percent of the world’s insecticides and an estimated two billion dollars of pesticides annually. Overall, the company opts to source organic materials where possible. Because organic cotton is grown without insecticides and pesticides, the company estimates that yields in a 46 percent reduction in CO2 production, 91 percent reduction in water consumption and a significantly diminished contribution to the acidification of both nearby land and water.

Amour Vert is one of a growing number of brands focusing on ethical fashion.

Made in, what Amour Vert refers to as, one of the country’s only vertically-integrated denim production facilities, the jeans come in 3 cuts and 9 washes. With prices between $148 and $198, they’re definitely investment pieces, but the female-led startup maintains that this premium pricing is a reflection of the craftsmanship and quality that goes into the garments as opposed to the customer covering the costs of sustainability.

Pagnetti adds: “We will never price the garments higher because they are more ecological than other jeans. As a company, we chose to invest in establishing ourselves as a sustainable brand because that is what we believe in and aligns with our goal to make as little of a footprint on our environment. We don’t want our customer to pay for our choice to be sustainable, we just hope they believe in the product and intention as much as we do.”

So does this new denim range demonstrate that ethical fashion need not involve a compromise on style or a huge financial commitment? Could this indicate a change in the fashion industry’s relationship with environmental and social sustainability?

Frankly, there have been numerous denim brands focused on using more eco-friendly materials, like organic cotton, or repurposing waste fabrics. Brands such as Nudie Jeans have been tracking the transparency of their supply chain and producing modern cuts of raw denim for decades; SOURCE Denim uses a biodegradable material (made of crab shells of all things) for its finish, cutting down on water usage and eliminating chemicals. In fact, even Levi’s announced this year that it’s going to incorporate lasers in its dying process to cut down on chemical usage.

Nevertheless, Amour Vert offers a certain fit, look, and style that will appeal to women, giving them something that’s chic and yet eco-friendly, which has been a gap in the denim market.

Flint estimates 14,000 lead water service lines still in the ground

Updated May 27; Posted May 27

By Ron Fongerrfonger1@mlive.com

FLINT, MI -- The city estimates 14,000 damaged lead and galvanized water service lines remain buried, about 15 percent more than past projections.

Director of Public Works Robert Bincsik provided the update in a May 16 letter to the U.S. Environmental Protection Agency, basing the projection on what crews have uncovered so far in work for the Flint Action and Sustainability Team (FAST) Start project.

City spokeswoman Kristin Moore said she doesn't believe the new estimate will require work to continue beyond next year and said crews are running ahead of schedule with work so far in 2018.

"During the first four phases of FAST Start, the city has conducted excavations at approximately 8,843 homes," Bincsik's letter says. "Of those ... 6,256 lead or galvanized steel service lines were identified or replaced ... approximately 30 percent of the lines were identified as already being copper from the water main to the house."

With no state bottled water, Flint pipe replacement starts with new urgency

"They said the (water points of distribution) would stay open until (service line removals were complete), and yet again they have backed off of their word and what they said they would do," Weaver said in a statement released by the city earlier this week. "This is exactly why the people's trust has not been restored."

Experts have said lead and galvanized lines became damaged by corrosive Flint River water used by the city from April 2014 until October 2015. Those lines essentially absorbed -- and have the potential to release -- lead particles.

The city has about 30,000 water accounts, 28,400 of which are residential.

Bencsik's letter says if the percentage of copper service lines still in the ground continues as it has so far, Flint still has about 14,000 lines to replace.

The city's past shoddy record-keeping has slowed the replacement of galvanized and lead lines, forcing the use of hydro-excavation contractors to determine which lines need replacement and which can remain.

Flint data on lead water lines stored on 45,000 index cards

FLINT, MI — The city knows which homes in Flint have pipes most likely to leach lead into tap water but can't easily access the information because it's kept on about 45,000 index cards.

City officials have said previously that information about the composition of the service lines was often stored on blueprints or index cards, and in some cases, no information on the lines was maintained by the city.

In February, the EPA asked the city to update its efforts to rebuild its service line inventory, prompting Bincsik's letter.

Bay mussels in Puget Sound show traces of oxycodone

May 9, 2018

By Jeff Rice

The opioid epidemic has now hit the waters of Puget Sound. State agencies tracking pollution levels in Puget Sound have discovered traces of oxycodone in the tissues of native bay mussels (Mytilus trossulus) from Seattle and Bremerton area harbors.

The mussels were part of the state’s Puget Sound Mussel Monitoring Program. Every two years, scientists at the Washington Department of Fish and Wildlife (WDFW) transplant uncontaminated mussels from an aquaculture source on Whidbey Island to various locations in Puget Sound to study pollution levels. Mussels, which are filter feeders, concentrate contaminants from the local marine environment into their tissues. After two to three months at the transplant site, scientists analyze the contaminants in the collected mussel tissues.

The areas where the oxycodone-tainted mussels were sampled are considered highly urbanized and are not near any commercial shellfish beds. “You wouldn’t want to collect (and eat) mussels from these urban bays,” explained PSI’s Andy James, who assisted with the study. The oxycodone was found in amounts thousands of times lower than a therapeutic dose for humans and would not be expected to affect the mussels, which likely don’t metabolize the drug, James said. The findings may raise concerns for fish, however, which are known to respond to opioids. Lab studies show that zebrafish will learn to dose themselves with opioids, and scientists say salmon and other Puget Sound fish might have a similar response.

Scientists typically find many chemical compounds in Puget Sound waters, ranging from pharmaceuticals to illicit drugs such as cocaine, but this is the first time that opioids have been discovered in local shellfish. The contaminants in this case are thought to be passed into Puget Sound through discharge from wastewater treatment plants. Even filtered wastewater can potentially include traces of thousands of chemicals known as contaminants of emerging concern (CECs). Runoff from agriculture and stormwater are also common sources of CECs.

In addition to oxycodone, the mussels also showed high levels of the chemotherapy drug Melphalan, which is a potential carcinogen due to its interactions with DNA. The drug was found at “levels where we might want to look at biological impacts,” said James. The mussels had ingested amounts of Melphalan relative by weight to a recommended dose for humans.

These Puget Sound mussel monitoring studies occur every two years and are currently funded by WDFW, the state’s Stormwater Action Monitoring program, and various other regional partners. The monitoring is led by Jennifer Lanksbury, of WDFW’s Toxics-focused Biological Observing System (TBiOS), along help from a host of citizen science volunteers from across Puget Sound. PSI’s Andy James worked with TBiOS on the chemical analysis and presented the findings at last month’s Salish Sea Ecosystem Conference.

James and his team at the University of Washington’s Center for Urban Waters in Tacoma are now using high resolution mass spectrometry to look for additional chemical exposures in the mussel tissues and to evaluate potential biological impacts on Puget Sound species.

Waste Heat, Innovators Turn to an Overlooked Resource

Nearly three-quarters of all the energy produced by humanity is squandered as waste heat. Now, large businesses, high-tech operations such as data centers, and governments are exploring innovative technologies to capture and reuse this vast energy source.

By Nicola Jones • May 29, 2018

When you think of Facebook and “hot air,” a stream of pointless online chatter might be what comes to mind. But the company will soon be putting its literal hot air — the waste heat pumped out by one of its data centers — to good environmental use. That center, in Odense, Denmark, plans to channel its waste heat to warm nearly 7,000 homes when it opens in 2020.

Waste heat is everywhere. Every time an engine runs, a machine clunks away, or any work is done by anything, heat is generated. That’s a law of thermodynamics. More often than not, that heat gets thrown away, dribbling out into the atmosphere. The scale of this invisible garbage is huge: About 70 percent of all the energy produced by humanity gets chucked as waste heat.

“It’s the biggest source of energy on the planet,” says Joseph King, one of the program directors for the U.S. government’s Advanced Research Projects Agency-Energy (ARPA-E), an agency started in 2009 with the mission of funding high-risk technology projects with high potential benefit. One of the agency’s main missions is to hike up energy efficiency, which means both avoiding making so much waste heat in the first place, and making the most of the heat that’s there. ARPA-E has funded a host of innovative projects in that realm, including a $3.5 million grant for RedWave Energy, which aims to capture the low-temperature wasted heat from places like power plants using arrays of innovative miniature antennae.

“The attitude has been that the environment can take this waste,” says one expert. “Now we have to be more efficient.”

The problem is not so much that waste heat directly warms the atmosphere — the heat we throw into the air accounts for just 1 percent of climate change. Instead, the problem is one of wastage. If the energy is there, we should use it. For a long time, says Simon Fraser University engineer Majid Bahrami, many simply haven’t bothered. “The attitude has been that the environment can take this waste; we have other things to worry about,” he says. “Now we have to be more efficient. This is the time to have this conversation.”

The global demand for energy is booming — it’s set to bump up nearly 30 percent by 2040. And every bit of waste heat recycled into energy saves some fuel — often fossil fuels — from doing the same job. Crunching the exact numbers on the projected savings is hard to do, but the potential is huge. One study showed that the heat-needy United Kingdom, for example, could prevent 10 million tons of carbon dioxide emissions annually (about 2 percent of the country’s total) just by diverting waste heat from some of the UK’s biggest power stations to warm homes and offices. And that’s not even considering any higher-tech solutions for capturing and using waste heat, many of which are now in the offing.

To help reduce carbon emissions — not to mention saving money and lessening reliance on foreign fuel imports — governments are increasingly pushing for policies and incentives to encourage more waste heat usage, big businesses like IBM are exploring innovative technologies, and start-ups are emerging to sell technologies that turn lukewarm heat into usable electricity.

For more than a century, waste heat has been used for its most obvious application: heat (think of your car, which uses waste heat from its engine to heat your interior). In 1882, when Thomas Edison built the world’s first commercial power plant in Manhattan, he sold its steam to heat nearby buildings. This co-generation of electricity and usable heat is remarkably efficient. Today, in the United States, most fossil fuel-burning power plants are about 33 percent efficient, while combined heat and power (CHP) plants are typically 60 to 80 percent efficient.

When it opens in 2020, Facebook's new data center in Odense, Denmark will channel its waste heat to warm nearly 7,000 homes. Facebook

That seems to make co-generation a no-brainer. But heat is harder to transport than electricity — the losses over piped distances are huge — and there isn’t always a ready market for heat sitting next to a power plant or industrial facility. Today, only about 10 percent of electricity generation in the U.S. produces both power and usable heat; the Department of Energy has a program specifically to boost CHP, and considers 20 percent a reasonable target by 2030.

Other countries have an easier time thanks to existing district heating infastructure, which uses locally produced heat to, typically, pipe hot water into homes. Denmark is a leader here. In response to the 1970s oil crisis, the country began switching to other energy sources, including burning biomass, which lend themselves to district heating. As a result, Denmark has an array of innovative waste-heat capture projects that can be added onto existing systems, including the upcoming Facebook data center.

In 2010, for example, Aalborg’s crematorium started using its waste heat to warm Danish homes (after the Danish Council of Ethics judged it a moral thing to do). Others are joining in. In Cologne, Germany, the heat of sewage warms a handful of schools. In London, the heat from the underground rail system is being channelled to heat homes in Islington. An IBM data center in Switzerland is being used to heat a nearby swimming pool. “Data centers crop up again and again as having huge potential,” says Tanja Groth, an energy manager and economist with the UK’s Carbon Trust, a non-profit that aims to reduce carbon emissions.

An alternative option is to turn waste heat into easier-to-transport electricity. While many power plants do that already, regulators striving for energy security are keen to push this idea for independent power producers like large manufacturers, says Groth. Businesses that make their own power would reduce carbon emissions by getting any extra electrical juice they need by squeezing it out of their waste heat, rather than buying it from the grid.

Waste heat is a problem of a thousand cuts, requiring a mass of different innovations.

Several companies have popped up to help do just this. One of the largest, Turboden, based in Brescia, Italy, sells a mechanical system based on the Organic Rankine Cycle. This is a type of external combustion engine — an idea that pre-dates the internal combustion engine used in cars. Rankine engines and similar technologies have contained, closed-loop systems of liquid that expand to gas to do work, thanks to a temperature difference on the outside of the system — so you can drive a power-generating engine off waste heat. When a cement plant in Bavaria, for example, added a Rankine engine to its system a decade ago, it reduced its electricity demand by 12 percent and its CO2 emissions by about 7,000 tons.

Since 2010, Turboden says it has sold systems for waste heat recovery to 28 production plants, with seven more under construction now. Turboden is just one of many; the Swedish-based company Climeon, for example, endorsed by spaceflight entrepreneur Richard Branson, uses a similar but different technique to make an efficient heat engine that can be bolted onto anything industrial, from cement plants to steel mills, in order to recycle their waste heat.

Waste heat is a problem of a thousand cuts, requiring a mass of innovations to tackle different slices of the problem: a system that works for one temperature range, for example, might not work for another, and some waste heat streams are contaminated with corrosive pollutants. “We aren’t looking for a silver bullet,” says Bahrami. “There are so many different things that can be done and should be done.”

. About 70 percent of all the energy produced globally gets discarded as waste heat.

Bahrami and others are pursuing solid-state systems for waste heat recovery, which, with no moving parts, can in theory be smaller and more robust than mechanical engines. There are a wide array of ways to do this, based on different bits of physics: thermoacoustics, thermionics, thermophotovoltaics, and more, each with pros and cons in terms of their efficiency, cost, and suitability to different conditions.

“Thermoelectrics have been the major player in this space for years,” says Lane Martin, a material scientist at the Univeristy of California, Berkeley and Lawrence Berkeley National Laboratory. Seiko released a “thermic watch” in 1998 that ran off the heat of your wrist, for example, and you can buy a little thermoelectric unit that will charge your cell phone off your campfire. Researchers are trying hard to increase the efficiency of such devices so they make economic sense for wide-scale use. That means screening thousands of promising new materials to find ones that work better than today’s semiconductors, or tweaking the microstructure of how they’re built.

The biggest technological challenge is to pull energy from the lukewarm end of the spectrum of waste heat: More than 60 percent of global waste heat is under the boiling point of water, and the cooler it is, the harder it is to pull usable energy from it. Martin’s group is tackling this by investigating pyroelectrics (which, unlike thermoelectrics, works by exploiting electron polarization). This isn’t near commercial application yet; it’s still early days in the lab. But the thin-film materials that Martin’s team is investigating can be tuned to work best at specific temperatures, while thermoelectrics always work better the larger the temperature difference. Martin imagines future systems that stack thermoelectric materials to suck up some of the warmer waste heat, say above 212 degrees Fahrenheit, and then uses pyroelectrics to mop up the rest. Martin says his recent work on such materials drummed up interest from a few bitcoin mining operations. “They have a real problem with waste heat,” says Martin. “Unfortunately, I had to tell them it’s a little early; I don’t have a widget I can sell them. But it’s coming.”

“Waste heat is an afterthought — we’re trying to make it a forethought,” says a U.S. government scientist.

Perhaps one of the best applications for waste heat is, ironically, cooling. Air conditioners and fans already account for about 10 percent of global energy consumption, and demand is set to triple by 2050. In urban areas, air conditioners can actually heat the local air by nearly 2 degrees F, in turn driving up the demand for more cooling.

One solution is to use waste heat rather than electricity to cool things down: absorption or absorption coolers use the energy from heat (instead of electrically driven compression) to condense a refrigerant. Again, this technology exists — absorption refrigerators are often found in recreational vehicles, and tri-generation power plants use such technology to make usable electricity, heat, and cooling all at once. “Dubai and Abu Dhabi are investing heavily in this because, well, they’re not stupid,” says Groth.

But such systems are typically bulky and expensive to install, so again research labs are on a mission to improve them. Project THRIVE, led in part by IBM Research in Rschlikon, Switzerland, is one player aiming to improve sorption materials for both heating and cooling. They have already shown how to shrink some systems down to a reasonable size. Bahrami’s lab, too, is working on better ways to use waste heat to cool everything from long-haul trucks to electronics.

It’s very hard to know which strategies or companies will pan out. But whatever systems win out, if these researchers have their way, every last drop of usable energy will be sucked from our fuel and mechanical systems. “Waste heat is often an afterthought,” says King. “We’re trying to make it a forethought.”

Climate change course aims to persuade evangelicals to go green

Christian Today staff writer Tue 29 May 2018 11:23 BST

A new climate change course is targeting evangelical Christians, hoping to persuade the notoriously sceptical group to go green.

Only four per cent of Christians think the environment is one of the most important issues facing the UK, according to 2015 polling, and conservative evangelicals are much less likely than other Christians to be concerned by the impact of climate change in the future.

Plastic waste is increasingly recognised as an environmental disaster, in part thanks to the BBC's Blue Planet documentary.

Tenants of the King is a new course designed to persuade climate-sceptic Christians that the New Testament teaches them to live sustainably.

'We wanted to provide something for Christians who struggle to see the connection between environmental issues, the teaching of the Bible and their daily walk as Christians,' said Stephen Edwards, from Operation Noah, a Christian climate change charity launching the programme.

'We've put together a Bible-based, Jesus-focused study guide which we hope can help communicate environmental issues from a Christian perspective, equipping believers to take confident personal and political action.'

It is backed by the bishop of Kensington, Graham Tomlin, as well as Rev Mark Melluish, from New Wine, Dr Ruth Valerio, of Tearfund, and Dr Justin Thacker, of Cliff College, who have contributed video reflections for the small group discussions.

Valerio said: 'We live in an amazing world that God has placed us in and has asked us to look after. As a church we have got a part to play in responding to God's call. I want to recommend this resource as a way to learn about the calling God has given us to take care of this world.'

Liberals to buy Trans Mountain pipeline for $4.5B to ensure expansion is built

Canadian public could also incur millions to construct expansion project with estimated price tag of $7.4B

Kathleen Harris CBC News Posted: May 29, 2018 8:15 AM ET

Finance Minister Bill Morneau announced today the government is buying the Trans Mountain pipeline for $4.5 billion.

The Liberal government will buy the Trans Mountain pipeline and related infrastructure for $4.5 billion, and could spend billions more to build the controversial expansion.

Finance Minister Bill Morneau announced details of the agreement reached with Kinder Morgan at a news conference with Natural Resources Minister Jim Carr this morning, framing the short-term purchase agreement as financially sound and necessary to ensure a vital piece of energy infrastructure gets built.

"Make no mistake, this is an investment in Canada's future," Morneau said.

Morneau said the project is in the national interest, and proceeding with it will preserve jobs, reassure investors and get resources to world markets. He said he couldn't state exactly what additional costs will be incurred by the Canadian public to build the expansion, but suggested a toll paid by oil companies could offset some costs and that there would be a financial return on the investment.

Kinder Morgan had estimated the cost of building the expansion would be $7.4 billion, but Morneau insisted that the project will not have a fiscal impact, or "hit."

He said the government does not intend to be a long-term owner, and at the appropriate time, the government will work with investors to transfer the project and related assets to a new owner or owners. Investors such as Indigenous groups and pension funds have already expressed interest, he said.

Until then, the project will proceed under the ownership of a Crown corporation. The agreement, which must still be approved by Kinder Morgan's shareholders, is expected to close in August.

A senior government official, speaking on background, said the government hopes to get a new commercial buyer for the pipeline by August, but if that doesn't happen, it will put up the $4.5 billion to purchase the assets.

The government won't publicly discuss construction cost for the expansion because it wants private companies to carry out their own assessments, then bid on the project, the official said.

Conservative Leader Andrew Scheer said today's decision does nothing to advance the project, since the legal questions and obstacles still remain. He said the government has failed to take action to ensure certainty around the expansion by resolving jurisdictional issues.

"This is a very, very sad day for Canada's energy sector. The message that is being sent to the world is that in order to get a big project build in this country, the federal government has to nationalize a huge aspect of it," he said.

NDP Leader Jagmeet Singh called it a "bad deal that will solve nothing." Pushing ahead with the pipeline betrays the government's promise to ease reliance on fossil fuels, he said.

"Climate change leaders don't spend $4.5 billion dollars on pipelines," he said. "We need a government with a vision that takes our future seriously."

The pipeline expansion project has faced intense opposition from the B.C. government, environmental activists and some Indigenous groups.

Carr said the plan does not sacrifice the environment for economic benefits.

"Canadians want both and we can have both," he said.

Kinder Morgan issued a statement that says the deal represents the best way forward for shareholders and Canadians.

"The outcome we have reached represents the best opportunity to complete Trans Mountain Expansion Project and thereby realize the great national economic benefits promised by that project," said chairman and CEO Steve Kean.

'A great day': Kinder Morgan CEO cheerful after Trans Mountain sale to Ottawa

Federal decision to buy Trans Mountain pipeline sparks social media reaction

"Our Canadian employees and contractors have worked very hard to advance the project to this critical stage, and they will now resume work in executing this important Canadian project."

Green Party Leader Elizabeth May, who pleaded guilty Monday to criminal contempt for protesting the pipeline, tweeted that Kinder Morgan is "laughing all the way to the bank."

She called it a bad public policy decision that future generations will regret.

"Historically, I'm quite certain, this will go down as an epic financial, economic boondoggle that future students of political science will say, 'Why on earth did they do that? That made no sense,'" she said.

Green Party Leader Elizabeth May says the decision to buy the Trans Mountain pipeline will go down in history as one of Canada's greatest epic, economical boondoggles. 1:13

Alberta Premier Rachel Notley called it "a major step forward for all Canadians." She believes any efforts to "harass" the project will have less effect with the federal government as the owner, because it will have Crown immunity in legal proceedings.

She said the pipeline remains a commercially viable project that will turn a profit. She conceded that governments could be on the hook if there is a spill, but said spills are becoming less frequent.

"Just like any project, there is risk. In this case, the risk is very low," she said.

Prime Minister Justin Trudeau took to Twitter to praise the deal.

"Today, we've taken action to create and protect jobs in Alberta and B.C., and restart construction on the TMX pipeline expansion, a vital project in the national interest," his post says.

Under the arrangement, the government will indemnify a potential buyer for additional costs caused by provincial or municipal attempts to delay or obstruct the expansion. It also promises to underwrite costs if the proponent abandons the project because of an adverse judicial decision, or because it can't be completed by a predetermined date despite "commercially reasonable efforts."

Under either of those scenarios, the government will have the option to re-purchase the pipeline before the expansion is abandoned.

Notley has been locked in a bitter dispute over the pipeline with B.C. Premier John Horgan.

Today, Horgan said a change of ownership doesn't alter his concerns about the risk of a spill that could harm the coastal environment and said he'll proceed with a legal challenge.

"The good news is I think I have a better chance of progress with a Crown corporation and a government that is responsive to people rather than a company that is only responsive to its shareholders," he told CBC in Vancouver.

In a news conference, Horgan said the dispute should have been resolved through a joint reference to the Supreme Court.

"Now we have both Ottawa and Alberta, rather than going to court to determine jurisdiction, they're making financial decisions that affect taxpayers, and they'll have to be accountable for that."

The twinning of the 1,150-kilometre-long Trans Mountain pipeline will nearly triple its capacity to an estimated 890,000 barrels a day and increase traffic off B.C.'s coast from approximately five tankers to 34 tankers a month. (CBC News)

The federal government had looked at three options for moving the project forward:

compensating Kinder Morgan — or any other company — for financial losses caused by British Columbia's attempts to block the project;

buying and building the expansion itself, and then selling it once the work is complete, or;

buying the project from Kinder Morgan, then putting it on the market for investors willing to pick up the project and build it themselves.

Morneau's announcement comes just two days before a deadline that had been set by Kinder Morgan. The company had said it needed clarity on a path forward for the project by May 31 or it would walk away from construction.

The original Trans Mountain pipeline was built in 1953. The expansion would be a twinning of the existing 1,150-kilometre pipeline between Strathcona County (near Edmonton), Alta., and Burnaby, B.C. It would add 980 kilometres of new pipeline and increase capacity from 300,000 barrels a day to 890,000 barrels a day.

According to Kinder Morgan's project website, the construction and the first 20 years of expanded operations would mean a combined government revenue of $46.7 billion, with $5.7 billion for B.C., $19.4 billion for Alberta and $21.6 billion for the rest of Canada.

At Two Power Plants, Scientists Are Racing Each Other To Turn Carbon Into Dollars

The 10 finalists in the four-year-long Carbon XPRIZE have been selected, and will embark on a two-year project to show that there’s a market for products made from captured carbon.

By Eillie Anzilotti

It sounds like something out of science fiction: 10 teams of scientists and innovators that are working on plans to converting carbon emissions into useful products will ship out to two carbon-dioxide emitting power plants. Five teams will travel to a natural-gas fired plant in Alberta, Canada, and the five others will go to a coal-powered plant in Gillette, Wyoming. There, they’ll have two years to prove the validity of their models.

This is, in fact, the final stage of the Carbon XPRIZE–a four-and-a-half-year-long, $20 million global competition to develop and scale models for converting carbon emissions into valuable products like enhanced concrete, liquid fuel, plastics, and carbon fiber. XPRIZE runs various competitions around topics ranging from water quality to public health. From 47 ideas first submitted to the challenge, a panel of eight energy and sustainability experts whittled the list down to the final 10. The two winners–one from the Canada track, and the other from Wyoming–will receive a $7.5 million grand prize to bring their innovation to market.

“We give the teams literally the pipes coming out of the power plants, and they can bring whatever technology they’re developing to plug into that source,” says Marcius Extavour, XPRIZE senior director of Energy and Resources and the lead on the Carbon XPRIZE competition. Teams will be judged on how much CO2 they convert, and the net value of their innovations.

The finalists stationed at Wyoming include C4X, a team from Suzhou, China producing bio-foamed plastics, and Carbon Capture Machine from Aberdeen, Scotland, which is making solid carbonates potentially to be used in building materials. Carbon Cure from Dartmouth, Canada, and Carbon Upcycling UCLA from Los Angeles are both experimenting with CO2-infused concrete, and Breathe from Bangalore is making methanol, which can be used as fuel.

In Alberta, Carbicrete from Montreal is making concrete with captured CO2 emissions and waste from steel production, and Carbon Upcycling Technologies from Calgary is producing nanoparticles that can strengthen concrete and polymers. CERT from Toronto is making ingredients for industrial chemicals, C2CNT from Ashburn, Virginia, is making tubing that can serve as a lighter alternative to metal, say for batteries, and Newlight from Huntington Beach, California, is making bioplastics.

“This XPRIZE is about climate change, sustainability, and getting to a low-carbon future,” Extavour says. “The idea is to take emissions that are already being produced, and preventing them from leaking out into the atmosphere or oceans or soil, and converting them, chemically, into valuable material.”

Carbon capture is not a new idea. The concept of trapping carbon emissions before they seep out from a power plant by sequestering them in the ground, or sucking them out of the air, as a facility in Zurich does, has been around for years, but not without controversy. If innovators are able to scale carbon-capture and conversion models, will it stop the push toward renewables?

The two entities sponsoring the prize certainly make it seem like that’s a possibility. One, NRG is a large energy company that manages power plants across the U.S., and Canada’s Oil Sand’s Innovation Alliance, a consortium of oil sands producers. (NRG has made efforts to reduce its emissions; it’s retiring three natural-gas fired plants across California over the next year.)

But Extavour does not see carbon conversion as antithetical to reducing emissions overall. Rather, “I think it’s complimentary,” he says. “We just don’t have the option of turning off our CO2-emitting resources today.” While emission-free options like solar, wind, and geothermal are scaling, he says, they’re not doing so fast enough to completely replace carbon. “There’s still a hard core of emissions from sectors like manufacturing that we have to get our hands around,” Extavour says.

“This isn’t about a proposal anymore,” Extavour says. “This is about: Can you build it in a way that works and is reliable? And can you do it in a way that’s not just climate and carbon sustainable, but economically sustainable? Can you build a business around this technology? Because if you can, that’s how we can get emitters of CO2 today to actually adopt these solutions and scale them up, and really take a bit out of emissions.”

Burning Wood as Renewable Energy Threatens Europe’s Climate Goals

Scientists say a new EU policy on biomass is 'simplistic and misleading' and will increase emissions. U.S. forests are being turned into wood pellets to feed demand.

BY BOB BERWYN, INSIDECLIMATE NEWS

JUN 22, 2018

The European Union declared this week that it could make deeper greenhouse gas cuts than it has already pledged under the Paris climate agreement. But its scientific advisors are warning that the EU's new renewable energy policy fails to fully account for the climate impacts of burning wood for fuel.

By counting forest biomass, such as wood pellets used in power plants, as carbon-neutral, the new rules could make it impossible for Europe to achieve its climate goals, the European Academy of Sciences Advisory Council (EASAC) wrote in a strongly worded statement.

The council said the renewable energy policy's treatment of biomass is "simplistic and misleading" and could actually add to Europe's greenhouse gas emissions over the next 20 to 30 years.

That bump in emissions would come just as the planet's carbon emissions budget is running out, said William Gillett, EASAC's energy director. The Paris agreement aims to reduce net emissions from energy systems to zero within the next several decades.

"The Paris agreement put the time dimension into stark focus," Gillett said. "We don't have 200 years to get to carbon balance. We only have 10 to 20 years. Our carbon budget is nearly used up, and burning trees uses up the budget even faster," he said.

The Math Doesn't Add Up

The countries in the Paris treaty have been encouraged to adopt new, more ambitious goals in the next few years to further reduce the greenhouse gas emissions that are driving global warming.

So, European nations, among the treaty's strongest backers, have engaged in prolonged negotiations toward deeper emissions cuts. Over the past two weeks, they agreed to increase renewable energy to 32 percent of the power mix and set a goal of 32 percent energy efficiency savings.

The EU's climate commissioner, Miguel Arias Caete, told a meeting of environment leaders from Europe, Canada and China on Wednesday that the new policies would mean the European Union could increase its emissions reduction target from 40 percent to just over 45 percent by 2030.

But the renewable energy policy includes burning wood for fuel. Over a year ago, the EU's science advisors published a comprehensive report debunking the logic behind treating all wood fuel as beneficial to the climate. Because burning wood gives off more CO2 than coal per unit of electricity produced, the climate math doesn't add up, scientists say.

Large-scale forest harvests have a climate warming effect for at least 20 to 35 years, said University of Helsinki climate and forest scientist Jaana Bck, who noted that scores of evidence-based studies all say basically the same thing.

"And if we look at the Paris targets, we are in critical times at the moment. We need to reduce emissions now, not in 50 or 100 years," she said.

Of particular concern is the harvesting of mature trees. Converting waste wood or fast-growing agricultural products has less climate impact.

Large-scale forest harvests have a climate warming effect for at least 20 to 35 years, said climate and forest scientist Jaana Bck.

The adoption of the rules is partly a disconnect between science and policy, and also part of political compromise in the EU confederation, which strives for consensus. Some countries, including forest-rich Scandinavia, pushed for the rules in their current form, Gillett said.

EASAC noted that while it may be too late to change the EU directive itself, each country can now decide how to implement it. The national science academies will be advising policy makers in their respective countries on how to implement the rules without adding to emissions, Gillett said.

What About Sustainable Forests?

In creating the energy policy, the EU attempted to address the climate impacts of wood-burning by setting standards, such as requiring biomass supply chains in the heat and power sector to emit 80 to 85 percent less greenhouse gas than fossil fuels. And the forest biomass is supposed to come from certified sustainable forests.

But that doesn't cover the emissions from burning the biomass or the loss of stored carbon when trees are harvested, "in other words, all the main things," said Alex Mason, who tracks EU energy policy for the World Wide Fund for Nature Europe, formerly the World Wildlife Fund.

Other proposed climate safeguards, such as limiting subsidies to wood-burning facilities that produce both electricity and heat, were watered down during the negotiations. In the final version, the low standards for subsidies will encourage more inefficient biofuel projects with high emissions, Mason said.

"That has potentially disastrous consequences for the climate and for global forests. That's precisely why nearly 800 scientists wrote to the members of the European Parliament in January—but they were ignored," he said. He said watchdog groups like his would continue to press the EU and its member countries to change course.

U.S. Southeast Is a Major Biomass Source

Biomass rules in Europe have a direct impact on the United States, as well.

European subsidies have been driving deforestation in the Southeastern U.S. since 2009, when the EU adopted its first renewable energy standards, said David Carr, with the Southern Environmental Law Center.

Logging in areas like northeastern North Carolina and adjacent parts of Virginia has spread so fast that environmental groups haven't been able to compile an accurate regional picture. But they do know that some of the logging is happening in ecologically valuable wetlands forests.

In one part of North Carolina, the wood fuel industry has been logging about 50,000 acres per year (about the size of Washington, D.C.) to meet demand at four wood pellet factories for export to Europe.

"The pellet producers say they are taking residues, but they're taking trees up to 2 feet in diameter, big trees that store a lot of carbon," Carr said. "You're burning that immediately and putting all that carbon in the atmosphere."

"There's no commitment those forests will regrow, no legal obligation to replant them, and it's nearly all being exported," he said. "We are the third world on this one."

Recycled plastic could supply three-quarters of UK demand, report finds

Circular economy could recycle more plastic and meet industry demand for raw materials, finds Green Alliance research

Sandra Laville

Thu 14 Jun 2018 01.01 EDT Last modified on Thu 14 Jun 2018 01.40 EDT

Recycled plastic products: the UK does not have an adequate system to capture, recycle and re-use plastic materials, the report found.

Plastic recycled in the UK could supply nearly three-quarters of domestic demand for products and packaging if the government took action to build the industry, a new report said on Thursday.

The UK consumes 3.3m tonnes of plastic annually, the report says, but exports two-thirds to be recycled. It is only able to recycle 9% domestically.

Measures including increased taxes on products made with virgin plastic, and mandatory targets for using recycled plastic in packaging, could encourage an additional 2m tonnes of plastic to be recycled in the UK, the report from Green Alliance said.

The analysis said simply collecting plastic and sending it abroad for recycling does not solve the problem of the global scourge of plastic pollution.

“The UK does not have an adequate system to capture, recycle and re-use plastic materials,” the report said.

It recommends three new measures to ensure more plastic is recovered in the UK and used as raw material in manufacturing. These are:

Mandatory recycled content requirements for all plastic products and packaging;

Short-term support to kickstart the plastic reprocessing market; and

a fund to stabilise the market for companies investing in recycling plastic domestically.

Green Alliance produced the report for a group of businesses that have formed a circular economy taskforce.

Peter Maddox, director of Wrap UK, said the UK had to take more responsibility for its own waste.

“Our mission is to create a world where resources are used sustainably. To make this happen in the UK, we need to design circular systems for plastics and other materials that are sustainable both economically and environmentally. This will require some fundamental changes from all of us.”

The report said government action is necessary to create and support a secondary plastic market in the UK. “The government is uniquely placed to address the market failures that have led to unnecessary reliance on virgin materials to the detriment of the environment, industry and the economy.”

UK businesses including supermarkets recently signed up to a pact to cut plastic.

But voluntary pacts were not enough, the report said, and government action was needed.

“A secondary plastic market ... could recycle an additional 2m tonnes in the UK and fulfil 71% of UK manufacturing’s raw material demand ... Voluntary initiatives like the UK plastic pact ... only thrive when supported by a credible prospect of government regulation if industry does not deliver.”

First Solar to build new manufacturing plant in northwest Ohio

BySarah Elms | BLADE STAFF WRITER, Published on April 26, 2018 | Updated 2:18 p. m.

First Solar Inc. plans to build a new solar panel manufacturing facility near its North American factory complex in Perrysburg Township, a move that’s expected to create 500 jobs.

The Tempe, Ariz.-based company in an announcement Thursday said the expansion plan calls for a $400 million, 1 million-square-foot facility in Lake Township, just east of Perrysburg Township in Wood County. Construction is expected to begin mid-year and the site should be able to reach full production in late 2019, a news release said.

The company began in Toledo and has its flagship site in Perrysburg Township, the largest solar panel manufacturing facility in the U.S. First Solar also has manufacturing facilities in Malaysia and Vietnam.

“First Solar started in the Toledo area and our first manufacturing facility was in Perrysburg,” spokesman Steve Krum said. “We’re really excited to be able to bring U.S. manufacturing growth to northwest Ohio.”

The new facility will produce the company’s relatively new Series 6 solar panel, which has thin-film solar panels that are three times the size of the previous model, the Series 4. Last year the company invested $175 million into retooling its existing Perrysburg Township plant to produce the Series 6 product.

“The continued, sustained demand validates that we can make and sell more, so that’s what’s driving this decision to build an additional factory that will also make the Series 6,” Mr. Krum said.

Once the new plant is complete, First Solar will produce enough panels at its two sites to generate 1.8 gigawatts of power annually. That’s three times the 600 megawatts of annual power its panels produce now.

The company laid off 350 workers when it slowed production last year to convert to building the new Series 6, but Mr. Krum said there soon will be a need for those jobs again, and then some.

Glenn Richardson, JobsOhio managing director for advanced manufacturing, called the expansion “great news for the area.”

“While it’s always difficult to see companies have to reduce their force, it’s great to see that they were able to retool so quickly and produce components that the market is now looking at as very much in demand,” he said. “I think the 500 jobs is just a start.”

Mr. Krum said the new jobs will be a combination of professional engineers and manufacturing technicians, with an average annual salary of $60,000.

The Lake Township plant is contingent on confirmation of state and local incentive packages still under negotiation, Mr. Krum said.

First Solar’s panels are for commercial, industrial, and utility-scale use, and are used by large utility solar farms, on public buildings, and at off-grid industrial sites.

The company does not make panels for personal consumer use or on rooftops of private homes.

Nationally, solar energy is expected to more than double over the next five years, according to the Solar Energy Industries Association.

Although the 10.6 gigawatts of solar-generated capacity in 2017 was down from the record-setting 15 gigawatts in 2016, it still marked a 40 percent increase over 2015’s installation total and was the second consecutive year for double-digit increases. As of now, the industry has installed 53.3 gigawatts, enough to power 10.1 million American homes, the trade group said recently.

Ross Hopper, SEIA president and chief executive officer, said he was especially encouraged by the “increasing geographic diversity in states deploying solar, from the Southeast to the Midwest, that led to a double-digit increase in total capacity.”

Ohio ranked 29th for new solar installations in 2017, according to SEIA’s latest report.

Recent applications for new construction filed with the Ohio Power Siting Board shows solar is coming on strong in this state, with supporters pointing to Gov. John Kasich’s decision in late 2016 not to extend the controversial two-year ban on renewable energy mandates he signed into law in 2014.

Bowling Green currently has the state’s largest solar farm, with 85,000 solar panels erected across 165 acres.

But the state power board has approved plans by Hardin Solar Energy LLC to build one in Hardin County, to be known as the Hardin Solar Center, that will be seven times larger.

It also has approved plans by Blue Planet Renewable Energy LLC for a project east of Cincinnati called Hillcrest Solar I that is to be six times larger than the Bowling Green project. The board is also reviewing an application for a large project that would be in Vinton County, southeast of Columbus.

Staff writer Tom Henry contributed to this report.

New York City Aims for All-Electric Bus Fleet by 2040

NYC has more than 5,700 MTA buses. Taking the fleet electric would reduce climate-warming emissions and cut fuel, maintenance and health costs.

Phil McKenna, APR 26, 2018

New York City plans to take its buses electric. Credit: Chris Hondros/Getty Images

New York City is testing 10 electric buses this year and plans to go all electric by 2040. How quickly it reaches that goal depends in part on technology and EV charging infrastructure.

New York City plans to convert its public bus system to an all-electric fleet by 2040, a new target announced this week by NYC Transit President Andy Byford.

"It does depend on the maturity of the technology—both the bus technology and the charging technology—but we are deadly serious about moving to an all-electric fleet," Byford, who became head of NYC Transit in January, said at a Metropolitan Transit Authority board meeting on Wednesday.

Byford's comments follow an ambitious action plan released on Monday that seeks to address flagging ridership and sluggish service on the nation's largest municipal bus network. The average speed of an MTA bus in Manhattan is among the slowest of large metropolitan systems at 5.7 miles per hour. That means pollution from idling engines is much higher per mile than if the buses were going faster.

The plans calls for a "transition to a zero-emissions fleet to improve air quality and reduce greenhouse gas emissions."

Environmental and community advocates applauded the plan.

"It's a surprising development and a big deal big because this is the largest transit fleet in the country, with over 5,000 buses—that is the equivalent to over 100,000 electric cars," Kenny Bruno, a clean energy consultant, said. "It's a big deal on climate change and public health. All New Yorkers will benefit, not just drivers and passengers but everyone who lives along bus routes and depots, a lot of whom have high asthma rates."

A report released earlier this month by New York City Environmental Justice Alliance found 75 percent of bus depots in New York City are located in communities of color. It noted that fossil-fuel-powered buses emit air pollution linked to respiratory distress, asthma and hospitalization for people of all ages.

"These communities have been overburdened by noxious emissions for too long," Eddie Bautista, executive director of the New York City Environmental Justice Alliance, said in a statement. The announcement by the MTA "signals to us that the Authority has heard our call for a clean bus fleet. We are pleased to receive MTA's commitment to zero emissions and applaud their efforts."

A study in 2016 by a researcher at Columbia University found that if New York shifted from diesel to electric buses, it could reduce health costs from respiratory and other illnesses by roughly $150,000 per bus. The study also showed that fuel and maintenance costs would drop by $39,000 per year by shifting to electric vehicles, and the city could cut carbon dioxide emissions across the fleet by 575,000 metric tons per year.

The MTA, which has more than 5,700 buses in its fleet, already is testing 10 all-electric buses and has plans to purchase 60 more by 2019. With these purchases representing only 1 percent of the entire fleet, the agency would have to significantly increase its electric bus purchases to meet its 2040 target.

Los Angeles is also shifting to electric buses. The city's public transportation agency agreed last year to spend $138 million to purchase 95 electric buses, taking it closer to its goal of having a zero-emissions fleet, comprising some 2,300 buses, by 2030.

Details about the planned conversion to electric vehicles and how the New York agency will pay for the new buses and charging stations were not included in this week's report. The MTA will release a full modernization plan for New York City transit in May, Byford said.

Hawaii upends utility model, adds incentives for solar

Anne C. Mulkern, E&E News reporter Energywire: Thursday, April 26, 2018

Hawaii is overhauling how utilities get paid, upending a century-old business model and ordering incentives for affordability, renewable power and helping homeowners add rooftop solar.

Gov. David Ige (D) on Tuesday signed legislation that directs the state's Public Utilities Commission to create a framework for rewarding utilities based on performance. Criteria for rewards or penalties include "rapid integration of renewable energy sources, including quality interconnection of customer-sited resources."

Utilities also face evaluation on electricity reliability, on timely execution of adding power purchased through contracts and on customer satisfaction — including on their options for managing electricity costs. The measure, S.B. 2939 S.D. 2, states that the change is needed to "break the direct link between allowed revenues and investment levels," such as investments in power plants.

It's the latest move as the state pushes toward clean energy. Hawaii in 2015 passed a mandate to hit 100 percent renewable electricity by 2045, becoming the first state in the country to approve that goal. The Aloha State at the time had no blueprint for how to make it happen. Much remains in the planning stage, though leaders argue it's achievable.

The law appears to be the first one where utilities statewide could earn reward tied to renewable energy projects located in communities, known as distributed energy generation. In other states there are rules affecting some utilities. Rhode Island allows a financial incentive for utility National Grid PLC, based on the value of its long-term renewable energy contracts. National Grid also earns a percent of payments made to distributed generation facilities. In New York, National Grid was allowed an earnings mechanism tied to the total energy from distributed energy resources.

Hawaii largely burns fuel oil to generate electricity, making consumer bills costly and volatile. That has triggered a rooftop solar boom in the state, one so large that the utility on Oahu — the most populous island — severely limited new connections for months. The state later revised its rooftop solar benefits by eliminating net energy metering, where residents earn bill credit for excess power sent to the grid.

S.B. 2939 S.D. 2 argues that a realignment of utility customer and company interests is needed to prevent "economic and environmental harm" by the electrical system. The rewrite is also critical "to ensure the ongoing viability of the State's regulated electric utilities, as they face increasing need to rapidly adapt business models" that enable new technologies and customer choice, it says.

Hawaiian Electric Co. runs electrical power in the state through subsidiaries on Oahu, Maui and the Big Island. The parent company said it welcomes performance-based evaluations.

"We're getting a lot done, including tripling the amount of clean, renewable energy on our grids," Shannon Tangonan, director of Hawaiian Electric's corporate communications, said in a statement. "We're already well below the state's 2020 target for greenhouse gas emissions. The companies have been advocating this new performance-based approach for 20 years."

Hawaiian Electric is at 27 percent renewables across its five-island territory. It's on track to reach 30 percent as required in 2020, it said.

Oil use is dropping, the company said. It used 8.55 million barrels last year, down from 10.7 million seven years ago. Across the five islands, Hawaiian Electric said, one-third of single-family home residents have rooftop solar.

The company said it has added few new power plants that it owns in recent years, and those were at the request of the military. Hawaiian Electric will own and operate a 80,760-panel solar facility under construction at Pearl Harbor on Oahu. The Navy sought the project. Additionally, a 50-megawatt biomass plant is on land the Army leased to Hawaiian Electric.

A report issued last week from the Rhodium Group and Smart Growth America, commissioned by Elemental Excelerator, a nonprofit that funds energy and other startups, said Hawaii could reach more than 80 percent clean energy in 2030. One move that would help, it said, is changing the utility model so that the companies get paid for accelerating the renewables transition (Energywire, April 20).

S.B. 2939 S.D. 2 will give consumers "improved electric services with more options for innovative renewables and batteries," said Hawaii state Rep. Chris Lee (D), chairman of the Legislature's Committee on Energy and Environmental Protection. It's also "a responsible step forward helping our utilities transition to a sustainable business model that can survive disruption in the energy market," he said.

Anne Hoskins, chief policy officer at Sunrun Inc., said other states should see it as an example.

"The time to make these changes is now, before billions of dollars are spent in rebuilding our outdated electrical networks," Hoskins said. Rooftop solar and home batteries allow choosing a system "that maximizes public benefits, not utility shareholder profits."

The Bill:

https://www.capitol.hawaii.gov/session2018/bills/SB2939_SD2_.htm

Forced to Move: Climate Change Already Displacing U.S. Communities

Benjamin Goulet for KCET

April 26, 2018

Now Playing: S1 E1: Sea Level Rising: Living with WaterLearn more about a community displaced by climate change on "Sea Level Rising: Living with Water."

Extreme drought pushes rural inhabitants of a Middle Eastern nation into the cities, leading to social stressors and a devastating civil war. Decades of drought conditions and war send African migrants into neighboring countries by the tens of thousands. Residents of a small Alaskan village decide to vote in favor of relocating their entire population, as their living area crumbles around them due to storm surges.

These scenarios aren’t cribbed from a dystopian novel, or the plot of an apocalyptic big-budget movie; it’s real life. And it’s happening – now.

The role of climate change in human displacement and migration is being cited by experts as the number one global security threat of the 21st century. According to the United Nations Refugee Agency, annually 21.5 million people are, in their words, “forcibly displaced by weather-related sudden onset hazards – such as floods, storms, wildfires, [and] extreme temperatures.” These threats are not just concerning climate scientists and environmental activists; the U.S. State Department and the Global Military Advisory Council have both stated that global instability and uprisings, such as the Arab Spring, the war in Syria, and Boko Haram’s terrorism, all have links to climate change.

While the global displacement and migration of lower and middle-income countries make headlines, the U.S. is not immune to these situations.

And while the global displacement and migration of lower and middle-income countries make headlines, the U.S. is not immune to these situations. In 2016, two U.S. communities — both very different in terms of geography and demographics — made history for relocating their respective towns because climate change greatly altered their land.

The Alaskan village of Shishmaref is an Inupiat community of about 600 people and is located on an island called Sarichef (north of the Bering Strait). The island has been shrinking for over 40 years, as storm surges have become more and more powerful. In a relocation study published in February of 2016, experts calculated that warmer sea water is eroding the permafrost, undercutting protective embankments, which then topples into the ocean. In short, the island is shrinking.

The report states that “the Community has continued to shrink over the past 40 years with erosion eating away over 200 feet since 1969. A few storms have caused 25-30 foot erosion losses from a single storm.” After much debate, in August of 2016 the town voted to relocate completely, and decided to move the village about five miles away from its current location.

More than 6,000 miles away, the community of Isle de Jean Charles, a narrow piece of land located in Terrebonne Parish, Louisiana in the process of completely relocating. Funded by a $48.3 million dollar grant, the U.S. Housing and Urban Development has begun planning the logistics of resettling the residents as a whole community. The official website for the project states:

In the face of rising sea levels, subsiding land and frequent flood events, the south Louisiana residents of Isle de Jean Charles need to move to higher, safer ground. With the loss of more than 98 percent of the community’s land during the past 60 years, only 320 of the island’s original 22,400 acres remain. Although the island has been both a home and a historically significant landmark for nearly 200 years, community resettlement is inevitable. The only question is how.

As the historical homeland and burial ground of the tribe of the Isle de Jean Charles Band of Biloxi-Chitimacha-Choctaw Indians, HUD officials explicitly recognize the sensitive nature of the move, asserting that the resettlement must, in their words, “protect the island’s American Indian culture and support future generations.” In late 2017, a 515-acre farm 40 miles north of the island was chosen as the site for relocation of the residents. The new location is considered to be more convenient, as it is closer to schools, work and shopping. The island currently has no services and its one bridge is frequently impassable due to flooding, a major problem since most residents commute off the island for work.

The Federal Emergency Management Agency’s National Flood Insurance Program is now – no pun intended – drowning in debt of $25 billion, as payouts to homeowners have skyrocketed in the face of more powerful hurricanes.

Many other states are grappling with how to respond to stronger storms, decaying infrastructure and population shifts. In the wake of 2012’s Hurricane Sandy, a powerful storm whose strength is still debated as a possible result of climate-related changes in the atmosphere, insurance companies rewrote home insurance plans for homeowners living in coastal areas. The Federal Emergency Management Agency’s National Flood Insurance Program, created by Congress in 1968, is now – no pun intended – drowning in debt of $25 billion, as payouts to homeowners have skyrocketed in the face of more powerful hurricanes.

As sea levels rise, will residents eventually walk away from their million-dollar homes for higher land? The state of Florida is especially vulnerable to these possibilities. In a 2017 study by the independent organization Climate Central, nine out of the top 10 cities most vulnerable to coastal flooding by 2050 were located in Florida (New York was number one). There are reports that some Floridians aren’t waiting for the floods to arrive and are actively selling their homes.

Even as the 2016 Presidential race gained steam and the term “fake news” became the slur du jour against climate change by candidate Donald Trump, the U.S. military was already sounding the alarm. In July of 2015, the Department of Defense published a report titled “National Security Implications of Climate-Related Risks and a Changing Climate.” The report, compiled at the request of Congress, did not hold back in its findings, unambiguously listing a cascading series of events, such as drought, flooding, melting sea ice and global displacement as major national and global security threats. The agency states that these threats and their unpredictably will precipitate a greater response from the DoD in future crises around the globe. The report cites the conflict in Syria as just one example of this looming scenario:

[Climate change] could result in increased intra-and inter-state migration, and generate other negative effects on human security. For example, from 2006-2011, a severe multi-year drought affected Syria and contributed to massive agriculture failures and population displacements. Large movements of rural dwellers to city centers coincided with the presence of large numbers of Iraqi refugees in Syrian cities, effectively overwhelming institutional capacity to respond constructively to the changing service demands. These kinds of impacts in regions around the world could necessitate greater DoD involvement in the provision of humanitarian assistance and other aid.

The DoD sees climate change as a “threat multiplier,” exacerbating already existing issues, such as economic inequalities, social vulnerability, drought, disease and more, leading countries to suffer “systemic breakdowns.” These breakdowns, according to the DoD, will trigger population displacement, creating a dangerous cycle of instability around the globe.

In 2007, the United Nations published a report that cited climate change as a major contributor to conflict in Darfur. According to the report, rainfall in the area dropped 30 percent over the last 40 years, and extreme drought conditions inflamed tensions between farmers and herders as pasture land disappeared.

Beginning in 2003, the war in Darfur resulted in the deaths of over 200,000 and created a migration crisis, as Sudanese migrants spilled over into neighboring Chad, attempting to escape widespread ethnic cleansing. Although a peace accord was signed in 2005, Darfur is still cited as one of the first modern examples of climate change triggering global conflict across regions. In fact, rural tensions continue to threaten a new war between North and South Sudan as the drought continues.

Barrier islands such as this one off the Louisiana coast are one of the first lines of defense against dangerous storms. | Nicky Milne/Thomson Reuters FoundationBarrier islands such as this one off the Louisiana coast are one of the first lines of defense against dangerous storms. | Nicky Milne/Thomson Reuters Foundation

Looking forward, as many as 200 million people could be displaced by 2050 because of climate change, according to a 2007 report. What will the displaced be called? “Climate refugees”? “Climate migrants”? This phenomenon is still so new and disorienting, there isn’t an official designation for people who are displaced by climate change in their communities. According to the United Nations Refugee Agency, “many of those who are displaced across borders as a result of climate change may not meet the refugee definition.”

But climate displacement is happening and help is needed, regardless of the lack of an official designation. As the planet warms and displacements accelerate, an official term is likely to emerge.

"EU Agrees Total Ban On Bee-Harming Pesticides"

"The world’s most widely used insecticides will be banned from all fields within six months, to protect both wild and honeybees that are vital to crop pollination"

"The European Union will ban the world’s most widely used insecticides from all fields due to the serious danger they pose to bees.

The ban on neonicotinoids, approved by member nations on Friday, is expected to come into force by the end of 2018 and will mean they can only be used in closed greenhouses.

Bees and other insects are vital for global food production as they pollinate three-quarters of all crops. The plummeting numbers of pollinators in recent years has been blamed, in part, on the widespread use of pesticides. The EU banned the use of neonicotinoids on flowering crops that attract bees, such as oil seed rape, in 2013."

"Jury Awards Hog Farm Neighbors $50 Million"

"RALEIGH - A North Carolina jury awarded $50 million to neighbors of a 15,000-hog farm in Eastern North Carolina in a case being closely watched across the country by environmentalists and the hog farm industry.

The verdict, revealed late Thursday after a jury deliberated less than two days, is the first to come in a series of federal lawsuits filed against Murphy-Brown/Smithfield Foods, the world’s largest pork producer.

In this case decided in a federal courtroom in Raleigh, 10 neighbors contended that industrial-scale hog operations have known for decades that the open-air sewage pits on their properties were the source of noxious, sickening and overwhelming odors. The stench was so thick, the neighbors argued, that it was impossible to get it out of their clothes."

Minnesota team looks for synergy between solar and electric vehicles

WRITTEN BY

Frank Jossi

April 27, 2018

A $150,000 federal grant will pay for study of solar-powered charging stations in Minnesota.

A Minneapolis nonprofit has received a federal grant to study potential synergies between distributed solar and electric vehicle charging stations.

And it doesn’t have to look farther than its own rooftop for an example.

The Great Plains Institute is the lead organization on a $150,000 grant from the National Renewable Energy Laboratory’s Solar Energy Innovation Network.

With a growing expectation that EVs will someday represent a significant share of the car market, the government and utilities are using this study and others to learn how to manage growing electricity loads through maximizing the growing amount of renewable energy on the grid.

“We’re looking to create a roadmap for how we would deploy a solar-plus-EV technology in the marketplace,” said Brian Ross, senior program director for the Great Plains Institute.

A 30-kilowatt solar array on the institute’s rooftop feeds power to three electric chargers in its parking lot. A minimal charge is always available but when the sun is shining the chargers ramp up to a Level 2 power setting. If a station is not in use, power is transferred to the other chargers.

Ross said their own solar charging station pilot is an example of what could be used for park-and-ride lots, business and government fleets, homes, apartments and even retail establishments.

Another nearby example comes from Lake Region Electric Cooperative, which offers members a solar program called “Go West,” in which solar panels are tilted southwest at 240 degrees to maximize production for summer peak demand. The utility uses a Go West solar installation next to its headquarters to reduce peak demand, but in off peak hours it uses the solar energy during the day to charge a Chevy Bolt the cooperative purchased.

Other partners on the research grant include the Minnesota Department of Commerce and ZEF Energy, which operates the largest independently owned and operated fast charging network in Minnesota and Wisconsin. The company is building out a network of Level 2 chargers to link transportation corridors and reduce range anxiety among EV drivers.

CEO and founder Matthew Blackler said ZEF has developed technology allowing utilities to adjust or program Level 2 chargers based on electricity demand or time of day. A utility could fully charge vehicles when solar is plentiful, for example, and then limit charging during periods of peak demand for electricity.

“You could turn (charging) down to 25 percent or switch if off and on every 15 minutes” during peak demand hours, Blackler said.

The idea is to synchronize solar availability with EV charging and peak demand to improve grid stability, he added.

The group is one of nine teams chosen across the country. Others are looking at solar and grid resiliency, impacts on rate design, storage, and distribution generation in rural areas.

AS NUCLEAR POWER LOSES GROUND, ENVIRONMENTALISTS ARE TORN: SAVE NUCLEAR FOR CLIMATE’S SAKE?

REID FRAZIERAPRIL 27, 2018 for the Alleghany Front

CLIMATE CHANGEENERGY

The natural gas boom was supposed to help the electric sector lower its carbon footprint by replacing old, carbon dioxide-spewing coal plants with newer, lower-emitting natural gas plants. And to a degree, that’s happened. As coal plants have retired, carbon emissions from the power sector have decreased.

But the flood of cheap natural gas is also threatening the nation’s biggest source of carbon-free electricity: nuclear power.

In Pennsylvania and Ohio, four nuclear plants have said they are shutting down, years ahead of their scheduled retirement dates, unless they receive state aid. Those include Three Mile Island, near Harrisburg, and Beaver Valley, in Shippingport, Pa., as well as the Davis-Besse and Perry plants in Ohio. If those plants close, that’s bad news for climate hawks.

“If your concern is about climate change and you view it as an urgent and perhaps even existential threat to human society, then you should be very concerned about the closure of nuclear power plants,” said Jesse Jenkins, a Ph.D. candidate at MIT studying the electric grid.

That’s because the power currently generated by soon-to-close nuclear plants will likely be replaced mainly by natural gas, which produces about half the carbon dioxide as coal.

If nuclear plants close, emissions likely to rise

Together, the four plants provide enough electricity to power 4 million homes. A recent industry-funded group found the plants provide more electricity than all the wind and solar in the PJM Interconnection, the electric grid that serves 65 million people in the Mid-Atlantic region. The report estimates that replacing the four plants with fossil fuels with coal and natural gas would produce the same amount of carbon pollution as 4.5 million more cars on the road.

“Each one of those nuclear power plants is a significant source of low-carbon, emissions-free electricity, and losing each one is a significant step back at a time we should be making rapid progress towards a zero carbon energy system,” Jenkins said.

This Brattle Group chart shows four nuclear power plants in Ohio and Pennsylvania create more carbon-free electricity than all renewable sources in PJM Interconnection, the mid-Atlantic electric grid. The consultancy used funding from the nuclear industry to analyze planned closures.

But keeping these plants online might be difficult in an electricity market flooded by cheap natural gas. Jenkins’ research found that low natural gas prices were the main reason for the struggles of the nuclear industry.

To stay in business, the owners of nuclear plants are turning to states and the federal government for help. FirstEnergy has asked the federal government to declare a grid emergency to keep its coal and nuclear plants running.

New York and Illinois are subsidizing struggling nuclear plants through ‘zero emissions credits’ — subsidies that pay plants based on the amount of carbon pollution they save.

Matt Wald, a spokesman with the Nuclear Energy Institute, an industry trade group, says the subsidies are necessary.

“We are seeking to have the market recognize the things that we provide to the system,” Wald said. “It doesn’t do that now.”

Environmental groups conflicted on nuclear power

Nuclear’s precarious situation creates a dilemma for environmental groups, which have long opposed nuclear power.

“If they’re replaced by plants that burn fracked gas, then that’s definitely a problem,” said Tom Schuster, a senior campaign representative for the Sierra Club in Pennsylvania.

But, Schuster said, nuclear energy carries its own environmental problems: radioactive waste, impacts from uranium mining, and the risks of a Fukushima-like disaster. Also, he says, money spent keeping nuclear plants open could be spent building out the renewables sector.

“If policies to bail nuclear plants out are structured such that we’re spending a lot of money to keep plants online that we could otherwise be spending on truly clean sources of energy, then that’s problematic.”

But nuclear is still the country’s top source of carbon-free electricity. So the Sierra Club supported a nuclear bailout in Illinois that also promoted renewables, Schuster says.

“I kind of think about it as thinking about nuclear plant policy as a phase out rather than a bailout,” Schuster said.

In New Jersey, the state will spend $300 million a year to keep three nuclear plants online, while it also raised its renewable energy targets to 50 percent by the year 2030, and took separate actions to boost solar power and energy storage.

States take lead on nuclear bailouts

Schuster says the best way to lower carbon emissions is to simply tax them. That would make coal and natural gas more expensive, he said, which would increase the use of renewables and nuclear energy. But a federal tax on carbon emissions is unlikely. So most of the activity to keep nuclear plants online is in the states.

Any action in Pennsylvania to bail out its nuclear plants would face stiff opposition. Pennsylvania’s large natural gas industry opposes any such idea.

Stephanie Catarino Wissman, executive director of the Associated Petroleum Industries of Pennsylvania, said in an email that her organization “opposes any effort to subsidize one form of energy over another because when the government picks winners and losers in the electricity markets, consumers pay the price.”

The state legislature has a bi-partisan nuclear caucus, which is looking for policy solutions to keep the nuclear plants operating. But to date, the caucus has yet to introduce any bills in the legislature to keep the state’s nuclear plants running.

U.S.-U.K. science armada to target vulnerable Antarctic ice sheet

By Paul VoosenApr. 30, 2018 , 6:00 AM

An armada of 100 scientists will soon be descending on West Antarctica, and understanding the future of global sea levels might depend on what they find. Today, after several years of planning, the U.S. and U.K. science agencies announced the details of a joint $50 million (or more) plan to study the Thwaites glacier, the Antarctic ice sheet most at risk of near-term melting.

The International Thwaites Glacier Collaboration plans to deploy six teams to the remote ice sheet, where they will study it using a host of tools, including instrument-carrying seals and earth-sensing seismographs. The researchers will concentrate their work in the Antarctic summers of 2019–20 and 2020–21. An additional two teams will channel the findings of the field teams into global models.

Overall, the collaboration is the largest joint effort between the two nations in Antarctica since the 1940s. “We’ll see what until now has been inferred playing out right in front of our sensors,” says Ted Scambos, a glaciologist at the National Snow and Ice Data Center in Boulder, Colorado, who is serving as the lead U.S. scientific coordinator.

You agree to share your email address with the publication. Information provided here is subject to Science's privacy policy.

Over the past decade, thanks to a variety of satellite and aircraft observations and modeling insights—including signs that the glacier’s ice has started thinning and flowing faster toward the ocean—scientists have been paying special attention to Thwaites. It is, they believe, the Antarctic ice sheet most at risk of accelerated melting in the next century, making it the wild card in projections of sea level rise. But its remote location, 1600 kilometers from the nearest research station, has made it inaccessible to scientists seeking to understand these risks up close.

Thwaites, a 182,000-square-kilometer glacier in the Amundsen Sea, acts as a plug, blocking the rest of the West Antarctic Ice Sheet from flowing into the ocean. Melt from the glacier already accounts for 4% of modern sea level rise, an amount that has doubled since the 1990s. Scientists are concerned that if it retreats, it could become unstable, making the collapse of the ice sheet irreversible and ultimately increasing sea levels by 3.3 meters over the span of centuries or millennia.

“It could contribute to sea level in our lifetimes in a large way, in a scale of a meter of sea level rise,” says Sridhar Anandakrishnan, a glacial seismologist at Pennsylvania State University in State College who is co-leading one Thwaites project. “Which is just an unthinkable possibility.”

Over the past decade, the Thwaites glacier has risen to the forefront of scientists' Antarctic melt concerns. JEREMY HARBECK

Overall, the U.S. National Science Foundation and the United Kingdom’s Natural Environment Research Council will spend $25 million on the science, with each of the eight teams led by researchers from both countries. Funders expect to spend another $25 million or more on the logistics of moving so much heavy equipment toward the shelf. Several ships will work off the coastline while scientists will be based at the former drilling site of the West Antarctic Ice Sheet Divide ice core, flying from there to their research sites in small airplanes or helicopters.

The scientific teams will focus on what puts Thwaites particularly at risk. Researchers have noticed that shifts in winds seem to be pushing warm, deep ocean waters on to Antarctica’s continental shelf at the base of its glaciers. Thwaites is perched on a ridge that holds these waters back, but beyond that ridge, the land under the glacier slopes downward, creating an inland bowl that is below sea level. But the researchers are uncertain about the composition and slipperiness of that geologic bowl.

One study, co-led by Anandakrishnan, will seek to understand the actual composition of the bowl and a second ridge, 70 kilometers inland, on which the glacier might catch. “In some models [the melting glacier] stabilizes” on that ridge, Anandakrishnan says. “In some it doesn’t.” By detonating explosives on the surface of the ice sheet and using seismic sensors to measure their reflections, his team will tease out whether the rock underlying Thwaites, including this critical ridge, is soft and pliable or hard and crystalline.

Another project will target the warm intruding waters. “We plan to have a pincer movement,” says Karen Heywood, an oceanographer at the University of East Anglia in Norwich, U.K., who will be co-leading the team. One effort will involve drilling sensors through the ice and then driving several autonomous submersibles and gliders toward these stations. Another, starting this year, will involve outfitting 10 seals a year with scientific instruments, using the animals to make routine, repeated studies of the ocean. The technique that should produce a torrent of sustained data. They’ll be running similar measures on the nearby Dotson Ice Shelf, seeking to understand whether differences in the ocean waters explain why Thwaites is retreating so rapidly compared with Dotson.

Scientists also expect to explore the glacier’s grounding zone, planning to drill through 800 meters of ice to observe over several seasons where the triangle of ice, ocean, and rock meet. That will include dropping a new autonomous vehicle, developed by the Georgia Institute of Technology in Atlanta, that can deploy through a borehole and explore the grounding zone—an unprecedented view.

Other projects will seek geological evidence of whether Thwaites has previously retreated and reformed in the past 10,000 years, giving a clue to whether the modern melting threat is truly unprecedented. And another team will examine the glacier’s connections to the broader ice sheet—its shear margins—using radar and seismic reflections to detect whether neighboring ice helps hold it in place or lets it go, like a game of high-stakes red rover.

For now, the researchers are eager to get started. The United States and United Kingdom announced the opportunity almost a year and a half ago and it took some time to come together. This project is expected to launch a new generation of Antarctic researchers and, in the process, might reduce some uncertainty about the future of climate change. “No doubt we’re going to learn something that’s important to refining those predictions,” Scambos says. “This is kind of the missing piece right now.”

"US Judge Scraps Oakland, California, Ban On Coal Shipments"

"SAN FRANCISCO — A federal judge in California on Tuesday struck down the city of Oakland’s ban on coal shipments at a proposed cargo terminal, siding with a developer who wants to use the site to transport Utah coal to Asia.

In a scathing ruling, U.S. District Judge Vince Chhabria in San Francisco said the information the city relied on to conclude that coal operations would pose a substantial health or safety danger to the public was “riddled with inaccuracies” and “faulty analyses, to the point that no reliable conclusion about health or safety dangers could be drawn from it.”

The decision cheered coal proponents while opponents said they would continue to fight for cleaner air. Oakland is reviewing its options and may appeal, said Justin Berton, a spokesman for Mayor Libby Schaaf."

Enbridge likely to wrap protective sleeves around damaged Line 5 portions

Updated May 14; Posted May 14

By Michael Kranszmkransz@mlive.com

LANSING, MI -- Protective sleeves will likely be wrapped around dented portions of the Line 5 oil and gas pipeline, which was damaged last month by a suspected anchor strike.

A timeline for the fix is not currently available, as Enbridge Energy still has to confirm plans with regulatory officials, according to company spokesperson Ryan Duffy.

A section of the Line 5 pipeline crosses the Straits of Mackinac in Lake Michigan.

The Line 5 damage occurred April 1 when, according to a lawsuit filed by Michigan Attorney General Bill Schuette, a boat owned by an Escanaba-based shipping company dragged its anchor through the Straits, causing the damage.

Although visible "impacts" to the company's twin pipelines crossing the Straits of Mackinac haven't compromised line integrity, it did warrant a precautionary reducing of maximum operating pressure, Duffy said.

The line's eastern leg sustained a dent of a little more than three-fourths of an inch and an abrasion to the outer coating, according to Peter Holran, Enbridge director of U.S. government affairs.

The two dents on the western leg were a little less than three-quarters of an inch and under a half-inch, he said. On both legs, the impacts were spaced about 24 inches apart.

The pipes are 20 inches in diameter and have a wall thickness of 0.8 inches, Holran said.

An underwater photograph of Enbridge Line 5's eastern leg shows what the company calls "apparent contact areas," which are circled, believed to be damage resulting from an April 1 incident in the Straits of Mackinac.

Once the protective sleeve is implemented, the maximum operating pressure precaution will be lifted, Duffy said.

The suspected strike also damaged American Transmission Company's high-voltage power cables running through the straits. An estimated 600 gallons of dielectric fluid leaked into the water as a result.

Jerome Popiel, a representative from the U.S. Coast Guard, categorized the spill as "minor" with negligible environmental effects.

"It wasn't a big deal in terms in what actually happened," Popiel said. "What could've happened is getting everyone's attention."

Broken cables capped as Straits of Mackinac spill response continues

The broken ends from which an estimated 600 gallons of toxic fluid leaked were capped last week, 25 days after the spill began.

Popiel declined to give further information about the incident, citing the ongoing investigation by the U.S. Coast Guard.

The Line 5 pipeline, built in 1953, runs 645 miles from Superior, Wisconsin, to Sarnia, Canada, and transports up to 540,000 barrels of light crude oil and natural gas liquids per day.

The April 1 incident and the possibility of a major spill renewed calls for the aging pipeline to be shut down.

Study puts $6.3 billion price tag on potential Mackinac Straits oil spill

Prepared for advocacy group For Love of Water (FLOW), the study includes natural resource damages and various economic impacts.

There are no restrictions to dropping or dragging anchor in the Straits of Mackinac, only an advisory.

"Mariners should use extreme caution when operating vessels in depths of water comparable to their draft in areas where pipelines and cables may exist and when anchoring, dragging or trawling," the National Oceanic and Atmospheric Administration states.

At the Michigan Pipeline Safety Advisory Board's May 14 meeting, the first since the Straits incident, the possibility of a "no-anchor zone" to protect from further, and possibly worse, anchor-strike incidents was briefly mentioned.

"I think that's definitely being looked at," said Scott Dean, a spokesperson for the Michigan Department of Environmental Quality. "I think everyone is very focused on never seeing this happen again."

A problem with any anchor-drop ban, Dean said, is there must be stipulations that ensure protection of human life in cases where, for example, a ship is headed for collision with the Mackinac Bridge.

Dean cited awareness and warning campaigns as additional options.

Holran declined to say whether he's in favor of an anchor-drop ban in the Straits.

He told the board that Enbridge is working with the state and looking into safeguards which ensure vessels aren't acting "recklessly" or "negligently."

Mike Shriberg, Great Lakes executive director for the National Wildlife Federation and a member of Michigan's Pipeline Safety Advisory Board, said he supports an anchor-drop ban in the Straits, with the only exceptions being emergencies of life or death.

"We know that this is the No. 1 risk factor," Shriberg said. "We don't know how many times anchors have been dropped in the Straits."

Toxic Algae Blooms Happen More Often, May Involve Climate Feedback Loop

"The blooms, primarily fed by farm runoff but exacerbated by warming, release methane and CO2. Lake Erie is a 'poster child' for the challenge. "

"Blooms of harmful algae in the nation's waters appear to be occurring much more frequently than in the past, increasing suspicions that the warming climate may be exacerbating the problem.

The Environmental Working Group (EWG) published newly collected data on Tuesday reporting nearly 300 large blooms since 2010. Last year alone, 169 were reported. While NOAA issues forecasts for harmful algal blooms in certain areas, the advocacy group called its report the first attempt to track the blooms on a nationwide scale.

The study comes as scientists have predicted proliferation of these blooms as the climate changes, and amid increasing attention by the news media and local politicians to the worst cases.

Just as troubling, these blooms could not only worsen with climate change, but also contribute significantly to greenhouse gas emissions."

Electric School Buses Can Be Backup Batteries For the US Power Grid

American school buses guzzle $3.2B of carcinogen-laden diesel a year, but they could do a lot of good if electrified.

Tracey Lindeman

May 15 2018, 10:30am

We often think of electricity as a one-way transaction. Need to toast a bagel, wash the sheets, or charge your phone? Your fuse box sends you the juice you need. Electric vehicles, though, have the capacity to send power back to the electrical grid using vehicle-to-grid (V2G) technology—and that’s good news for an aging grid already operating at full capacity.

Vehicle-to-grid does this by letting electric vehicle (EV) batteries switch between providing and consuming energy on an as-needed basis. As EV adoption rates steadily climb, this technology could help stabilize the electrical grid, lessen the need for new power plants, and reduce kids’ exposure to cancer-causing exhaust. Essentially, electric vehicles can be like a backup battery for your phone, but for the entire US power grid.

“You’re looking for a vehicle that’s plugged in a decent amount of hours each day,” Marc Trahand, chief operating officer of San Diego-based V2G startup Nuvve, told me on the phone. The bigger the battery, the bigger the potential to help the electrical grid and the people using it.

Enter school buses. There are approximately 500,000 buses in the US alone, and the majority of them are basically rolling cancer machines due to their diesel engines. in 2012, the World Health Organization said diesel exhaust can definitely cause lung cancer, and might also be associated with an increased risk of bladder cancer. Meanwhile, according to the American School Bus Council, the US’s school buses consume a combined $3.2 billion worth of diesel a year.

Lance Noel, a postdoctoral V2G researcher at the Center for Energy Technologies at Aarhus University in Denmark, said school buses are not only ripe for electrification, but also V2G technology because they’re only in use for a few hours at a time. “It’s a giant battery sitting in a parking lot for at least 18 hours a day,” he told me on the phone.

Using a bidirectional charger—that is, one that is able to charge and discharge power on command dynamically using smart software—these batteries can supplement the grid in peak-demand times, and act as storage when demand is low.

The technology was created by a professor and co-founder of Nuvve, Willett Kempton, who has been working on V2G at the University of Delaware since 1996. Today Nuvve, which wants to be the intermediary between EV owners and grids, is working with automakers such as Nissan and Mitsubishi to integrate bidirectionality into batteries and chargers.

The Japanese automakers’ interest was partly piqued by the 2011 Fukushima nuclear disaster that resulted in blackouts, according to Trahand. “They realized, ‘We have all these electric cars with quite a big capacity,’” he said, adding that a Nissan Leaf could probably light up 10 houses for an evening, or a single house for far longer.

At its core, V2G advocates for the democratization of energy markets by allowing small-time players to use their vehicles to provide electricity—either to the grid, to their neighbors, or to their own homes—and potentially earn a little bit of revenue while doing so. According to PJM Interconnection, a grid operator and wholesale electricity market in the eastern half of the US, V2G tests done with electric BMW Minis earned each car user about $100 a month. Electric-vehicle owners in the US who drive 15,000 miles a year and charge exclusively at home can expect to pay $540/year in electricity costs, according to Plug In America.

Single EVs can’t participate in this market on their own because they don’t produce enough power; they’d have to be aggregated with other vehicles to participate in the market. This is why fleets are especially attractive for V2G technology. School buses are a particularly interesting fleet to work with, not only because of their V2G capability and revenue-generating potential, but also because of their primary clientele: kids.

In the US, 25 million children collectively ride 62 billion miles a year in school buses. Many of these use diesel engines, which again have been shown to cause cancer. Kids of color and children from low-income households in particular are more likely to take transit or the school bus. Research published in the American Journal of Respiratory and Critical Care Medicine showed that diesel pollution is far worse inside of bus cabins than in the air around the vehicle. That same research shows that cleaner fuel policies positively impact children’s health.

While a grid that still largely depends on coal isn’t an ideal power source for any electric vehicle, V2G’s storage capabilities could actually help promote renewable energy like solar and wind—the output of which fluctuates depending on time of day and weather conditions—by offering unique energy storage solutions that are lacking on the existing grid.

Given how skittish some people still are about electric cars, it might take some cajoling to convince school boards and PTAs they’ve got enough range and power to safely deliver their kids to school and back.

In North America, at least a couple of major school-bus manufacturers—Blue Bird and Lion—are working on proving the benefits of electrification and vehicle-to-grid technology. Electric school buses are a particularly good candidate for V2G, said Noel and Trahand, because their range far exceeds most route distances. The lower fuel and maintenance costs make up for their higher sticker price, they added.

Noted Trahand, “It’s going to be cheaper to go electric.”

Illinois to sue EPA for exempting Foxconn plant from pollution controls

Valerie Volcovici

WASHINGTON (Reuters) - Illinois’ Attorney General said on Friday she plans to sue the U.S. Environmental Protection Agency for allowing a proposed Foxconn Technology Co Ltd plant in neighboring Wisconsin to operate without stringent pollution controls.

On Tuesday, the EPA identified 51 areas in 22 states that do not meet federal air quality requirements for ozone, a step toward enforcing the standards issued in 2015.

An exempted area was Racine County, Wisconsin, just north of the Illinois border that is known to have heavily polluted air, where Taiwan-based Foxconn is building a $10 billion liquid-crystal display plant.

Pollution monitoring data show the county’s ozone levels exceed the 70 parts per billion (ppb) limit. If Racine County had been designated a “non-attainment” area, it would have required Foxconn to install stringent pollution control equipment.

Attorney General Lisa Madigan said she would file a lawsuit in the District of Columbia Circuit Court of Appeals challenging the EPA’s ozone designations, saying its failure to name Racine County a “non-attainment” area puts people at risk.

“Despite its name, the Environmental Protection Agency now operates with total disregard for the quality of our air and water, and in this case, the U.S. EPA is putting a company’s profit ahead of our natural resources and the public’s health,” Madigan said in a statement.

The EPA, under Administrator Scott Pruitt, left Racine County off its non-attainment list despite an agency staff analysis of ozone levels in Wisconsin published in December, which found that the county’s air exceeded federal ozone limits.

Wisconsin Governor Scott Walker, who supports bringing Foxconn to Wisconsin, tweeted on Tuesday that the state would work with EPA “to implement a plan that continues to look out for the best interest of Wisconsin.”

Wisconsin’s Republican-controlled state Assembly last year voted to approve a bill that paves the way for a $3 billion incentives package for a proposed by Foxconn.

Tourism's carbon impact three times larger than estimated

By Matt McGrath, Environment correspondent

Travellers from affluent countries are a key part of emissions growth in tourism

A new study says global tourism accounts for 8% of carbon emissions, around three times greater than previous estimates.

The new assessment is bigger because it includes emissions from travel, plus the full life-cycle of carbon in tourists' food, hotels and shopping.

Driving the increase are visitors from affluent countries who travel to other wealthy destinations.

The US tops the rankings followed by China, Germany and India.

Tourism is a huge and booming global industry worth over $7 trillion, and employs one in ten workers around the world. It's growing at around 4% per annum.

Previous estimates of the impact of all this travel on carbon suggested that tourism accounted for 2.5-3% of emissions.

However in what is claimed to be the most comprehensive assessment to date, this new study examines the global carbon flows between 160 countries between 2009 and 2013. It shows that the total is closer to 8% of the global figure.

As well as air travel, the authors say they have included an analysis of the energy needed to support the tourism system, including all the food, beverage, infrastructure construction and maintenance as well as the retail services that tourists enjoy.

"It definitely is eye opening," Dr Arunima Malik from the University of Sydney, who's the lead author of the study, told BBC News.

"We looked at really detailed information about tourism expenditure, including consumables such as food from eating out and souvenirs. We looked at the trade between different countries and also at greenhouse gas emissions data to come up with a comprehensive figure for the global carbon footprint for tourism."

The researchers also looked at the impacts in both the countries where tourists came from and where they travelled. They found that the most important element was relatively well off people from affluent countries travelling to other well to do destinations.

In the leading countries, US, China, Germany and India, much of the travel was domestic.

Travellers from Canada, Switzerland, the Netherlands and Denmark exert a much higher carbon footprint elsewhere than in their own countries.

Small island states like the Maldives are hugely dependent on long distance tourism

When richer people travel they tend to spend more on higher carbon transportation, food and pursuits says Dr Malik.

"If you have visitors from high income countries then they typically spend heavily on air travel, on shopping and hospitality where they go to. But if the travellers are from low income countries then they spend more on public transport and unprocessed food, the spending patterns are different for the different economies they come from."

When measuring per capita emissions, small island destinations such as the Maldives, Cyprus and the Seychelles emerge as the leading lights. In these countries tourism is responsible for up to 80% of their annual emissions.

"The small island states are in a difficult position because we like travelling to these locations and those small island states very much rely on tourist income but they are also at the same time vulnerable to the effects of rising seas and climate change," said Dr Malik.

Demand for international tourism is also being seen in emerging countries like Brazil, India, China and Mexico, highlighting a fundamental problem - wealth.

The report underlines the fact that when people earn more than $40,000 per annum, their carbon footprint from tourism increase 13% for every 10% rise in income. The consumption of tourism does "not appear to satiate as incomes grow," the report says.

The World Travel and Tourism Council (WTTC) has welcomed the research but doesn't accept that the industry's efforts to cut carbon have been a flop.

As countries get wealthier their citizens' appetite for global travel rapidly increases

"It would be unfair to say that the industry is not doing anything," said Rochelle Turner, director of research at WTTC.

"We've seen a growing number of hotels, airports and tour operators that have all become carbon neutral so there is a momentum."

Experts say that offsetting, where tourists spend money on planting trees to mitigate their carbon footprint will have to increase, despite reservations about its effectiveness.

Awareness is also the key. The WTTC say that the recent water crisis in Cape Town has also helped people recognise that changes in climate can impact resources like water.

"There is a real need for people to recognise what their impact is in a destination," said Rochelle Turner, "and how much water, waste and energy you should be using compared to the local population."

"All of this will empower tourists to make better decisions and only through those better decisions that we'll be able to tackle the issue of climate change."

Full Report:

http://www.nature.com/articles/s41558-018-0141-x.epdf?referrer_access_token=emVYaUKHfxMWOg4S_k4CU9RgN0jAjWel9jnR3ZoTv0OWchRe3LnakvyDKR-guL4qsoWG671RwjXXASZXjuYXVuiSQc7egCtBzuj1yos1I25z6K4rbm0_vmdaALM6M8TrWUOs2rI9wyEjSB3tDPDx4yY-hZm_J7JwUk1k4EBa3DWKaFyhQCYYZRM5wINmX104QVc7pAwDiWk_XfdtSUk6PU_ENs_pjWL0qFU_g1daH5Y%3D&tracking_referrer=www.bbc.com

DOE’s maverick climate model is about to get its first test

By Gabriel PopkinMay. 3, 2018 , 1:20 PM

The world's growing collection of climate models has a high-profile new entry. Last week, after nearly 4 years of work, the U.S. Department of Energy (DOE) released computer code and initial results from an ambitious effort to simulate the Earth system. The new model is tailored to run on future supercomputers and designed to forecast not just how climate will change, but also how those changes might stress energy infrastructure.

Results from an upcoming comparison of global models may show how well the new entrant works. But so far it is getting a mixed reception, with some questioning the need for another model and others saying the $80 million effort has yet to improve predictions of the future climate. Even the project's chief scientist, Ruby Leung of the Pacific Northwest National Laboratory (PNNL) in Richland, Washington, acknowledges that the model is not yet a leader. "We really don't expect that our model will be wowing the world," she says.

Since the 1960s, climate modelers have used computers to build virtual globes. They break the atmosphere and ocean into thousands of boxes and assign weather conditions to each one. The toy worlds then evolve through simulated centuries, following the laws of physics. Historically, DOE's major role in climate modeling was contributing to the Community Earth System Model (CESM), an effort based at the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. But in July 2014, DOE launched its Accelerated Climate Model for Energy. The goal was to predict how storms and rising seas could affect power plants, dams, and other energy infrastructure, and to focus on regions such as North America or the Arctic. DOE officials also wanted a model that could run on a generation of megapowerful "exascale" computers expected to turn on around 2021.

The project pulled in researchers from eight DOE national labs. It began as a carbon copy of the CESM and retains similar atmosphere and land models, but includes new ocean, sea-ice, river, and soil biochemistry simulations. The DOE team doubled the number of vertical layers, extended the atmosphere higher, and adopted a number-crunching method that is computationally intensive but may be easier to break into chunks and run in parallel on the anticipated exascale machines. "For them, it makes a lot of sense to go in that direction," says Richard Neale, a climate scientist at NCAR.

In 2017, after President Donald Trump took office and pulled the nation out of the Paris climate accords, DOE dropped "climate" from the project name. The new name, the Energy Exascale Earth System Model (E3SM), better reflects the model's focus on the entire Earth system, says project leader David Bader of Lawrence Livermore National Laboratory in California.

The E3SM's first results highlight its potential; they include model runs with ultrasharp, 25-kilometer-wide grid cells—fine enough to simulate small-scale features such as ocean eddies and mountain snow packs. But this sharp picture is still too coarse to resolve individual clouds and atmospheric convection, major factors limiting models' precision. And some scientists doubt it will improve forecasts. The last intercomparison effort, which ended in 2014, included 26 modeling groups—nine more than the previous round—yet yielded collective predictions that were no more precise. "Just having more models—I don't think there's any evidence that that's key to advancing the field," says Bjorn Stevens, a climate scientist at the Max Planck Institute for Meteorology in Hamburg, Germany, and co-leader of the new intercomparison, code-named CMIP6.

Gavin Schmidt, who heads NASA's Goddard Institute for Space Studies in New York City, which also produces a global climate model, questions the new model's rationale, given that DOE's exascale computers do not yet exist. "No one knows what these machines will even look like, so it's hard to build models for them ahead of time," he wrote on Twitter. And the computational intensity of the E3SM has drawbacks, says Hansi Singh, a PNNL climate scientist who uses the CESM for her research. The sheer number of calculations needed to get a result with the E3SM would overwhelm most university clusters, limiting outside scientists' ability to use it.

One preliminary result, on the climate's sensitivity to carbon dioxide (CO2), will "raise some eyebrows," Bader says. Most models estimate that, for a doubling of CO2 above preindustrial levels, average global temperatures will rise between 1.5C and 4.5C. The E3SM predicts a strikingly high rise of 5.2C, which Leung suspects is due to the way the model handles aerosols and clouds. And like many models, the E3SM produces two bands of rainfall in the tropics, rather than the one seen in nature near the equator.

The first test of the E3SM will be its performance in CMIP6. Nearly three dozen modeling groups, including newcomers from South Korea, India, Brazil, and South Africa, are expected to submit results to the intercomparison between now and 2020. Each group will devote thousands of computer-hours to standard scenarios, such as simulating the impact of a 1% per year CO2 increase and an abrupt quadrupling of it.

But given the plodding rate of improvement since previous intercomparisons, few are expecting the E3SM or any other model to yield revolutionary insights. Stevens hopes to revise the exercise to encourage innovations, such as modeling the climate at the 1-kilometer resolution needed to make out individual clouds, or campaigns to gather new kinds of data. "The whole premise of CMIP is trying to get everyone to do the same thing," he says. "Everyone knows that breakthroughs come from getting people to do different things."

Port Harcourt: Anger, anxiety as soot takes over skyline, environs

By Ann Godwin (Port Harcourt)

06 May 2018 | 4:35 am

The ranking of Port Harcourt as the world’s most polluted city in the world has not just increased panic among residents but has forced some wealthy residents and those who have alternative homes to begin relocating from the city.

In April 29, 2018, air visual, ranked Port Harcourt as the worst polluted city in the world with an air index of 188, followed by Beijing, China which ranked 182 and Delhi, India at 181 among others. Closed monitoring showed that the index of Port Harcourt has increased from 188 to 200 and above since then till now.

In fact, all parts of Port Harcourt ,including some neighbouring Local Government Areas like Eleme, Oyigbo, Ikwere, Obio/akpor, the four Local Councils in Ogoni are usually covered with dark clouds since the deposition of black soot started. Preliminary test samples showed that the soot was caused by

“incomplete combustion of hydrocarbons, as well as asphalt processing and illegal artisanal refining operations”.

The data for Cities’ air quality index is analysed everyday by Air Visual, a global air quality monitor founded in 2015, “ on the belief that data enables action and that without air quality data, the hazards of air pollution will remain invisible and unbeatable.

The soot is not like the rain or sun which drops on people outside a shield, but the soot filters in through locked doors and closed surfaces into the rooms, cars, offices, even undergrounds. In fact, there is no safe place for anyone residing in Port Harcourt as its effect is non discriminatory.

Current findings showed that people now fall sick often in the city. Cough, catarrh and sore throat is now common. Most hospitals’ bed spaces are occupied by sick children, women men and the elderly.

Undoubtedly, the black soot portend great danger to peoples lives as its has been associated with asthma, heart diseases , bronchitis and some respiratory illnesses,its toxicity can also cause cancer which may lead to premature death.

The panicking residents who could no longer keep calm took to the street three weeks ago protesting against the unending black soot in the once garden city which may now be likened as a dark city .

The hundreds of protesters including pregnant women, women, children, youths and the elderly, as well as Labour leaders, Civil society organisations marched from Iseac Boro Park as early as 8am to Government House, Port Harcourt and the State House of Assembly, demanding for a clean air to breath.

The leader of Stop-the-Soot Campaign, Tunde Bello, in a 14-point protest letter, said, it was time enough for government to priorities the issue of environment, adding that millions of Rivers residents risk cancer diseases among others if the sooth is not urgently stopped.

He also stressed the need for the state government to conduct an environmental audit on all the oil bearing communities and the operations going on in such areas.

Part of the letter reads, “There should be a street to street health campaign, and awareness on the health issue by local governments and various stakeholders.”

He added: “We want Governor Wike to lead a bi-partisan campaign on behalf of the state and its citizens,requesting that the state assembly and national assembly should change the rules governing the issue of environment and instigate the part of the Petroleum Industry Bill on environment.

Meanwhile, the Chief organiser of the protest, Eugene Abels, said the campaign would be sustained until something concrete is done by the Federal and State government to stop it.

He disclosed that, the group would kick off the public health campaign next week to enlighten and educate the populace about what to do to stay safe,adding that June 5, this year, another massive protest would be staged against the soot.

Abels said, the soot is injurious to health, which kills gradually. He maintained that serious steps need to be taken to draw the needed attention to avoid it go the way the United Nations Environment Programme (UNEP) report on Ogoni land is going.

However, the Deputy Governor , Dr. Ipalibo Banigo ,received the letter from the protesters and promised to pass the message to the State Governor, Nyesom Wike.

Meanwhile, Governor Wike, while inaugurating the Neighbourhood Safety Corps Agency at Government House Port Harcourt, urged the protesters to Chanel their protest to the Federal Government, stating that the government at the centre was responsible for the degradation of pollution of Rivers environment.

Wike said, Rivers government has done all that it suppose to do including advocacy and enlightenment campaigns with no action from the federal government.

He said: “Help us demonstrate against the Federal government because we have no control over the sources of the soot, do I go to shut down refinery, will they not say, it is economic sabotage?

“Do I even have the security, do i control the police and the army to go and stop production at the refinery, not even the navy.”

However, during the Governor’s meeting with the United Nation’s Delegation on the soot, he tabled the concerns of the state government and the people to the international community.

While addressing the UN delegation , He called on the United Nations to prevail on the Federal Government to act on the soot.

It is however, devastating that irrespective of the warnings by medical experts on the effects of the soot menace, nothing serious has been done to bring permanent solution to it. The Federal Government Agency like the NESDRA, the agency responsible for environment, has remained silent about the soot.

Though, the State Ministry of Environment in support with the task force set up by the state government has shut down asphalt processing plant operating in the city, and sealed off Chinese company for breach of environmental laws, and seizing abandoned tyres but despite all these, thick smoke still filters the air on daily bases and instead of uniting and fighting together to bring solution, the All Progressive Congress (APC and Peoples Democratic Party, (PDP) has kept politicking with the health of the people.

The state government has not taken any step further to address the challenge, rather it is blaming the Federal government, while the state chapter of APC is exonerating the Federal government from the challenge, Governor Wike is being blamed by them for causing the soot.

When The Guardian visited an Ogoni community last week in Khana Local Government Area of the State, the rain water was completely dark.

No one can wash, drink or bath with such water and since their water and lands are polluted and no provision of water yet, they resorted to buying water from vendors at high prices.

An environment expert, Professor of Applied Metrology and Environmental Management, in Rivers State University, Port Harcourt, Akuro Gobo, has advised the Federal Government to work in synergy with the State government, to bring solution to the soot menace.

He further, stated that a state of emergency should be declared on the environment sector of the state so that necessary funds would be released to tackle the soot problem.

Gobo said, “ there is need to work in synergy, the federal and state government, the National Assembly and State Assembly, the federal ministry of Environment and State Ministry of

Environment, the Federal Emergency Management Agency and the State Emergency Management Agency, to work together to enable them get some review of laws that can bring permanent solution, there should be better approach to it and encouragement in some studies”

Similarly, the Senator representing Rivers South East Senatorial District, Magnus Abe in an interview with The Guardian, said, it was time to declare national health emergency in Rivers State.

He urged all to stop politics, come together to find lasting solution to the soot, saying that should be the real problem bothers everyone in Rivers state.

On his part, the Chairman NUPENG and PENGASSAN joint Committee on Joint National Committee on PIB, Chika Onuegbu, who also joined in the soot protest, said, the more devastating issue is that there is no health facility in Rivers state or any other part of Nigeria to manage the medical emergency that the soot has unleashed already.

He said: “Look my friends, the Soot is like rain. It does not select who will breathe it. Whether you are in APC, PDP, Labour party etc does not matter here, let’s join hands and fight it.

“Just because we are in the gestation period does mean that all is well. That is the way cancer operates. So for me I don’t care what people say. I will continue to sensitize people and put pressure on all concerned to #STOPTHESOOT”.

Also, the National Coordinator of Ken Saro-Wiwa Associates, Chief Gani

Tobpa, urged government at all levels to work together to achieve a permanent solution to the soot menace.

Is the revolving door syndrome harming Europe's climate change fight?

By Chris Harris

last updated: 06/05/2018

Campaigners have sounded the alarm over climate change policy in Europe after uncovering evidence about the closeness of the relationship between governments and major energy firms.

A new report claims to have found 88 cases of senior politicians or staff moving from the public sector to the energy industry or vice versa.

Greens-European Free Alliance, the political grouping in the European Parliament that commissioned the research, says it opens the door to fossil fuel-friendly firms having an influence on Europe’s climate change policy.

It’s part of a wider ‘revolving doors’ issue in Brussels that sees former European Commissioners and other high-ranking staff move into the private sector to work for firms lobbying to shape EU legislation.

The report calls for safeguards to be put in place to help prevent conflicts of interest arising because climate change is "the greatest and most urgent challenges of our time".

“It is difficult to measure the effect the revolving door has on climate policy,” reads the report’s executive summary.

“We highlight numerous cases of revolving doors between fossil fuel companies and the public sector but more research is needed before a definitive link with its effect on climate policy can be made.

“Nevertheless, the cases documented highlight the major potential for conflicts of interest, and when one takes into account what is at stake for large fossil fuel companies, and how much lobbying they conduct on climate policy more generally, weak revolving door policy provides another avenue of influence for private fossil fuel interests to exploit.”

The report looks at the revolving door issue in 13 countries across Europe.

It highlighted scores of cases, among them Charles Hendry, the UK’s former minister for climate change, who since leaving office has netted three jobs with oil firms.

Full Report:

https://www.greens-efa.eu/files/doc/docs/3d2ec57d6d6aa101bab92f4396c12198.pdf

Scientists Find New Climate ‘Feedback Loop’ in Lakes

May 4, 2018

Reeds in a lake near Sudbury, Ontario, in the Canadian Boreal Shield. Andrew Tanentzap

Warming global temperatures have helped boost the growth of freshwater plants like cattails in the world’s lakes in recent decades. Now, scientists have found that this surge in aquatic plant growth could double the methane being emitted from lakes — already significant sources of methane — over the next 50 years.

The research, published in the journal Nature Communications, reveals a previously unknown climate feedback loop, where warming triggers the release of greenhouse gases, which in turn triggers more warming — similar to what is happening with Arctic’s melting permafrost. Freshwater lakes currently contribute as much as 16 percent of the world’s methane emissions, compared with just 1 percent from oceans.

Lakes produce methane when plant debris is buried in sediment and consumed by microbes. The scientists studied differences in methane production from biomass that originated in lakeside forests and from dead aquatic plants growing in the water. They found that forest-derived biomass helped to actually trap carbon in the lake sediment, reducing methane emissions. But aquatic plant biomass actually fueled methane production. Lake sediment full of decaying cattails produced over 400 times the amount of methane as sediment with plant debris from coniferous trees, and almost 2,800 times the methane from deciduous tree-filled sediment.

“The organic matter that runs into lakes from the forest trees acts as a latch that suppresses the production of methane within lake sediment,” Erik Emilson, an ecologist at Natural Resources Canada and lead author of the study, said in a statement. “Forests have long surrounded the millions of lakes in the northern hemisphere, but [they] are now under threat. At the same time, changing climates are providing favorable conditions for the growth and spread of aquatic plants such as cattails, and the organic matter from these plants promotes the release of even more methane from the freshwater ecosystems of the global north.”

Using models of the Boreal Shield, a lake-filled ecosystem that stretches across central and eastern Canada, Emilson and his colleagues calculated that the number of lakes colonized by just the common cattail (Typha latifolia) could double in the next 50 years — resulting in a 73 percent increase in lake-produced methane in that part of the world alone.

Full Report:

https://www.nature.com/articles/s41467-018-04236-2

Avoiding meat and dairy is ‘single biggest way’ to reduce your impact on Earth

Biggest analysis to date reveals huge footprint of livestock - it provides just 18% of calories but takes up 83% of farmland

Damian Carrington Environment editor

Thu 31 May 2018 14.00 EDT Last modified on Fri 1 Jun 2018 08.20 EDT

Avoiding meat and dairy products is the single biggest way to reduce your environmental impact on the planet, according to the scientists behind the most comprehensive analysis to date of the damage farming does to the planet.

The new research shows that without meat and dairy consumption, global farmland use could be reduced by more than 75% – an area equivalent to the US, China, European Union and Australia combined – and still feed the world. Loss of wild areas to agriculture is the leading cause of the current mass extinction of wildlife.

The new analysis shows that while meat and dairy provide just 18% of calories and 37% of protein, it uses the vast majority – 83% – of farmland and produces 60% of agriculture’s greenhouse gas emissions. Other recent research shows 86% of all land mammals are now livestock or humans. The scientists also found that even the very lowest impact meat and dairy products still cause much more environmental harm than the least sustainable vegetable and cereal growing.

The study, published in the journal Science, created a huge dataset based on almost 40,000 farms in 119 countries and covering 40 food products that represent 90% of all that is eaten. It assessed the full impact of these foods, from farm to fork, on land use, climate change emissions, freshwater use and water pollution (eutrophication) and air pollution (acidification).

“A vegan diet is probably the single biggest way to reduce your impact on planet Earth, not just greenhouse gases, but global acidification, eutrophication, land use and water use,” said Joseph Poore, at the University of Oxford, UK, who led the research. “It is far bigger than cutting down on your flights or buying an electric car,” he said, as these only cut greenhouse gas emissions.

Humans just 0.01% of all life but have destroyed 83% of wild mammals – study

“Agriculture is a sector that spans all the multitude of environmental problems,” he said. “Really it is animal products that are responsible for so much of this. Avoiding consumption of animal products delivers far better environmental benefits than trying to purchase sustainable meat and dairy.”

The analysis also revealed a huge variability between different ways of producing the same food. For example, beef cattle raised on deforested land result in 12 times more greenhouse gases and use 50 times more land than those grazing rich natural pasture. But the comparison of beef with plant protein such as peas is stark, with even the lowest impact beef responsible for six times more greenhouse gases and 36 times more land.

The large variability in environmental impact from different farms does present an opportunity for reducing the harm, Poore said, without needing the global population to become vegan. If the most harmful half of meat and dairy production was replaced by plant-based food, this still delivers about two-thirds of the benefits of getting rid of all meat and dairy production.

Cutting the environmental impact of farming is not easy, Poore warned: “There are over 570m farms all of which need slightly different ways to reduce their impact. It is an [environmental] challenge like no other sector of the economy.” But he said at least $500bn is spent every year on agricultural subsidies, and probably much more: “There is a lot of money there to do something really good with.”

Labels that reveal the impact of products would be a good start, so consumers could choose the least damaging options, he said, but subsidies for sustainable and healthy foods and taxes on meat and dairy will probably also be necessary.

One surprise from the work was the large impact of freshwater fish farming, which provides two-thirds of such fish in Asia and 96% in Europe, and was thought to be relatively environmentally friendly. “You get all these fish depositing excreta and unconsumed feed down to the bottom of the pond, where there is barely any oxygen, making it the perfect environment for methane production,” a potent greenhouse gas, Poore said.

The research also found grass-fed beef, thought to be relatively low impact, was still responsible for much higher impacts than plant-based food. “Converting grass into [meat] is like converting coal to energy. It comes with an immense cost in emissions,” Poore said.

The new research has received strong praise from other food experts. Prof Gidon Eshel, at Bard College, US, said: “I was awestruck. It is really important, sound, ambitious, revealing and beautifully done.”

He said previous work on quantifying farming’s impacts, including his own, had taken a top-down approach using national level data, but the new work used a bottom-up approach, with farm-by-farm data. “It is very reassuring to see they yield essentially the same results. But the new work has very many important details that are profoundly revealing.”

Giving up beef will reduce carbon footprint more than cars, says expert

Prof Tim Benton, at the University of Leeds, UK, said: “This is an immensely useful study. It brings together a huge amount of data and that makes its conclusions much more robust. The way we produce food, consume and waste food is unsustainable from a planetary perspective. Given the global obesity crisis, changing diets – eating less livestock produce and more vegetables and fruit – has the potential to make both us and the planet healthier.”

Dr Peter Alexander, at the University of Edinburgh, UK, was also impressed but noted: “There may be environmental benefits, eg for biodiversity, from sustainably managed grazing and increasing animal product consumption may improve nutrition for some of the poorest globally. My personal opinion is we should interpret these results not as the need to become vegan overnight, but rather to moderate our [meat] consumption.”

Poore said: “The reason I started this project was to understand if there were sustainable animal producers out there. But I have stopped consuming animal products over the last four years of this project. These impacts are not necessary to sustain our current way of life. The question is how much can we reduce them and the answer is a lot.”

‘Sunny day flooding’ worsens at NC beaches — a sign sea rise is decades too soon, studies say

By Abbie Bennett

May 03, 2018 10:31 PM

RALEIGH

Living in cities threatened by sea-level rise could be like living near an active volcano, according to NOAA oceanographer William Sweet.

Some parts of the Earth are seeing sea levels rise far beyond average, and it's just a waiting game before some areas are inundated with sea water, studies show.

The East Coast of the U.S. is experiencing "sunny day flooding" that scientists didn't expect for decades yet.

Sea levels are rising at a rate of about an inch per year (5 inches from 2011-15) in some areas along the East Coast, from North Carolina to Florida, according to one study — that's faster than researchers expected.

Residents of coastal communities most often feel the effects of sea level rise during tidal flooding.

Tidal flooding, also known as "sunny day flooding" is the temporary inundation of low-lying areas, such as roads, during high-tide events — especially during "king tides," the highest tides of the year.

King tides aren't caused by sea level rise in and of themselves, but because they are the annual peak tides, they demonstrate how sea level has already risen over the past 100 years.

Sea levels aren't rising equally "like water in a bathtub," according to a report from Yale Environment 360. "The oceans are more akin to a rubber kiddie pool where the water sloshes around unevenly, often considerably higher on one side than another."

More flooding, higher costs

Climate scientists view sea level rise as one of the most obvious signals of a warming planet. Sea water expands as it warms, and melting land-based ice sheets adds to rising water levels.

There are neighborhoods that now flood on sunny days, but didn’t years ago even during especially high tides, according to the National Oceanic and Atmospheric Administration.

And as sea levels continue to rise, the frequency, depth and extent of coastal flooding will continue to worsen, according to NOAA.

In 2016, Charleston saw 50 days of tidal flooding.

Fifty years ago? Just four days.

Flooding projections are set at about 25 percent above average for 2017-18 for areas including Wilmington, according to a recent NOAA report.

Wilmington had 84 days of high-tide flooding in 2016, according to NOAA.

"It is important for planning purposes that U.S. coastal cities become better informed about the extent that high-tide flooding is increasing and will likely increase in the coming decades," according to the February 2018 NOAA report.

Sea levels rise and waters inundate storm drains and wash over flood barricades — flooding buildings and streets.

While flooding impacts might be limited or not obvious in those area right now, stormwater systems are reported to be degraded, "increasing the risk of compound flooding during heavy rains," according to NOAA.

And coastal cities should be particularly concerned that the cost of dealing with an increase of many smaller floods will be greater than major, but much rarer, flood events, NOAA said.

A 2017 report "When Rising Seas Hit Home: Hard Choices Ahead for Hundreds of U.S. Coastal Communities" analyzed three projected scenarios of when towns and cities along U.S. coasts can expect to see the ocean rise enough to disrupt daily life.

That report found that as many as 20 North Carolina communities could be submerged by sea water in the next 15 years.

The report was created by the Union of Concerned Scientists, a U.S. nonprofit science advocacy group founded in 1969 by faculty and students of the Massachusetts Institute of Technology.

20 NC coastal communities could be flooded by sea water in 15 years, new report says

The Union of Concerned Scientists report predicts that by 2035, 13 communities clustered mostly on the mainland side of Pamlico Sound will be “chronically inundated.” The study defines that as the point at which 10 percent of a community’s usable land floods at least 26 times a year.

By 2060, that number rises to 25 communities. By 2100, it says, people 49 communities may be forced to adapt to rising water or move out.

Familiar vacation spots on North Carolina’s Outer Banks would suffer, the report said. Nags Head would have 11 percent of its land area, and Hatteras 14 percent, chronically inundated by 2045 under the highest sea level rise scenario. By 2060, it predicts, flooded areas would grow to 19 percent of the land at Nags Head and 28 percent at Hatteras.

Projections of global sea level rise range widely, from as little as 2 feet to more than 6 feet by 2100.

The Intergovernmental Panel on Climate Change estimated in a 2013 report that sea level will rise between 10.2 inches and nearly 39 inches by 2100, depending in part on future greenhouse gas emission scenarios and the effect of greenhouse gas concentrations on global temperatures and thermal expansion.

Sea-level rise scenarios have prompted opposition from some economic development interests on North Carolina’s coast that say long-range forecasts could be wrong.

When a state science panel reported in 2010 that seas on the coastline could rise by as much as 39 inches over the next century, legislators passed a law forbidding communities from using the report to make new rules.

A new report in 2015 looked only 30 years to the future and forecast a rise of 2 to 10 inches, depending on location.

Satellites and tide gauges have been used to report sea level at regular intervals in North Carolina.

That data shows that the sea level has been gradually rising consistently along the North Carolina coast for the past 30 years or more since those gauges were installed, according to the N.C. Department of Environmental Quality.

Tide gauge measurements show that relative sea-level rise is higher in the northern coastal plain (north of Cape Lookout) than in the southern coastal plain, according to NCDEQ. This is at least partly because the northern coastal plain has a higher rate of caving in or sinking than the southern plain.

The NCDEQ said possible impacts of sea-level rise in the state include:

▪ Accelerated coastal erosion.

▪ Higher storm surge flooding and property damage.

▪ Contamination of drinking water with seawater.

▪ Increased likelihood of flooding during heavy rain.

▪ More frequent flooding and drainage issues.

▪ Saltwater intrusion and salinity changes.

▪ Changes in the availability and distribution of fish.

Possible causes of East Coast flooding

Scientists have isolated several factors that appear to make the U.S. southeastern coast a hot spot of sea level rise.

One is the role of the Gulf Stream, a warm and fast Atlantic Ocean current that runs from the Gulf of Mexico to the tip of Florida, and then follows the East Coast. The Gulf Stream influences the East Coast's climate.

An example of the Gulf Stream's affect on sea level rise can be seen in 2016's Hurricane Matthew, the first Category 5 Atlantic hurricane since 2007's Felix. Matthew caused massive flooding, power outages and millions in damage throughout North Carolina because of its relentless rain and high sea levels that blocked drainage in North Carolina and in Virginia.

Matthew slowed in the Gulf Stream, stalling out over parts of the Southeast — worsening its effects.

Scientists credit the rapidly rising sea levels from Cape Hatteras in North Carolina to Miami from 2011 to 2015 (as much as 5 inches in some areas) to El Nio and other weather phenomena, including wind patterns that lead to higher water on the Eastern seaboard.

Charlotte Observer staff writer Bruce Henderson contributed to this report.

Weeds take over kelp in high CO2 oceans

Adelaide, Australia (SPX) May 04, 2018

Kelp forests (left image) are increasingly being replaced by extensive covers of weeds (right image) where local pollution occurs. Global pollution, in the form of carbon emissions, is likely to accelerate this switch.

Weedy plants will thrive and displace long-lived, ecologically valuable kelp forests under forecast ocean acidification, new research from the University of Adelaide shows.

Published in the journal Ecology, the researchers describe how kelp forests are displaced by weedy marine plants in high CO2 conditions, equivalent to those predicted for the turn of the century.

Carbon emissions will fuel the growth of small weedlike species, but not kelps - allowing weeds to take over large tracts of coastal habitats, the researchers say.

"Carbon emissions might boost plant life in the oceans, but not all plant life will benefit equally," says project leader Professor Sean Connell, from the University of Adelaide's Environment Institute. "Weedy species are quicker to capitalise on nutrients, such as carbon, and can grow faster than their natural predators can consume them.

"Unfortunately, the CO2 that humans are pumping into the atmosphere by burning fossil fuels gets absorbed by the ocean and favours weedy turfs, which replace kelp forests that support higher coastal productivity and biodiversity."

Led by the University of Adelaide, the international team from Europe, Canada, USA, Hong Kong used natural volcanic CO2 seeps to compare today's growth of weeds and kelps with levels of CO2 that are predicted for the turn of the century.

"In our study, we found that while elevated CO2 caused some weeds to be eaten in greater amounts, the dominant sea urchin predator ate these weeds at reduced amounts. This enabled the weeds to escape their natural controls and expand across coasts near the elevated CO2," says Professor Connell.

Fellow researchers Dr Zoe Doubleday and Professor Ivan Nagelkerken, from the University's Southern Seas Ecology Laboratories, visited the volcanic vents with Professor Connell.

"We could clearly see the effect of CO2 on promoting the dominance of weedy species and the suppression of their natural predators," says Dr Doubleday.

Professor Nagelkerken says: "Under the level of acidification we will find in oceans in a few decades, marine life is likely to be dominated by fast-growing and opportunistic species at the expense of longer-lived species with specialist lifestyles, unless we can set some change in place.

"We need to consider how natural enemies might be managed so that those weedy species are kept under control," Professor Nagelkerken says.

Europe's first solar panel recycling plant opens in France

by Reuters

Tuesday, 26 June 2018 06:32 GMT

The plant is set to recycle 1,300 tonnes of solar panels in 2018 - virtually all solar panels that will reach their end of life in France this year

* Veolia to recycle virtually all French solar panels

* PV panels for now recycled mainly with old glass

* Renewables agency expects huge growth in PV recycling value

* Veolia aims to build PV recycling plants abroad

By Geert De Clercq

PARIS, June 25 (Reuters) - French water and waste group Veolia has opened what it says is Europe's first recycling plant for solar panels and aims to build more as thousands of tonnes of ageing solar panels are set to reach the end of their life in coming years.

The new plant in Rousset, southern France, has a contract with solar industry recycling organisation PV Cycle France to recycle 1,300 tonnes of solar panels in 2018 - virtually all solar panels that will reach their end of life in France this year - and is set to ramp up to 4,000 tonnes by 2022.

"This is the first dedicated solar panel recycling plant in Europe, possibly in the world," Gilles Carsuzaa, head of electronics recycling at Veolia, told reporters.

The first ageing photovoltaic (PV) panels - which have lifespans of around 25 years - are just now beginning to come off rooftops and solar plants in volumes sufficiently steady and significant to warrant building a dedicated plant, Veolia said.

Up until now, ageing or broken solar panels have typically been recycled in general-purpose glass recycling facilities, where only their glass and aluminium frames are recovered and their specialty glass is mixed in with other glass. The remainder is often burned in cement ovens.

Employees work at Veolia’s solar panel recycling plant in Rousset, France, June 25, 2018. At the plant, photovoltaic panels are dissembled and their constituent parts such as glass, aluminium, silicon and plastics are recycled. REUTERS/Jean-Paul Pelissier

Employees work at Veolia’s solar panel recycling plant in Rousset, France, June 25, 2018. At the plant, photovoltaic panels are dissembled and their constituent parts such as glass, aluminium, silicon and plastics are recycled. REUTERS/Jean-Paul Pelissier

In a 2016 study on solar panel recycling, the International Renewable Energy Agency (IRENA) said that in the long term, building dedicated PV panel recycling plants makes sense. It estimates that recovered materials could be worth $450 million by 2030 and exceed $15 billion by 2050.

The robots in Veolia's new plant dissemble the panels to recuperate glass, silicon, plastics, copper and silver, which are crushed into granulates that can used to make new panels.

A typical crystalline silicon solar panel is made up of 65-75 percent glass, 10-15 percent aluminium for the frame, 10 percent plastic and just 3-5 percent silicon. The new plant does not recycle thin-film solar panels, which make up just a small percentage of the French market.

Veolia said it aims to recycle all decommissioned PV panels in France and wants to use this experience to build similar plants abroad.

Installed solar capacity is growing 30 to 40 percent per year in France, with 53,000 tonnes installed in 2016 and 84,000 tonnes in 2017, Veolia said.

Worldwide, Veolia expects tonnage of decommissioned PV panels will grow to several ten of millions of tonnes by 2050.

IRENA estimated that global PV waste streams will grow from 250,000 tonnes end 2016 - less than one percent of installed capacity - to more than five million tonnes by 2050. By then, the amount of PV waste will almost match the mass contained in new installations, it said.

Justice Kennedy’s Retirement Could Reshape the Environment

A new justice will likely weaken the Clean Air Act, Clean Water Act, and Endangered Species Act.

ROBINSON MEYER

JUN 27, 2018

Justice Anthony Kennedy arrives at the funeral of fellow justice Antonin Scalia in February 2016. CARLOS BARRIA / REUTERS

The retirement of Justice Anthony Kennedy, announced Wednesday in a letter hand-delivered to President Trump, could bring about sweeping changes to U.S. environmental law, endangering the federal government’s authority to fight climate change and care for the natural world.

With Kennedy gone, a more conservative Supreme Court could overhaul key aspects of the Clean Air Act, the Clean Water Act, and the Endangered Species Act, legal scholars say. And any new justice selected by President Trump would likely seek to weaken the Environmental Protection Agency, curtail its ability to fight global warming, and weaken its protections over wetlands.

The reason has to do with simple math. As on many other issues, Kennedy has functioned as the court’s swing vote on the environment, occasionally joining with the court’s four more liberal justices to preserve some aspect of green law.

“He’s been on the court just over 30 years, and he’s been in the majority in every single environmental case but one. You don’t win without Kennedy,” said Richard Lazarus, a law professor at Harvard who has argued 14 cases in front of the Supreme Court.

“I think more than the other more conservative justices, Kennedy seemed open to embracing the idea that tough national laws were necessary to address some types of problems,” he told me. “He was concerned about private-property rights and the marketplace, but open to the necessity of tough environmental laws.”

Other legal scholars and environmental advocates agreed.

“The loss of Kennedy is not good news for environmental regulation,” said Ann Carlson, a law professor at UCLA and the co-director of the Emmett Institute on Climate Change and the Environment.

The nation’s highest court would now “almost certainly” be more hostile to environmental law than it has been since the founding of the EPA in 1970, said Jonathan Z. Cannon, a law professor at the University of Virginia.

“With the departure of Justice Kennedy, this is no time to mince words: We are in for the fight of our lives,” said Trip Van Noppen, the president of the environmental-legal-advocacy group Earthjustice, in a statement. “Trump intends to fill this Supreme Court vacancy with someone who will put corporations, the wealthy, and the powerful above the rest of us. We must do everything in our power to resist this.”

Experts said that Kennedy’s departure could change the outcome of near-term rulings on three different questions: Can the government fight climate change? How broadly can it regulate clean water? And does the Constitution even allow it to protect endangered species and regulate pollution?

Kennedy provided the crucial fifth vote in Massachusetts v. EPA, which is the most important court case in U.S. climate law. In that decision, the Supreme Court said the EPA could regulate greenhouse gases under the Clean Air Act.

The ruling meant that the president could regulate carbon dioxide and other heat-trapping gases without requiring Congress to pass a new law. It set the stage for President Obama’s broad set of climate-targeted regulations, including his fuel-economy rules for cars and anti-carbon plan for the electricity sector.

Cannon, the University of Virginia law professor whose legal arguments shaped the case, told me that Kennedy’s vote in Massachusetts was “an environmentalist triumph that would not have happened if he had held ranks with other conservatives.”

Had Kennedy not joined with the liberals, then the EPA would likely not have authority over greenhouse gases. With Kennedy gone, this may soon come to pass.

“It’s easy to think about the loss of Kennedy leading to either the repeal of Mass. v. EPA or a serious restriction to the Clean Air Act’s ability to regulate greenhouse gases,” Carlson told me.

Lazarus, the Harvard professor, disagreed that a more conservative court would overturn Massachusetts v. EPA. “I assume [the decision] itself, that greenhouse gases are air pollutants, will hold. That’s a Constitutional law question … I don’t think we’re going to run roughshod over that,” he told me.

But he worried a future court could limit the ability of environmentalists to gain standing. That is, it could seriously restrict the ability of private Americans to sue the federal government for failing to respond to climate change.

Even more likely than these changes to climate law, experts said, is that Kennedy’s successor will curtail the Clean Water Act. Specifically, he or she would make it much easier to—and this is not a joke—drain the swamps.

The Clean Water Act, passed over President Nixon’s veto in 1972, allows the government to regulate the conservation and protection of rivers, streams, and any other “waters of the United States.” It also prevents companies from draining wetlands or dumping pollutants in those waters without a permit.

The only problem: The phrase “the waters of the United States” has never been completely defined. This means that it isn’t totally clear which bodies of water—and especially which wetlands, which tend to touch many different rivers and streams—are subject to EPA conservation and pollution control.

When the Supreme Court last examined that problem, in 2006, the justices ruled in an unusual way: 4-1-4. The four liberal justices wanted to preserve broad protections for wetlands; four conservative justices wanted to overrule it.

Kennedy wound up in the middle. He wrote his own opinion, recommending that wetlands be subject to federal regulation only if they had a “significant nexus” with navigable waters. This argument would have preserved much of the federal government’s ability to regulate wetlands.

A few years later, when the Obama administration tried to define “waters of the United States” once and for all, EPA lawyers looked to Kennedy’s opinion. “The Obama administration’s approach to regulating wetlands basically came straight out of Kennedy’s reasoning in that case,” Carlson told me.

It was a savvy bit of rule-writing: Since Kennedy would likely rule on a case about their rule, why not adopt his legal thinking? The only problem: “Obviously, now, Kennedy will be gone,” Lazarus said.

Had he stayed on the court for another year or two, Kennedy likely would have ruled on this very question. In January, the EPA Administrator Scott Pruitt announced that the Trump administration would suspend the Obama administration’s Kennedy-inspired wetlands rule and replace it with a far weaker policy. States and environmental groups promptly sued Pruitt, setting up a legal fight that has a good chance of reaching the high court.

“I think Kennedy would have struck down a serious attempt to repeal federal jurisdiction [over wetlands],” Carlson told me. “Kennedy had a much more sophisticated view of why the environmental protection of wetlands made sense—and not just for the wetlands themselves. Losing him could wreak havoc.”

Finally, Kennedy’s retirement could allow the high court to rule on a broad Constitutional question about whether the government even has the power to make and enforce environmental policy in the first place.

In October, for instance, the Supreme Court will hear arguments in Weyerhaeuser Company v. United States, which could address whether parts of the Endangered Species Act are unconstitutional. Plaintiffs in that case argue that the federal government doesn’t have Constitutional authority to force land owners to protect the habitats of endangered species.

“That is a hotly contentious issue in the Supreme Court, and it’s a case that one would expect going in would have a good chance of coming out 5-4,” Lazarus said. “So replacing Kennedy with anybody else makes a big difference.”

His absence from the bench could also reopen the Supreme Court to rule on the “takings clause,” which limits the federal government’s ability to take private property “for public use, without just compensation.” During the 1980s, conservative justices pushed for more and more aggressive readings of the takings clause. Decisions increasingly seemed to incline to the idea that the government should compensate companies that it regulated.

But when Kennedy joined the court in 1988, the speed of those rulings slowed. A more conservative court could push the idea again. “If you got a Supreme Court more interested in protecting private-property rights, then you could really get a curtailment of the government’s right to regulate under the Endangered Species Act and the Clean Water Act,” Carlson said.

A more conservative court could also end the legal principle, established during the Reagan administration, that government agencies like the EPA should have a wide latitude in interpreting the laws that affect them. The idea, named “Chevron deference” after a key court case, has undergirded many Democratic policy victories in the past three decades. Now, says Lazarus, it is “certainly at risk.”

He told me he expects deference to be a big issue during the confirmation hearings for Kennedy's replacement.

No matter who Trump picks for the Supreme Court, it is clear that an era of American environmental law—built around the preferences of one man—is coming to an end. Environmental lawyers may no longer work as hard to make their arguments Kennedy-esque: respectful of private property, but comfortable with a muscular government.

“Kennedy shared the right’s concerns about the scope of federal power, but he was reluctant to draw sharp bright lines limiting such power,” said Jonathan Adler, a law professor at Case Western Reserve, in an email. “He was similarly reluctant to endorse a view of standing that would significantly constrain environmental claims. These tendencies meant he was often the swing vote, and his solo opinions often determined the contours of the law.”

I asked Lazarus whether Kennedy’s departure made him more worried to argue another environmental case in front of the court. He waved me off.

“No, not at all, you work with what you’re given,” he said. “It’s the American voters’ job to elect the right people to the White House, and then it’s my job to argue the best case I can in front of the justices that are presented.

“And these sorts of moments,” he added, “underscore the stakes.”

ROBINSON MEYER is a staff writer at The Atlantic, where he covers climate change and technology.

Our Planet Lost 40 Football Fields of Tree Cover Every Single Minute in 2017

MIKE MCRAE 28 JUN 2018

Last year, 39 million acres of forest cover was lost from the world's tropics.

The good news is that this figure is a little lower than the record amounts of canopy destroyed in 2016. But that's pretty much the only silver lining here. A less generous interpretation of the data suggests there's no sign of the trend reversing.

Data gathered by the University of Maryland as part of the US-based World Resources Institute's Global Forest Watch was used in a snapshot describing the amount of tropical forest that lost significant amounts of cover in 2017.

To be precise, lost tree cover isn't quite the same thing as deforestation, which actually – thankfully – seems to be declining.

The reduction in a forest's cover describes the removal of 30 percent of canopy in both managed and wild wooded ecosystems, most commonly as a result of natural disasters or fires set by humans.

If 39 million acres is hard to wrap your head around, it's close to 160,000 square kilometres (about 60,000 square miles).

Still can't picture it? Nepal covers 147,181 square kilometres. So this is bigger than Nepal.

To lose that amount of cover you could strip leaves from an area of 40 standard American football fields. Every minute. For a year.

When we picture tropical forest, it's hard to not think of the Amazon first. And in 2016, Brazil lost 9.1 million acres of its portion of Amazonian tree cover – three times more than the previous year.

This jump was caused by widespread fires rather than deforestation, which still manage to do a thorough job of reducing biodiversity and biomass storage.

Considering the Amazon region had more fires in 2017 than in any other year since recording began in 1999, any ground gained in locking up carbon through past deforestation laws was well and truly set back.

Climate change is a significant contributing factor not just towards large scale fires, but various tree-stripping weather events. On the island of Dominica, an extreme hurricane season stripped bare a third of its tree cover in 2017. Similarly, Puerto Rico lost 10 percent of its island's canopy.

The biggest loser in the report was Colombia, which saw close to a 50 percent spike in tree cover loss. Yet the cause of this decline was more political than climate-related, and just as challenging to resolve.

The recent disarming of the major guerilla movement known as Revolutionary Armed Forces of Colombia (FARC) has seen them lose control over large sections of remote forest.

Where once the rebel faction kept commercial interests out of the wilderness, their removal has opened the way for illegal land clearance for coca plantations, logging, and pastures.

It's important to note that while this year's report technically shows an improvement, an average taken over the past three years still shows the problem is worsening. And if we want to see a trend, that's the more accurate number to go for.

Not that it's doom and gloom everywhere. Indonesia saw such a big reduction in 2017's tree cover loss, its three year average has improved as well.

A 2016 moratorium on converting peatland for agriculture is thought to have played a role in the drop, which combined with ongoing extensions to a halt on licenses for using primary forest that dates back to 2011 is worth cheering.

Indonesia's efforts show the rest of the world what can be done, which is a step in the right direction.

Measures to limit deforestation are to be applauded, and seem to be working. But curbing the impact of global warming on our climate and reigning in the poverty that encourages people to clear forest cover for crops and pastures remain important challenges.

Ironically, we need healthy forests to play a role in managing both of these issues. It's a vicious cycle, and one that we quickly need to put the brakes on.

Green electricity isn't enough to curb global warming

by Brooks Hays, Washington (UPI) Jun 26, 2018

The adoption of clean energies to power electric grids won't be sufficient to meet the Paris climate targets established by the United Nations.

According to new research published in the journal Nature Climate Change, the continued use of fossil fuels for a variety of industrial processes, to power vehicles and heat buildings, is likely to push CO2 emissions beyond manageable levels.

"We focused on the role of fossil fuel emissions that originate in industries like cement or steel making, fuel our transport sector from cars to freight to aviation and goes into heating our buildings," Shinichiro Fujimori, researcher from the National Institute for Environmental Studies and Kyoto University in Japan, said in a news release. "These sectors are much more difficult to decarbonize than our energy supply, as there are no such obvious options available as wind and solar electricity generation."

According to Fujimori and his colleagues, green transportation is essential to meeting CO2 emissions targets.

The Paris agreement called on nations to progressively curtail CO2 emissions in order to limit global warming to 1.5 degrees Celsius. To meet this target, scientists suggest no more than 200 gigatons of CO2 can be released between now and 2100. If current fossil fuel-use trends continue, however, 4,000 gigatons of CO2 will have been emitted by the end of the century.

Authors of the new study argue that relying on carbon capture and storage technologies is a dangerous strategy. Pulling carbon from the atmosphere is likely a necessity, scientists admit, but major industries, including the transportation industry, must also end their use of fossil fuels.

Researchers argue that climate pledges made by individual countries must be strengthened sooner rather than later in order to prevent continued investments in fossil fuel infrastructure, investments that lock in continued CO2 emissions.

"Climate mitigation might be a complex challenge, but it boils down to quite simple math in the end: If the Paris targets are to be met, future CO2 emissions have to be kept within a finite budget," said Elmar Kriegler, a scientist at the Potsdam-Institute for Climate Impact Research. "The more the budget is overrun, the more relevant will carbon dioxide removal technologies become, and those come with great uncertainties."

According to Krieglar, the precise CO2 budget may be difficult to calculate but the solution to the threat of global warming is clear.

Last straw for McDonald's, Burger King in Mumbai plastic ban

by Staff Writers

Mumbai (AFP) June 26, 2018

Burger King, McDonald's and Starbucks are among dozens of companies fined for violating a new ban on single-use plastics in India's commercial capital Mumbai, an official said Tuesday.

The rules, in force since Saturday, prohibit the use of disposable plastic items such as bags, cutlery, cups and bottles under a certain size.

Businesses and residents face fines of between 5,000 rupees ($73) for a first-time offence to 25,000 rupees ($367) or even three months in jail for repeat offending.

Some 250 officials, wearing blue uniforms and dubbed Mumbai's "anti-plastic squad", have been deployed to carry out inspections of restaurants and shops across the teeming coastal city of 20 million.

Nidhi Choudhari, a deputy municipal commissioner in charge of enforcing the ban, said 660,000 rupees ($9,684) in fines had been collected during the first three days.

She said 132 premises had been issued with penalties including outlets of Burger King, McDonald's and Starbucks.

A branch of Godrej Nature's Basket, a high-end Indian supermarket, had also been penalised, Choudhari added.

"All were fined for using banned plastic straws and disposable cutlery etc," she told AFP.

A spokesperson for Starbucks in India said the company complies with local laws in all of its markets and was committed to "environmental sustainability".

Hardcastle Restaurants, which runs the McDonald's franchise in Mumbai, said it had "successfully transitioned from plastic to eco-friendly and biodegradable alternatives" such as wooden cutlery.

Authorities hope the ban will help clean up Mumbai's beaches and streets, which like other cities in India are awash with vast mountains of plastic rubbish.

Plastic has also been blamed for blocking drains and contributing to flooding during the city's four-month-long summer monsoon.

Authorities first announced the ban -- which covers the whole of Maharashtra state, of which Mumbai is the capital -- three months ago to allow businesses to prepare.

The majority of India's 29 states have a full or partial ban on single-use plastics but the law is rarely enforced.

Choudhari said more than 8,000 businesses had been searched in Mumbai alone and at least 700 kilogrammes (1,500 pounds) of plastic seized.

Small traders, however, have claimed that the crackdown threatens their livelihoods.

Retailers associations say a confusion over what is and isn't allowed has led small grocery stores to remain closed for fear of being fined.

The Plastic Bags Manufacturers Association of India estimates that 300,000 people employed in the industry could lose their jobs.

The United Nations warned earlier this month that the world could be awash with 12 billion tonnes of plastic trash by the middle of the century if use is maintained at current levels.

Prime Minister Narendra Modi recently pledged to make India, which was the host of this year's International Environment Day, free of single-use plastic by 2022.

Industrial microbes could feed cattle, pigs and chicken with less damage to the environment

Potsdam, Germany (SPX) Jun 26, 2018

Deforestation, greenhouse gas emissions, biodiversity loss, nitrogen pollution - today's agricultural feed cultivation for cattle, pigs and chicken comes with tremendous impacts for the environment and climate. Cultivating feed in industrial facilities instead of on croplands might help to alleviate the critical implications in the agricultural food supply chain.

Protein-rich microbes, produced in large-scale industrial facilities, are likely to increasingly replace traditional crop-based feed. A new study now published in the journal Environmental Science and Technology for the first time estimates the economic and environmental potential of feeding microbial protein to pigs, cattle and chicken on a global scale. The researchers find that by replacing only 2 percent of livestock feed by protein-rich microbes, more than 5 percent of agricultural greenhouse gas emissions, global cropland area and global nitrogen losses could each be decreased.

"Chicken, pigs and cattle munch away about half of the protein feed cultivated on global croplands," says Benjamin Leon Bodirsky, author of the study from the Potsdam Institute for Climate Impact Research (PIK, member of the Leibniz Association). Without drastic changes to the agro-food system, the rising food and animal feed demand that comes with our meat-based diets will lead to continuous deforestation, biodiversity loss, nutrient pollution, and climate-impacting emissions.

"However, a new technology has emerged that might avoid these negative environmental impacts: Microbes can be cultivated with energy, nitrogen and carbon in industrial facilities to produce protein powders, which are then fed instead of soybeans to animals. Cultivating feed protein in labs instead of using croplands might be able to mitigate some environmental and climatic impacts of feed production. And our study expects that microbial protein will emerge even without policy support, as it is indeed economically profitable".

Small feed changes could have a substantial environmental impact

The study is based on computer simulations that assess the economic potential and environmental impacts of microbial protein production until the middle of the century. The simulations show that globally between 175?307 million tons of microbial protein could replace conventional concentrate feed like soybeans. So by replacing just roughly 2 percent of the livestock feed, pressure on deforestation agricultural greenhouse gas emissions and nitrogen losses from cropland could be decreased by more than 5 percent - namely 6 percent for global cropland area, 7 percent for agricultural greenhouse gas emissions and 8 percent for global nitrogen losses.

"In practice, breeding microbes like bacteria, yeast, fungi or algae could substitute protein-rich crops like soybeans and cereals. This method was originally developed during the cold war for space travel and uses energy, carbon and nitrogen fertilizers to grow protein-rich microbes in the lab," explains Ilje Pikaar from the University of Queensland in Australia.

For their new study, the researchers considered five different ways to breed microbes: By using natural gas or hydrogen, feed production could be completely decoupled from cultivating cropland. This landless production avoids any pollution due to agricultural production, but it also comes with a huge energy demand. Other processes that make use of photosynthesis by upgrading sugar, biogas or syngas from agricultural origin to high-value protein result in lower environmental benefits; some eventually even in an increase in nitrogen pollution and greenhouse gas emissions.

Microbial protein alone will not be enough for making our agriculture sustainable

"Feeding microbial protein would not affect livestock productivity," stresses author Isabelle Weindl from PIK. "In contrast, it could even have positive effects on animal growth performance or milk production". But even though the technology is economically profitable, the adoption of this new technology might still face constraints such as habitual factors in farm management, risk-aversion towards new technologies, or lacking market access. "However, pricing environmental damages in the agricultural sector could make this technology even more economically competitive," says Weindl.

"Our findings clearly highlight that the switch to microbial protein alone will not be enough for sustainably transforming our agriculture," says co-author Alexander Popp from PIK. To reduce the environmental impact of the food supply chain, major structural changes in the agro-food system are required as well as changes in human dietary patterns towards more vegetables.

"For our environment and the climate as well as our own health, it might actually be another considerable option to reduce or even skip the livestock ingredient in the food supply chain. After further advances in technology, microbial protein could also become a direct part of the human diet - using space food for people's own nutrition."

Article: Ilje Pikaar, Silvio Matassa, Benjamin L. Bodirsky, Isabelle Weindl, Florian Humpenoder, Korneel Rabaey, Nico Boon, Michele Bruschi, Zhiguo Yuan, Hannah van Zanten, Mario Herrero, Willy Verstraete, Alexander Popp (2018): Decoupling Livestock from Land Use through Industrial Feed Production Pathways. Environmental Science and Technology [DOI:10.1021/acs.est.8b00216]

Waste Heat: Innovators Turn to an Overlooked Energy Resource

Nearly three-quarters of all the energy produced by humanity is squandered as waste heat. Now, large businesses, high-tech operations such as data centers, and governments are exploring innovative technologies to capture and reuse this vast renewable energy source.

By Nicola Jones • May 29, 2018

When you think of Facebook and “hot air,” a stream of pointless online chatter might be what comes to mind. But the company will soon be putting its literal hot air — the waste heat pumped out by one of its data centers — to good environmental use. That center, in Odense, Denmark, plans to channel its waste heat to warm nearly 7,000 homes when it opens in 2020.

Waste heat is everywhere. Every time an engine runs, a machine clunks away, or any work is done by anything, heat is generated. That’s a law of thermodynamics. More often than not, that heat gets thrown away, dribbling out into the atmosphere. The scale of this invisible garbage is huge: About 70 percent of all the energy produced by humanity gets chucked as waste heat.

“It’s the biggest source of energy on the planet,” says Joseph King, one of the program directors for the U.S. government’s Advanced Research Projects Agency-Energy (ARPA-E), an agency started in 2009 with the mission of funding high-risk technology projects with high potential benefit. One of the agency’s main missions is to hike up energy efficiency, which means both avoiding making so much waste heat in the first place, and making the most of the heat that’s there. ARPA-E has funded a host of innovative projects in that realm, including a $3.5 million grant for RedWave Energy, which aims to capture the low-temperature wasted heat from places like power plants using arrays of innovative miniature antennae.

“The attitude has been that the environment can take this waste,” says one expert. “Now we have to be more efficient.”

The problem is not so much that waste heat directly warms the atmosphere — the heat we throw into the air accounts for just 1 percent of climate change. Instead, the problem is one of wastage. If the energy is there, we should use it. For a long time, says Simon Fraser University engineer Majid Bahrami, many simply haven’t bothered. “The attitude has been that the environment can take this waste; we have other things to worry about,” he says. “Now we have to be more efficient. This is the time to have this conversation.”

The global demand for energy is booming — it’s set to bump up nearly 30 percent by 2040. And every bit of waste heat recycled into energy saves some fuel — often fossil fuels — from doing the same job. Crunching the exact numbers on the projected savings is hard to do, but the potential is huge. One study showed that the heat-needy United Kingdom, for example, could prevent 10 million tons of carbon dioxide emissions annually (about 2 percent of the country’s total) just by diverting waste heat from some of the UK’s biggest power stations to warm homes and offices. And that’s not even considering any higher-tech solutions for capturing and using waste heat, many of which are now in the offing.

To help reduce carbon emissions — not to mention saving money and lessening reliance on foreign fuel imports — governments are increasingly pushing for policies and incentives to encourage more waste heat usage, big businesses like IBM are exploring innovative technologies, and start-ups are emerging to sell technologies that turn lukewarm heat into usable electricity.

For more than a century, waste heat has been used for its most obvious application: heat (think of your car, which uses waste heat from its engine to heat your interior). In 1882, when Thomas Edison built the world’s first commercial power plant in Manhattan, he sold its steam to heat nearby buildings. This co-generation of electricity and usable heat is remarkably efficient. Today, in the United States, most fossil fuel-burning power plants are about 33 percent efficient, while combined heat and power (CHP) plants are typically 60 to 80 percent efficient.

When it opens in 2020, Facebook's new data center in Odense, Denmark will channel its waste heat to warm nearly 7,000 homes. Facebook

That seems to make co-generation a no-brainer. But heat is harder to transport than electricity — the losses over piped distances are huge — and there isn’t always a ready market for heat sitting next to a power plant or industrial facility. Today, only about 10 percent of electricity generation in the U.S. produces both power and usable heat; the Department of Energy has a program specifically to boost CHP, and considers 20 percent a reasonable target by 2030.

Other countries have an easier time thanks to existing district heating infastructure, which uses locally produced heat to, typically, pipe hot water into homes. Denmark is a leader here. In response to the 1970s oil crisis, the country began switching to other energy sources, including burning biomass, which lend themselves to district heating. As a result, Denmark has an array of innovative waste-heat capture projects that can be added onto existing systems, including the upcoming Facebook data center.

In 2010, for example, Aalborg’s crematorium started using its waste heat to warm Danish homes (after the Danish Council of Ethics judged it a moral thing to do). Others are joining in. In Cologne, Germany, the heat of sewage warms a handful of schools. In London, the heat from the underground rail system is being channelled to heat homes in Islington. An IBM data center in Switzerland is being used to heat a nearby swimming pool. “Data centers crop up again and again as having huge potential,” says Tanja Groth, an energy manager and economist with the UK’s Carbon Trust, a non-profit that aims to reduce carbon emissions.

An alternative option is to turn waste heat into easier-to-transport electricity. While many power plants do that already, regulators striving for energy security are keen to push this idea for independent power producers like large manufacturers, says Groth. Businesses that make their own power would reduce carbon emissions by getting any extra electrical juice they need by squeezing it out of their waste heat, rather than buying it from the grid.

Waste heat is a problem of a thousand cuts, requiring a mass of different innovations.

Several companies have popped up to help do just this. One of the largest, Turboden, based in Brescia, Italy, sells a mechanical system based on the Organic Rankine Cycle. This is a type of external combustion engine — an idea that pre-dates the internal combustion engine used in cars. Rankine engines and similar technologies have contained, closed-loop systems of liquid that expand to gas to do work, thanks to a temperature difference on the outside of the system — so you can drive a power-generating engine off waste heat. When a cement plant in Bavaria, for example, added a Rankine engine to its system a decade ago, it reduced its electricity demand by 12 percent and its CO2 emissions by about 7,000 tons.

Since 2010, Turboden says it has sold systems for waste heat recovery to 28 production plants, with seven more under construction now. Turboden is just one of many; the Swedish-based company Climeon, for example, endorsed by spaceflight entrepreneur Richard Branson, uses a similar but different technique to make an efficient heat engine that can be bolted onto anything industrial, from cement plants to steel mills, in order to recycle their waste heat.

.

Waste heat is a problem of a thousand cuts, requiring a mass of innovations to tackle different slices of the problem: a system that works for one temperature range, for example, might not work for another, and some waste heat streams are contaminated with corrosive pollutants. “We aren’t looking for a silver bullet,” says Bahrami. “There are so many different things that can be done and should be done.”

Heat radiates from the Grangemouth Oil Refinery in Scotland. About 70 percent of all energy produced globally gets discarded as waste heat.

Bahrami and others are pursuing solid-state systems for waste heat recovery, which, with no moving parts, can in theory be smaller and more robust than mechanical engines. There are a wide array of ways to do this, based on different bits of physics: thermoacoustics, thermionics, thermophotovoltaics, and more, each with pros and cons in terms of their efficiency, cost, and suitability to different conditions.

“Thermoelectrics have been the major player in this space for years,” says Lane Martin, a material scientist at the Univeristy of California, Berkeley and Lawrence Berkeley National Laboratory. Seiko released a “thermic watch” in 1998 that ran off the heat of your wrist, for example, and you can buy a little thermoelectric unit that will charge your cell phone off your campfire. Researchers are trying hard to increase the efficiency of such devices so they make economic sense for wide-scale use. That means screening thousands of promising new materials to find ones that work better than today’s semiconductors, or tweaking the microstructure of how they’re built.

The biggest technological challenge is to pull energy from the lukewarm end of the spectrum of waste heat: More than 60 percent of global waste heat is under the boiling point of water, and the cooler it is, the harder it is to pull usable energy from it. Martin’s group is tackling this by investigating pyroelectrics (which, unlike thermoelectrics, works by exploiting electron polarization). This isn’t near commercial application yet; it’s still early days in the lab. But the thin-film materials that Martin’s team is investigating can be tuned to work best at specific temperatures, while thermoelectrics always work better the larger the temperature difference. Martin imagines future systems that stack thermoelectric materials to suck up some of the warmer waste heat, say above 212 degrees Fahrenheit, and then uses pyroelectrics to mop up the rest. Martin says his recent work on such materials drummed up interest from a few bitcoin mining operations. “They have a real problem with waste heat,” says Martin. “Unfortunately, I had to tell them it’s a little early; I don’t have a widget I can sell them. But it’s coming.”

“Waste heat is an afterthought — we’re trying to make it a forethought,” says a U.S. government scientist.

Perhaps one of the best applications for waste heat is, ironically, cooling. Air conditioners and fans already account for about 10 percent of global energy consumption, and demand is set to triple by 2050. In urban areas, air conditioners can actually heat the local air by nearly 2 degrees F, in turn driving up the demand for more cooling.

One solution is to use waste heat rather than electricity to cool things down: absorption or absorption coolers use the energy from heat (instead of electrically driven compression) to condense a refrigerant. Again, this technology exists — absorption refrigerators are often found in recreational vehicles, and tri-generation power plants use such technology to make usable electricity, heat, and cooling all at once. “Dubai and Abu Dhabi are investing heavily in this because, well, they’re not stupid,” says Groth.

But such systems are typically bulky and expensive to install, so again research labs are on a mission to improve them. Project THRIVE, led in part by IBM Research in Rschlikon, Switzerland, is one player aiming to improve sorption materials for both heating and cooling. They have already shown how to shrink some systems down to a reasonable size. Bahrami’s lab, too, is working on better ways to use waste heat to cool everything from long-haul trucks to electronics.

It’s very hard to know which strategies or companies will pan out. But whatever systems win out, if these researchers have their way, every last drop of usable energy will be sucked from our fuel and mechanical systems. “Waste heat is often an afterthought,” says King. “We’re trying to make it a forethought.”

Nicola Jones is a freelance journalist based in Pemberton, British Columbia, just outside of Vancouver. With a background in chemistry and oceanography, she writes about the physical sciences, most often for the journal Nature. She has also contributed to Scientific American, Globe and Mail, and New Scientist and serves as the science journalist in residence at the University of British Columbia.

South Georgia declared rat-free after centuries of rodent devastation

World’s biggest project to kill off invasive species to protect native wildlife is hailed a success

Fiona Harvey Environment correspondent

Tue 8 May 2018 19.01 EDT

Last modified on Wed 9 May 2018 05.31 EDT

Scientists check the island for rodents. The last of the poisoned bait was dropped more than two years ago.

The world’s biggest project to eradicate a dangerous invasive species has been declared a success, as the remote island of South Georgia is now clear of the rats and mice that had devastated its wildlife for nearly 250 years.

Rats and mice were inadvertently introduced to the island, off the southern tip of South America and close to Antarctica, by ships that stopped there, usually on whaling expeditions. The effect on native bird populations was dramatic. Unused to predators, they laid their eggs on the ground or in burrows, easily accessible to the rodents.

Two species of birds unique to the island, the South Georgia pipit and pintail, were largely confined to a few tiny islands off the coast, which the rodents could not reach, and penguins and other seabird populations were also threatened.

Mike Richardson, the chair of the decade-long 10m project, said: “No rodents were discovered in the [final] survey. To the best of our knowledge, for the first time in two and a half centuries this island is rodent-free. It has been a long, long haul.”

The South Georgia pipit was one of two native birds that had largely been confined to a few tiny islands off the coast as a result of the rodent invasion.

The last of the poisoned bait was dropped more than two years ago, but scientists have spent the intervening period monitoring the island for rodents. Two experienced dog handlers from New Zealand walked three dogs – named Will, Ahu and Wai – across nearly 1,550 miles (2,500km) in often extreme weather, beset by heavy rain and often fierce storms, seeking out signs of rats and mice.

Only when none were found over the whole period and in every area was the project deemed a success according to international standards. The leaders of the eradication effort declared the island free of rats and mice on Wednesday, and said the air resounded with the once-rare song of the native pipit.

The project was led by the South Georgia Heritage Trust, a charity set up to protect the island, and an associate organisation, the US-based Friends of South Georgia Island. The UK government also played a role, but the bulk of the funding came from private fundraising and philanthropy.

Scientists hope the success could become an inspiration and model for other projects around the world to eliminate invasive species, which in the worst cases can drive native animals close to extinction.

The island’s inaccessibility added to the difficulty of the project.

Lord Gardiner, the parliamentary under secretary at the Department for Environment, Food and Rural Affairs, said: “We must not rest on our laurels. In our overseas territories, which make up 90% of the UK’s biodiversity, [many species] are highly vulnerable.”

The South Georgia programme involved dropping hundreds of tonnes of poisoned bait on areas known to be infested with rodents. Harsh weather, mountainous terrain and the island’s limited accessibility made the project fraught with danger.

At times, when scientists believed they had eradicated rats from one area, they returned from another nearby section of the island, meaning the poisoning regime had to begin again. Richardson paid tribute to the teams’ endurance and bravery.

South Georgia, one of the UK’s numerous overseas territories, was first noted by Captain Cook in 1775 on one of his voyages of discovery. It became a stop for the hundreds of whalers which plied the southern seas, providing shelter, dry land and a meeting site for ships that may have spent weeks or months without sight of land.

About 2,000 people lived on the island during whaling’s heyday, but today the main activities are centred on two scientific research stations run by the British Antarctic Survey. It was claimed for Argentina during the Falklands war, and a UK garrison was only withdrawn in 2001. The island is also known as the burial ground of the Antarctic explorer Sir Ernest Shackleton, who died there in 1922.

One of the dogs used to hunt for any remaining rodents. Photograph: Oliver Prince/South Georgia Heritage Trust

About 100 miles long, the island covers about 350,000 hectares (865,000 acres) and much of it is covered in snow and ice. The remainder is very mountainous. Some of the coastal regions have vegetation, which has provided a haven for seabirds, and the pipits and pintails.

Other species on the island include seals – 98% of the world’s population of fur seals breed here, and it is home to about half the global population of elephant seals – and four penguin species, including 450,000 breeding pairs of king penguins. All four of the penguin species are listed as threatened. About 30 million birds are thought to nest and raise chicks on the island, with 81 species recorded.

Invasive species are one of the worst threats to biodiversity around the world. When non-native species are introduced, often inadvertently but also sometimes as pets or ornamental features, they can disrupt natural ecosystems evolved over millennia to the detriment of the native species.

China Has Refused To Recycle The West's Plastics. What Now?

June 28, 20184:02 PM ET by SARA KILEY WATSON

For more than 25 years, many developed countries, including the U.S., have been sending massive amounts of plastic waste to China instead of recycling it on their own.

Some 106 million metric tons — about 45 percent — of the world's plastics set for recycling have been exported to China since reporting to the United Nations Comtrade Database began in 1992.

But in 2017, China passed the National Sword policy banning plastic waste from being imported — for the protection of the environment and people's health — beginning in January 2018.

Now that China won't take it, what's happening to the leftover waste?

According to the authors of a new study, it's piling up.

"We have heard reports of waste accumulating in these places that depend on China," says Amy Brooks, a doctoral student of engineering at the University of Georgia and the lead author of the study published in Scientific Advances last week.

She says some of it is ending up in landfills, being incinerated or sent to other countries "that lack the infrastructure to properly manage it."

By 2030, an estimated 111 million metric tons of plastic waste will be displaced because of China's new law, the study estimates. This is equal to nearly half of all plastic waste that has been imported globally since 1988.

.

Rapid expansion of disposable plastics in the 1990s — and single-use containers — drove imports up rapidly. Yearly global imports grew 723 percent, to around 15 million megatons, from 1993 to 2016.

For developed nations like the United States, it can be more economical to push plastics out of the country rather than recycling them, says Jenna Jambeck, an associate professor of engineering at the University of Georgia and another of the study's authors.

The U.S., Japan and Germany are all at the top of the list when it comes to exporting their used plastic. In the U.S. alone, some 26.7 million tons were sent out of the country between 1988 and 2016.

Hong Kong is the biggest exporter of plastic waste, at 56.1 million tons. But it has acted as an entry point to China — having imported 64.5 million tons from 1988 to 2016 from places like the U.S. (which sent more than 372,000 metric tons there in 2017) and then having sent most of that on to China.

Other nations can buy and recycle these plastics and manufacture more goods for sale or export, as China did, making it profitable for them as well, the study notes. Industry publication Waste Dive reports that the U.S. sent 137,044 metric tons to Vietnam in 2017, up from 66,747 in 2016.

But nations like Malaysia, Thailand or Vietnam, which have picked up some of what China is leaving behind, don't have as well-developed waste management systems. Jambeck says Vietnam has already reached a cap on how much waste it can handle: The country has announced it will not accept any more imports of plastic scraps until October.

"Not one country alone has the capacity to take what China was taking," Jambeck says. "What we need to do is take responsibility in making sure that waste is managed in a way that is responsible, wherever that waste goes — responsible meaning both environmentally and socially."

Marian Chertow, the director of the program on solid waste policy at Yale who was not involved in the study, says these new findings are useful because it confirms what experts thought they knew — nearly half of plastic waste exports have ended up being recycled in China.

"There's a tremendous shift in the market when China won't take half of these plastics. I really think that this export mindset that has developed in the U.S. is one that has to change," she said".

Seattle bans plastic straws, utensils at restaurants, bars

SEATTLE (AP) — Looking for a plastic straw to sip your soda? It’s no longer allowed in Seattle bars and restaurants.

Neither are plastic utensils in the latest push to reduce waste and prevent marine plastic pollution. Businesses that sell food or drinks won’t be allowed to offer the plastic items under a rule that went into effect Sunday.

Seattle is believed to be the first major U.S. city to ban single-use plastic straws and utensils in food service, according to Seattle Public Utilities. The eco-conscious city has been an environmental leader in the U.S., working to aggressively curb the amount of trash that goes into landfills by requiring more options that can be recycled or composted.

The city’s 5,000 restaurants - including Seattle-based Starbucks outlets - will now have to use reusable or compostable utensils, straws and cocktail picks, though the city is encouraging businesses to consider not providing straws altogether or switch to paper rather than compostable plastic straws.

“Plastic pollution is surpassing crisis levels in the world’s oceans, and I’m proud Seattle is leading the way and setting an example for the nation by enacting a plastic straw ban,” Seattle Public Utilities General Manager Mami Hara said in a statement last month.

Proposals to ban plastic straws are being considered in other cities, including New York and San Francisco.

California’s Legislature is considering statewide restrictions, but not an outright ban, on single-use plastic straws. It would block restaurants from providing straws as a default but would still allow a customer to request one. It’s passed the state Assembly and now awaits action in the Senate.

In the United Kingdom, Prime Minister Theresa May announced in April a plan to ban the sale of plastic straws, drink stirrers and plastic-stemmed cotton buds. She called plastic waste “one of the greatest environmental challenges facing the world.”

Smaller cities in California, including Malibu and San Luis Obispo, have restricted the use of plastic straws. San Luis Obispo requires single-use straws only be provided in restaurants, bars and cafes when customers ask for them. City officials said most customers will say “no” if asked if they want a straw.

Business groups have opposed the idea in Hawaii, where legislation to ban plastic straws died this year, the Honolulu Star-Advertiser reported Sunday , with the Hawaii Restaurant Association and Hawaii Food Industry Association testifying against the measure.

Seattle’s ban is part of a 2008 ordinance that requires restaurants and other food-service businesses to find recyclable or compostable alternatives to disposable containers, cups, straws, utensils and other products.

Businesses had time to work toward complying with the ban, said Jillian Henze, a spokeswoman for the Seattle Restaurant Alliance, an industry trade group.

“We’ve almost had a year to seek out products to protect the environment and give customers a good experience (with alternatives),” she said.

The city had allowed exemptions for some products until alternatives could be found. With multiple manufacturers offering alternatives, the city let the exemption for plastic utensils and straws run out over the weekend.

Environmental advocates have been pushing for restaurants and other businesses to ditch single-use straws, saying they can’t be recycled and end up in the ocean, polluting the water and harming sea life.

A “Strawless in Seattle” campaign last fall by the Lonely Whale involving more than 100 businesses voluntarily helped remove 2.3 million single-use plastic straws.

Supporters say it will take more than banning plastic straws to curb ocean pollution but that ditching them is a good first step and a way to start a conversation about waste and ocean conservation.

Seattle urged businesses to use up their existing inventory of plastic utensils and straws before Sunday. Those who weren’t able to use up their supply have been told to work with the city on a compliance schedule.

Businesses that don’t comply may face a fine of up to $250, but city officials say they will work with businesses to make the changes.

Climate Change Will Leave Many Pacific Islands Uninhabitable by Mid-Century, Study Says

As sea level rises, waves will more frequently wash over many of the atolls—and the U.S. military facilities built there—damaging water supplies and infrastructure.

BY NICHOLAS KUSNETZ, APR 25, 2018

Storm-driven waves wash over man-made barriers on Roi-Namur island in 2014, showing the risk as sea level rises. Credit: U.S. Geological Survey

"Even if we don't worry about the groundwater issue, when seawater floods through cars and buildings and infrastructure every year, that's going to have an impact," U.S. Geological Survey scientist Curt Storlazzi said. Credit: USGS

Long before the Pacific Ocean subsumes thousands of low-lying islands, waves will begin washing over them frequently enough to ruin groundwater supplies and damage crops and fragile infrastructure. A new report says this likely will render many coral atolls uninhabitable within decades—before 2030 in a worst case scenario and by 2065 in a more optimistic one.

"When you're walking around the islands and you see kids there, you realize this isn't something that's next generation or the generation after that," said Curt Storlazzi, a scientist with the U.S. Geological Survey and the study's lead author. "If these scenarios play out, we're talking about those kids' generation."

Some of these reef-lined islands are home to hundreds of thousands of people and American military sites.

In the study, commissioned by the Defense Department and published Wednesday in the journal Science Advances, the scientists combined climate projections with weather and wave modeling to look at the impacts of rising seas on an island in Kwajalein Atoll. Kwajalein is part of the Marshall Islands and home to the Ronald Reagan Ballistic Missile Defense Test Site, which has been used to test U.S. defenses against a nuclear attack.

The Air Force spent nearly $1 billion in recent years building a facility there to track space debris. But a report by the Associated Press found in 2016 that the Defense Department and contractor Lockheed Martin paid little attention to projections for rising seas when planning and constructing the site, which is intended to have a 25-year life span.

John Conger, who was Assistant Secretary of Defense for Energy, Installations and Environment under President Obama and is now director of the Center for Climate and Security, said the project illustrates how much money the Pentagon could waste if it fails to adequately plan for rising seas and climate change.

"You've got to think this through," he said. "A lot of people have asked me in the past about how much the department is going to invest in dealing with climate change, and I think it's the wrong question to ask. I think climate change is an important factor to study to save money."

Lessons from a 2008 Storm Surge

As the planet warms, rising seas will pose a risk to hundreds of the United States' coastal military sites, with more than 200 already reporting effects from storm surges. The new study suggests that without global efforts to rapidly cut emissions or expensive mitigation projects, the facilities on low-lying islands may have even shorter lifespans than sea level rise projections would indicate.

Storlazzi began studying the issue after a winter storm hundreds of miles away sent water surging over the Marshall Islands in 2008.

"It had destroyed the groundwater so much, they had to fill up tankers and take freshwater out to a lot of these atoll islands," he said.

Storlazzi and his colleagues found that with about 16 inches of sea level rise, such waves would wash saltwater over Roi-Namur island on Kwajalein Atoll about once a year. When that starts happening annually, the island's aquifer will be unable to recover, the authors said.

"Even if we don't worry about the groundwater issue, when seawater floods through cars and buildings and infrastructure every year, that's going to have an impact," Storlazzi said. While the study was specific to Kwajalein's topography, the effects of sea level rise will likely be similar on thousands of other atolls and reef-lined islands across the tropics.

Temporary Adaptation — at a Cost

At the Defense Department's request, the authors considered three scenarios for rising seas: one in which emissions stabilize mid-century and two in which they continue to rise. In the most optimistic scenario, annual flooding would still likely begin between 2055 and 2065.

The island can be protected, at least for a time, with measures like building seawalls and shipping in water. But the costs will continue to rise, Conger said.

Kwajalein Atoll.

"The nearest-term impact is going to be on drinking water," he said. "How expensive is it going to be to get drinking water out there?"

Most of the atolls, of course, do not host military bases, and governments of these island nations may not have the money to pay for coastal protections, leaving residents exposed.

Military Studies Show Bases Face Climate Risks

The Pentagon has been studying the impact of climate change on its facilities and national security for more than a decade. In 2014, an Army Corps of Engineers study determined that about 1.5 feet of sea level rise could bring a "tipping point" for the Navy's largest base, in Norfolk, Virginia, where the risk of damage to infrastructure would increase dramatically. That level is expected to be reached in the next few decades.

In January, the Defense Department published the results of a survey of nearly 1,700 military installations that found about half reported experiencing at least one weather-related effect associated with climate change, such as wildfires, drought and flooding. Last year, the Government Accountability Office said the military was failing to adequately plan for climate change at overseas facilities.

Now, the military's work may be caught up in the Trump administration's hostility toward action on climate change. The Defense Department's response published with the GAO report, for example, said that blaming infrastructure damage from weather on climate change was "speculative at best and misleading." Two key strategy documents issued by the Trump administration this year—the National Defense Strategy and National Security Strategy—omitted references to climate change that had been included in earlier versions.

But where the administration may be pushing one way, Congress has begun to push in the other. Last year, lawmakers included language in an annual defense budget bill that declared climate change to be a national security threat and required the Pentagon to issue a report to Congress this year assessing vulnerabilities to climate change over the next 20 years, including a list of the 10 most at-risk facilities in each service.

Trump plan to tackle lead in drinking water criticized as 'empty exercise'

Sources within EPA tell Guardian that proposals are threadbare and muddled – ‘they’re are just making it up as they go along’

Oliver Milman in New York, Thu 26 Apr 2018 06.00 EDT

Four years ago officials chose to switch Flint’s water to the Flint river, without lead corrosion controls, prompting the public health crisis.

Donald Trump has overseen an onslaught against environmental regulations while insisting, in the wake of the Flint lead crisis, that he would ensure “crystal-clean water” for Americans.

The federal government says it is currently drawing up a new plan to tackle lead contamination, which the Environmental Protection Agency says will be unveiled in June.

Flint activists still waiting as governor escapes fallout of water crisis

Public details of the plans are, however, scant. And some sources inside the EPA, speaking to the Guardian anonymously, said they were skeptical about whether what was being developed could meet such a challenge. “It’s a fig leaf,” one source claimed.

Scott Pruitt, the administrator of the EPA, which is spearheading the new strategy, has vowed to eliminate lead from drinking water and banish the specter of Flint. Wednesday this week was the fourth anniversary of the decision to switch the Michigan city’s water supply to the Flint river, without lead corrosion controls, prompting the public health crisis.

In February, Pruitt met fellow Trump cabinet members, including Alex Azar, the health and human services secretary, and Ben Carson, the housing and urban development secretary, and other agencies to tout a new approach for aimed at reducing lead exposure in children. “I really believe that we can eradicate lead from our drinking supply within a decade,” said Pruitt, who has touted a “back-to-basics approach” that has steered the EPA towards toxic clean-ups and away from challenges such as climate change.

Pruitt needs a fig leaf to suggest he’s doing something for the environment and has landed on eradicating lead

The agency’s administrator warned the “mental-acuity levels of our children are being impacted adversely” by lead and called for a coordinated approach to ensure the disaster in Flint is not replicated.

One of the few concrete proposals put forward by Pruitt is to replace the millions of lead lines that funnel drinking water to Americans’ homes, a process that could cost about $45bn. A of the nation’s water utilities found that nearly half a trillion dollars of investment is required to restore crumbling drinking water systems and ensure lead and other pollutants don’t endanger the public.

Some agency staff, while pleased that the administration is raising the profile of lead poisoning, described the new plan as threadbare and muddled.

“Everyone was running around talking about a war on lead, but there was no conversation about how it will work, which is typical of this administration,” said one senior EPA official. “The lead problem is huge and multifaceted and they are just making it up as they go along,” the source said.

Asked about progress of the plan, this week an EPA spokeswoman declined to discuss details but said: “The 17 federal agencies that comprise the president’s taskforce on environmental health risks and safety risks to children is developing a new federal strategy to address childhood lead exposure and expects the strategy to be finalized and made public in mid-June.”

Pruitt has, according to staff sources, used meetings to demand a “single standard” for lead, which has caused confusion as the EPA has an actionable level of 15 parts per billion in water samples, although the agency and other health bodies concur there is no safe limit of lead exposure.

The Flint disaster came about after the city switched its drinking water supply to the Flint river on 25 April 2014 but then failed to add proper controls to prevent lead, a known neurotoxin, leeching from pipes and joints into the water.

Lead levels in children’s blood soared and a further dozen people died from a legionnaire’s outbreak linked to the lead contamination. A total of 15 people have been charged by the Michigan attorney general over the disaster.

Nationally, the risk of a similar outcome is vast – about 18 million Americans are served by 5,300 water systems that have violated the EPA’s lead and copper rule by, among other things, failing to properly test water for lead or neglecting to adequately treat systems to avoid lead contamination.

Many cities, as the Guardian revealed, have used techniques known to mask the true level of lead in drinking water. Lead can also be present in soil and paint, particularly in older buildings.

Lead line replacement – which Pruitt has backed – would provide an overdue upgrade on a huge tangle of ageing underground pipes, many of which were installed in the wake of the civil war.

But experts have warned that tearing up lead lines, even if funding could be found, could exacerbate the problem by dislodging lead during replacement work. Utilities only have jurisdiction over pipes up to the curbside, with residents potentially either unwilling or unable to afford to pay for their share of pipe replacement.

“It can be incredibly expensive to replace lines but also complicated because so much pipe is privately owned,” said Maura Allaire, a water resources expert at the University of California.

“The other problem is we don’t even know where the lead service lines are. You could spend a ton of money and make the problem worse. There are smarter ways of doing things rather than hodge-podge lead-line replacements.”

Allaire said the EPA could demand improvements to lead corrosion treatments and tighten up a unusual patchwork testing regime that relies on individual utilities to get volunteer homeowners to test their own water. Another glaring problem is the EPA’s lack of data on what the vast range of water utilities, some servicing just a few hundred people, are doing.

Other uncertainties abound. Trump’s vision of repealing two federal rules for any new regulation enacted could dampen any enthusiasm for introducing new standards. And Pruitt’s own future as EPA head remains unclear because of a series of controversies over his use of taxpayer money for first-class flights, office furniture and personal security.

“I think this is likely to be an empty exercise,” said a former EPA water official, who declined to be named. “Pruitt needs a fig leaf to suggest he’s doing something protective of the environment and has landed on eradicating lead. It wouldn’t surprise me if we got to the end of the administration and there’s no significant change to the rules.”

Tim Epp, a lawyer with no prior experience in dealing with lead in water, has been handed the role within the EPA of coordinating the new plan. That move, along with the administration’s previous attempt to slash funding for lead clean-ups, has led to some cynicism about the new strategy within and outside the EPA. But staff insist that any effort to address the issue is worthwhile.

“It could’ve been handled better but it’s good it’s out there, it can build momentum,” said an EPA source. “Everyone in the building wants to work on lead, we all want progress on this. No one gives a shit about dissenting with the administration over this. If it’s lead, we’re all in.”

Bin liners to takeaway containers – ideas to solve your plastic conundrums

Plastic has become an environmental disaster. Microplastic pollution has been found in our waterways, fish stocks, salt, tap water and even the air we breathe. Reducing our reliance on plastic by refusing it wherever possible has never been more important, especially as Australia’s recycling system is in crisis.

Yet there are conundrums that continue to defeat even those dedicated to going plastic-free. From bin liners to takeaway containers, Guardian Australia has tried to solve them. And we want to hear from you: share your plastic conundrums and the solutions. We’ll round up the best ideas for a follow-up article.

What should I do about plastic liners for the kitchen bin?

Many people justify their continued use of plastic bags by arguing each one is reused in the kitchen bin. But that’s not actually recycling – it’s barely even reusing, as each bag is still destined for landfill. Others buy bin liners, spending good money on a product made to be ditched after just one use.

Skip the unnecessary plastic bag altogether and simply rinse your bin after emptying it into the council bin. To help cut down on smelly garbage “juice”, separate out food scraps for composting (community compost groups help if you don’t have space) or throw it in the green waste bin. If a naked bin still revolts you, try lining it with newspaper instead.

How can I avoid plastic packaging in supermarkets?

The ultimate solution is filling up your own reusable containers at bulk food stores (check out Sustainable Table’s bulk food directory), but strategic shopping at major supermarkets also helps.

Skip plastic bags altogether – including the smaller produce bags – by taking your own reusable bags. Prioritise nude food: Lebanese cucumbers over plastic-wrapped continental versions, for example. Buy staples such as flour and sugar in paper bags, rather than plastic. Buy cheese and meat at deli counters, using your own reusable container. And if staff refuse BYO containers, or too much food is unnecessarily packaged, write to management demanding change.

Can I take my plastic somewhere to be recycled and transformed directly, rather than rely on nontransparent recycling systems?

Unfortunately not, in most cases. But do go the extra mile to save soft plastics – they can be recycled into sturdy outdoor furniture, bollards and signage. You’ll need to collect them separately, as soft plastics such as shopping bags, courier satchels, bubble wrap and chip packets can’t be processed in council recycling systems. Instead, drop soft plastics – essentially anything that can be scrunched into a ball – at REDcycle bins. They are then sent on to Australian manufacturer Replas to be made into long-lasting recycled products.

What to do about getting takeaway food without plastic containers?

Discarded food and beverage containers made up almost half of the 15,552 ute loads of rubbish collected during last year’s Clean Up Australia Day. Opt out by taking your own reusable containers to the local takeaway. Trashless Takeaway helps consumers find Aussie eateries happy to fill reusable containers, while Fair Food Forager and Responsible Cafes highlight low-waste locations. Boycott places that refuse to embrace sustainable practices – as zero-waste lifestyle advocate Tammy Logan explains, refusing BYO containers is most often a business decision, not a legal requirement.

What’s the best alternative to a plastic water bottle?

Refuse bottled water point blank – plastic bottles are derived from crude oil and take thousands of years to break down in landfill. But select reusable water bottles with care, as many have plastic components, such as lids. Instead, go for brands such as Pura Stainless; it makes stainless steel bottles with medical-grade silicone lids. While you’re at it, grab a reusable coffee cup and stainless steel straw, too.

How do I clean and disinfect the house without using disposable wipes and cleaners sold in plastic bottles?

Most cleaning products are packaged in plastic – and packed with toxic chemicals, but almost the entire home can be cleaned with just three ingredients. Use Castile soap on floors and sinks, baking soda for scrubbing jobs, and vinegar for mould. (Bea Johnson from Zero Waste Home has an extensive natural cleaners recipe list.) Microfibre cloths can also clean windows and more with water only, no products needed.

Avoid all types of cleaning wipes, as they’re designed to be thrown away. As Adelaide zero waster Niki Wallace says: “We don’t need [wipes] for protection from germs and we don’t need them for their cleaning capability or even for their convenience.” Instead use cloths or old clothes cut to size.

Is it possible to get toothpaste, shampoo and conditioner without packaging?

Buying in bulk or making your own are the only real ways to access packaging-free toiletries. Toothpaste is easy to make from coconut oil, bicarb soda and salt, though some dentists baulk at using abrasive materials on tooth enamel. The easiest plastic-free shampoo and conditioner options sidestep bottles completely – choose soap bars instead, preferably unwrapped versions.

Should I be worried about mini toiletries offered in hotels?

Go easy on free toiletries when travelling, as millions of half-used bottles are ditched each day. Take your own or support hotels with more sustainable practices, such as refillable wall dispensers. Also keep an eye out for hotels involved with Melbourne non-profit Soap Aid, which recycles hotel soaps for distribution in disadvantaged communities.

How can I clean my cat litter without plastic bags?

The key is using natural litter, such as Oz-Pets’ wood pellet litter, made from waste plantation sawdust that would otherwise be dumped. The wood’s natural eucalyptus oils help kill bacteria and keep smells under control. After use, the lot can go straight into the compost (ensure it’s not subsequently used for food-producing crops, though) on to the garden as mulch or into council green bins. Recycled paper pellets are also an option. Scoop the poop and soiled litter into a small bucket rather than a plastic bag, before turfing into compost or green bins. Disinfect litter trays with boiling water and vinegar before refilling.

How can I ensure my clothes don’t add to plastic pollution?

Microfibres shed by synthetic clothing during washing is a growing concern. Synthetic fabrics such as polyester and nylon shed thousands of these tiny plastic particles with each wash, which end up polluting our rivers and oceans. Guppy Friend filter bags, developed in Berlin, are an emerging solution. Washing less, using front-loading machines, and opting for natural fabrics and fibres wherever possible also helps.

Going plastics-free is as easy as calico bags and reusable coffee cups

Australians throw away a lot of plastic, often after only one use. Here’s how to give it up

Koren Helbig

Fri 23 Mar 2018 17.51 EDT Last modified on Wed 28 Mar 2018 00.33 EDT

Say no to single-use plastics such as water bottles, straws and takeaway containers.

It’s almost everywhere you look – and it’s undeniably destroying our planet.

Over the past half a century, plastic has infiltrated modern life to such an extent that our oceans may have more of the stuff than fish by 2050.

Once hailed as an innovation, it’s now clear that plastic is very bad news. Non-renewable fossil fuels are needed for production, belching out greenhouse gases in the process. Once tossed – often after mere minutes of use – plastic then takes hundreds of years to break down, emitting toxic methane gas as it gradually breaks into smaller and smaller pieces. The end point is too often our waterways and an estimated one million seabirds and 100,000 mammals are killed every year by marine rubbish, much of it plastics.

Recycling helps, but a far more powerful solution is reducing and even eliminating single-use plastics altogether. That’s actually easier than one might think – read on for a life with less plastic.

Say no to single-use plastics

Refuse to use plastic bags, for a start. Australians go through 13 million new bags each day and around 50m of them end up in landfill each year – enough to cover Melbourne’s CBD, according to Clean Up Australia.

Be cautious with so-called “green bags” though; those commonly sold at Coles and Woolworths are made of polypropylene, a type of plastic. Instead, opt for calico, canvas, jute or hessian. Or join the Boomerang Bags movement, from Queensland’s Burleigh Heads, which encourages volunteers to make fabric bags from recycled materials.

Skip other single-use plastics too, such as coffee cups, takeaway containers, water bottles and straws – even the Queen last month banned the latter two across her royal estates.

A little pre-purchase prep makes saying no easier. Online stores including Brisbane’s Biome and Sydney’s Onya stock reusable alternatives, such as stainless steel water bottles, reusable coffee cups, bamboo toothbrushes, mesh produce bags and material food covers. Consider buying in bulk – The Source Bulk Foods has stores around the country and is Australia’s biggest bulk food retailer.

“A reusable coffee cup only needs to be used 15 times to break even on its life cycle, including cleaning. Every use after that is a bonus for the planet,” says Biome founder Tracey Bailey. “The cumulative use of reusable products makes it easy for individuals to have a long-term environmental impact. Within 12 months, the Biome community saved the waste of over 6m single-use plastic items.”

If plastics can’t be avoided, look to recycle as much as possible. Soft, scrunchable plastics, for example, are too often turfed, but can actually be collected at home then dropped at REDcycle bins in major supermarkets.

Avoid microplastics and reduce microfibre shedding

Some of the most environmentally damaging plastics can barely be seen by the human eye, yet are used daily. Microbeads are tiny plastic pellets used in cosmetics and household products, such as exfoliating face scrubs, whitening toothpastes and deep-cleaning washing powders. Flushed down the drain, microbeads are too small to be captured by wastewater treatment plant filters and end up in our waterways. According to the Australian Marine Conservation Society, marine wildlife that mistake microbeads for fish eggs often end up starving to death.

The Australian government has ordered a voluntary phase-out of microbeads by mid-2018, but many believe an outright ban is needed. Consumers can sign a Surfrider Foundation petition calling for a ban, and consult the Australian Good Scrub Guide or Beat the Microbead app for help to choose microbead-free products.

Microfibres shed by synthetic clothing during washing is a growing concern. Synthetic fabrics such as polyester and nylon shed thousands of these tiny plastic particles with each wash, which end up polluting our rivers and oceans. Guppy Friend filter bags, developed in Berlin, are an emerging solution. Washing less, using front-loading machines and opting for natural fabrics and fibres wherever possible also helps.

A whopping 269,000 tonnes of rubbish is now estimated to be floating in our oceans – weighing more than 1,300 blue whales combined – and about 80% of it comes from land, according to the Australian Marine Conservation Society. The Pacific ocean is already besieged by more plastic than plankton.

To turn that around, humans can’t just reduce our current consumption; we must also clean up the mess already created. Sydney-based non-profit Take 3 has inspired thousands to pick up three pieces of rubbish each when leaving the beach or waterways. Clean Up Australia Day, held annually each March, last year attracted 590,350 volunteers, who collected an estimated 15,500 ute loads of rubbish.

Perth surfers Andrew Turton and Pete Ceglinski have created the floating Seabin rubbish bin for marinas and ports, which moves with the tide, collecting about 1.5kg of floating rubbish each day. Dutch inventor Boyan Slat went further, inventing The Ocean Cleanup passive drifting systems – due to launch this year – which he believes will clean up half the Great Pacific Garbage Patch within just five years.

Despite how thoroughly plastic permeates modern life, it is possible to largely avoid it – with some effort. In Melbourne, Erin Rhoads has lived a “zero waste” existence since June 2014; her entire trash output since fits inside one old coffee jar. (Other notables on similar paths include Gippsland’s Tammy Logan, Adelaide’s Niki Wallace and Perth-based Lindsay Miles.)

“We have the power to dictate how things are packaged and presented to us by participating in everyday activism through our purchases,” Rhoads says. “The more businesses see and hear us saying no to plastics, the more likely we will see changes that will lay the foundation to a cleaner, safer and healthier planet.”

While going fully plastic-free remains a pipe dream for most, Rhoads advocates small shifts that build into big lifestyle changes. On her blog, The Rogue Ginger, Rhoads outlines five easy first steps, and encourages consumers to embrace Plastic-Free Tuesday or Plastic-Free July (which began in Perth).

“Going plastic-free helps us take responsibility for our actions, while reminding us that most of the plastic we use today is not necessary at all,” Rhoads says. “We should focus on preserving our earth’s resources for something other than a single-use plastic fork.”

Additional research and reporting by Nicole Lutze

The great Australian garbage map: 75% of beach rubbish made of plastic

Data compiled from rubbish collected by volunteers aims to encourage industry to control plastic pollution at the source

https://www.theguardian.com/environment/ng-interactive/2018/apr/18/the-great-australian-garbage-map-75-of-beach-rubbish-made-of-plastic

Maine PUC should not sink ocean wind project

BY THE EDITORIAL BOARD, April 29, 2018

Saving customers a few cents would not be worth further damaging the state's credibility as a business partner.

BY THE EDITORIAL BOARD

Volturn US generates power off Castine in 2013. The prototype is a scale model of the floating turbines to be used in a wind project planned for deep water off Monhegan Island – if the state doesn't renege on an agreement to buy clean, home-grown energy.

The Maine Public Utilities Commission usually sets prices for electricity, natural gas, telecommunications and water systems. It’s not every day that it also regulates the state’s credibility.

But that’s what will be happening when the PUC reconsiders a power purchase agreement it signed in 2013 with Maine Aqua Ventus, a public-private partnership involving the University of Maine that is trying to develop the nation’s first deep-water wind power generator.

The three PUC commissioners, all appointed by Gov. LePage since the power purchase agreement was signed, question whether Maine ratepayers should have to pay higher-than-market prices for electricity that’s produced by the demonstration project, and they’ve hinted that they could pull Maine out of the contract.

They say they would be looking out for Maine consumers, who could save as much as 73 cents a month on their electric bills. But those savings would come at a tremendous cost.

Ripping up the contract would probably kill the project. In addition to losing the money from the energy sales, it would almost certainly result in the loss of $87 million in grant money from the U.S. Department of Energy aimed at getting this technology ready to market commercially.

The end of this project would be bad news not only for the Maine Aqua Ventus partners. It would also hurt the Maine companies that would have helped build and supply a new manufacturing facility, and it would be bad news for the people who would have been hired to fill newly created jobs.

But the biggest loser would be Maine itself. A decade-long effort to take a leadership role in a home-grown energy industry – one that could sell clean power to states that don’t have the capacity to make their own – would go down the drain. To save consumers that potential 73 cents a month, the state would throw away millions already invested, including an $11 million bond (which passed in 2010 with nearly 60 percent of the vote).

The ultimate price would come at the expense of the state’s credibility as a business partner, giving investors a very good reason to look elsewhere before deciding to put their money to work here. They would have to doubt whether Maine could be trusted to keep its word through inevitable shifts in the political climate. That’s not a good position to be in for a state that cites a lack of capital as an obstacle to economic growth.

The agreement looks fairly straightforward: Maine Aqua Ventus would build a 12-megawatt wind project off Monhegan Island, and the PUC agreed that the company commits the state’s ratepayers to buy the electricity it produces at 23 cents per kilowatt-hour in the first year, or as much as $187 million over 20 years.

That price is roughly three times the average price of power now, which, to some eyes, makes the agreement look like a bad deal for Maine.

But this is not just a power purchase agreement – it’s also a research and development project.

The over-market energy sales enables Maine Aqua Ventus to develop a new technology – floating platforms for wind-driven turbines – and prove that it can work. The Legislature envisioned this financing system a decade ago when it created the Ocean Energy Task Force in 2009 and adopted its recommendations in statute a year later. The law set a limit on how much the research project could add to electric bills, and this term sheet is below the limit.

This is not the only source of revenue for this project. It has been heavily subsidized by the federal Energy Department, as well as Maine taxpayers, who have approved two bonds.

And while the kilowatt-hour price is high, the amount of energy that ratepayers would have to buy from this experimental project is very small. It’s likely that consumers would not see any noticeable change in their electric bills because of this project.

What they are more likely to see is the economic benefit that would come from Maine becoming a supply hub for an emerging industry. Private companies would buy made-in-Maine equipment or license patented technology that is being developed here. The university would cement its status as a leader in the field, making it eligible for more federal research funds that percolate in the state’s economy.

But what do we get if the PUC pulls its support?

Five years ago, Maine gave itself a black eye by reneging on a power purchase agreement with Statoil, the Norwegian energy business, which was working on a different ocean wind idea.

The company took its $120 million and invested it in Scotland instead of Maine. If the PUC repeats that sorry episode, the damage to Maine’s reputation will be complete.

Anything as complicated as the transition from fossil fuels to renewable energy will take time. It can’t be accomplished by one Legislature or one governor. Leaders come and go, but the commitments they make to a shared vision should not shift with the political winds.

Maine has put too much into ocean wind power to abandon it. The PUC should not sink this proposal, or the state’s credibility.

Fracking chemicals “imbalance” the immune system

Brian Bienkowski , May 01, 2018

Mice exposed to fracking chemicals during pregnancy were less able to fend off diseases; scientists say this could have major implications for people near oil and gas sites

Che