future timeline technology singularity humanity
 
   
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed
 
     
     
 
       
 
 
 

Blog » Energy & the Environment

 
     
 

18th April 2017

Device pulls water from dry air, powered only by Sun

The University of California, Berkeley, has created a device that pulls water from dry air, powered only by the Sun. Even under conditions of relatively low (20-30%) humidity, it can produce 2.8 litres of water over a 12-hour period.

 

water technology future timeline
Credit: University of California, Berkeley

 

Imagine a future in which every home has an appliance that pulls all the water the household needs out of the air, even in dry or desert climates, using only the power of the Sun. That future may be just around the corner, with the demonstration of a water harvester that uses only ambient sunlight to pull litres of water out of the air each day in conditions as low as 20 percent humidity, a level common in arid areas.

The solar-powered harvester, reported in the journal Science, was constructed at the Massachusetts Institute of Technology using a special material called a metal-organic framework – or MOF – produced at the University of California, Berkeley.

"This is a major breakthrough in the long-standing challenge of harvesting water from the air at low humidity," said Omar Yaghi from UC Berkeley, one of two senior authors of the paper. "There is no other way to do that right now, except by using extra energy. Your electric dehumidifier at home 'produces' very expensive water."

The prototype, under conditions of 20-30 percent humidity, was able to pull 2.8 litres (3 quarts) of water from the air over a 12-hour period, using one kilogram (2.2 pounds) of MOF. Rooftop tests at MIT confirmed that the device works in real-world conditions.

 

water technology future timeline
Schematic of a metal-organic framework (MOF). Credit: UC Berkeley, Berkeley Lab image.

 

"One vision for the future is to have water off-grid, where you have a device at home running on ambient solar for delivering water that satisfies the needs of a household," said Yaghi, who is the founding director of the Berkeley Global Science Institute, a co-director of the Kavli Energy NanoSciences Institute and the California Research Alliance by BASF. "To me, that will be made possible because of this experiment. I call it personalised water."

Yaghi worked with Evelyn Wang, a mechanical engineer at MIT, alongside students at the university. The system they designed consists of approximately two pounds of dust-sized MOF crystals compressed between a solar absorber and a condenser plate, inside a chamber open to the air. As ambient air diffuses through the porous MOF, water molecules preferentially attach to the interior surfaces. X-ray diffraction studies have shown that the water vapour molecules often gather in groups of eight to form cubes.

Sunlight entering through a window heats up the MOF and drives the bound water toward the condenser, which is at the temperature of the outside air. The vapour condenses as liquid water and drips into a collector.

"This work offers a new way to harvest water from air that does not require high relative humidity conditions and is much more energy efficient than other existing technologies," said Wang.

This proof of concept harvester leaves much room for improvement, Yaghi said. The current MOF can absorb only 20 percent of its weight in water, but other MOF materials could possibly absorb 40 percent or more. The material could also be tweaked to be more effective at higher or lower humidity.

"It's not just that we made a passive device that sits there collecting water; we have now laid both the experimental and theoretical foundations so that we can screen other MOFs, thousands of which could be made, to find even better materials," he said. "There is a lot of potential for scaling up the amount of water that is being harvested. It is just a matter of further engineering now."

Yaghi and his team are working to improve their MOFs, while Wang continues to improve the harvesting system to produce more water.

"To have water running all the time, you could design a system that absorbs the humidity during the night and evolves it during the day," he said. "Or design the solar collector to allow for this at a much faster rate, where more air is pushed in. We wanted to demonstrate that if you are cut off somewhere in the desert, you could survive because of this device. A person needs about a Coke can of water per day. That is something one could collect in less than an hour with this system."

 

 

 

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

11th April 2017

Two-thirds of Great Barrier Reef hit by back-to-back mass coral bleaching

Australia's Great Barrier Reef is reported to be experiencing a second consecutive mass coral bleaching event, affecting two-thirds of its area.

 

coral bleaching future timeline
Credit: Bette Willis/ARC Centre of Excellence for Coral Reef Studies

 

For the second time in just 12 months, scientists have recorded severe coral bleaching across huge tracts of the Great Barrier Reef, after completing aerial surveys along its entire length. In 2016, bleaching was most severe in the northern third of the Reef, while one year on, the middle third has experienced the most intense coral bleaching.

"The combined impact of this back-to-back bleaching stretches for 1,500 km (900 miles), leaving only the southern third unscathed," says Prof. Terry Hughes, Director of the ARC Centre of Excellence for Coral Reef Studies, who undertook the aerial surveys in both 2016 and 2017.

"The bleaching is caused by record-breaking temperatures driven by global warming. This year, 2017, we are seeing mass bleaching, even without the assistance of El Niño conditions."

 

coral bleaching ocean temperature
Credit: Ed Hawkins

 

The aerial surveys in 2017 covered more than 8,000 km (5,000 miles) and scored nearly 800 individual coral reefs closely matching the aerial surveys in 2016 that were carried out by the same two observers.

Dr. James Kerry, who also undertook the aerial surveys, explains further: "This is the fourth time the Great Barrier Reef has bleached severely – in 1998, 2002, 2016, and now in 2017. Bleached corals are not necessarily dead corals, but in the severe central region we anticipate high levels of coral loss."

"It takes at least a decade for a full recovery of even the fastest growing corals, so mass bleaching events 12 months apart offers zero prospect of recovery for reefs that were damaged in 2016."

 

coral bleaching map

 

Coupled with the 2017 mass bleaching event, Tropical Cyclone Debbie struck a corridor of the Great Barrier Reef at the end of March. The intense, slow-moving system was likely to have caused varying levels of damage along a path up to 100 km wide. Any cooling effects related to the cyclone are likely to be negligible in relation to the damage it caused, which struck a section of the reef that had largely escaped the worst of the bleaching.

"Clearly the reef is struggling with multiple impacts," says Prof. Hughes. "Without a doubt the most pressing of these is global warming. As temperatures continue to rise, the corals will experience more and more of these events: 1°C of warming so far has already caused four events in the past 19 years."

"Ultimately, we need to cut carbon emissions, and the window to do so is rapidly closing."

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

5th April 2017

Atmosphere could resemble Triassic by 22nd century

If greenhouse gas emissions continue to rise, or if subsequent efforts to reverse climate change end in failure, the world's atmosphere could resemble that of the Triassic period by the 22nd century, according to a new study.

 

triassic
Credit: Massachusetts Institute of Technology (MIT)

 

New research led by the University of Southampton suggests that, over the next 100 to 200 years, carbon dioxide concentrations in the Earth's atmosphere will head towards values not seen since the Triassic period, 200 million years ago. Furthermore, by the 23rd century, the climate could reach a warmth not seen in 420 million years.

The study, published in Nature Communications, compiled over 1200 estimates of ancient atmospheric carbon dioxide (CO2) concentrations to produce a continuous record dating back nearly half a billion years. It concludes that if humanity burns all available fossil fuels in the future, the levels of CO2 contained in the atmosphere may have no geologically-preserved equivalent during this 420 million year period.

The researchers examined published data on fossilised plants, the isotopic composition of carbon in soils and the oceans, and the boron isotopic composition of fossil shells. Gavin Foster, lead author and Professor of Isotope Geochemistry at the University of Southampton, explains: "We cannot directly measure CO2 concentrations from millions of years ago. Instead we rely on indirect 'proxies' in the rock record. In this study, we compiled all the available published data from several different types of proxy to produce a continuous record of ancient CO2 levels."

This wealth of data shows that CO2 concentrations have naturally fluctuated on multi-million year timescales over this period, from around 200-400 parts per million (ppm) during cold 'icehouse' periods, to up to 3000 ppm during intervening warm 'greenhouse' periods. Although evidence tells us our climate has fluctuated greatly in the past (with the Earth currently in a colder period), it also shows the current speed of climate change is highly unusual.

 

earth history co2 future timeline


Proxy-based atmospheric CO2 and climate forcing on a log timescale, with future projections to 2500 AD. Credit: Foster, et al / Nature.

 

Carbon dioxide is a potent greenhouse gas and during the last 150 years, humanity's fossil fuel extraction has increased its atmospheric concentration from 280 ppm in the pre-industrialisation era to nearly 405 ppm today. However, it is not just CO2 that determines the climate of our planet; ultimately it is both the strength of the greenhouse effect and the amount of incoming sunlight that is important. Changes in either parameter are able to force climate change.

"Due to nuclear reactions in stars, like our Sun, over time they become brighter," adds co-author Dan Lunt, Professor of Climate Science at the University of Bristol. "This means that, although carbon dioxide concentrations were high hundreds of millions of years ago, the net warming effect of CO2 and sunlight was less. Our new CO2 compilation appears on average to have gradually declined over time by about 3-4 ppm per million years. This may not sound like much, but it is actually just about enough to cancel out the warming effect caused by the Sun brightening through time, so in the long-term it appears the net effect of both was pretty much constant on average."

This interplay between carbon dioxide and the Sun's brightness has fascinating implications for the history of life on Earth. Co-author Professor Dana Royer, from Wesleyan University in the US, explains: "Up until now it's been a bit of a puzzle as to why, despite the Sun's output having increased slowly over time, scant evidence exists for any similar long-term warming of the climate. Our finding of little change in the net climate forcing offers an explanation for why Earth's climate has remained relatively stable, and within the bounds suitable for life for all this time."

This long-term view also offers a valuable perspective on future climate change. It is well recognised that the climate today is changing at rates well above the geological norm. If humanity fails to tackle rising CO2 and burns all the readily available fossil fuel, then within the next two centuries, it will be at 2000 ppm – levels not seen since 200 million years ago.

Professor Foster adds: "However, because the Sun was dimmer back then, the net climate forcing 200 million years ago was lower than we would experience in such a high CO2 future. So, not only will the resultant climate change be faster than anything the Earth has seen for millions of years, the climate that will exist is likely to have no natural counterpart, as far as we can tell, in at least the last 420 million years."

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

4th April 2017

Graphene sieve turns seawater into drinking water

Researchers at the University of Manchester have demonstrated a graphene-based sieve able to filter seawater. This could lead to affordable desalination technologies.

 

graphene seawater filter technology future
Credit: University of Manchester

 

In recent years, graphene-oxide membranes have attracted major attention as promising candidates for new filtration technologies. Now, the much sought-after breakthrough of making membranes capable of sieving common salts has been achieved. New research demonstrates the real-world potential of providing clean drinking water for millions of people who currently struggle to obtain adequate water resources. The findings, by scientists from the University of Manchester, were published yesterday in the journal Nature Nanotechnology.

Graphene-oxide membranes developed at the National Graphene Institute have already demonstrated the potential of filtering out small nanoparticles, organic molecules, and even large salts. Until now, however, they couldn't filter common salts, which require even smaller sieves. Previous research at the University of Manchester found that if immersed in water, graphene-oxide membranes become slightly swollen and smaller salts flow through the membrane along with water, but larger ions or molecules are blocked.

The team has now further developed these graphene membranes and found a way to prevent the swelling of the membrane when exposed to water. Pore size in the membrane can be precisely controlled, to filter common salts out of salty water and make it safe to drink.

 

africa seawater
Man with a bucket of seawater on the coast of Morocco, Africa. Credit: Salvador Aznar

 

As the effects of climate change continue to impact on water supplies, wealthy countries are also investing in desalination technologies. Following the recent disasters in California, major cities are looking increasingly to alternative water solutions.

When common salts are dissolved in water, they form a 'shell' of water molecules around the salt molecules. This allows the tiny capillaries of the graphene-oxide membranes to block salt from flowing along with the water. Water molecules are able to pass through the membrane barrier and flow anomalously fast, which is ideal for application of these membranes for desalination.

"To make it permeable, you need to drill small holes in the membrane. But if the hole size is larger than one nanometre, the salts go through that hole," said Rahul Nair, Professor of Materials Physics. "You have to make a membrane with a very uniform, less-than-one-nanometre hole size to make it useful for desalination. It is a really challenging job. When the capillary size is around one nanometre, which is very close to the size of the water molecule, those molecules form a nice interconnected arrangement, like a train."

"Realisation of scalable membranes with uniform pore size down to the atomic scale is a significant step forward and will open new possibilities for improving the efficiency of desalination technology," he continued. "This is the first clear-cut experiment in this regime. We also demonstrate that there are realistic possibilities to scale up the described approach and mass produce graphene-based membranes with required sieve sizes."

"The developed membranes are not only useful for desalination, but the atomic scale tunability of the pore size also opens new opportunity to fabricate membranes with on-demand filtration, capable of filtering out ions according to their sizes." said Jijo Abraham, co-author on the research paper.

By 2025, the UN expects that 14% of the world's population will encounter water scarcity. This new technology has the potential to revolutionise water filtration across the world, particularly in nations which cannot afford large-scale desalination technology. It is hoped that graphene membrane systems can be utilised on smaller scales – making them accessible to regions that do not have the financial infrastructure to fund large plants.

 

 


---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

18th March 2017

Artificial "power island" to be built in the North Sea

A trio of European energy firms is collaborating to build a gigantic wind power hub in the middle of the North Sea. Providing clean energy transmission between six neighbouring countries, this will be a major step towards meeting Europe's 2050 climate goals.

 

 

 

Denmark's Energinet and the German and Dutch arm of TenneT will sign an agreement on 23rd March to explore ways to build a giant artificial island in the middle of the North Sea. This would create a new "hub" for the generation and transmission of renewable energy across northern Europe that could provide up to 100,000 megawatts (MW) to Belgium, Denmark, Germany, the Netherlands, Norway and the UK.

Known as the North Sea Wind Power Hub, this project would be located in Dogger Bank, a large sandbank in a shallow area of sea about 100 km (62 mi) off the east coast of England. During the last ice age, around 10,000 BC, this bank was part of Doggerland, a large landmass connecting Europe and the British Isles. Today, the water remains of relatively low depth, which combined with optimal wind conditions and a central location makes it an ideal site for land reclamation, according to TenneT.

The artificial island could be surrounded with up to 7,000 wind turbines, providing green energy for 80 million Europeans – not only generating and transmitting energy from the North Sea, but simultaneously forming a power link between six countries, enabling them to trade electricity. With an area of 6 sq km, the island would have its own landing strip and harbour. Staff, components and assembly workshops would be stationed there. The exact schedule for construction is currently unknown, and will depend on feasibility studies, but Energinet and TenneT believe the artificial island could be built on Dogger Bank sometime between 2030 and 2050.

 

north sea wind power hub future timeline 2020 2030 2040 2050
Credit: TenneT

 

Mel Kroon, the CEO of TenneT, commented on the multi-billion euro plan: "This project can significantly contribute to a completely renewable supply of electricity in Northwest Europe. TenneT and Energinet both have extensive experience in the fields of onshore grids, the connection of offshore wind energy and cross-border connections. Transmission System Operators (TSOs) are best placed to play a leading role in the long-term development of the offshore infrastructure. I am happy that we are going to take this step with our Danish colleagues and I look forward to the participation of other TSOs and possibly other partners."

Peder Østermark Andreasen, the CEO of Energinet, said: "Offshore wind has in recent years proved to be increasingly competitive and it is important to us to constantly focus on further reduction in prices of grid connections and interconnections. We need innovative and large-scale projects so that offshore wind can play an even bigger part in our future energy supply."

The island "could make the wind power of the future a lot cheaper and more effective," said Torben Nielsen, technical director of Energinet. In an interview with the Copenhagen Post, he added: "We haven't let our fantasy gain the upper hand, although it may sound a little crazy and like something out of science fiction. We who have the responsibility of transporting the electricity generated by offshore wind turbines back to land and the consumers must constantly push and make sure that the price continues to fall. That requires innovative, big-scale solutions, and an energy hub in the North Sea is worth thoroughly looking into."

The European targets for reducing CO2 emissions cannot be accomplished by individual member states on their own. Cooperation and synergy will be required on a broader scale – and projects like the North Sea Wind Power Hub could achieve that. Wind power and solar energy complement each other: there is more Sun from spring to autumn, and stronger winds during the colder and darker months of the year. By taking advantage of these variations, Europe and indeed many other parts of the world could fully optimise their energy grids at all times of the year. Longer term, continent-wide supergrids may emerge, allowing the generation and transmission of clean energy on massive scales and very long distances.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

16th March 2017

New York 2140

The latest novel by Kim Stanley Robinson depicts a scarily plausible future, in which New York has been inundated by rising sea levels.

 

new york 2140 future timeline

 

Kim Stanley Robinson is an award-winning science fiction author, best known for his Mars trilogy (Red Mars, Green Mars and Blue Mars), which follows the terraforming efforts on Mars over a 200-year period. He also wrote 2312, depicting a number of futuristic concepts including asteroid terrariums and rewilding of extinct species on Earth. His many other books explore a wide range of scientific, environmental, cultural and political themes. He is noted for his use of "hard" science fiction to convey a sense of realism and plausibility.

Now, Robinson is back with his latest novel: New York 2140. With yet another futuristic storyline, it tells the tale of a 22nd century Manhattan that is struggling to survive amid rising sea levels. A synopsis of the book reads as follows:

"The waters rose, submerging New York City.
But the residents adapted and it remained the bustling, vibrant metropolis it had always been. Though changed forever.
Every street became a canal. Every skyscraper an island.
Through the eyes of the varied inhabitants of one building, Kim Stanley Robinson shows us how one of our great cities will change with the rising tides.
And how we too will change."

For those who might be wondering whether New York: 2140 is set in the same universe as the Mars trilogy or 2312, Robinson had this to say in an interview with sfsite.com:

"I don't like linking up my various projects into one larger future history. I've never done it, and so of course now it's too late, and I don't regret it. I don't see that the advantages of some larger macro-history are very large, compared to the flexibility that I've gained by making each novel have its own future history. Even within my Mars stories there are a couple alternative historical lines to the main one described in the trilogy. I think it's best to keep on updating one's views on what is "most likely to happen," and write accordingly. And doing it this way means each time I have a chance to invent a whole new history, and even if they are somewhat similar, there's still a lot of pleasure to be had there in the details."

New York: 2140 is published this week in hardcover, by Orbit. It is also available in E-book and audio format. A paperback version is scheduled for 2018.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

14th March 2017

Renewables are now Australia's cheapest energy option

A major new report claims that renewable energy is now Australia's cheapest energy option, even when the cost of storage to make the intermittent power sources reliable is added.

 

australia solar energy

 

A major new study into the cost of emissions reductions and energy storage in the Australian power sector indicates that the rising price of gas, coupled with the falling cost of energy storage, has now made renewable energy the cheapest source of "reliable" generation in Australia – surpassing gas as the 'least cost' source of energy supply – even if the Sun is not shining and the wind not blowing.

The study also shows that "clean coal" technologies such as Carbon Capture and Storage (CCS) will not be commercially mature before 2030 – and will therefore not help Australia to meet its 2030 emissions reduction target under the Paris Agreement.

The study, by energy and carbon advisory firm RepuTex, is expected to re-shape thinking on the role of renewable energy in providing affordable, clean, and reliable energy as Australia seeks to meet its 2030 emissions reduction target – referred to as the "energy trilemma".

RepuTex analysis, supported by extensive consultation with over 45 electricity generators, industrial and commercial consumers and investors, identifies emissions reduction activities in the power sector – such as retrofitting existing coal-fired plants, developing new wind, solar, gas and "clean coal" generation – with analysis mapping the size and cost of abatement through to 2030.

Their analysis also calculates the "full cost" for renewables to supply "reliable" power – including the cost of batteries, pumped hydro, or thermal storage – to determine which technologies can supply electricity at least cost, while improving security.

According to RepuTex, advancements in the cost of energy storage technology, coupled with significant rises in the domestic gas price, have now made wind and solar – with storage – competitive with gas in providing system reliability in the form of instantaneous peaking or load-following generation.

This means new renewable facilities, with storage, are the least cost source of firm power, and able to provide energy supply even if the Sun is not shining, or the wind not blowing.

 

australia wind energy

 

"Traditionally, gas-fired generators have been the least cost technology that could provide energy security, such as load-following and peaking services," explained Bret Harper, head of research at RepuTex. "However, the rising price of gas has increased the levelised cost of any new gas build in Australia. At the same time, the decline in capital costs for new wind and solar projects, and improvements in storage performance, have seen renewable project costs fall. When we consider the 'full cost' of renewables to supply dispatchable power – including storage costs to ensure supply even when the wind is not blowing or the Sun not shining – we find that renewables have overtaken gas as the least cost source of new firm supply," he said.

The analysis is significant for the federal debate on energy security, with findings indicating that load-following wind and solar may now be able to strengthen the grid – overcoming intermittency concerns – while strengthening the claims of state governments in South Australia, Queensland and Victoria as they seek to cash in on new renewable investment.

"As older coal and gas-fired generation leave the market, new dispatchable renewables will be able to provide energy during daily peaks, adjust as demand changes throughout the day, or provide reserve peaking generation capacity to alleviate critical situations such as those in South Australia and New South Wales," said Harper. "Moreover, they can now provide that service at 'least cost', surpassing gas. Our view is that this will create a decreasing need for baseload-only facilities, with potential for states to rely on new storage technologies to provide affordable, clean, and secure energy, while improving system reliability."

 

coal power australia
Yallourn coal-fired power plant in Victoria, Australia

 

Notably, the study also shows that "clean coal" technologies such as Carbon Capture and Storage (CCS) will not be commercially mature before 2030 – and will therefore not help Australia to meet its 2030 emissions reduction target under the Paris Agreement.

Findings indicate that four groups of measures have potential to deliver the vast majority of the power sector's emissions reductions by 2030, including distributed generation, the closure of emissions intensive generators, improving the greenhouse gas intensity of existing fossil fuel plants, and investing in renewables and energy storage.

However, while analysis shows there are many opportunities for emissions reductions in the sector, "clean coal" technology is not among the cheapest.

"While clean coal is promoted as a critical emissions reduction technology, findings indicate that cost will be a major barrier to the implementation of a commercial scale project in Australia," said Mr Harper. "We see costs for CCS coming down as low as $100/MWh around 2030, at which point a large-scale project may be feasible if there is any appetite for a baseload-only generator. On that timeline, we assume CCS will play little role to meet Australia's 2030 emissions reduction target."

"Moreover, with a premium placed on flexible generation that can ramp up or down, we see baseload-only generation as being too inflexible to compete in Australia's future electricity system. That is not good news for coal generation, irrespective of how clean it is," said Mr Harper.

The study is expected to provide a new reference point for the cost of emissions reductions, and energy storage technologies, as policymakers seek to solve the "energy trilemma" of providing affordable and reliable energy supply, while meeting Australia's 2030 emissions target. RepuTex notes that a clear market signal is needed to guide investment toward a long-term target well beyond 2030, to better match the investment timeframes of the sector.

"To determine a cost-effective pathway, Australia needs to define its long-term target to better match investment decisions that have lifetimes of 20 to 40 years" said Mr Harper. "Are we aiming for a 26 per cent target by 2030, or 100 per cent clean energy by 2050? Each of those targets have different least-cost pathways, but the same investment timeframe. Identifying a long-term target, with a clear signal on the rate and pace of change, will therefore help to guide the correct investment in the sector."

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

13th March 2017

Soil's contribution to climate change much higher than previously thought

Deeper soil layers are more sensitive to warming than previously thought, scientists have found. By 2100, this could release carbon to the atmosphere at a rate that is 30% of today's human-caused annual emissions

 

soil climate future

 

Soils could release much more CO2 than expected into the atmosphere as the climate warms, according to new research by scientists from the Department of Energy's Lawrence Berkeley National Laboratory.

Their findings are based on a field experiment that, for the first time, explored what happens to organic carbon trapped in soil when all soil layers are warmed, which in this case extend to a depth of 100 centimetres. The scientists discovered that warming both the surface and deeper soil layers at three experimental plots increased the plots' annual release of CO2 by 34 to 37 percent over non-warmed soil. Much of the CO2 originated from deeper layers, indicating that deeper stores of carbon are more sensitive to warming than previously thought.

The results shed light on what is potentially a big source of uncertainty in climate projections. Soil organic carbon harbours three times as much carbon as Earth's atmosphere. In addition, warming is expected to increase the rate at which microbes break down soil organic carbon, releasing more CO2 into the atmosphere and contributing to climate change.

But, until now, the majority of field-based soil warming experiments only focused on the top five to 20 centimetres of soil—which leaves a lot of carbon unaccounted for. Experts estimate soils below 20 centimetres in depth contain more than 50 percent of the planet's stock of soil organic carbon. The big questions have been: to what extent do the deeper soil layers respond to warming? And what does this mean for the release of CO2 into the atmosphere?

"We found the response is quite significant," says Caitlin Hicks Pries, a postdoctoral researcher in Berkeley Lab's Climate and Ecosystem Sciences Division. She conducted the research with co-corresponding author Margaret Torn, and Christina Castahna and Rachel Porras, who are also Berkeley Lab scientists.

"If our findings are applied to soils around the globe that are similar to what we studied, meaning soils that are not frozen or saturated, our calculations suggest that by 2100 the warming of deeper soil layers could cause a release of carbon to the atmosphere at a rate that is significantly higher than today, perhaps even as high as 30 percent of today's human-caused annual carbon emissions depending on the assumptions on which the estimate is based," adds Hicks Pries.

 

soil climate future
An innovative deep soil warming experiment in full swing. Scientist Caitlin Hicks Pries downloads soil temperature data, while fellow Berkeley Lab scientists Cristina Castanha (left) and Neslihan Tas (middle) work on an experimental plot in the background. (Credit: Berkeley Lab)

 

The need to better understand the response of all soil depths to warming is underscored by projections that, over the next century, deeper soils will warm at roughly the same rate as surface soils and the air. In addition, Intergovernmental Panel on Climate Change simulations of global average soil temperature, using a "business-as-usual" scenario in which carbon emissions rise in the decades ahead, predict that soil will warm 4° Celsius by 2100.

To study the potential impacts of this scenario, the Berkeley Lab scientists pioneered an innovative experimental setup at the University of California's Blodgett Forest Research Station, which is located in the foothills of California's Sierra Nevada mountains. The soil at the research station is representative of temperate forest soils, which in turn account for about 13.5 percent of soil area worldwide.

The scientists built their experiment around six soil plots that measure three metres in diameter. The perimeter of each plot was ringed with 22 heating cables that were vertically sunk more than two metres underground. They warmed three of the plots 4° Celsius for more than two years, leaving the other three plots unheated to serve as controls.

They monitored soil respiration three different ways over the course of the experiment. Each plot had an automated chamber that measured the flux of carbon at the surface every half hour. In addition, one day each month, Hicks Pries and the team measured surface carbon fluxes at seven different locations at each plot.

A third method probed the all-important underground realm. A set of stainless steel "straws" was installed below the surface at each plot. The scientists used the straws to measure CO2 concentrations once a month at five depths between 15 and 90 centimetres. By knowing these CO2 concentrations and other soil properties, they could model the extent to which each depth contributed to the amount of CO2 released at the surface.

 

soil climate future
One of the experimental heating plots. Credit: Berkeley Lab.

 

They discovered that, of the 34 to 37 percent increase in CO2 released at the three warmed plots, 40 percent of this increase was due to CO2 that came from below 15 centimetres. They also found the sensitivity of soil to warming was similar across the five depths.

The scientists say these findings suggest the degree to which soil organic carbon influences climate change may be currently underestimated.

"There's an assumption that carbon in the subsoil is more stable and not as responsive to warming as in the topsoil, but we've learned that's not the case," says Torn. "Deeper soil layers contain a lot of carbon, and our work indicates it's a key missing component in our understanding of the potential feedback of soils to the planet's climate."

Their work is published in the journal Science.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

17th February 2017

U.S. solar installations grew by 95% in 2016

Annual solar PV installations in the U.S. nearly doubled last year – growing from 7,492 megawatts to 14,626 megawatts.

This new record-breaking figure is revealed by GTM Research and the Solar Energy Industries Association (SEIA) in advance of their upcoming U.S. Solar Market Insight report, due for release on 9th March.

 

exponential solar power growth usa 2016 2017

 

In addition, solar power set another record by achieving the greatest share of capacity additions for the first time ever in the U.S. It accounted for 39% of new installations among all energy types in 2016, ahead of natural gas (29%) and wind (26%). As shown on the graph below, this is almost a ten-fold improvement on its 2010 share of 4%.

“What these numbers tell you is that the solar industry is a force to be reckoned with,” said Abigail Hopper, SEIA’s president. “Solar's economically winning hand is generating strong growth across all market segments nationwide, leading to more than 260,000 Americans now employed in solar.”

 

exponential solar power growth usa 2016 2017

 

"In a banner year for U.S. solar, a record 22 states each added more than 100 megawatts," said Cory Honeyman, GTM Research's associate director of U.S. solar. "While U.S. solar grew across all segments, what stands out is the double-digit-gigawatt boom in utility-scale solar, primarily due to solar's cost-competitiveness with natural gas alternatives."

The U.S. now has more than 1.3 million solar PV installations, with a cumulative capacity of over 40 gigawatts.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

17th February 2017

Oxygen levels in the world's oceans have fallen 2% since the 1960s

A study published in the journal Nature finds that oxygen levels in the oceans have declined by 2% globally in the last 50 years, due to warming and stratification.

 

oxygen levels ocean falling 2 percent

 

Oxygen is an essential necessity of life on land. The same applies for almost all organisms in the ocean. However, the oxygen supply in the oceans is threatened by global warming in two ways: warmer surface waters take up less oxygen than colder waters. In addition, warmer water stabilises the stratification of the ocean. This weakens the circulation connecting the surface with the deep ocean and less oxygen is transported into the deep sea. Therefore, models predict a decrease in global oceanic oxygen inventory of the oceans due to global warming. The first global evaluation of millions of oxygen measurements seems to confirm this trend and points to first impacts of global change.

In the journal Nature this week, oceanographers from GEOMAR Helmholtz Centre for Ocean Research Kiel in Germany have published the most comprehensive ever study on global oxygen content in the world's oceans. It demonstrates that overall oxygen levels have dropped by more than 2% over the last 50 years. While 2% might not sound like much, the effects can be dramatic.

"Just a little loss of oxygen in coastal waters can lead to a complete change in ecosystems – a small decrease in oxygen like this can transform from something desirable to very undesirable," said David Baker, a Professor at the University of Hong Kong's Swire Institute of Marine Sciences. "It's almost like the oceans are getting ready for a heart attack. You're essentially slowing the heartbeat of the ocean, and you're getting less oxygen to the ocean."

"Since large fishes in particular avoid or do not survive in areas with low oxygen content, these changes can have far-reaching biological consequences," comments Dr. Sunke Schmidtko, lead author of the study.

The researchers used all historic oxygen data available around the world for their work, supplemented it with current measurements and refined the interpolation procedures to more accurately reconstruct the development of the oxygen "budget". In some regions, previous research had already shown a decrease in oxygen.

 

oxygen levels ocean falling 2 percent
Credit: Martin Visbeck, GEOMAR

 

"To quantify trends for the entire ocean, however, was more difficult, since oxygen data from remote regions and the deep ocean is sparse," explains Dr. Schmidtko. "We were able to document the oxygen distribution and its changes for the entire ocean for the first time. These numbers are an essential prerequisite for improving forecasts for the ocean of the future."

The study also shows that, with the exception of a few regions, oxygen content decreased throughout the entire ocean during the period investigated. The greatest loss was found in the North Pacific. So-called "dead zones" are also multiplying – shallow areas where the water is so low in dissolved oxygen that most sea creatures can't survive. The global decline in ocean oxygen may worsen from 2% to 7% by the year 2100.

"While the slight decrease of oxygen in the atmosphere is currently considered non-critical, the oxygen losses in the ocean can have far-reaching consequences because of the uneven distribution. For fisheries and coastal economies this process may have detrimental consequences," said co-author Dr. Lothar Stramma.

"With measurements alone, we cannot explain all the causes," adds Prof. Martin Visbeck. "Natural processes occurring on time scales of a few decades may also have contributed to the observed decrease." However, the team says their results are consistent with most model calculations that predict a further decrease in oxygen in the oceans due to higher atmospheric CO2 concentrations and consequently higher global temperatures.

"The oceans are really a mirror of human health – if they're sick and dying, then that's the future of humanity as well," said Baker.

 

oxygen levels ocean falling 2 percent

 

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

14th February 2017

Human activity is changing the climate 170 times faster than natural processes

Humans are causing the climate to change 170 times faster than natural forces, research has found. This study, for the first time, produced a mathematical equation to describe the global impact of human activity on the Earth system, known as the Anthropocene equation.

 

anthropocene equation

 

"Over the past 7,000 years, the primary forces driving change have been astronomical – changes in solar intensity, and subtle changes in orbital parameters, along with a few volcanoes. They have driven a rate of change of 0.01°C per century," said Professor Will Steffen from the Australian National University (ANU), one of the study authors. "Human-caused greenhouse gas emissions over the past 45 years have increased the rate of temperature rise to 1.7°C per century, dwarfing the natural background rate."

His paper, co-authored by Owen Gaffney from the Stockholm Resilience Centre, is published in The Anthropocene Review. It examines our planet as a single complex system and assesses the impact of human activities on the system's trajectory. Under current astronomical forcing, and if atmospheric levels of CO2 had remained at their pre-industrial level of 280 parts per million (ppm), Holocene-like conditions could have been expected for another 50,000 years, the paper says.

"We are not saying the astronomical forces of our Solar System or geological processes have disappeared – but in terms of their impact in such a short period of time, they are now negligible compared with our own influence," said Steffen. "Crystallising this evidence in the form of a simple equation gives the current situation a clarity that the wealth of data often dilutes. It also places the contemporary human impact in the context of the great forces of nature that have driven Earth system dynamics over billions of years. The human magnitude of climate change looks more like a meteorite strike than a gradual change."

In addition to CO2, the researchers looked at a range of other impacts. For example, the release of methane (an even more powerful greenhouse gas) has occurred 285 times faster than the natural background rate, leading to a 150% increase in atmospheric concentration since 1750. Humans have also disrupted the nitrogen cycle, now undergoing its largest and most rapid change in 2.5 billion years. Before the Industrial Revolution, only about 5% of land cover was intensively used, but this has now expanded to 55%. The falling pH level of the oceans is yet another concern – they are currently acidifying at their fastest rate since the carboniferous period, 300 million years ago. Biodiversity is collapsing, with extinction rates up to 100 times faster than normal.

Humanity still has a chance to prevent catastrophic climate change, according to Steffen, but time is rapidly running out: "The global economy can function equally well with zero emissions. Research shows we can feed nine billion people – the projected world population by 2050 – and reduce greenhouse gas emissions at the same time."

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

13th February 2017

Japan to build world's largest floating solar power plant

In a joint venture, Kyocera Corporation and Century Tokyo Leasing Corporation have announced plans to start construction of the world's largest floating solar power plant, on the Yamakura Dam reservoir, managed by the Public Enterprises Agency of Chiba Prefecture in Japan for industrial water services.

The 13.7 megawatt (MW) plant will be managed by the Public Enterprises Agency of Chiba Prefecture for industrial water services. It is scheduled for launch during 2018 and will be comprised of approximately 51,000 Kyocera modules installed over a fresh water surface area of 180,000m². The project will generate an estimated 16,170 megawatt hours (MWh) per year – enough to power almost 5,000 typical households – while offsetting 8,170 tons of CO2 emissions annually. This is equal to 19,000 barrels of oil consumed.

The project was initiated in October 2014, when the Public Enterprises Agency of Chiba Prefecture publicly sought companies to construct and operate a floating solar power plant to help reduce environmental impacts. With a decrease in tracts of land suitable for utility-scale solar power plants in Japan, Kyocera has been developing floating solar power plants since 2014, which utilise Japan's abundant water surfaces of reservoirs for agricultural and flood-control uses. The company began operation of 1.7MW and 1.2MW plants in March 2015, followed by the launch of a 2.3MW plant in June.

Like many other nations around the world, Japan is seeing rapid growth in deployment of solar energy. Based on current trends in generating capacity, the country has the potential to be almost 100% solar powered by 2040.

 

Japan to build worlds largest floating solar power plant

 

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

30th January 2017

U.S. crops at risk of "abrupt and substantial yield losses"

A study published in Nature warns that some of the most important crops in the U.S. are at risk of "abrupt and substantial yield losses" from rising temperatures later this century, with harvests potentially declining by 20% for wheat, 40% for soybean and almost 50% for maize.

 

food crops climate change

 

Some of the most important crops in the U.S. are at risk of substantial damage from rising temperatures. To better assess how climate change is likely to impact wheat, maize and soybean, an international team of scientists ran ultra-detailed computer simulations of past, present and future yields. These were shown to accurately reproduce the observed reduction in past crop yields induced by high temperatures, thereby confirming that they captured one main mechanism for future projections. Importantly, the scientists found that increased irrigation could help to reduce the negative effects of global warming on crops – but this is possible only in regions where sufficient water is available. Ultimately, limiting global warming is needed to keep overall crop losses in check.

“We know from observations that high temperatures can harm crops, but now we have a much better understanding of the processes,” says lead author of the study, Bernhard Schauberger from the Potsdam Institute for Climate Impact Research. “The computer simulations that we do are based on robust knowledge from physics, chemistry, biology; on a lot of data and elaborate algorithms. But they of course cannot represent the entire complexity of the crop system, hence we call them models. In our study, they have passed a critical test.”

In their work, the scientists compare the model results to data from actual observations. This way, they can find out if they include the critical factors into their calculations, from temperature to CO2, from irrigation to fertilisation.

For every day above 30°C (86°F), maize and soybean plants can lose 5 percent of their harvest. The simulations developed at the Potsdam Institute show that even small heat increases beyond this threshold can result in abrupt and substantial yield losses. Such temperatures will be more frequent under unabated climate change and could severely harm agricultural productivity. Harvest losses of 20 percent for wheat, 40 percent for soybean and almost 50 percent for maize, relative to non-elevated temperatures, can be expected by the end of this century without substantial emission reductions. These losses do not even consider extremely high temperatures above 36°C (97°F), which are expected to lower yields further. The effects go far beyond the U.S., one of the largest crop exporters: world market crop prices are likely to increase, which is an issue for food security in poor countries. This will be a particular concern as demand for food increases due to both population growth and rising affluence.

“The losses got substantially reduced when we increased irrigation of fields in the simulation, so water stress resulting from temperature increase seems to be a bigger factor than the heat itself,” says co-author Joshua Elliott from the University of Chicago. When water supply from the soil to the plant decreases, the small openings in the leaves gradually close to prevent water loss. They thereby preclude the diffusion of CO2 into the cells, which is an essential building material for the plants. Additionally, crops respond to water stress by increasing root growth at the expense of above-ground biomass and, eventually, yields. “Irrigation therefore could be an important means of adaptation to dampen the most severe effects of warming,” says Elliott. “However, this is of course limited by the lack of water resources in some regions.”

Burning fossil fuels elevates the amount of CO2 in the air. This usually increases the water use efficiency of plants, since they lose less water for each unit of CO2 taken up from the air. However, this cannot be confirmed as a safeguard of yields at high temperatures, the scientists argue. The additional CO2 fertilisation in the simulations does not alleviate the drop in yields associated with higher temperatures above 30°C.

The study, "Consistent negative response of US crops to high temperatures in observations and crop models", appears this month in Nature.

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

26th January 2017

Doomsday Clock moves closer to midnight

The Bulletin of the Atomic Scientists has moved the hands of the iconic "Doomsday Clock" forwards by 30 seconds.

 

doomsday clock 2017

 

The Doomsday Clock is now at two and a half minutes to midnight, having previously been at three minutes to midnight. Normally when changes occur, the hands are moved forwards or backwards in increments of a minute. But today, for the first time in the 70-year history of the clock, the Bulletin of the Atomic Scientists' Science and Security Board has moved the hands 30 seconds closer to midnight. In another first, the Board has decided to act, in part, based on the words of a single person: Donald Trump, the new President of the United States.

The decision to move the hands of the clock is made in consultation with the Bulletin's Board of Sponsors, which include 15 Nobel Laureates. The Science and Security Board's full statement about the Clock is available online.

In January 2016, the Doomsday Clock's minute hand did not change, remaining at three minutes before midnight. The Clock was changed in 2015 from five to three minutes to midnight, the closest it had been since the arms race of the 1980s.

In a statement today, the Bulletin's Science and Security Board notes: "Over the course of 2016, the global security landscape darkened as the international community failed to come effectively to grips with humanity's most pressing existential threats — nuclear weapons and climate change... This already-threatening world situation was the backdrop for a rise in strident nationalism worldwide in 2016, including in a US presidential campaign during which the eventual victor, Donald Trump, made disturbing comments about the use and proliferation of nuclear weapons and expressed disbelief in the overwhelming scientific consensus on climate change. The board's decision to move the clock less than a full minute — something it has never before done — reflects a simple reality: As this statement is issued, Donald Trump has been the US president only a matter of days..."

The statement continues: "Just the same, words matter, and President Trump has had plenty to say over the last year. Both his statements and his actions as President-elect have broken with historical precedent in unsettling ways. He has made ill-considered comments about expanding the US nuclear arsenal. He has shown a troubling propensity to discount or outright reject expert advice related to international security, including the conclusions of intelligence experts. And his nominees to head the Energy Department, and the Environmental Protection Agency dispute the basics of climate science. In short, even though he has just now taken office, the president's intemperate statements, lack of openness to expert advice, and questionable cabinet nominations have already made a bad international security situation worse."

In addition to addressing the statements made by President Trump, the Board also expressed concern about the greater global context of nuclear and climate issues: "The United States and Russia—which together possess more than 90 percent of the world's nuclear weapons—remained at odds in a variety of theatres, from Syria to Ukraine to the borders of NATO; both countries continued wide-ranging modernisations of their nuclear forces, and serious arms control negotiations were nowhere to be seen. North Korea conducted its fourth and fifth underground nuclear tests and gave every indication it would continue to develop nuclear weapons delivery capabilities. Threats of nuclear warfare hung in the background as Pakistan and India faced each other warily across the Line of Control in Kashmir after militants attacked two Indian army bases."

 

nuclear weapon states

 

In surveying the status of climate matters, the Board concluded: "The climate change outlook was somewhat less dismal (in 2016) —but only somewhat. In the wake of the landmark Paris climate accord, the nations of the world have taken some actions to combat climate change, and global carbon dioxide emissions were essentially flat in 2016, compared to the previous year. Still, they have not yet started to decrease; the world continues to warm. Keeping future temperatures at less-than-catastrophic levels requires reductions in greenhouse gas emissions far beyond those agreed to in Paris—yet little appetite for additional cuts was in evidence at the November climate conference in Marrakech."

Rachel Bronson, executive director and publisher, Bulletin of the Atomic Scientists, said: "As we marked the 70th anniversary of the Doomsday Clock, this year's Clock deliberations felt more urgent than usual. In addition to the existential threats posed by nuclear weapons and climate change, new global realities emerged, as trusted sources of information came under attack, fake news was on the rise, and words were used by a President-elect of the United States in cavalier and often reckless ways to address the twin threats of nuclear weapons and climate change."

Lawrence Krauss, the Bulletin Board of Sponsors chair, said: "Wise men and women have said that public policy is never made in the absence of politics. But in this unusual political year, we offer a corollary: Good policy takes account of politics, but is never made in the absence of expertise. Facts are indeed stubborn things, and they must be taken into account if the future of humanity is to be preserved, long term. Nuclear weapons and climate change are precisely the sort of complex existential threats that cannot be properly managed without access to and reliance on expert knowledge. In 2016, world leaders not only failed to deal adequately with those threats; they actually increased the risk of nuclear war and unchecked climate change through a variety of provocative statements and actions, including careless rhetoric about the use of nuclear weapons and the wanton defiance of scientific evidence. To step further back from the brink will require leaders of vision and restraint. President Trump and President Putin can choose to act together as statesmen, or as petulant children, risking our future. We call upon all people to speak out and send a loud message to your leaders so that they do not needlessly threaten your future, and the future of your children."

Retired Rear Admiral David Titley, Bulletin Science and Security Board, said: "Climate change should not be a partisan issue. The well-established physics of Earth's carbon cycle is neither liberal nor conservative in character. The planet will continue to warm to ultimately dangerous levels so long as carbon dioxide continues to be pumped into the atmosphere— irrespective of political leadership. The current political situation in the United States is of particular concern. The Trump administration needs to state clearly and unequivocally that it accepts climate change, caused by human activity, as reality. No problem can be solved unless its existence is first recognised. There are no 'alternative facts' here."

 

 

 

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

26th January 2017

Technological progress alone won't stem resource use

A study by the Massachusetts Institute of Technology (MIT) finds no evidence of an overall reduction in the world’s consumption of materials.

 

technology and resource consumption future timeline
Credit: MIT

 

Are humans taking more resources from the Earth than the planet can safely produce? The answer lies partly in whether we can “dematerialise,” or reduce the amount of materials needed to produce goods and services.

While some scientists believe that the world can achieve significant dematerialisation through improvements in technology, a new MIT-led study finds that technological advances alone will not bring about dematerialisation and, ultimately, a sustainable world.

The researchers found that no matter how much more efficient and compact a product is made, consumers will only demand more of that product and in the long run increase the total amount of materials used in making that product.

Take, for instance, one of the world’s fastest-improving technologies: silicon-based semiconductors. Over the last few decades, technological improvements in the efficiency of semiconductors have greatly reduced the amount of material needed to make a single transistor. As a result, today’s smartphones, tablets, and computers are far more powerful and compact than computers built in the 1970s.

Nonetheless, the researchers find that consumers’ demand for silicon has outpaced the rate of its technological change, and that the world’s consumption of silicon has grown by 345 percent over the last four decades. As others have found, by 2005, there were more transistors used than printed text characters.

“Despite how fast technology is racing, there’s actually more silicon used today, because we now just put more stuff on, like movies, and photos, and things we couldn’t even think of 20 years ago,” says Christopher Magee, a professor of the practice of engineering systems in MIT’s Institute for Data, Systems, and Society. “So we’re still using a little more material all the time.”

The researchers found similar trends in 56 other materials, goods and services – from basic resources such as aluminium and formaldehyde to hardware and energy technologies such as hard disk drives, transistors, wind energy, and photovoltaics. In all cases, they found no evidence of dematerialisation, or an overall reduction in their use, despite technological improvements to their performance.

 

technology and resource consumption future timeline

 

“There is a techno-optimist’s position that says technological change will fix the environment,” Magee observes. “This says, probably not.”

Magee and his co-author, Tessaleno Devezas, a professor at the University of Beira Interior, in Portugal, published their findings recently in the journal Technological Forecasting and Social Change.

In their research, Magee and Devezas examined whether the world’s use of materials has been swayed by an effect known as Jevons’ Paradox. In 1865, the English economist William Stanley Jevons observed that as improvements to coal-fired steam engines reduced the price of coal, England’s consumption of coal actually increased.

While experts believed technological improvements would reduce coal consumption, Jevons countered the opposite was true: Improving coal-fired power’s efficiency would only increase consumer demand for electricity and further deplete coal reserves.

Magee and Devezas looked to see whether Jevons’ Paradox – and consumer demand in general – has prevented dematerialisation of today’s goods and services. They sought to identify a general relationship between dematerialisation, technological change, and Jevons’s Paradox – also referred to as a rebound effect.

The team developed an equation, to calculate whether dematerialisation is taking place for a given product. Their model considers a number of variables including population and economic growth, a product’s yearly increase in technological performance, and demand elasticity – the degree to which demand for a product varies with its price.

Not surprisingly, the researchers’ model indicates that dematerialisation is more likely when demand elasticity for a product is relatively low and the rate of its technological improvement is high. But when they applied the equation to common goods and services used today, they found that demand elasticity and technological change worked against each other – the better a product was made to perform, the more consumers wanted it.

“It seems we haven’t seen a saturation in demand,” Magee explains. “People haven’t said, ‘That’s enough,’ at least in anything that we can get data to test for.”

 

technology and resource consumption future timeline

 

Magee and Devezas gathered data for 57 common goods and services, including widely used chemical components such as ammonia, formaldehyde, polyester fibre, and styrene, along with hardware and energy technologies such as transistors, laser diodes, crude oil, photovoltaics, and wind energy. They worked the data for each product into their equation, and, despite seeing technological improvements in almost all cases, they failed to find a single case in which dematerialisation – an overall reduction in materials – was taking place.

In follow-up work, the researchers were eventually able to identify six cases in which an absolute decline in materials usage has occurred. However, these cases mostly include toxic chemicals such as asbestos and thallium, whose dematerialisation was due not to technological advances, but to government intervention.

There was one other case in which researchers observed dematerialisation: wool. The material’s usage has significantly fallen, due to innovations in synthetic alternatives, such as nylon and polyester fabrics. In this case, Magee says that substitution, and not dematerialisation, has occurred, i.e. wool has simply been replaced by another material to fill the same function.

So what will it take to reduce our materials consumption and achieve a sustainable world?

“What it’s going to take is much more difficult than just letting technological change do it,” Magee says. “Social and cultural change, people talking to each other, cooperating, might do it. That’s not the way we’re going right now, but that doesn’t mean we can’t do it.”

However, others are more hopeful that technology will bring about sustainability, albeit at significant cost.

“[Technology] will get us to a sustainable world – it has to,” says J. Doyne Farmer, a professor of mathematics at the University of Oxford who was not involved in the research. “I say this not only because we need it, but because there is only so much we can suck out of the Earth, and eventually we will be forced into a sustainable world, one way or another. The question is whether we can do that without great pain. Magee’s paper shows that we need to expect more pain than some of us thought.”

---

• Follow us on Twitter

• Follow us on Facebook

 

  speech bubble Comments »
 

 

 

 
     
       
     
   
« Previous  
   
     
   

 
     
 

Blogs

AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure

 

 

Archive

2015

 

2014

 

2013

 

2012

 

2011

 

2010

 

 
 
 
 

 


future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed

Privacy Policy