We tend to think of climate change as being decades into the future, but climate disasters are already displacing millions of people worldwide.
Click to enlarge
The Internal Displacement Monitoring Centre (IDMC) has released its Global Estimates 2012 report. This reveals that over 32.4 million people were forced to flee their homes in 2012 by disasters such as floods, storms and earthquakes. While Asia and west and central Africa bore the brunt, 1.3 million were displaced in rich countries, with the USA particularly affected.
98% of all displacement in 2012 was related to climate and weather events, with flood disasters in India and Nigeria accounting for 41% of the total. In India, monsoon floods displaced 6.9 million, and in Nigeria 6.1 million people were newly displaced. While over the past five years 81% of global displacement has occurred in Asia, in 2012 Africa had a record high for the region of 8.2 million people newly displaced – over four times more than in any of the previous four years.
"In countries already facing the effects of conflict and food insecurity such as in Nigeria, Pakistan, and South Sudan, we observe a common theme," says Clare Spurrell, Chief Spokesperson for IDMC. "Here, vulnerability to disaster triggered by floods is frequently further compounded by hunger, poverty and violence; resulting in a 'perfect storm' of risk factors that lead to displacement."
There is also increasing scientific evidence that climate change will become a factor. A 2012 Special Report from the Intergovernmental Panel on Climate Change (IPCC) found that there is evidence to support the claim that "[d]isasters associated with climate extremes influence population mobility and relocation, affecting host and origin communities."
IDMC’s report highlights how disaster-induced displacement takes a toll in both rich and poor countries with the USA appearing among the top ten countries with the highest levels of new displacement, with over 900,000 people being forced to flee their homes in 2012. People in poorer countries, however, remain disproportionately affected and make up 98% of the global five year total.
"In the US following Hurricane Sandy, most of those displaced were able to find refuge in adequate temporary shelter while displaced from their own homes," says Spurrell. "Compare this to communities in Haiti, where hundreds of thousands are still living in makeshift tents over three years after the 2010 earthquake mega-disaster, and you see a very different picture."
According to the IDMC report, a critical component to improving community resilience and government responses to disasters is better data collection on people who have been displaced. "Currently, the information available is biased, often only focusing on the most visible people who take shelter in official evacuation sites or camps," says Spurrell. "We need to know more about those who seek refuge with families and friends, people who are repeatedly displaced by smaller disasters, or those who are stuck in prolonged displacement following a disaster – not just those that make headlines."
Improvements in lithium ion (Li-ion) battery technology are helping to accelerate the worldwide market for electric vehicles (EVs).
In the last few years, automakers have shifted from nickel-metal hydride (NiMH) batteries to Li-ion batteries. This shift represents a major endorsement of Li-ion chemistry and its ability to perform consistently in an automotive environment. According to a new report from Navigant Research, total worldwide capacity of Li-ion batteries for transportation applications will increase more than ten-fold, from 4,400 megawatt-hours (MWh) in 2013 to nearly 49,000 MWh by 2020.
"Li-ion technology continues to improve, as increased energy densities translate into smaller and lighter battery packs with more power," says David Alexander, senior research analyst with Navigant Research. "At the same time, leading battery cell manufacturers have built new factories utilising the latest production techniques, including greater automation and faster throughput. This will lead to a reduction in the cost per kilowatt-hour (kWh) over the next few years, provided that volumes continue to increase."
The market for Li-ion batteries will primarily be driven by the growth of battery electric vehicles (BEVs), as they utilise much larger battery packs than plug-in hybrid electric vehicles (PHEVs). Today, most BEVs use battery packs ranging from 16 kWh to 85 kWh, compared to PHEVs that typically use packs ranging from 4 kWh to 16 kWh. Additionally, many recently introduced hybrid vehicles, such as the Honda Civic Hybrid, use Li-ion batteries, and the percentage of hybrids using Li-ion technology is expected to grow steadily as automakers update their models.
The report, "Electric Vehicle Batteries", provides a detailed examination of the growing market for Li-ion batteries, including profiles of all of the leading Li-ion battery manufacturers. Forecasts for revenues from Li-ion batteries, segmented by vehicle type, are included, along with vehicle roadmaps for hybrid, PHEV, and BEV sales by region. The report also includes a review of competing energy storage technologies, including ultracapacitors and nickel-metal hydride batteries. An Executive Summary of the report is available for free download on the Navigant Research website.
Scientists have produced the largest flexible, plastic solar cells in Australia – 10 times the size of what they were previously able to – thanks to a new printer installed at CSIRO, the country's national science agency.
The printer has allowed researchers from the Victorian Organic Solar Cell Consortium (VICOSC) – a collaboration between CSIRO, the University of Melbourne, Monash University and industry partners – to print organic photovoltaic cells the size of an A3 sheet of paper.
According to CSIRO materials scientist Dr Scott Watkins, printing cells on such a large scale opens up a huge range of possibilities for pilot applications: "There are so many things we can do with cells this size. We can set them into advertising signage, powering lights and other interactive elements. We can even embed them into laptop cases to provide backup power for the machine inside."
The new printer, worth A$200,000, is a big step up for the VICOSC team. In just three years they have gone from making cells the size of a fingernail to cells 10cm square. Now with the new printer they have jumped to cells that are 30cm wide.
VICOSC project coordinator and University of Melbourne researcher Dr David Jones says that one of the great advantages of the group's approach is that they're using existing printing methods, making it a very accessible technology.
"We're using the same techniques that you would use if you were screen printing an image onto a T-Shirt," he says.
Using semiconducting inks, the researchers print the cells straight onto paper-thin flexible plastic or steel. With the ability to print at speeds of up to ten metres per minute, this means they can produce one cell every two seconds.
As the researchers continue to scale up their equipment, the possibilities will become even greater.
"Eventually we see these being laminated to windows that line skyscrapers," Dr Jones says. "By printing directly to materials like steel, we'll also be able to embed cells onto roofing materials."
The organic photovoltaic cells, which produce up to 50 watts of power per square metre, could even be used to improve the efficiency of more traditional silicon solar panels.
Click to enlarge
"The different types of cells capture light from different parts of the solar spectrum. So rather than being competing technologies, they are actually very complementary," Dr Watkins says.
The scientists predict that the future energy mix for the world, including Australia, will rely on many non-traditional energy sources. "We need to be at the forefront of developing new technologies that match our solar endowment, stimulate our science and support local, high-tech manufacturing.
"While the consortium is focused on developing applications with current industrial partners there are opportunities to work with other companies through training programs or pilot-scale production trials," he says.
As part of the consortium, a complementary screen printing line is also being installed at nearby Monash University. Combined, they will make the Clayton Manufacturing and Materials Precinct one of the largest organic solar cell printing facilities in the world.
Fish have been migrating toward Earth's poles in search of cooler waters since at least 1970, according to a new study by the University of British Columbia (UBC) that reveals yet more evidence of a warming planet. This has major implications for global food security in the future.
In a Nature study published this week, UBC researchers used temperature preferences of fish and other marine species as a sort of “thermometer” to assess effects of climate change on the world's oceans between 1970 and 2006.
They found that global fisheries catches were increasingly dominated by warm-water species, as a result of fish migrating towards the poles in response to rising ocean temperatures.
“One way for marine animals to respond to ocean warming is by moving to cooler regions,” says the study’s lead author William Cheung, an assistant professor at UBC’s Fisheries Centre. “As a result, places like New England on the northeast coast of the U.S. saw new species typically found in warmer waters, closer to the tropics.
“Meanwhile in the tropics, climate change meant fewer marine species and reduced catches, with serious implications for food security.”
“We’ve been talking about climate change as if it’s something that’s going to happen in the distant future – our study shows that it has been affecting our fisheries and oceans for decades,” says co-author Daniel Pauly, principal investigator with UBC’s Sea Around Us Project. “These global changes have implications for everyone in every part of the planet.”
Limiting the amount of warming experienced by the world's oceans in the future could buy some time for tropical coral reefs, say researchers from the University of Bristol.
The study, published by the journal Geophysical Research Letters, used computer models to investigate how shallow-water tropical coral reef habitats may respond to climate change over the coming decades.
Dr Elena Couce and colleagues found that restricting greenhouse warming to three watts per square metre (equivalent to just 50-100 parts per million carbon dioxide, or approximately half again the increase since the Industrial Revolution) is needed in order to avoid large-scale reductions in reef habitat occurring in the future.
Shallow-water tropical coral reefs are amongst the most productive and diverse ecosystems on the planet. They are currently in decline due to increasing frequency of bleaching events, linked to rising temperatures and fossil fuel emissions.
Credit: Skeptical Science
Dr Couce said: "If sea surface temperatures continue to rise, our models predict a large habitat collapse in the tropical western Pacific which would affect some of the most biodiverse coral reefs in the world. To protect shallow-water tropical coral reefs, the warming experienced by the world's oceans needs to be limited."
The researchers modelled whether artificial means of limiting global temperatures – known as solar radiation 'geoengineering' – could help. Their results suggest that if geoengineering could be successfully deployed then the decline of suitable habitats for tropical coral reefs could be slowed. They found, however, that over-engineering the climate could actually be detrimental as tropical corals do not favour overly-cool conditions. Solar radiation geoengineering also leaves unchecked a carbon dioxide problem known as 'ocean acidification'.
Dr Couce said: "The use of geoengineering technologies cannot safeguard coral habitat long term, because ocean acidification will continue unabated. Decreasing the amount of carbon dioxide in the atmosphere is the only way to address reef decline caused by ocean acidification."
Dr Erica Hendy, one of the co-authors, added: "This is the first attempt to model the consequences of using solar radiation geoengineering on a marine ecosystem. There are many dangers associated with deliberate human interventions in the climate system and a lot more work is needed to fully appreciate the consequences of intervening in this way."
Solar engineers from the University of New South Wales (UNSW) have developed a new method to dramatically improve the quality of low-grade silicon, boost electrical efficiency and reduce the cost of solar panels.
The UNSW team has discovered a mechanism to control hydrogen atoms so they can better correct deficiencies in silicon – by far the most expensive component used in the making of solar cells.
"This process will allow lower-quality silicon to outperform solar cells made from better-quality materials," says Scientia Professor Stuart Wenham from the School of Photovoltaics and Renewable Energy Engineering.
Standard commercial silicon cells currently have a maximum efficiency of around 19%. The new technique, patented by UNSW researchers earlier this year, is expected to produce efficiencies between 21% and 23%, says Wenham.
"By using lower-quality silicon to achieve higher efficiencies, we can enable significant cost reductions," he says.
The solar industry has long been focused on bringing down the cost of silicon. However, cheaper silicon also means lower-quality silicon, with more defects and contaminants that reduce efficiency.
It's been known for several decades that hydrogen atoms can be introduced into the atomic structure of silicon to help correct these defects, but until now, researchers had limited success in controlling the hydrogen to maximise its benefits or even understanding why this happens.
Atomic structure of silicon.
"Our research team has worked out how to control the charge state of hydrogen atoms in silicon – something that other people haven't previously been able to do," says Wenham.
Hydrogen atoms can exist in three 'charge' states – positive, neutral and negative. The charge state determines how well the hydrogen can move around the silicon and its reactivity, which is important to help correct the defects.
"We have seen a 10,000 times improvement in the mobility of the hydrogen and we can control the hydrogen so it chemically bonds to things like defects and contaminants, making these inactive," says Wenham.
The UNSW team currently has eight industry partners interested in commercialising the technology, and is also working with manufacturing equipment companies to implement the new capabilities.
The project, which has been generously supported by the Australian Renewable Energy Agency, is expected to be completed in 2016.
UNSW still holds the world-record for silicon cell efficiency at 25%, and last week, Scientia Professor and solar pioneer Martin Green, was elected into the Fellowship of the United Kingdom's prestigious Royal Society.
This week, atmospheric CO2 reached a worrying milestone: 400 parts per million, a level not seen in more than five million years. The last time Earth had this concentration of greenhouse gases, average sea levels were 25m higher than today, and steaming jungles covered northern Canada.
Before the Industrial Revolution, global average CO2 was about 280 parts per million (ppm). During the last 800,000 years, CO2 fluctuated between 180 ppm during ice ages and 280 ppm during interglacial warm periods. Today's rate of increase is over 100 times faster than the increase that occurred when the last ice age ended. Scientists warn that a "safe" limit for CO2 concentration is 350 ppm. On current trends, we are heading for somewhere between 900-1,000 ppm by the end of this century, with a global average temperature rise of 6ºC (11ºF). In other words, the end of modern civilisation.
Once emitted, CO2 added to the atmosphere and oceans remains there for thousands of years. Thus, climate changes forced by CO2 depend primarily on cumulative emissions, making it progressively more and more difficult to avoid further substantial climate change. 99.8 percent of peer-reviewed, published climate studies agree that global warming is real. Any "slowdown" of atmospheric and land warming in recent years is vastly outweighed by ocean heat content, a fact which is often ignored by the mainstream media and blogosphere.
In a letter to Barack Obama, 150 high-profile Democrats have urged the president to use his next four years to take meaningful action on climate change. This includes blocking the controversial Keystone Pipeline, which would be equivalent to 51 new coal-fired power plants if allowed to go ahead. The letter, reproduced in full below, follows in the wake of a similar message from businesses last month.
President Barack Obama
The White House
1600 Pennsylvania Ave NW
Washington, DC 20500
Dear President Obama,
As business leaders, philanthropists, and supporters of your 2008 and 2012 campaigns, we write to urge you to reject the Keystone XL tar sands pipeline and to do everything in your power to accelerate the transition away from fossil fuels and to clean energy sources.
We have read of your admiration for President Lincoln, surely the most beloved of all presidents. He made one of the most important decisions of his presidency and for our nation when he decided that he would fight for the 13th Amendment to end slavery even if it took every ounce of his political capital. Your decision on Keystone may not be so weighty, but we believe it holds a comparable urgency and importance, not strictly as a pipeline decision but as a presidential choice that will signal a fundamentally new direction for our nation.
We urge you to proclaim with clarity and purpose that our nation will transition away from carbon-based fossil fuels to job-creating clean energy. As challenging as this may be, the costs pale in comparison to the human consequences of unchecked climate disruption. We must help impacted communities and industries. We cannot make these changes overnight, but we must make them. Yours is the last presidency in which it is possible for America to choose a responsible path forward for itself, before climate disruption becomes unmanageably dangerous. "Winning" a safe climate future is a long game, but we can lose it very quickly - on your watch. As the IEA starkly warned, continued investment in capital intensive, long-lived fossil fuel infrastructure like Keystone XL will "lock in" emission trajectories that make catastrophic climate disruption inevitable.
The Keystone decision affords you a rare opportunity to pivot away from fossil fuels and towards a clean energy future in a way that signals the necessary sea change. The controversy associated with the decision is commensurate with its historic significance. Of course, no single decision is technically decisive with respect to climate disruption. But those who dismiss the Keystone decision as "merely symbolic" underestimate both its substantive importance and its place in history and your presidency.
This decision more than any other will signal your direction, your commitment, your resolve. It is the biggest, most explicit statement you will make in this historic moment, the moment when America turns from denial to solutions - or fails to.
Under trying circumstances and against entrenched opposition, you have led America toward a clean energy future by improving fuel efficiency standards, extending clean energy production tax credits, and asserting EPA authority to regulate coal-fired power plants. Your call to action on climate change in your State of the Union and Inaugural addresses inspired us. We thank you for this leadership, and urge you to push now, beyond what official Washington deems possible, toward what we know is necessary.
We pledge to support you in every way possible as you help our nation "respond to the threat of climate change, knowing that failure to do so would betray our children and future generations." We believe in the power and promise of clean energy. We believe it's time to look our kids and grandkids - the prospective victims of still-preventable climate disasters - in the eye and say, "We will do what must be done to protect you. We will make this better." But they won't believe us until we stop making it worse. That's why we urge you in the strongest possible terms to reject the Keystone XL tar sands pipeline.
With hope and determination to build a healthy future, and the deepest respect for your leadership,
In the last few weeks, two large animals have been officially declared extinct – the Formosan clouded leopard and the rhinoceros in Mozambique.
Credit: Georgios Kollidas
The Formosan clouded leopard was the second largest carnivore in Taiwan, after the Formosan black bear. A team of local and US zoologists had been trying for 13 years to find the species, using thousands of infrared cameras and scent traps. The last known evidence of these animals came in the 1990s, in the form of pugmarks located near Yushan National Park. Despite an extensive search, none have been found since then. As with many extinctions, the likely cause of their demise is poaching and destruction of natural habitat due to development projects. The only Formosan clouded leopard remaining in Taiwan is now a stuffed specimen at the National Taiwan Museum.
Another large animal – the rhinoceros – has disappeared from Mozambique, according to both a leading rhino expert and the warden in charge of Great Limpopo Transfrontier Park. Wiped out more than a century ago by hunters, they were reintroduced several years ago, but have again been driven to extinction by poachers seeking their horns for sale in Asia. Somewhat ironically, the picture above depicts a rhino on Mozambique's national currency.
Other notable extinctions in recent years include the following:
The Alaotra Grebe
A freshwater diving bird, once endemic to Lake Alaotra and surrounding lakes in Madagascar. The species declined over the course of the 20th century, mainly because of habitat destruction, entanglement with monofilament gillnets and predation by the introduced snakehead murrel fish. This was the 162nd bird extinction since 1600 AD.
Image credit: L. Shyamal
The Eastern Cougar
Also known as "ghost cats", these animals were decimated by European settlers arriving in the eastern United States during the 1700s and 1800s. The last confirmed Eastern cougar was trapped in the late 1930s and the species was officially declared extinct in 2011. Considered by many biologists to be a subspecies of the North American cougar.
Image credit: US Fish & Wildlife Service
The Western Black Rhinoceros
One of the four subspecies of black rhino. In 2006, the World Conservation Union (IUCN) announced that it was tentatively being declared extinct, but efforts to locate surviving individuals continued. The last western black rhino is believed to have been killed in 2011. The remaining three subspecies are critically endangered.
Image credit: US Fish and Wildlife Service
The Japanese River Otter
Formerly widespread in Japan, the population suddenly collapsed in the 1930s, and the mammal nearly vanished. Since then, it has only been spotted several times. The last official sighting was in 1979. It was subsequently classified as a "Critically Endangered" species on the Japanese Red List. In August 2012, it was officially declared extinct by the Japanese Ministry of the Environment.
Image credit: Hiroshi Kibe
Pinta Island Tortoise
The Pinta Island tortoise was a subspecies of Galápagos tortoise. By 1900, most had been wiped out due to hunting. By the mid-20th century, it was assumed that the subspecies was extinct, until a single male was discovered on the island in 1971. Efforts were made to mate the male, named Lonesome George, with other subspecies, but no viable eggs could be produced. Lonesome George died on 24th June 2012 and the subspecies was believed to have become extinct. There is hope, however, as 17 first-generation hybrids were recently found on Isabela Island. Genetic analysis showed that these hybrids had a parent like Lonesome George. Since these specimens are juveniles, their parents may still be alive.
Image credit: Mike Weston
In a world increasingly dominated by human industrial activity, many more species of both animal and plant life will go extinct in the coming decades. If present trends continue, it is estimated that rhinos could disappear completely by 2025 – not just in Mozambique, but worldwide. Elephants are under severe threat, too, with industrial-scale poaching reducing their numbers by 40,000 each year. If nothing is done, the world's biggest land animal could vanish from the wild by 2024, a prospect that seems almost unthinkable, yet is fast becoming a reality. Prices for ivory and rhino horn have soared in recent years, a situation made worse by corruption of wildlife rangers offered money from criminal poacher syndicates.
Hunting and poaching activities pale into insignificance when compared to a far greater problem, however: climate change. Some scientists estimate that up to half of presently existing species may become extinct by 2100. Already, the rate of species extinctions is between 100 and 1,000 times the normal "background" rate seen in the fossil record. This could increase tenfold by the mid-21st century. We face the prospect of a genuine mass extinction, something which has only happened on five previous occasions in the whole of Earth's 3.5 billion year evolutionary history.
At the end of a long day, it can be more convenient to order your groceries online while sitting on the living room couch instead of making a late-night run to the store. New research shows it's also much more environmentally friendly to leave the car parked and opt for groceries delivered to your doorstep.
University of Washington engineers have found that using a grocery delivery service can cut carbon dioxide emissions by at least half when compared with individual household trips to the store. Trucks filled to capacity that deliver to customers clustered in neighbourhoods produced the most savings in carbon dioxide emissions.
"A lot of times, people think they have to inconvenience themselves to be greener, and that actually isn't the case here," said Anne Goodchild, UW associate professor of civil and environmental engineering. "From an environmental perspective, grocery delivery services overwhelmingly can provide emissions reductions."
Consumers have increasingly more grocery delivery services to choose from. AmazonFresh operates in the Seattle area, while Safeway's service is offered in many U.S. cities. FreshDirect delivers to residences and offices in the New York City area. Last month, Google unveiled a shopping delivery service experiment in the San Francisco Bay Area, and UW alumni recently launched the grocery service Geniusdelivery in Seattle.
As companies continue to weigh the costs and benefits of offering a delivery service, Goodchild and Erica Wygonik, a UW doctoral candidate in civil and environmental engineering, looked at whether using a grocery delivery service was better for the environment, with Seattle as a test case. In their analysis, they found delivery service trucks produced 20 to 75 percent less carbon dioxide than the corresponding personal vehicles driven to and from a grocery store.
They also discovered significant savings for companies – 80 to 90 percent less carbon dioxide emitted – if they delivered based on routes that clustered customers together, instead of catering to individual household requests for specific delivery times.
Credit: Goodchild/Wygonik, UW
"What's good for the bottom line of the delivery service provider is generally going to be good for the environment, because fuel is such a big contributor to operating costs and greenhouse gas emissions," Wygonik said. "Saving fuel saves money, which also saves on emissions."
The researchers used an EPA modelling tool which calculated emissions at a much more detailed level than previous studies have done. Their work was funded by the Oregon Department of Transportation and published in the Journal of the Transportation Research Forum.
Design approval has been given for a crucial reactor component of the ITER nuclear fusion project, which is currently under construction in France and expected to begin generating power in 2022.
The ITER blanket system, a crucial technology on the way to fusion power, will now proceed to the manufacturing stage. "The development and validation of the final design of the ITER blanket and first wall technology is a major achievement on our way to deuterium-tritium operation — the main goal of the ITER project," said Rene Raffray, in charge of the blanket for the ITER Organisation. "We are looking at a first-of-a-kind fusion blanket which will operate in a first-of-a-kind fusion experimental reactor."
The ITER blanket system (illustrated above) provides the physical boundary for the plasma and contributes to the thermal and nuclear shielding of the vacuum vessel and the external machine components such as the superconducting magnets operating in the range of 4 Kelvin (-269°C). Directly facing the ultra-hot plasma and having to cope with large electromagnetic forces, while interacting with major systems and other components, the blanket is arguably the most critical and technically challenging component in ITER.
Due to the high heat deposition expected during plasma operation (the blanket is designed to take a maximum thermal load of 736 MW), ITER will be the first fusion device with an actively cooled blanket. The cooling water is fed to and from the shield blocks — of which there are 440, with each module weighing up to 4.5 tons — through manifolds and branch pipes. Furthermore, the modules have to provide passage for the multiple plasma diagnostic technologies, for viewing systems, and for the plasma heating systems.
The procurement of the shield blocks is equally shared between China and Korea. The first wall panels will be manufactured by Europe (50%), Russia (40%) and China (10%). Russia will, in addition, provide the flexible supports, the key pads and the electrical straps.
The assembly of the blanket system will be among the final stages of the ITER project. Completion had originally been scheduled for 2019, but delays with the construction and commissioning phases have pushed this back to 2022.
98 percent of the new lamp's energy goes to lighting the street instead of the night sky.
Streetlights illuminate the night, shining upon roadways and sidewalks across the world, but these ubiquitous elements of the urban environment are notoriously inefficient and major contributors to light pollution that washes out the night sky. Recent innovations in light emitting diodes (LEDs) have improved the energy efficiency of streetlights, but, until now, their glow still wastefully radiated beyond the intended area. A team of researchers from Taiwan and Mexico has developed a new lighting system design that harnesses high-efficiency LEDs and ensures they shine only where they’re needed, sparing surrounding homes and the evening sky from unwanted illumination. The team report their findings in the Optical Society's journal Optics Express.
A unique feature of the new LED system is its adaptability to different street lamp layouts, “to all kinds of streets and roads, providing a uniform illumination with high energy efficiency,” says co-author Ching-Cherng Sun of National Central University in Taiwan. For example, some modern lamps that line a thoroughfare or suburban sidewalk lean into the middle of the road, lighting the street from above. But more often, lamps are posted to one side of a street, or alternating in a “zig-zag” pattern from one side to the other – a layout that may be more efficient for roads with high traffic flow. The new design provides flexibility to be used for different illumination requests while maintaining a high efficiency, Sun says.
The proposed lamp is based on a novel three-part lighting fixture. The first part contains a cluster of LEDs, each of which is fitted with a special lens, called a Total Internal Reflection (TIR) lens, that focuses the light so the rays are parallel to one another instead of intersecting—a process called collimation. These lens-covered LEDs are mounted inside a reflecting cavity, which “recycles” the light and ensures that as much of it as possible is used to illuminate the target. Finally, as the light leaves the lamp it passes through a diffuser or filter that cuts down on unwanted glare. The combination of collimation and filtering also allows researchers to control the beam’s shape: the present design yields a rectangular light pattern ideally suited for street lighting, the researchers say.
The team tested their design’s performance by analysing how little the beam would spread as it hit its target — a road or sidewalk 10 metres or more away from the source of the light. They quantified the lamp’s performance using something called optical utilisation factor (OUF), a number that describes the relationship between the flow rate of light at the target and the flow rate of light coming directly out of the LEDs. Higher OUF indicates better performance. Simulations show that the new design achieves an OUF of up to 81 percent, greatly outperforming a recent “excellent” design that reached 45 percent. Furthermore, the proposed streetlamp meets high expectations for power and brightness. Light pollution is also significantly reduced: for conventional street lamps, up to a fifth of their total energy is directed horizontally or upward into the sky. The best LED streetlamps reduce this to a tenth of their total energy. In the new model, just 2 percent of the lamp’s total energy would contribute to light pollution.
In addition to cutting light pollution and glare, the new model could also save energy. “A general LED street light could reduce power consumption by 40 to 60 percent,” Sun says; the increased efficiency of the proposed design would likely save an additional 10 to 50 percent. Furthermore, he adds, the module would be simple to fabricate, since it comprises just four parts, including a type of LED bulb commonly used in the lighting industry.
Sun’s team expects to finish a prototype of their design in the next 3 to 6 months, and to begin practical installations of the new street lamp as early as next year. According to a recent report by the Energy Saving Trust, LED lamps will dominate the commercial and domestic lighting markets in 2015. Another report, by Navigant Research, forecasts that worldwide unit shipments of LED lamps will grow from 68 million in 2013 to 1.28 billion annually by 2021. The markets for every other lighting technology will contract over that period.
Schematic of the new street lamp. Credit: Optics Express
Researchers from the U.S. Department of Energy’s (DOE) SLAC National Accelerator Laboratory and Stanford University have designed a low-cost, long-life battery that could enable solar and wind energy to become major suppliers to the electrical grid.
Credit: Matt Beardsley/SLAC
"For solar and wind power to be used in a significant way, we need a battery made of economical materials that are easy to scale and still efficient," said Yi Cui, associate professor at Stanford. "We believe our new battery may be the best yet designed to regulate the natural fluctuations of these alternative energies."
Currently, the electrical grid cannot tolerate large and sudden power fluctuations caused by wide swings in sunlight and wind. As solar and wind's combined contributions to an electrical grid approach 20 percent, energy storage systems must be available to smooth out the peaks and valleys of this "intermittent" power – storing excess energy and discharging when input drops.
Among the most promising batteries for intermittent grid storage today are "flow" batteries, because it's relatively simple to scale their tanks, pumps and pipes to the sizes needed to handle large capacities of energy. The new flow battery developed by Cui's group has a simplified, less expensive design that presents a potentially viable solution for large-scale production.
Today's flow batteries pump two different liquids through an interaction chamber where dissolved molecules undergo chemical reactions that store or give up energy. The chamber contains a membrane that only allows ions not involved in reactions to pass between the liquids while keeping the active ions physically separated. This battery design has two major drawbacks: the high cost of liquids containing rare materials such as vanadium – especially in the huge quantities needed for grid storage – and the membrane, which is also very expensive and requires frequent maintenance.
The new Stanford/SLAC battery design uses only one stream of molecules and does not need a membrane at all. Its molecules mostly consist of the relatively inexpensive elements lithium and sulfur, which interact with a piece of lithium metal coated with a barrier that permits electrons to pass without degrading the metal. When discharging, the molecules, called lithium polysulfides, absorb lithium ions; when charging, they lose them back into the liquid. The entire molecular stream is dissolved in an organic solvent, which doesn't have the corrosion issues of water-based flow batteries.
"In initial lab tests, the new battery also retained excellent energy-storage performance through more than 2,000 charges and discharges, equivalent to more than 5.5 years of daily cycles," Cui said.
To demonstrate their concept, the researchers created a miniature system using simple glassware. Adding a lithium polysulfide solution to the flask immediately produces electricity that lights an LED. A utility version of the new battery would be scaled up to store many megawatt-hours of energy.
In the future, Cui's group plans to make a laboratory-scale system to optimize its energy storage process and identify potential engineering issues, and to start discussions with potential hosts for a full-scale field-demonstration unit.
For the first time, solar energy accounted for all new utility electricity generation capacity added to the U.S. grid last month, according to the Federal Energy Regulatory Commission’s (FERC’s) latest report.
More than 44 megawatts (MW) of solar electric capacity was brought online from seven projects in California, Nevada, Arizona, New Jersey, Hawaii, and North Carolina. All other energy sources combined added no new generation.
Solar also had a strong showing in FERC’s quarterly generation numbers, accounting for about 30 percent of all new utility-scale capacity. The report focuses exclusively on larger facilities and does not include energy generated by net-metered installations. Net-metered systems account for more than half of all U.S. solar electric capacity.
“This speaks to the extraordinary strides we have made in the past several years to bring down costs and ramp up deployment,” said Rhone Resch, president and CEO of the Solar Energy Industries Association. “Since 2008, the amount of solar powering U.S. homes, businesses and military bases has grown by more than 600 percent — from 1,100 megawatts to more than 7,700 megawatts today. As FERC’s report suggests, and many analysts predict, solar will grow to be our nation’s largest new source of energy over the next four years.”
FERC’s report supports other findings which show solar power to be one of the fastest growing energy sources in the U.S., powering homes, businesses and utility grids across the nation. The Solar Market Insight annual edition shows the U.S. installed 3,313 megawatts (MW) of solar photovoltaics (PV) in 2012, a record for the industry.
Some of this growth is attributed to the fact that the cost of a solar system has dropped by nearly 40 percent over the past two years, making solar more affordable than ever for utilities and consumers.
“In 2012, the U.S. brought more new solar capacity online than in the three prior years combined,” Resch added. “These new numbers from FERC support our forecast that solar will continue a pattern of growth in 2013, adding 5.2 GW of solar electric capacity. This sustained growth is enabling the solar industry to create thousands of good jobs and to provide clean, affordable energy for more families, businesses, utilities, and the military than ever before.”
Today, America’s solar industry employs 119,000 workers throughout the country. That’s a 13.2 percent growth over 2011’s jobs numbers, making solar one of the fastest-growing job sectors in the nation. Solar grid parity is forecast to reach almost 10 percent of the U.S. by 2022.
Researchers at the University of Illinois have developed a new type of battery that could revolutionise the way consumer electronics and electric vehicles are powered.
Led by William King, the Bliss Professor of mechanical science and engineering, the researchers published their results in Nature Communications. They describe a new class of "microbatteries" which owe their high performance to an internal three-dimensional microstructure.
"The thinking parts of computers have gotten small," said King. "And the battery has lagged far behind. This is a microtechnology that could change all of that. Now, the power source is as high-performance as the rest of it."
Batteries have two key components: the anode (minus side) and cathode (plus side). Building on a novel fast-charging cathode design by materials science and engineering professor Paul Braun's group, King and his colleague James Pikul developed a matching anode, then developed a new way to integrate the two components at the microscale to make a complete battery with superior performance.
"Our key insight," they report, "is that the battery micro-architecture can concurrently optimize ion and electron transport for high-power delivery, realized here as three-dimensional bi-continuous interdigitated microelectrodes. The battery microarchitecture affords trade-offs between power and energy density, resulting in a high-performance power source which is scalable to larger areas."
With so much raw power, the batteries could enable sensors that broadcast 30 times farther, or devices 30 times smaller. The batteries are rechargeable and can charge 1,000 times faster than competing technologies, potentially allowing a smartphone to be replenished in a matter of seconds. As well as consumer electronics, a vast range of other applications could benefit – from tiny medical devices, up to large objects like electric vehicles.
The team is now working on integrating their batteries with other components and will begin trials on electronic equipment before the end of the year. Safety issues will also need to be resolved, as well as manufacturability at low cost. However, this appears to be a very promising development.
How will the revolutionary technology of 3D printing help us rise to the future challenge of peak oil? In his latest video, futurist Christopher Barnatt explains. For more information on 3D printing, see explainingthefuture.com.
The BIQ House took around three years to build, with design and construction costs of €5 million ($6.5 million). It features "bio-reactors" in the facade which contain microalgae. These live in a water solution, with nutrients and carbon dioxide provided by an automated system. Each of the 129 tanks can be rotated towards the Sun, generating biomass that can either cool or heat the building, while serving as a renewable energy source.
Even if you don't believe in man-made climate change, this type of building makes sense from an economic viewpoint – given the finite supply of fossil fuels. Such "living" buildings could actually produce more resources than they consume, potentially easing the population crisis. They are expected to be commonplace by 2050.
Josef Hargrave, consultant in Arup's Foresight + Innovation team: "By producing food and energy, and providing clean air and water, buildings can evolve from being passive shells into adaptive and responsive organisms – living and breathing structures supporting the cities of tomorrow."
As President Obama unveils his budget for the coming year, dozens of major U.S. companies have signed a "Climate Declaration," urging federal policymakers to take action on climate change, asserting that a bold response to the climate challenge is one of the greatest economic opportunities of the 21st century.
Signatories of the Climate Declaration are among the country's best-known consumer brands – including Starbucks, Intel, eBay, Nike, Levi Strauss & Co, IKEA, Jones Lang LaSalle, L'Oréal, the North Face, the Portland Trail Blazers, Timberland and Unilever, among others (a full list of signatories is available at www.climatedeclaration.us).
Over the course of an ongoing campaign by Ceres and its BICEP (Business for Innovative Climate & Energy Policy) coalition, other businesses, as well as individuals, will be encouraged to sign the Declaration and join the call to action.
"The signers of the Climate Declaration have a clear message for Washington: Act on climate change. We are, and it's good for our businesses," said Anne Kelly, Director of BICEP. "The cost of inaction is too high. Policymakers should see climate change policy for what it is: an economic opportunity."
Together, the Declaration signatories provide approximately 475,000 U.S. jobs and generate a combined annual revenue of approximately $450 billion. Extreme weather events like Hurricane Sandy are affecting growing numbers of these companies and exposing the United States’ economic vulnerability to climate change.
"From droughts that affect cotton crops to Hurricane Sandy, which caused extensive damage to our operations, climate affects all aspects of our business," said Eileen Fisher, CEO of New York-based apparel firm Eileen Fisher, which suffered severe damage and business interruption during the 2012 storm. "As a socially and environmentally responsible company, we are trying to affect positive change, but business can't do it alone. We need the support of strong climate legislation."
The signatories of the Declaration are calling for Congress to address climate change by promoting clean energy, boosting efficiency and limiting carbon emissions – strategies that these businesses already employ within their own operations.
"Businesses understand that planning for a successful future takes investment today. One of the most important things Congress can do to grow our economy and protect our planet is to pass smart climate change legislation this year. Our workforce, supply chain and consumers are counting on us to lead the way," said Anna Walker, Director, Government Affairs and Public Policy at Levi Strauss & Co.
BICEP members have supported several climate-driven policies, including historic automotive fuel economy standards signed into law in 2012 and the extension of the Production Tax Credit for wind power. Innovation within the transportation, electric power sectors and IT sectors, among others, will be essential to meeting the climate challenge.
"eBay is committed to driving a future for commerce that embraces clean energy innovation and is ultimately more sustainable," said Lori Duvall, Global Director, Green at eBay Inc. "Our efforts extend across our data, employee and distribution center portfolios, our shipping and logistics infrastructure, as well as the actions of buyers, sellers, and merchants on our platforms. We see our participation in this coalition as a key element in bringing to life our vision for enabling greener forms of commerce over the long term."
The Climate Declaration comes on the heels of Obama’s renewed commitment to combat the threat of climate change and a recent study from Ceres, Calvert Investments and WWF indicating that a strong majority of Fortune 100 companies have set renewable energy or greenhouse gas reduction goals. Recent polls conducted by Gallup and Yale University, respectively, indicate that a majority of Americans believe climate change is happening and that corporations, as well as government officials, should be doing more to address the issue.
Below is the trailer for an upcoming sci-fi movie, Elysium. It is directed by Neill Blomkamp, whose previous work includes the critically-acclaimed District 9. Set in the 22nd century, Elysium portrays two groups of people: a hyper-rich overclass who live in the utopian paradise of an orbiting space habitat; and the billions of others, who struggle in the crime-ridden, overpopulated and ruined environment of Earth. For more information, visit the official website.
A team of Virginia Tech researchers has discovered a way to extract large quantities of hydrogen from any plant, a breakthrough that has the potential to bring a low-cost, environmentally-friendly fuel source to the world.
"Our new process could help end our dependence on fossil fuels," said Y.H. Percival Zhang, an associate professor of biological systems engineering in the College of Agriculture and Life Sciences and the College of Engineering. "Hydrogen is one of the most important biofuels of the future."
Zhang and his team have succeeded in using xylose, the most abundant simple plant sugar, to produce a large quantity of hydrogen that previously was attainable only in theory. Zhang's method can be performed using any source of biomass. The discovery is a featured editor's choice in an online version of the chemistry journal Angewandte Chemie.
This new environmentally friendly method of producing hydrogen utilises renewable natural resources, releases almost zero greenhouse gases, and does not require costly or heavy metals. Previous methods to produce hydrogen were expensive and created greenhouse gases.
The U.S. Department of Energy says that hydrogen fuel has the potential to dramatically reduce reliance of fossil fuels and automobile manufacturers are aggressively trying to develop vehicles that run on hydrogen fuel cells. Unlike gas-powered engines that spew out pollutants, the only by-product of hydrogen fuel is water. Zhang's discovery opens the door to an inexpensive, renewable source of hydrogen.
Jonathan Mielenz, group leader of bioscience and technology biosciences division at the Oak Ridge National Laboratory, who is familiar with Zhang's work but not affiliated with this project, said this discovery has the potential to have a major impact on alternative energy production: "The key to this exciting development is that Zhang is using the second most prevalent sugar in plants to produce this hydrogen. This amounts to a significant additional benefit to hydrogen production and it reduces the overall cost of producing hydrogen from biomass."
Mielenz said Zhang's process could find its way to the marketplace as quickly as three years if the technology is available. Zhang said when it does become commercially available, it has the possibility of making an enormous impact.
"The potential for profit and environmental benefits are why so many automobile, oil, and energy companies are working on hydrogen fuel cell vehicles as the transportation of the future," Zhang said. "Many people believe we will enter the hydrogen economy soon, with a market capacity of at least $1 trillion in the United States alone."
Obstacles to commercial production of hydrogen gas from biomass previously included the high cost of processes used and the relatively low quantity of the end product. But Zhang thinks he has the answers to those problems. For seven years, his team has been focused on finding non-traditional ways to produce high-yield hydrogen at low cost, specifically researching enzyme combinations, discovering novel enzymes, and engineering enzymes with desirable properties.
The team liberates high-purity hydrogen under mild reaction conditions at 50°C (122°F) and normal atmospheric pressure. The biocatalysts used to release the hydrogen are a group of enzymes artificially isolated from different microorganisms that thrive at extreme temperatures, some of which could grow at around the boiling point of water.
The researchers chose to use xylose, which comprises as much as 30 percent of plant cell walls. Despite its abundance, the use of xylose for releasing hydrogen has been limited. The natural or engineered microorganisms that most scientists use in their experiments cannot produce hydrogen in high yield because these microorganisms grow and reproduce instead of splitting water molecules to yield pure hydrogen.
To liberate the hydrogen, Virginia Tech scientists separated a number of enzymes from their native microorganisms to create a customised enzyme cocktail that does not occur in nature. The enzymes, when combined with xylose and a polyphosphate, liberate the unprecedentedly high volume of hydrogen from xylose, resulting in the production of about three times as much hydrogen as other hydrogen-producing microorganisms.
The energy stored in xylose splits water molecules, yielding high-purity hydrogen that can be directly utilised by proton-exchange membrane fuel cells. Even more appealing, this reaction occurs at low temperatures, generating hydrogen energy that is greater than the chemical energy stored in xylose and the polyphosphate. This results in an energy efficiency of more than 100 percent – a net energy gain. That means that low-temperature waste heat can be used to produce high-quality chemical energy hydrogen for the first time. Other processes that convert sugar into biofuels such as ethanol and butanol always have energy efficiencies of less than 100 percent, resulting in an energy penalty.
In his previous research, Zhang used enzymes to produce hydrogen from starch, but the reaction required a food source that made the process too costly for mass production.
The commercial market for hydrogen gas is now around $100 billion for hydrogen produced from natural gas, which is expensive to manufacture and generates a large amount of the greenhouse gas carbon dioxide. Industry most often uses hydrogen to manufacture ammonia for fertilisers and to refine petrochemicals – but a cheap, plentiful green hydrogen source can rapidly change that market.
"It really doesn't make sense to use non-renewable natural resources to produce hydrogen," Zhang said. "We think this discovery is a game-changer in the world of alternative energy."
The defence contractor, Lockheed Martin, has reported a new method for desalination that is vastly cheaper and more efficient, using nanotechnology.
Lockheed Martin has been awarded a patent for "Perforene" – a new molecular filtration system that is designed to meet the growing global demand for potable water. This material works by removing sodium, chlorine and other ions from seawater and other sources.
Dr. Ray Johnson, senior vice president and chief technology officer: "Access to clean drinking water is going to become more critical as the global population continues to grow, and we believe that this simple and affordable solution will be a game-changer for the industry. Perforene ... is just one example of Lockheed Martin's efforts to apply some of the advanced materials that we have developed for our core markets, including aircraft and spacecraft, to global environmental and economic challenges."
According to a UN report last year, over 780 million people around the world do not have access to clean drinking water. Tom Notaro, Lockheed business manager for advanced materials: "One of the areas that we're very concerned about in terms of global security is the access to clean and affordable drinking water. As more and more countries become more developed ... access to that water for their daily lives is becoming more and more critical."
Perforene was developed by placing holes that are one nanometre or less in a membrane of graphene. These are small enough to trap ions while dramatically improving the flow-through of water molecules, reducing clogging and pressure. Being just one atom thick, graphene is both strong and durable, making it far more effective at sea water desalination at a fraction of the cost of traditional reverse osmosis systems.
John Stetson, senior engineer: "It's 500 times thinner than the best filter on the market today and 1,000 times stronger. The energy that's required and the pressure that's required to filter salt is approximately 100 times less."
In addition to desalination, the Perforene membrane can be tailored to other applications – including capturing minerals, through the selection of the size of hole placed in the material to filter or capture a specific size particle of interest. Lockheed Martin has also been developing processes that will allow the material to be produced at scale. The company is now seeking commercialisation partners.
A desalination plant in Dubai, United Arab Emirates
After three years of construction, a major milestone has been achieved for renewable energy in the Middle East, with the opening of a 100 megawatt (MW) solar power plant.
The Shams solar power station is located near Abu Dhabi, United Arab Emirates. With 258,000 parabolic trough mirrors, covering 2.5 sq km (0.97 sq mi), it generates up to 100 megawatts (MW) of power, making it the largest station of its kind in the world. It will offset 175,000 tons of CO2 per year – the equivalent of planting 1.5 million trees or taking 15,000 cars off the road – and its electrical output will be enough to power 20,000 homes.
The project is a collaboration between Abu Dhabi Future Energy Company (Masdar), Spain's Abengoa Solar and France's Total S.A. Masdar has a 60% stake, while Abengoa Solar and Total S.A. each have 20%. This newly completed first part, Shams 1, will be followed by two additional stations, Shams 2 and Shams 3, similar in size to the original.
The President of the UAE, Sheikh Khalifa bin Zayed Al Nahyan, expressed his pride in the inauguration of Shams 1: "Expanding our leadership into renewable sources of power demonstrates the United Arab Emirates' commitment to maintaining its position as a major provider of energy. The inauguration of Shams 1 is a major milestone in our country's economic diversification and a step toward long-term energy security."
Abu Dhabi, the nation's capital, has a goal of generating 7% of its power from renewables by 2020. This ambitious target will require 15 plants like Shams 1. Other countries in the region are undertaking similar plans. Saudi Arabia, for example, intends to install 41 gigawatts of solar energy by 2032. It is hoped that much of this energy, along with future additions, will be integrated into a continent-wide "super grid" by 2050.
Santiago Seage, the CEO of Abengoa Solar: "The Middle East holds nearly half of the world's renewable energy potential. The abundance of solar energy is an opportunity to integrate sustainable, clean sources of power that address energy security and climate change. The region needs more projects like Shams 1, and we look forward to pushing the boundaries of future energy."
Geoengineering, the use of human technologies to alter Earth's climate system – such as injecting reflective particles into the upper atmosphere to scatter incoming sunlight back to space – has emerged as a potentially promising way to mitigate the impacts of climate change. But such efforts could present unforeseen new risks. That inherent tension, argue two professors from UCLA and Harvard, has thwarted both scientific advances and the development of an international framework for regulating and guiding geoengineering research.
In an article published yesterday in the journal Science, Edward Parson of UCLA and David Keith of Harvard University outline how the current deadlock on governance of geoengineering research poses real threats to the sound management of climate risk. Their article advances concrete and actionable proposals for allowing further research – but not deployment – and for creating scientific and legal guidance, as well as addressing public concerns.
"We're trying to avoid a policy train wreck," said Keith, professor of public policy at John F. Kennedy School of Government and Gordon McKay Professor of Applied Physics at the School of Engineering and Applied Sciences at Harvard University. "Informed policy judgments in the future require research now into geoengineering methods' efficacy and risks. If research remains blocked, in some stark future situation, only untested approaches will be available."
"Our proposals address the lack of international legal coordination that has contributed to the current deadlock," said Parson, a professor of law and faculty co-director of the Emmett Center on Climate Change and the Environment at the UCLA School of Law. "Coordinated international governance of research will both provide the guidance and confidence to allow needed, low-risk research to proceed and address legitimate public concerns about irresponsible interventions or a thoughtless slide into deployment."
Stratospheric Particle Injection for Climate Engineering (SPICE) is a UK government-funded geoengineering research project that aims to assess the feasibility of injecting particles into the stratosphere from a tethered balloon for the purposes of solar radiation management. Credit: Hugh Hunt
In their paper, the authors state that progress on research governance must advance four aims:
Allow low-risk, scientifically valuable research to proceed.
Give scientists guidance on the design of socially acceptable research.
Address legitimate public concerns.
End the current legal void that facilitates rogue projects.
Parson and Keith argue that scientific self-regulation is not sufficient to manage risks and that scientists need to accept government authority over geoengineering research. They emphasise that initial steps should not require new laws or treaties but can come from informal consultation and coordination among governments.
The authors also propose defining two thresholds for governance of geoengineering research: a large-scale threshold to be subject to a moratorium and a separate, much smaller threshold below which research would be allowed. Keith, for example, is currently developing an outdoor experiment to test the risks and efficacy of stratospheric aerosol geoengineering, which would fall below the proposed allowable threshold.
The authors emphasise that this article proposes only first steps. In the near term, these steps frame a social bargain that would allow research to proceed; in the long term, they begin to build international norms of cooperation and transparency in geoengineering.
An oceanic phytoplankton bloom in the South Atlantic Ocean, off the coast of Argentina. Encouraging such blooms with iron fertilisation could lock up carbon on the seabed. Credit: NASA
KLM, the flag carrier airline of the Netherlands, is to operate its first-ever series of biofuel-powered intercontinental flights.
KLM has formed a partnership with Schiphol Group, Delta Air Lines and the Port Authority of New York and New Jersey, that will see weekly flights between John F. Kennedy Airport and Schiphol using sustainable biofuel. Flight KL642 is operated by a Boeing 777-200 every Thursday.
The fuel itself is obtained using cooking oil, recycled and refined in Louisiana. This is supplied by SkyNRG, a company which KLM founded in 2009 together with ARGOS (North Sea Petroleum) and Spring Associates. SkyNRG is now the world’s market leader for sustainable kerosene, supplying over 15 carriers worldwide and the operating partner in KLM’s BioFuel program.
Like all human activities involving combustion, most forms of aviation release carbon dioxide and other greenhouse gases into Earth's atmosphere, contributing to the acceleration of global warming and (in the case of CO2) ocean acidification. Rapid growth of air travel in recent years has produced a large increase in total pollution attributable to aviation. In the European Union, greenhouse gas emissions from aircraft soared by 87% between 1990 and 2006.
Biofuel is widely considered to be one of the primary means by which the industry can reduce its carbon footprint. After a multi-year technical review by aircraft makers, engine manufacturers and oil companies, biofuels were first approved for commercial use in July 2011. Since then, a number of airlines have begun experimenting with their use. KLM flew the world's first commercial biofuel flight – carrying 171 passengers from Amsterdam to Paris – and is now able to offer its first intercontinental service.
Camiel Eurlings, KLM managing director: "I am proud that KLM is once again demonstrating its leading role in developing sustainable biofuel. For eight years in a row, KLM, together with Air France, has been sector leader on the Dow Jones Sustainability Index. Alongside this biofuel series we are starting a study to further identify sustainability gains in fuel, weight and CO2 reduction throughout the entire flight process. We are striving to achieve the 'optimal flight' together with research institutes, suppliers, airports, and air traffic control. We are combining new and existing technology, processes, and efficiency initiatives to achieve this. Cooperation is a priority."
Scientists have expressed concerns about land-use changes in response to greater demand for crops needed in biofuels. The focus is now on second generation sustainable biofuels that do not compete with food. Another major issue is cost. Cooking oil-based fuel, like that used in KLM's new service, is currently $10/gallon, around three times more expensive than regular jet fuel. However, it is hoped this can be reduced in the future. The International Air Transport Association (IATA) believes that a 6% share of sustainable, 2nd generation biofuels is achievable by 2020.
In addition to its use of biofuels, KLM is aiding research by the Delft University of Technology to develop a new aircraft that is 50% more efficient and 50% quieter. This could be ready to fly by 2025.
NASA scientists report that warmer temperatures and changes in precipitation locally and regionally have altered the growth of large forest areas in the eastern United States over the past 10 years.
Using NASA's Terra satellite, scientists examined the relationship between natural plant growth trends, as monitored by NASA satellite images, and variations in climate over the eastern United States from 2000 to 2010.
Trends in forest canopy green cover over the eastern United States from 2000 to 2010. Green shades indicate a positive trend of increasing growing season green cover, whereas brown shades indicate a negative trend of decreasing growing season green cover. Four forest sub-regions are outlined in red, north to south as: Great Lakes, Southern Appalachian, Mid-Atlantic, and southeastern Coastal Plain.
Image credit: NASA
Monthly satellite images from the MODerate resolution Imaging Spectroradiometer (MODIS) Enhanced Vegetation Index (EVI) showed declining density of the green forest cover during summer in four sub-regions: the Upper Great Lakes, southern Appalachian, mid-Atlantic, and southeastern Coastal Plain. More than 20 percent of the non-agricultural area in the four sub-regions that showed decline during the growing season was covered by forests. Nearly 40 percent of the forested area within the mid-Atlantic sub-region alone showed a significant decline in forest canopy cover.
"We looked next at the relationships between warmer temperatures, rainfall patterns, and reduced forest greenness across these regions," said Christopher Potter, a research scientist at NASA's Ames Research Center. "This comprehensive data gave us the evidence to conclude that a series of relatively dry years since 2000 has been unfavourable for vigorous growth of forest cover over much of the eastern U. S. this past decade." Potter is the first author of a paper titled "Declining Vegetation Growth Rates in the Eastern United States from 2000 to 2010," published by Natural Resources.
In the past, scientists were uncertain about what was causing the changes in the forests in the eastern U. S. Based on small-scale field site measurements since 1970, forest growth was thought to be increasing in regions where soil nutrients and water were in good supply. At the same time, there were fewer wildfires throughout the eastern U.S., which scientists believe contributed to the transformation of more open lands into closed-canopy forests with more shade-tolerant, fire-sensitive plants.
More recent studies indicate that climate change could be having many adverse and interrelated impacts on the region. The warming climate this century has caused new stresses on trees, such as insect pest outbreaks and the introduction of new pathogens. Scientists consider both climate change and disease to be dominant driving forces in the health of forests in this region.
NASA's technology is revealing an entirely new picture of these complex impacts. The MODIS satellite captures very broad regional patterns of change in forests, wetlands, and grasslands by continuous monitoring of the natural plant cover over extended time periods. Now, with over a decade of "baseline" data to show how trees typically go through a yearly cycle of leaves blooming, summer growth, and leaves falling, scientists are detecting subtle deviations from the average cycle to provide early warning signs of change at the resolution of a few miles for the entire country.
"The next studies at NASA Ames will research areas that appear most affected by drought and warming to map out changes in forest growth at a resolution of several acres," said Potter.
Environmental concerns among citizens around the world have been falling since 2009 and have now reached twenty-year lows, according to a multi-country GlobeScan poll.
The findings are drawn from the GlobeScan Radar annual tracking poll of citizens across 22 countries. A total of 22,812 people were interviewed face-to-face or by telephone. Twelve of these countries have been regularly polled on environmental issues since 1992.
Asked how serious they consider each of six environmental problems to be — air pollution, water pollution, species loss, automobile emissions, fresh water shortages, and climate change — fewer people now consider them “very serious” than at any time since tracking began 20 years ago.
Climate change is the only exception, where concern was lower from 1998 to 2003 than it is now. Concern about air and water pollution, as well as biodiversity, is significantly below where it was even in the 1990s. Many of the sharpest falls have taken place in the past two years.
The perceived seriousness of climate change has fallen particularly sharply since the unsuccessful UN Climate Summit in Copenhagen in December 2009. Climate concern dropped first in industrialised countries, but this year’s figures show that concern has now fallen in major developing economies such as Brazil and China as well.
Despite the steep fall in environmental concern over the past three years, majorities still consider most of these environmental problems to be "very serious," Water pollution is viewed as the most serious environmental problem among those tested, rated by 58 percent as very serious. Climate change is rated second least serious out of the six, with under half (49%) viewing it as "very serious."
GlobeScan Chairman Doug Miller comments: "Scientists report that evidence of environmental damage is stronger than ever — but our data shows that economic crisis and a lack of political leadership mean that the public are starting to tune out. Those who care about mobilising public opinion on the environment need to find new messages in order to reinvigorate a stalled debate."
In experiments mimicking a natural environment, Duke University researchers have demonstrated that the silver nanoparticles used in many consumer products can have an adverse effect on plants and microorganisms.
Fifty days after scientists applied a single low dose of silver nanoparticles, the experimental environments produced about a third less biomass in some plants and microbes. These preliminary findings are important, the researchers said, because little is known about the environmental effects of such tiny particles, which range in size from 1 nm to 100 nm. They are found in textiles, clothing, children's toys and pacifiers, disinfectants and toothpaste.
"No one really knows what the effects of these particles are in the environment," said Benjamin Colman, a post-doctoral fellow in Duke's biology department and a member of the Center for the Environmental Implications of Nanotechnology (CEINT). "We're trying to come up with the data that can be used to help regulators determine the risks to the environment from silver nanoparticle exposures," Colman said. CEINT's research is funded by the National Science Foundation and the Environmental Protection Agency.
Previous studies have involved high concentrations of the nanoparticles in a laboratory setting, which the researchers point out, doesn't represent "real-world" conditions.
"Results from laboratory studies are difficult to extrapolate to ecosystems, where exposures likely will be at low concentrations and there is a diversity of organisms," Colman said.
Silver nanoparticles are used in consumer products because they can kill bacteria, inhibiting unwanted odors. They work through a variety of mechanisms – including generating free radicals of oxygen, which can cause DNA damage to microbial membranes without harming human cells.
The main route by which these particles enter the environment is as a by-product of sewage treatment plants. The nanoparticles are too small to be filtered out, so they and other materials end up in the resulting wastewater treatment "sludge," which is then spread on the land surface as a fertilizer.
Credit: Benjamin Colman
For their studies, the researchers created mesocosms, which are small, man-made structures containing different plants and microorganisms meant to represent the environment. They applied sludge with low doses of silver nanoparticles in some of the mesocosms, then compared plants and microorganisms from treated and untreated mesocosms after 50 days. Their study appears in the journal PLOS One.
The researchers found that one of the plants they studied, a common annual grass known as Microstegium vimineum, had 32 percent less biomass in the mesocosms treated with nanoparticles. Microbes were also affected by the nanoparticles, Colman said. One enzyme associated with helping microbes deal with external stresses was 52 percent less active, while another enzyme that helps regulate processes within the cell was 27 percent less active. The overall biomass of the microbes was also 35 percent lower, he said.
"Our field studies show adverse responses of plants and microorganisms following a single low dose of silver nanoparticles applied by a sewage biosolid," Colman said. "An estimated 60 percent of the average 5.6 million tons of biosolids produced each year is applied to the land for various reasons, and this practice represents an important and understudied route of exposure of natural ecosystems to engineered nanoparticles."
"Our results show that silver nanoparticles in the biosolids, added at concentrations that would be expected, caused ecosystem-level impacts," Colman said. "Specifically, the nanoparticles led to an increase in nitrous oxide fluxes, changes in microbial community composition, biomass, and extracellular enzyme activity, as well as species-specific effects on the above-ground vegetation."
The researchers plan to continue studying longer-term effects of silver nanoparticles and to examine another ubiquitous nanoparticle – titanium dioxide.
The rise of connected devices will drive mobile data revenues past voice revenues globally by 2018, according to a new report from the Global System Mobile Association (GSMA). This data explosion will provide better access to healthcare and education, help lift people out of poverty, fight hunger and reduce carbon emissions.
Mobile data is being driven by a surge in demand for connected devices and machine-to-machine (M2M) communications, as we accelerate towards a truly networked world. This is transforming the socioeconomic future of people in both developed and developing countries. The new GSMA report, produced in collaboration with PwC, reveals how innovative mobile connected products and services will revolutionise people's lives over the next five years:
In developed countries:
Mobile health could save $400 billion in healthcare costs in OECD countries
Connected cars could save one in nine lives through emergency calling services, providing quicker and more accurate location and response times
Mobile education can reduce student drop-outs by eight per cent
Smart metering can cut carbon emissions by 27 million tonnes – the equivalent of planting 1.2 billion trees
In developing countries:
Mobile health could save one million lives in sub-Saharan Africa
Automotive data will improve food transport and storage, helping feed more than 40 million people annually – equivalent to the entire population of Kenya
Mobile education can enable 180 million students to further their education
Smart cities with intelligent transport systems could reduce commute times by 35 per cent, giving commuters back a whole week each year
Michael O'Hara, Chief Marketing Officer, GSMA: "Mobile data is not just a commodity, but is becoming the lifeblood of our daily lives, society and economy, with more and more connected people and things. This is an immense responsibility and the mobile industry needs to continue collaborating with governments and key industry sectors to deliver products and services that help people around the world improve their businesses and societies."
The increase in mobile operator data revenues is a global trend, across both developed and emerging markets. In 2012, Japan became the first country where data revenues exceeded voice revenues, due largely to the availability of advanced mobile broadband networks and a higher adoption of the latest smartphones, tablets and connected devices. This year, Argentina's data revenues will exceed voice revenues – attaining this milestone ahead of the US and UK, which will reach this point in 2014. Kenya will experience this shift in 2016, with global revenues following in 2018 as mobile broadband continues to thrive.