Vince Cable, UK Business Secretary, has announced measures that give the green light for driverless cars on UK roads from January 2015.
UK cities can now bid for a share of a £10 million (US$17m) competition to host a driverless cars trial. The government is calling on cities to join together with businesses and research organisations to put forward proposals to become a test location. Up to three cities will be selected to host the trials from next year, with each project expected to last between 18 and 36 months, starting in January 2015.
Ministers have also launched a review to look at current road regulations to establish how the UK can stay at the forefront of driverless car technology and ensure there is an appropriate regime for testing driverless cars in the UK. Two areas will be covered in the review: cars with a qualified driver who can take over control of the driverless car, and fully autonomous vehicles where there is no driver.
Speaking at MIRA – a vehicle engineering consultancy, test and research facility – where he tested a driverless car with Science Minister Greg Clark, Business Secretary Vince Cable said: "The excellence of our scientists and engineers has established the UK as a pioneer in the development of driverless vehicles through pilot projects. Today’s announcement will see driverless cars take to our streets in less than six months, putting us at the forefront of this transformational technology and opening up new opportunities for our economy and society.
"Through the government's industrial strategy, we are backing the automotive sector as it goes from strength to strength. We are providing the right environment to give businesses the confidence to invest and create high skilled jobs."
Britain joins a growing number of countries planning to use this technology. Elsewhere in Europe, cities in Belgium, France and Italy intend to operate transport systems for driverless cars. In the USA, four states have passed laws permitting autonomous cars: Nevada, Florida, California, and Michigan. FutureTimeline.net predicts annual purchases of autonomous vehicles will reach almost 100 million worldwide by 2035. The benefits could be enormous, with drastic reductions in accident fatalities, traffic congestion and pollution.
A new report from Stanford University warns that biodiversity is close to a tipping point that will lead to the next mass extinction.
Credit: Smith609, Wikipedia (CC BY-SA 3.0)
Earth's current biodiversity – the product of 3.5 billion years of evolutionary trial and error – is very high when looking at the long history of life. But it may be reaching a tipping point. In a review of scientific literature and data analysis published in Science, an international team of scientists cautions that the loss and decline of animals is contributing to what appears to be the early days of the planet's next mass extinction event.
Since 1500, over 320 terrestrial vertebrates have gone extinct. Populations of the remaining species show a 25 percent average decline in abundance. The situation is similarly dire for invertebrate animal life. And while previous extinctions have been driven by natural planetary transformations or catastrophic asteroid strikes, the current die-off can be associated to human activity – a situation that lead author Rodolfo Dirzo, professor of biology at Stanford University, calls an era of "Anthropocene defaunation."
Across vertebrates, up to 33 percent of all species are estimated to be globally threatened or endangered. Large animals – described as megafauna and including elephants, rhinoceroses, polar bears and countless other species worldwide – face the highest rates of decline, a trend that matches previous extinction events. Larger animals tend to have lower population growth rates and produce fewer offspring. They need larger habitat areas to maintain viable populations. Their size and meat mass make them easier and more attractive hunting targets for humans.
Although these species represent a relatively low percentage of the animals at risk, their loss would have trickle-down effects that could shake the stability of other species and, in some cases, even human health. For instance, previous experiments conducted in Kenya have isolated patches of land from megafauna such as zebras, giraffes and elephants, and observed how an ecosystem reacts to the removal of its largest species. Rather quickly, these areas become overwhelmed with rodents. Grass and shrubs increase and the rate of soil compaction decreases. Seeds and shelter become more easily available, and the risk of predation drops. Consequently, the number of rodents doubles – and so does the abundance of the disease-carrying ectoparasites that they harbour. If rats dominate ecosystems, they could evolve to giant sizes in the future, according to recent research.
"Where human density is high, you get high rates of defaunation, high incidence of rodents, and thus high levels of pathogens, which increases the risks of disease transmission," says Dirzo. "Who would have thought that just defaunation would have all these dramatic consequences? But it can be a vicious circle."
The scientists also detailed a troubling trend in invertebrate defaunation. Human population has doubled in the past 40 years; during the same period, the number of invertebrate animals – such as beetles, butterflies, spiders and worms – has decreased by 45 percent. As with larger animals, the loss is driven primarily by loss of habitat and global climate disruption, and could have trickle-up effects in our everyday lives. For instance, insects pollinate roughly 75 percent of the world's food crops, an estimated 10 percent of the economic value of the world's food supply. They also play critical roles in nutrient cycling and decomposing organic materials, which helps to ensure ecosystem productivity.
Dirzo says that the solutions are complicated. Immediately reducing rates of habitat change and overexploitation would help, but these approaches need to be tailored to individual regions and situations. He said he hopes that raising awareness of the ongoing mass extinction – and not just of large, charismatic species – and its associated consequences will help spur change.
"We tend to think about extinction as loss of a species from the face of Earth, and that's very important, but there's a loss of critical ecosystem functioning in which animals play a central role that we need to pay attention to as well," he said. "Ironically, we have long considered that defaunation is a cryptic phenomenon, but I think we will end up with a situation that is non-cryptic because of the increasingly obvious consequences to the planet and to human wellbeing."
Researchers from the University of Bradford have devised a simple blood test that can be used to diagnose whether people have cancer or not.
The test will enable doctors to rule out cancer in patients presenting with certain symptoms – saving time, and preventing costly and unnecessary invasive procedures such as colonoscopies and biopsies being carried out. Alternatively, it could be a useful aid for investigating patients who are suspected of having a cancer that is currently hard to diagnose.
Early results have shown the method gives a high degree of accuracy diagnosing cancer and pre-cancerous conditions from the blood of patients with melanoma, colon cancer and lung cancer. The Lymphocyte Genome Sensitivity (LGS) test looks at white blood cells and measures the damage caused to their DNA when subjected to different intensities of ultraviolet light (UVA), which is known to damage DNA. The results of the empirical study show a clear distinction between the damage to the white blood cells from patients with cancer, with pre-cancerous conditions and from healthy patients.
The research was led by Professor Diana Anderson, from the University’s School of Life Sciences, who says: “White blood cells are part of the body’s natural defence system. We know that they are under stress when they are fighting cancer or other diseases, so I wondered whether anything measureable could be seen if we put them under further stress with UVA light. We found that people with cancer have DNA which is more easily damaged by ultraviolet light than other people, so the test shows the sensitivity to damage of all the DNA – the genome – in a cell.”
The study looked at blood samples taken from 208 individuals. Ninety-four healthy individuals were recruited from staff and students at the University and 114 blood samples were collected from patients referred to specialist clinics within Bradford Royal Infirmary prior to their diagnosis and treatment. The samples were coded, anonymised, randomised and then exposed to UVA light through five different depths of agar.
UVA damage was observed in the form of DNA fragments being pulled in an electric field towards the positive end of the field, causing a comet-like tail. During the LGS test, the longer the tail the more DNA damage, and the measurements correlated to those patients who were ultimately diagnosed with cancer (58), those with pre-cancerous conditions (56) and those who were healthy (94).
“These are early results completed on three different types of cancer and we accept that more research needs to be done; but these results so far are remarkable,” said Prof. Anderson. "Whilst the numbers of people we tested are, in epidemiological terms, quite small, in molecular epidemiological terms, the results are powerful. We’ve identified significant differences between the healthy volunteers, suspected cancer patients and confirmed cancer patients of mixed ages at a statistically significant level of P<0.001. This means that the possibility of these results happening by chance is 1 in 1000. We believe that this confirms the test’s potential as a diagnostic tool.”
Professor Anderson believes that if the LGS proves to be a useful cancer diagnostic test, it would be a highly valuable addition to the more traditional investigative procedures for detecting cancer. A clinical trial is currently underway at Bradford Royal Infirmary. This will investigate the effectiveness of the LGS test in correctly predicting which patients referred by their GPs with suspected colorectal cancer would, or would not, benefit from a colonoscopy – currently the preferred investigation method. The University of Bradford has filed patents for the technology and a spin-out company, Oncascan, has been established to commercialise the research.
Globally, June 2014 was the hottest June since records began in 1880. Experts predict that 2014 will be an El Niño year.
According to NOAA scientists, the globally averaged temperature over land and ocean surfaces for June 2014 was the highest for June since records began in 1880. This follows the hottest May on record the previous month. It also marked the 38th consecutive June and 352nd consecutive month with a global temperature above the 20th century average. The last below-average global temperature for June was in 1976 and the last below-average global temperature for any month was February 1985.
Most of the world experienced warmer-than-average monthly temperatures, with record warmth across part of southeastern Greenland, parts of northern South America, areas in eastern and central Africa, and sections of southern and southeastern Asia. Drought conditions in the southwest U.S. continued to worsen, with Lake Mead dropping to its lowest levels ever – triggering fears of major water shortages within the next several years. Australia saw nationally-averaged rainfall 32 percent below average and in Western Australia precipitation was 72 percent below average.
Ocean surface temperatures for June were 0.64°C (1.15°F) above the 20th century average of 16.4°C (61.5°F), the highest for June on record and the highest departure from average for any month. Notably, large parts of the western equatorial and northeast Pacific Ocean and most of the Indian Ocean were record warm or much warmer than average for the month. Although neither El Niño nor La Niña conditions were present across the central and eastern equatorial Pacific Ocean during June 2014, ocean waters in that region continued to trend above average. NOAA's Climate Prediction Centre estimates there is about a 70 percent chance that El Niño conditions will develop during Northern Hemisphere summer 2014 and 80 percent chance it will develop during the fall and winter.
With many of Earth's metals and minerals facing a supply crunch in the decades ahead, deep ocean mining could provide a way of unlocking major new resources. Amid growing commercial interest, the UN's International Seabed Authority has just issued seven exploration licences.
Credit: Nautilus Minerals Inc.
To build a fantastic utopian future of gleaming eco-cities, flying cars, robots and spaceships, we're going to need metal. A huge amount of it. Unfortunately, our planet is being mined at such a rapid pace that some of the most important elements face critical shortages in the coming decades. These include antimony (2022), silver (2029), lead (2031) and many others. To put the impact of our mining and other activities in perspective: on land, humans are now responsible for moving about ten times as much rock and earth as natural phenomena such as earthquakes, volcanoes and landslides. The UN predicts that on current trends, humanity's annual resource consumption will triple by 2050.
While substitution in the form of alternative metals could help, a longer term answer is needed. Asteroid mining could eventually provide an abundance from space – but a more immediate, technically viable and commercially attractive solution is likely to arise here on Earth. That's where deep sea mining comes in. Just as offshore oil and gas drilling was developed in response to fossil fuel scarcity on land, the same principle could be applied to unlock massive new metal reserves from the seabed. Oceans cover 72% of the Earth's surface, with vast unexplored areas that may hold a treasure trove of rare and precious ores. Further benefits would include:
• Curbing of China's monopoly on the industry. As of 2014, the country is sitting on nearly half the world's known reserves of rare earth metals and produces over 90% of the world's supply.
• Limited social disturbance. Seafloor production will not require the social dislocation and resulting impact on culture or disturbance of traditional lands common to many land-based operations.
• Little production infrastructure. As the deposits are located on the seafloor, production will be limited to a floating ship with little need for additional land-based infrastructure. The concentration of minerals is an order of magnitude higher than typical land-based deposits with a corresponding smaller footprint on the Earth's surface.
• Minimal overburden or stripping. The ore generally occurs directly on the seafloor and will not require large pre-strips or overburden removal.
• Improved worker safety. Operations will be mostly robotic and won't require human exposure to typically dangerous mining or "cutting face" activities. Only a hundred or so people will be employed on the production vessel, with a handful more included in the support logistics.
Credit: Nautilus Minerals Inc.
Interest in deep sea mining first emerged in the 1960s – but consistently low prices of mineral resources at the time halted any serious implementation. By the 2000s, the only resource being mined in bulk was diamonds, and even then, just a few hundred metres below the surface. In recent years, however, there has been renewed interest, due to a combination of rising demand and improvements in exploration technology.
The UN's International Seabed Authority (ISA) was set up to manage these operations and prevent them from descending into a free-for-all. Until 2011, only a handful of exploration permits had been issued – but since then, demand has surged. This week, seven new licences were issued to companies based in Brazil, Germany, India, Russia, Singapore and the UK. The number is expected to reach 26 by the end of 2014, covering a total area of seabed greater than 1.2 million sq km (463,000 sq mi).
Michael Lodge of the ISA told the BBC: "There's definitely growing interest. Most of the latest group are commercial companies so they're looking forward to exploitation in a reasonably short time – this move brings that closer."
So far, only licences for exploration have been issued, but full mining rights are likely to be granted over the next few years. The first commercial activity will take place off the coast of Papua New Guinea, where a Canadian company – Nautilus Minerals – plans to extract copper, gold and silver from hydrothermal vents. After 18 months of delays, this was approved outside the ISA system and is expected to commence in 2016. Nautilus has been developing Seafloor Production Tools (SPTs), the first of which was completed in April. This huge robotic machine is known as the Bulk Cutter and weighs 310 tonnes when fully assembled. The SPTs have been designed to work at depths of 1 mile (1.6 km), but operations as far down as 2.5 miles (4 km) should be possible eventually.
As with any mining activity, concerns have been raised from scientists and conservationists regarding the environmental impact of these plans, but the ISA says it will continue to demand high levels of environmental assessment from its applicants. Looking ahead, analysts believe that deep sea mining could be widespread in many parts of the world by 2040.
For the first time, researchers have demonstrated proof-of-concept that the HIV virus can be eliminated from the DNA of human cell cultures. Although years away from clinical application, this breakthrough has been described as an important step forward in the search for a cure.
The HIV-1 virus has proved to be tenacious – inserting its genome permanently into victims' DNA, forcing patients to take a lifelong drug regimen to control the virus and prevent a fresh attack. Now, a team of Temple University School of Medicine researchers has designed a way to "snip out" the integrated HIV-1 genes for good.
"This is one important step on the path toward a permanent cure for AIDS," says Kamel Khalili, PhD. He and colleague, Wenhui Hu, led the work which marks the first successful attempt to eliminate latent HIV-1 virus from human cells. "It's an exciting discovery – but it's not yet ready to go into the clinic. It's a proof-of-concept that we're moving in the right direction," added Dr. Khalili.
In a study published yesterday by the Proceedings of the National Academy of Sciences (PNAS), Dr. Khalili and colleagues detail how they created molecular tools to delete the HIV-1 proviral DNA. When deployed, a combination of DNA-snipping enzyme called a nuclease and targeting strand of RNA called a guide RNA (gRNA) hunt down the viral genome and excise the HIV-1 DNA. From there, the cell's own gene repair machinery takes over – soldering the loose ends of the genome back together – resulting in virus-free cells.
"Since HIV-1 is never cleared by the immune system, removal of the virus is required in order to cure the disease," said Khalili, whose work focuses on the neuropathogenesis of viral infections. The same technique could theoretically be used against a variety of viruses, he said. The research shows that these molecular tools also hold promise as a therapeutic vaccine; cells armed with the nuclease-RNA combination proved impervious to HIV infection.
Worldwide, over 35 million people have HIV, including more than 1 million in the United States. Every year, another 50,000 Americans contract the virus, according to the U.S. Centers for Disease Control and Prevention.
Although highly active antiretroviral therapy (HAART) has controlled HIV-1 for infected people in the developed world for the last 15 years, the virus can rage again with any interruption in treatment. Even when HIV-1 replication is well controlled with HAART, the lingering HIV-1 presence has longer-term health consequences. "The low level replication of HIV-1 makes patients more likely to suffer from diseases usually associated with aging," Khalili said. These include cardiomyopathy – a weakening of the heart muscle – bone disease, kidney disease, and neurocognitive disorders. "These problems are often exacerbated by the toxic drugs that must be taken to control the virus," he added.
His team based the two-part HIV-1 editor on a system that evolved as a bacterial defence mechanism to protect against infection, Khalili said. His lab engineered a 20-nucleotide strand of guide RNA to target the HIV-1 DNA and paired it with Cas9 (to induce strand breaks in DNA). The gRNA targets the control region of the gene called the long terminal repeat (LTR). LTRs are present on both ends of the HIV-1 genome. By targeting both LTRs, the Cas9 snips out the 9,709-nucleotides that comprise the HIV-1 genome. To avoid any risk of the gRNA accidentally binding with part of the patient's genome, the researchers selected nucleotide sequences that do not appear in any coding sequences of human DNA, thereby avoiding off-target effects and subsequent cellular DNA damage.
The editing process was successful in a number of cell types that can harbour HIV-1 – including microglia and macrophages, as well as in T-lymphocytes. "T-cells and monocytic cells are the main cell types infected by HIV-1, so they are the most important targets for this technology," Dr. Khalili said.
The HIV-1 eradication approach faces several significant challenges before the technique is ready for patients, Dr. Khalili said. The researchers must devise a method to deliver the therapeutic agent to every single infected cell. Finally, because HIV-1 is prone to mutations, treatment may need to be individualised for each patient's unique viral sequences.
"We are working on a number of strategies so we can take the construct into preclinical studies," Dr. Khalili said. "We want to eradicate every single copy of HIV-1 from the patient. That will cure AIDS. I think this technology is the way we can do it."
Last week, a report by the United Nations claimed that AIDS could be brought under control by 2030.
New research has uncovered the structure of one of the most important and complicated proteins in cell division – a fundamental process in life and the development of cancer.
A team from The Institute of Cancer Research in London and the Medical Research Council Laboratory of Molecular Biology in Cambridge has produced the first detailed 3D images of the anaphase-promoting complex (APC/C). Mapping this gigantic protein in unprecedented detail will transform scientists’ understanding of exactly how cells copy their chromosomes and divide, and could reveal binding sites for future cancer drugs.
The APC/C performs a wide range of vital tasks associated with mitosis, the process during which a cell copies its chromosomes and pulls them apart into two separate cells. Mitosis is used in cell division by all animals and plants. Discovering its structure could ultimately lead to new treatments for cancer, which hijacks the normal process of cell division to make thousands of copies of harmful cancer cells.
In the study, which was funded by Cancer Research UK, the researchers reconstituted human APC/C, using a combination of electron microscopy and imaging software to visualise it at a resolution of less than a nanometre (one billionth of a metre). The resolution was so fine that it allowed them to see the secondary structure – the set of basic building blocks which combine to form every protein. Alpha-helix rods and folded beta-sheet constructions were clearly visible within the 20 subunits of the APC/C, defining the overall architecture of the complex.
Previous studies led by the same research team had shown a globular structure for APC/C in much lower resolution, but the secondary structure had not been mapped at all, until now. Each of the APC/C’s subunits bond and mesh with other units at different points in the cell cycle, allowing it to control a range of mitotic processes – including the initiation of DNA replication, the segregation of chromosomes along protein ‘rails’ called spindles, and the ultimate splitting of one cell into two, called cytokinesis. Disrupting each of these processes could selectively kill cancer cells, or stop them dividing.
Professor David Barford, who led the study as Professor of Molecular Biology at The Institute of Cancer Research, London: “It’s very rewarding to finally tie down the detailed structure of this important protein, which is both one of the most important and most complicated found in all of nature. We hope our discovery will open up whole new avenues of research that increase our understanding of the process of mitosis, and ultimately lead to the discovery of new cancer drugs.”
Professor Paul Workman, Interim Chief Executive of The Institute of Cancer Research, London: “The fantastic insights into molecular structure provided by this study are a vivid illustration of the critical role played by fundamental cell biology in cancer research. The new study is a major step forward in our understanding of cell division. When this process goes awry, it is a critical difference that separates cancer cells from their healthy counterparts. Understanding exactly how cancer cells divide inappropriately is crucial to the discovery of innovative cancer treatments to improve outcomes for cancer patients.”
Dr Kat Arney, Science Information Manager at Cancer Research UK: “Figuring out how the fundamental molecular ‘nuts and bolts’ of cells work is vital if we’re to make progress understanding what goes wrong in cancer cells and how to tackle them more effectively. Revealing the intricate details of biological shapes is a hugely important step towards identifying targets for future cancer drugs.”
The European Space Agency has released striking new images of comet 67P/Churyumov-Gerasimenko. As part of the Rosetta mission, a robotic lander will be deployed on the surface.
As the Rosetta spacecraft nears its destination – comet 67P/Churyumov-Gerasimenko – the object is proving to be full of surprises. New images obtained by OSIRIS, the onboard scientific imaging system, confirm the body's peculiar shape hinted at in earlier pictures. Comet 67P is obviously different from other comets visited so far.
"The distance still separating Rosetta from 67P is now far from astronomical," said Holger Sierks, OSIRIS Principal Investigator from the Max Planck Institute for Solar System Research (MPS) in Germany. "It's a trip of less than 14,000 kilometres [about 8,700 miles]. That's comparable to travelling from Germany to Hawaii on a summer holiday."
However, while taking a snapshot of Mauna Kea, Hawaii's highest mountain, from Germany is an impossible feat, Rosetta's camera OSIRIS is doing a great job at catching ever clearer glimpses of its similarly sized destination. Images obtained on 14th July show a tantalizing shape. The comet's nucleus consists of two distinctly separated parts.
"This is unlike any other comet we have ever seen before," said OSIRIS project manager Carsten Güttler from the MPS. "The images faintly remind me of a rubber ducky with a body and a head." How 67P developed this intriguing shape is still unclear. "At this point we know too little about 67P to allow for more than an educated guess," said Sierks.
Later this year, the scientists hope to determine more of the comet's physical and mineralogical properties, which may help to confirm whether its body and head were formerly two individual bodies. In November, the probe will come within 2.5 km (1.5 miles) of the comet and deploy a small robotic lander called Philae. This will take around two hours to reach the surface, using a harpoon system to counter the extremely low gravity. Screws will be drilled to anchor its feet in place, as shown in the video below.
Once attached to the comet, the lander will begin its science mission – using ten instruments to characterise the surface, sub-surface and nucleus, determine the chemical compounds present and study the comet's activities over time. Six cameras mounted on the sides at 60° intervals will provide a 360° panorama around the lander.
Scientists are working to bring the multiverse hypothesis – which sounds like science fiction to some – firmly into the realm of testable science.
The Perimeter Institute for Theoretical Physics in Canada is to begin a series of experiments that could demonstrate – for the first time – direct evidence of the so-called multiverse. This theory postulates that other universes may reside outside our own. The central idea is that a "vacuum" existed prior to what we know as the Big Bang. This vacuum simmered with energy (variously called dark energy, vacuum energy, the inflation field, and the Higgs field). Like water in a pot, its high energy began to evaporate – forming bubbles.
Each bubble contained another vacuum, whose energy was lower, but still greater than zero. This energy drove the bubbles to expand, causing some to collide with each other. It’s possible that some produced secondary bubbles. Maybe the bubbles were rare and far apart; maybe they were packed as close together as foam. But the point is: each of these bubbles was a universe. In this picture, our universe is one bubble in a frothy sea of bubbles, possible an infinite number of them.
This version of the multiverse hypothesis is based on what's currently known about cosmic inflation. Although cosmic inflation isn’t accepted by everyone – cyclical models of the universe tend to reject the idea – it is nevertheless a leading theory of the universe’s very early development, and there is some observational evidence to support it.
Inflation holds that in the instant after the Big Bang, the universe expanded at a super-exponential rate – so rapidly that a cubic nanometre of space became a quarter-billion light years across in just a trillionth of a trillionth of a trillionth of a second. It’s an amazing idea, but it would explain some otherwise puzzling astrophysical observations.
Inflation is thought to have been driven by an inflation field – which is vacuum energy by another name. Once you postulate that an inflation field exists, it’s hard to avoid an “in the beginning was the vacuum” kind of story. This is where the theory of inflation becomes controversial – when it starts to postulate multiple universes.
Proponents of the multiverse theory argue that it’s the next logical step in the inflation story. Detractors argue that it is not physics, but metaphysics – that it is not science, because it cannot be tested. After all, physics lives or dies by data that can be gathered and predictions that can be checked.
That’s where Perimeter Associate Faculty member Matthew Johnson comes in. Working with a small team that also includes Perimeter Faculty member Luis Lehner, Johnson is working to bring the multiverse hypothesis firmly into the realm of testable science.
“That’s what this research program is all about. We’re trying to find out what the testable predictions of this picture would be, and then going out and looking for them,” says Johnson. Specifically, his research team is considering the rare cases in which our bubble universe might collide with another bubble universe. He lays out the steps: “We simulate the whole universe. We start with a multiverse that has two bubbles in it, we collide the bubbles on a computer to figure out what happens, and then we stick a virtual observer in various places and ask what that observer would see from there.”
Simulating the whole universe – and indeed more than one – might sound extremely difficult, but apparently that’s not so.
“Simulating the universe is easy,” says Johnson. Simulations don't have to include every atom, star, or galaxy – in fact, they account for none of them. “We’re simulating things only on the largest scales,” he says. “All I need is gravity and the stuff that makes these bubbles up. We’re now at the point where if you have a favourite model of the multiverse, I can stick it on a computer and tell you what you should see.”
That’s a small step for a computer simulation program, but a giant leap for multiverse cosmology. By creating testable predictions, the multiverse model has crossed the line from appealing story to real science. In fact, Johnson says, the program has reached the point where it can rule out certain models of the multiverse: “We’re now able to say that some models predict something that we should be able to see, and since we don’t in fact see it, we can rule those models out.”
For example, the collision from a neighbouring bubble universe might leave a mark or imprint – what Johnson calls “a disk on the sky” – in the form of a circular bruise in the cosmic microwave background. The fact that no such pattern has been found yet makes certain collision-filled models less likely. Meanwhile, his team is working to figure out what other kinds of evidence a bubble collision may leave behind. It’s the first time, they say in their paper, that anyone has produced a direct, quantitative set of predictions for observable signatures of bubble collisions. And though none of those signatures has so far been found, some of them are possible to look for.
The real significance of this work is as a proof of principle: it shows that the multiverse hypothesis can be testable. In other words, if we are living in a bubble universe, we might actually be able to tell.
And what might neighbouring alternative universes look like? Supposing there are infinite numbers of them – it may mean that anything that can happen, will happen in at least one of them. A universe may exist in which dinosaurs survived the asteroid impact. A universe may exist where you are President of the United States. A universe may exist where flowers have the ability to talk. Some universes could be subject to entirely new and different sets of physical laws, with bizarre arrangements of matter and energy. If we ever reach Type V status on the Kardashev scale, we may know for sure.
A Japanese humanoid robot called Pepper, whose makers claim can read people's emotions, has been unveiled in Tokyo. Telecoms company Softbank, which created the robot, says Pepper can understand 70 to 80 percent of spontaneous conversations. News agency AFP met the pint-sized chatterbox, who took time out from his day job greeting customers at SoftBank stores.
Scientists at the National Oceanic and Atmospheric Administration (NOAA) have developed a new high-resolution climate model, showing that southwestern Australia's long-term decline in fall and winter rainfall is caused by manmade greenhouse gas emissions and ozone depletion.
"This new high-resolution climate model is able to simulate regional-scale precipitation with considerably improved accuracy compared to previous generation models," said Tom Delworth, a research scientist at NOAA's Geophysical Fluid Dynamics Laboratory in Princeton, N.J., who helped develop the new model and is co-author of the paper. "This model is a major step forward in our effort to improve the prediction of regional climate change, particularly involving water resources."
NOAA researchers conducted several climate simulations using this global climate model to study long-term changes in rainfall in various regions across the globe. One of the most striking signals of change emerged over Australia, where a long-term decline in fall and winter rainfall has been observed over parts of southern Australia. Simulating natural and manmade climate drivers, scientists showed that the decline in rainfall is primarily a response to manmade increases in greenhouse gases as well as a thinning of the ozone caused by manmade aerosol emissions. Several natural causes were tested with the model, including volcano eruptions and changes in the sun's radiation. However, none of these natural drivers reproduced the long-term observed drying, indicating this trend is clearly due to human activity.
Southern Australia's decline in rainfall began around 1970 and has increased over the last four decades. The model projects a continued decline in winter rainfall throughout the rest of the 21st century, with significant implications for regional water resources. The drying is most severe over the southwest, predicted to see a 40 percent decline in average rainfall by the late 21st century.
"Predicting potential future changes in water resources, including drought, are an immense societal challenge," said Delworth. "This new climate model will help us more accurately and quickly provide resource planners with environmental intelligence at the regional level. The study of Australian drought helps to validate this new model, and thus builds confidence in this model for ongoing studies of North American drought."
Dubai is already known for its luxury tourist experience, super-tall skyscrapers and extravagant megaprojects. Now developers have announced it will host the world's first temperature-controlled city – incorporating the largest mall, largest domed park, cultural theatres and wellness resorts. Known as the "Mall of the World", this gigantic $7bn project will encompass 50 million square feet of floorspace, taking 10 years to construct.
Intended as a year-round destination, its capacity will be large enough to accommodate 180 million visitors each year in 100 hotels and serviced apartment buildings. Glass-roofed streets, modelled on New York's Broadway and London's Oxford Street, will stretch for 7 km (4.6 miles). These will be air-conditioned in summer as temperatures soar above 40°C, but the mall and its glass dome will be open to the elements during cooler winter months. Cars will be redundant in this "integrated pedestrian city."
Credit: Dubai Holding
"The project will follow the green and environmentally friendly guidelines of the Smart Dubai model," explained Ahmad bin Byat, the chief executive of Dubai Holding. "It will be built using state-of-the-art technology to reduce energy consumption and carbon footprint, ensuring high levels of environmental sustainability and operational efficiency."
In response to concerns about another real estate bubble, he insisted there was demand for such a project: "The way things are growing I think we are barely coping with the demand ... tourism is growing in Dubai," he said in an interview with Reuters. "This is a long-term project and we are betting strongly on Dubai."
Speaking at the launch of the mall, Sheikh Mohammed said: "The growth in family and retail tourism underpins the need to enhance Dubai's tourism infrastructure as soon as possible. This project complements our plans to transform Dubai into a cultural, tourist and economic hub for the 2 billion people living in the region around us – and we are determined to achieve our vision."
Mall of the World is one of several hi-tech, futuristic cities that could set the standard for eco-city designs in the coming decades. Others include China's car-free "Great City" (planned to be finished by 2020) and the Masdar City arcology (due in 2025).
Developed by Microsoft, Project Adam is a new deep-learning system modelled after the human brain that has greater image classification accuracy and is 50 times faster than other systems in the industry. The goal of Project Adam is to enable software to visually recognise any object. This is being marketed as a competitor to Google's Brain project, currently being worked on by Ray Kurzweil.
Sensitive electro-optical imaging and target-acquisition systems will achieve new levels of range and sensitivity thanks to a UK company’s breakthrough in developing a "super black" material.
Vantablack – created by Surrey NanoSystems – is revolutionary in its ability to be applied to lightweight, temperature-sensitive structures such as aluminium whilst absorbing 99.96% of incident radiation, the highest level ever recorded.
“Vantablack is a major breakthrough by UK industry in the application of nanotechnology to optical instrumentation", says Ben Jensen, the company's Chief Technology Officer. "It reduces stray-light, improving the ability of sensitive telescopes to see the faintest stars, and allows the use of smaller, lighter sources in space-borne black body calibration systems. Its ultra-low reflectance improves the sensitivity of terrestrial, space and air-borne instrumentation.”
Vantablack is the result of applying a low-temperature carbon nanotube growth process. The manufacture of "super-black" carbon nanotube materials has traditionally required high temperatures – preventing their direct application to sensitive electronics or materials with relatively low melting points. This, along with poor adhesion, prevented their application to critical space and air-borne instrumentation. Over a period of two years, the development and testing programme by Surrey NanoSystems successfully transferred its low-temperature manufacturing process from silicon to aluminium structures and pyroelectric sensors. Qualification to European Cooperation on Space Standardisation (ECSS) standards was also achieved.
Vantablack has the highest thermal conductivity and lowest mass-volume of any material that can be used in high-emissivity applications. It has virtually undetectable levels of outgassing and particle fallout – thus eliminating a key source of contamination in sensitive imaging systems. It can withstand launch shock, staging and long-term vibration, and is suitable for coating internal components, such as apertures, baffles, cold shields and Micro Electro Mechanical Systems (MEMS)-type optical sensors.
“We are now scaling up production to meet the requirements of our first customers in the defence and space sectors, and have already delivered our first orders. Our strategy includes both the provision of a sub-contract coating service from our own UK facility, and the formation of technology transfer agreements with various international partners”, added Jensen.
As a spin-off from its work in applying nanomaterials to semiconductor device fabrication, Surrey NanoSystems’ manufacturing process also enables Vantablack to be applied to flat and three-dimensional structures in precise patterns with sub-micron resolution.
The largest ever study of its kind has found significant differences between organic food and conventionally-grown crops. Organic food contains almost 70% more antioxidants and significantly lower levels of toxic heavy metals.
Conventionally-grown potatoes on the left of the picture and organically grown potatoes on the right. Credit: Newcastle University
Analysing 343 studies into the differences between organic and conventional crops, an international team of experts led by Newcastle University, UK, found that a switch to eating organic fruit, vegetable and cereals – and food made from them – would provide additional antioxidants equivalent to eating between 1-2 extra portions of fruit and vegetables a day.
The study, published in the British Journal of Nutrition, also shows significantly lower levels of toxic heavy metals in organic crops. Cadmium – one of only three metal contaminants along with lead and mercury for which the European Commission has set maximum permitted contamination levels in food – was found to be almost 50% lower in organic crops than conventionally-grown ones.
Professor Carlo Leifert, who led the study, says: “This study demonstrates that choosing food produced according to organic standards can lead to increased intake of nutritionally desirable antioxidants and reduced exposure to toxic heavy metals. This constitutes an important addition to the information currently available to consumers which until now has been confusing and in many cases is conflicting.”
New methods used to analyse the data
This is the most extensive analysis of the nutrient content in organic vs conventionally-produced foods ever undertaken and is the result of a groundbreaking new systematic literature review and meta-analysis by the international team.
The findings contradict those of a 2009 UK Food Standards Agency (FSA) commissioned study, which found there were no substantial differences or significant nutritional benefits from organic food. The FSA study based its conclusions on just 46 publications covering crops, meat and dairy, while Newcastle led meta-analysis is based on data from 343 peer-reviewed publications on composition difference between organic and conventional crops now available.
“The main difference between the two studies is time,” explains Professor Leifert, who is Professor of Ecological Agriculture at Newcastle University. “Research in this area has been slow to take off the ground and we have far more data available to us now than five years ago.”
Dr Gavin Stewart, a Lecturer in Evidence Synthesis and the meta-analysis expert in the Newcastle team, added: “The much larger evidence base available in this synthesis allowed us to use more appropriate statistical methods to draw more definitive conclusions regarding the differences between organic and conventional crops.”
What the findings mean
The study, funded jointly by the European Framework 6 programme and the Sheepdrove Trust, found that concentrations of antioxidants such as polyphenolics were between 18-69% higher in organically-grown crops. Numerous studies have linked antioxidants to a reduced risk of chronic diseases, including cardiovascular and neurodegenerative diseases and certain cancers. Substantially lower concentrations of a range of the toxic heavy metal cadmium were also detected in organic crops (on average 48% lower).
Nitrogen concentrations were found to be significantly lower in organic crops. Concentrations of total nitrogen were 10%, nitrate 30% and nitrite 87% lower in organic compared to conventional crops. The study also found that pesticide residues were four times more likely to be found in conventional crops than organic ones.
Professor Charles Benbrook, one of the authors of the study and a leading scientist based at Washington State University, explains: “Our results are highly relevant and significant and will help both scientists and consumers sort through the often conflicting information currently available on the nutrient density of organic and conventional plant-based foods.”
Professor Leifert added: “The organic vs non-organic debate has rumbled on for decades now, but the evidence from this study is overwhelming – organic food is high in antioxidants and lower in toxic metals and pesticides.
“But this study should just be a starting point. We have shown without doubt there are composition differences between organic and conventional crops, now there is an urgent need to carry out well-controlled human dietary intervention and cohort studies specifically designed to identify and quantify the health impacts of switching to organic food.”
The authors of this study welcome the continued public and scientific debate on this important subject. The entire database generated and used for this analysis is freely available on the Newcastle University website for the benefit of other experts and interested members of the public.
A new rocket design that incorporates methane fuel can provide a low-cost platform for launching clusters of tiny satellites, greatly improving broadband delivery and Earth observation missions.
A cluster of tiny "CubeSats" is shown in this image photographed by a crew member on the International Space Station.
Firefly Space Systems, a small satellite launch company, has officially announced its first launch vehicle, “Firefly Alpha.” This efficient, brand new rocket, capable of carrying 400kg (882lb) into low earth orbit, will be the world’s first dedicated light satellite launch vehicle in this mass class.
Following its launch and seed funding in January, the company – which includes highly experienced aerospace engineers from SpaceX and Virgin Galactic – has aggressively moved forward in its goal to reduce the prohibitively high costs of small satellite launches. Clusters of these micro and nanosatellites placed in low orbit could revolutionise broadband data delivery and Earth observation missions, among other uses. CubeSats like those pictured above are only a litre (10 cm cube) in volume, with masses of little more than a kilogram (2.2lb), typically using off-the-shelf components for their electronics.
“What used to cost hundreds of millions of dollars is rapidly becoming available in the single digit millions,” said Firefly CEO Thomas Markusic. “We are offering small satellite customers the launch they need for a fraction of that, around $8 or 9 million – the lowest cost in the world. It’s far cheaper than the alternatives, without the headaches of a multi manifest launch.”
Simplified and optimised for least cost – and utilising innovations such as a more aerodynamic engine design – Firefly has positioned itself to be a technological and cost effective solution for traditional manufacturers of small satellites.
“To say that this is an exciting and significant technological milestone would be an understatement,” said Michael Blum, co-founder of Firefly. “Until now, there existed virtually no dedicated launcher capacity in the small satellite industry to deliver their respective payloads to orbit. This announcement today just changed all that.”
A new interactive graphic and analysis released this week by research and journalism organisation Climate Central illustrates how much hotter summers will be in 1,001 U.S. cities by 2100, if current emissions trends continue, and shows which cities they are going to most feel like.
"Summer temperatures in most American cities are going to feel like summers now in Texas and Florida — very, very hot," comments Alyson Kenward, lead researcher of the analysis, which looked at projected changes in summer (June-July-August) high temperatures. On average, those temperatures will be 3.9 to 5.6°C (7-10°F) hotter, with some cities as much as 6.7°C (12°F) hotter by the end of the century.
Among the most striking examples featured in the interactive are:
• Boston, where average summer high temperatures will likely be more than 5.6°C (10°F) hotter than they are now, making it feel as steamy as North Miami Beach is today.
• Saint Paul, Minnesota, where summer highs are expected to rise by an average of 6.7°C (12°F), putting it on par with Mesquite, Texas.
• Memphis, where summer high temperatures could average a sizzling 37.8°C (100°F), typical of Laredo, Texas.
• Las Vegas, with summer highs projected to average a scorching 43.9°C (111°F), like summers today in Riyadh, Saudi Arabia.
• Phoenix, where summer high temperatures would average a sweltering 45.6°C (114°F), which will feel like Kuwait City.
This analysis only accounts for daytime summer heat — the hottest temperatures of the day, on average between June-July-August — and doesn't incorporate humidity or dewpoint, both of which contribute to how uncomfortable summer heat can feel. Other impacts the map does not include are rising sea levels and a likely increase in storms and severe weather events.
Recent articles by Fox News and the Daily Telegraph claimed that scientists have been "tampering" with U.S. temperature data. For those who care about real science (as opposed to conspiracy theories), Skeptical Science has a thorough debunking here.
Technology entrepreneur Peter Diamandis has appeared on Reddit to answer questions posed by futurology enthusiasts. Some of the top responses from this "Ask Me Anything" (AMA) are reproduced below.
Reddit: Hi Peter. Were you expecting something like Bitcoin to pop out of nowhere, and just how big do you expect its "market cap" to become in its Disruptive phase?
Peter Diamandis: Regarding bitcoin, in success, it will become the predominant means of financial transaction for not only the developed world but the developing world, the group I call the rising billion. Remember that 3 billion new people are coming online this next decade who do not have access to credit cards or banks. Bitcoin is their method to transact. These 3 billion people represent tens of trillions of dollars that will flow into the global economy. Given that there are only 21 million bitcoins, you do the math.
Reddit: This is a question from /r/basicincome: If robots replace workers, as you've said you think they will, and if we will need less human labor to meet humanity's needs and desires, as you have also suggested, how can society ensure that everyone is able to obtain what they need and desire if there aren't enough jobs to get income from? A basic income has been suggested as a policy, which basically replaces the current hodgepodge of welfare programs with Social Security for everyone. What do you think of this idea?
Peter Diamandis: First let me start by saying I would describe myself as a libertarian capitalist. The idea of creating a socialist state where people are getting supported and not having to work, goes against my grain in many ways. Having said that, I recently had a series of conversations, debates and discussions on this very subject with a group I assembled at Singularity University. I was amazed how the data -- in particular, from experiments done in India -- show that a basic income provided to an entire village in India positively transformed their lives in an extraordinary fashion that gives me great hope. I also believe, as I have written in my book Abundance, that the cost of meeting the basic needs of every man, woman and child on this planet will be significantly reduced by technology over the next few decades.
Reddit: I'd like to know your perspective on longevity science and healthy life extension in the short term, let's say from today till 2025. Do you think that there are going to be advances in the lab? Do you think that those advances are going to transmit fast to the society or will this "social spread" be a slower process? Personally I'm worried about social resistance to something like this.
Peter Diamandis: The innovations in human longevity are mostly going to come from two areas: genomics and stem cell sciences. Over the next decade, Human Longevity Inc. has the objective of sequencing 1 million individuals at a minimum, but in addition to their sequence we will also be collecting phenotypic data, microbiome data, imaging data and metabalomic data. All of this information will be crunched using artificial intelligence and machine learning to give us extraordinary insights. In the arena of stem cells, we will begin harnessing stem cells as the regenerative engine of the body. Having said all this, I think we're going to see amazing strides forward this next decade.
Reddit: What development, achievable in the next 10 years, excites you the most?
Peter Diamandis: I'm most excited about developments in the two areas that I'm pioneering: asteroid mining and the extension of the healthy human lifespan. Through Planetary Resources, we expect to be identifying, prospecting and eventually mining materials from near-Earth asteroids well within this decade. This will create an economic engine that will propel humanity beyond lower orbit. Through Human Longevity Inc, we will be creating the largest database of human genotypic, phenotypic, and microbiology data ever assembled and using machine learning to analyze it to truly understand disease and healthy aging. We feel we have the ability to extend healthy human life by 30-40 years. For me, going to space and living longer -- it doesn't get better!
Reddit: Mr Diamandis, what do you think of Mars One – realistic or hokum or somewhere in between?
Reddit: What do you think of nuclear power? We often have conflicting reports of how dangerous nuclear waste really is to store, how green nuclear power really is, etc.
Peter Diamandis: I believe that nuclear power has gotten a bad reputation because of the early generation plants. The 4th generation plants' design coming online are extraordinarily safe and, frankly, I'd put one in my own backyard. I think ignoring nuclear power is ridiculous, especially because of the upside it has for humanity. We're also seeing extraordinary progress in fusion for the first time in the last 50 years. Having said all this, I am a huge believer that solar power will be able to meet at least 50% of our needs in the United States over the next 2 decades. And moving to a solar economy will transform our entire planet. There is a beautiful alignment that the poorest parts of the world are also the sunniest!
Reddit: How do you see rapid technological advancement impacting the human psyche?
Peter Diamandis: It's incredible how adaptive we are as humans. The change that we have in our lives today would be blinding compared to change 100 or 1,000 years ago. I think the plasticity of the human cortex is fundamental to our survival.
Reddit: What are some fields you're seeing the fastest development in? And what is some of your favourite research being done today?
Peter Diamandis: The field undergoing the fastest development is what I refer to as exponential or accelerating technologies. They are all driven by increases in computational speed. The list specifically includes sensors, networks, artificial intelligence, robotics, synthetic biology, virtual reality, nanomaterials and digital medicine. In my next book, Bold, which comes out in February 2015, I will be focusing on artificial intelligence, 3D printing and synthetic biology. Having said all this, the entire field of crowdsourcing is exploding at the same time and is equally important to the technology. I'm particularly enamored with crowdfunding and incentive competitions as huge leverage points for entrepreneurs today.
Reddit: What advice can you give entrepreneurs (like myself) who want to make billions and better the world?
Peter Diamandis: My advice is simple, find a huge problem on the planet. A problem that affects a billion people, and take a shot at solving it! The best way to become a billionaire is to improve the lives of a billion people. This is exactly the philosophy we teach at Singularity University. It is now possible for an entrepreneur to have an impact at this massive scale. Before, it was only the Kings and the Queens, or the major industrialist. Now it's all of us.
US military research agency DARPA intends to cut the average time to develop new advanced materials from 10 years to less than three.
Military platforms – such as ships, aircraft and ground vehicles – rely on advanced materials to make them lighter, stronger and more resistant to stress, heat and other harsh environmental conditions. Currently, the process for developing new materials to field in platforms frequently takes over a decade. These lengthy schedules often mean that developers of new platforms are forced to rely on decades-old, mature materials, because other potentially more advanced materials are still being tested and aren’t ready to be implemented into platform designs.
To address this problem, US military research agency DARPA has initiated a new program called Materials Development for Platforms (MDP). This aims to develop a methodology and toolset to compress the applied material development process by at least 75 percent: from an average of 10 years or more, to just two and a half years.
To achieve this goal, a cross-disciplinary model will incorporate materials science and engineering, Integrated Computational Materials Engineering (ICME) principles, and platform development disciplines of engineering, design, analysis and manufacturing. DARPA will focus on rapid development of materials with specific platform capabilities and intended missions in view – rather than supporting long-term, generalised materials development followed by assessments of potential applications for the resulting materials.
“In this program, we want to move from the current mindset of sporadic ‘pushes’ in materials technology development to a mindset that ‘pulls’ materials technology forward driven by platform design intent and mission need,” says Mick Maher, DARPA program manager. “Ideally, we could envision materials development happening on timescales more in line with modern commercial automobile development.”
A hypersonic aircraft
As a test case, the program intends to focus its initial efforts on a hypersonic platform design – a bold and pressing challenge, since hypersonic vehicles operate under extreme conditions that push state-of-the-art materials to their thermal, chemical and structural limits. Specifically, the first MDP materials development effort would be applied to the design of an outer aerodynamic shell for a hypersonic vehicle that would glide through the atmosphere. Hypersonic air vehicles travel at more than five times the speed of sound, resulting in shell temperatures of several thousand degrees – hot enough to melt steel. The goal is to prove the MDP concept by developing, manufacturing and independently testing various new material structural elements of an outer shell within two and a half years.
“A key to the program’s success will be integrating expertise from a wide range of relevant technical disciplines,” Maher said. “We want to reach out to potential performers in all of the relevant scientific and engineering communities – and from both large companies and small businesses – so they can team together to create the most effective solutions possible.”
Driven by accelerating urbanisation – particularly in the Asia Pacific region – the global building stock is expected to grow strongly over the next 10 years, putting further pressure on resource demands and the environment.
Construction markets, while still recovering from the 2009 recession, continue to add new commercial and residential floorspace to the world’s buildings. According to a new report from Navigant Research, the global building stock will grow from 138 billion square metres today to over 171 billion by 2023 – an increase of 24 percent.
“Economic growth in developing countries like China and India is slowing, but remains robust, and the rising middle classes in these countries demand a higher quality of life, including improved working and living spaces,” states Eric Bloom, lead research analyst. “The commercial and residential segments will experience compound annual growth rates over the next 10 years of 2.1 percent and 2.2 percent, respectively.”
China’s construction boom has fuelled extraordinary growth in that country’s building stock, with nearly 2 billion square metres added every year. Although there is speculation today about the boom turning into a bubble, Navigant predicts growth continuing at a healthy rate of 4.2 percent annually. By 2023, China will have 58 billion square metres of building space – more than one-third of the world’s total.
The report, “Global Building Stock Database”, provides data on the size and growth of building stock from now to 2023, examining key drivers and trends covering eight commercial building types (office, retail, education, healthcare, hotels & restaurants, institutional/assembly, warehouse, and transport) and two residential building types (single-family detached and multi-unit residential). An Executive Summary is available on the Navigant website.
Commercial, residential, and industrial buildings are responsible for 47% of global greenhouse gas emissions and 49% of global energy consumption. Much of this energy is consumed needlessly and can be reduced through cost-effective measures. Thankfully, new and innovative technologies will soon become mainstream, such as LED lighting and smart grids. The "sprawl" effect of cities can also be reduced with self-sufficiency and taller skyscrapers.
The boundaries of our home galaxy may have to be redrawn, as two stars have been found orbiting the Milky Way at distances of 775,000 and 900,000 light-years from Earth, respectively.
This simulated image demonstrates how large the Milky Way would look from ULAS J0744+25, nearly 775,000 light years away.
The distant outskirts of the Milky Way harbour valuable clues for understanding the formation and evolution of our galaxy. Yet, due to overwhelming distances and an extremely sparse population of stars, few objects have been identified beyond 400,000 light years, with only seven stars known to date beyond this limit.
Recently, a team of astronomers led by John Bochanski, an assistant professor at Haverford College, began targeting stars in the Milky Way’s outer halo – a sparse shroud of stars that surrounds the disk of our galaxy and stretches to at least 500,000 light years away. The team has now discovered two stars in this halo that are the most distant ever discovered in our galaxy.
On 3rd July, Bochanski and his team, which includes Associate Professor of Astronomy Beth Willman, published a letter in Astrophysical Journal Letters, detailing the discovery of two cool red giants, ULAS J0744+25 and ULAS J0015+01. These stars are extremely far away, at distances of 775,000 and 900,000 light years, respectively. The giant stars were selected from observations in the UKIRT Infrared Deep Sky Survey and Sloan Digital Sky Survey.
Red giant stars are relatively rare when compared to nearby cool red dwarfs, which vastly outnumber giants. Yet giants are nearly 10,000 times brighter than dwarfs, making them visible even at huge distances. Using a combination of filters highlighting different parts of the optical and near-infrared light from these giants, the team was able to identify cool red giant candidates. The scientists then obtained spectroscopic confirmation of the identity of these stars using the 6.5m telescope at the MMT Observatory on Mt. Hopkins in Arizona.
“It really is like looking for a needle in a haystack,” Bochanski says. “Except our haystack is made up of millions of red dwarf stars.”
During a visit last November to the MMT Observatory, Bochanski and his team observed ULAS J0744+25 and ULAS J0015+01, using a variety of methods to estimate the distances to these stars. Every method pointed to the same conclusion: these stars are extremely far away. They are over 50 percent farther from the Sun than any known star in the Milky Way, or about five times more distant than the Large Magellanic Cloud. In fact, they lie one-third of the distance to the Andromeda Galaxy, the Milky Way’s sister spiral in the Local Group.
Click to enlarge
By Andrew Z. Colvin (Own work) [CC-BY-SA-3.0 (http://creativecommons.org/licenses/by-sa/3.0), via Wikimedia Commons
“The distances to these two stars are almost too large to comprehend,” adds Bochanski. “To put it in perspective, when the light from ULAS J0015+01 left the star, our early human ancestors were just starting to make fires here on Earth.”
“It is remarkable to find stars this far out in the Milky Way galaxy," remarked Daniel Evans, lead for Individual Investigator Programs at NSF's Division of Astronomical Sciences, which funded the research. “These results will undoubtedly shed new light on the formation and evolution of our galactic home.”
The significance of these stars goes beyond their record-holding distances because they inhabit the Milky Way’s halo. Some astronomers believe the halo is like a cloud of galactic crumbs, a result of the Milky Way’s merger with many smaller galaxies over our galaxy’s lifetime, says Beth Willman, co-author of the study: “Theory predicts the presence of such an extended stellar halo, formed by destroyed remains of small dwarf galaxies that merged over the cosmic ages to form the Milky Way itself. The properties of cool red giants in the halo thus preserve the formation history of our Milky Way. These stars are truly ghosts of galaxies past.”
By assembling a larger sample of distant red giants, Bochanski and his team hope to test model predictions for the Milky Way's formation. Their results may already be putting some of these models to the test. “Most models don’t predict many stars at these distances,” says Bochanski. “If more distant red giants are discovered, the models may need to be revised.” The search in the outer reaches of our Milky Way goes on, using the brightest stars to guide the way.
A study co-authored by a University of Guelph scientist that involved fitting bumblebees with tiny radio frequency tags shows long-term exposure to a neonicotinoid pesticide hampers bees’ ability to forage for pollen.
Bees fitted with RFID tags. Credit: Richard Gill
The research by Nigel Raine, a professor in Guelph’s School of Environmental Sciences, and Richard Gill of Imperial College London is published in the British Ecological Society’s journal Functional Ecology. The study shows how long-term pesticide exposure affects individual bees’ day-to-day behaviour, including pollen collection and which flowers the worker bees chose to visit.
“Bees have to learn many things about their environment, including how to collect pollen from flowers,” says Raine, who holds the Rebanks Family Chair in Pollinator Conservation. “Exposure to this neonicotinoid pesticide seems to prevent bees from being able to learn these essential skills.”
The researchers monitored bee activity using radio frequency identification (RFID) tags – seen in the photograph above – similar to those used by courier firms to track parcels. They tracked when individual bees left and returned to the colony, how much pollen they collected and from which flowers.
The bees from untreated colonies got better at collecting pollen as they learned to forage. However, bees exposed to neonicotinoid insecticides became less successful over time at collecting pollen. Neonicotinoid-treated colonies even sent out more foragers to try to compensate for lack of pollen from individual bees. Besides collecting less pollen, said Raine, “flower preferences of neonicotinoid-exposed bees were different to those of foraging bees from untreated colonies.”
Raine and Gill studied the effects of two pesticides – imidacloprid, one of three neonicotinoid pesticides currently banned for use on crops attractive to bees by the European Commission, and pyrethroid (lambda cyhalothrin) – used both alone and together, on the behaviour of individual bumblebees from 40 colonies over four weeks.
“Although pesticide exposure has been implicated as a possible cause for bee decline, until now we had limited understanding of the risk these chemicals pose, especially how it affects natural foraging behaviour,” Raine said.
Neonicotinoids make up about 30 per cent of the global pesticide market. Plants grown from neonicotinoid-treated seed have the pesticide in all their tissues, including the nectar and pollen.
“If pesticides are affecting the normal behaviour of individual bees, this could have serious knock-on consequences for the growth and survival of colonies,” explained Raine.
He suggests reform of pesticide regulations, including adding bumblebees and solitary bees to risk assessments that currently cover only honeybees.
“Bumblebees may be much more sensitive to pesticide impacts as their colonies contain a few hundred workers at most, compared to tens of thousands in a honeybee colony,” he added.
"Morning Glory" (pictured below) is the common name for over 1,000 species of flowering plants, noted for their short-lived blooms that normally unfold in the morning and wither by nightfall. A team of scientists at the National Agriculture and Food Research Organisation near Tokyo have reportedly slowed the aging process in one particular Japanese variety of this flower. Their breakthrough could allow bouquets to remain fresh for much longer.
In the study – carried out jointly with Kagoshima University in southern Japan – a gene named "EPHEMERAL1" was suppressed. This resulted in the lifespan of each flower almost doubling, from 13 hours to 24 hours. The finding could lead to developing methods to extend the life of cut flowers.
Kenichi Shibuya, one of the lead researchers, told AFP by telephone: "We have concluded that the gene is linked to petal aging. It would be unrealistic to modify genes of all kinds of flowers – but we can look for other ways to suppress the (target) gene... such as making cut flowers absorb a solution that prevents the gene from becoming active."
A similar breakthrough in plant aging was made by German researchers in January 2013. That study identified a "genetic switch" able to maintain a youthful state in tobacco plants.
Scientists analysing data from NASA's Cassini mission have firm evidence the ocean inside Saturn's largest moon, Titan, might be as salty as Earth's Dead Sea.
The new results come from a study of gravity and topography data collected during Cassini's repeated flybys of Titan during the past 10 years. Using the Cassini data, researchers presented a model structure for Titan, resulting in an improved understanding of the structure of the moon's outer ice shell. The findings are published in this week's edition of the journal Icarus.
"Titan continues to prove itself as an endlessly fascinating world, and with our long-lived Cassini spacecraft, we're unlocking new mysteries as fast as we solve old ones," said Linda Spilker, Cassini project scientist at NASA's Jet Propulsion Laboratory in California.
Additional findings support previous indications the moon's icy shell is rigid and in the process of freezing solid. Researchers found that a relatively high density was required for Titan's ocean in order to explain the gravity data. This indicates the ocean is probably an extremely salty brine of water mixed with dissolved salts likely composed of sulfur, sodium and potassium. The density indicated for this brine would give the ocean a salt content roughly equal to the saltiest bodies of water on Earth.
The Dead Sea, Israel
"This is an extremely salty ocean by Earth standards," said the paper's lead author, Giuseppe Mitri of the University of Nantes in France. "Knowing this may change the way we view this ocean as a possible abode for present-day life, but conditions might have been very different there in the past."
Cassini data also indicate the thickness of Titan's ice crust varies slightly from place to place. The researchers said this can best be explained if the moon's outer shell is stiff, as would be the case if the ocean were slowly crystalizing and turning to ice. Otherwise, the moon's shape would tend to even itself out over time, like warm candle wax. This freezing process would have important implications for the habitability of Titan's ocean, as it would limit the ability of materials to exchange between the surface and the ocean.
A further consequence of a rigid ice shell, according to the study, is that any outgassing of methane into Titan's atmosphere must happen at scattered "hot spots" – like the hot spot on Earth that gave rise to the Hawaiian Island chain. Titan's methane does not appear to result from convection or plate tectonics recycling its ice shell.
How methane gets into the moon's atmosphere has long been of great interest to researchers, as molecules of this gas are broken apart by sunlight on short geological timescales. Titan's present atmosphere contains about five percent methane. This means some process, thought to be geological in nature, must be replenishing the gas. The study indicates that whatever process is responsible, the restoration of Titan's methane is localised and intermittent.
"Our work suggests looking for signs of methane outgassing will be difficult with Cassini, and may require a future mission that can find localised methane sources," said Jonathan Lunine, a scientist on the Cassini mission at Cornell University, Ithaca, New York, and one of the paper's co-authors. "As on Mars, this is a challenging task."
Size comparison of Titan (lower left), Earth and the Moon.
By failing to consider future trends in smoking, most projections for life expectancy in low-mortality nations have been underestimated.
A new study by demographer John Bongaarts – Population Council Vice President and Distinguished Scholar – finds that mortality projections from most low-mortality countries are more pessimistic than they should be. The reason for this flaw is that existing projections fail to recognise that fewer people smoke today than used to. Indeed, less than 5% of the world's population may smoke by the year 2040. As a result, there will be a future decline in smoking-related mortality. This also suggests that with more people living longer, pension and health care costs in coming decades will likely be higher than previously estimated.
A country’s future mortality trajectory has important implications for health and social policy, especially in countries with aging populations where pension and health care costs are rising steeply.
Developed countries – such as the United States, Japan, and most nations of Europe – often have government agencies that make mortality projections (e.g. Actuaries of the Social Security Administration in the United States) and the UN Population Division makes projections for 238 countries and regions. All current mortality projections foresee substantial increases in future life expectancy. However, Bongaarts finds that the increases in life expectancy are likely to be even greater than current estimates suggest.
Nearly all methods for projecting mortality ignore trends in causes of death. Rather, they rely wholly or in part on the extrapolation of past trends in mortality rates, longevity measures, or mortality models. Bongaarts examined whether mortality projections could be improved by taking into account smoking trends. He focused on trends in death rates and causes of death in 15 countries with high life expectancy and reliable data on causes of death: Australia, Austria, Canada, Denmark, Finland, France, Italy, Japan, the Netherlands, Norway, Spain, Sweden, Switzerland, the United Kingdom, and the United States. Bongaarts studied mortality data gathered between 1955 and 2010.
A problem arises because most mortality projection methods ignore the past rise and the likely future decline in smoking-related deaths. “Making explicit adjustments for the distorting effects of smoking is likely to improve the accuracy of projections,” says Bongaarts. It would not be possible to improve mortality projections by making adjustments for other causes of death, he found. Unlike other causes of death, future trends in smoking mortality can be predicted with a high degree of certainty.
“Worldwide, we are making notable progress in reducing the number of people who smoke,” he says. “This not only has immediate health benefits, but also long-term public policy implications. To adequately prepare for longer-living older populations, countries must take smoking trends into account.”
The study, "Trends in Causes of Death in Low-Mortality Countries: Implications for Mortality Projections," is published in the journal Population and Development Review.
Researchers have announced the creation of an imaging technology more powerful than anything that has existed before, and is fast enough to observe life processes as they actually happen at the molecular level.
Researchers today announced the creation of an imaging technology more powerful than anything that has existed before, and is fast enough to observe life processes as they actually happen at the molecular level.
Chemical and biological actions can now be measured as they are occurring or, in old-fashioned movie parlance, one frame at a time. This will allow creation of improved biosensors to study everything from nerve impulses to cancer metastasis as it occurs.
The measurements, created by short pulse lasers and bioluminescent proteins, are made in femtoseconds, which is one-millionth of one-billionth of a second. A femtosecond, compared to one second, is about the same as one second compared to 32 million years. That’s a pretty fast shutter speed and will change the way biological research and physical chemistry are being done, scientists say.
“With this technology we’re going to be able to slow down the observation of living processes and understand the exact sequences of biochemical reactions,” said Chong Fang, assistant professor of chemistry in OSU College of Science, and lead author. “We believe this is the first time ever that you can really see chemistry in action inside a biosensor,” he said. “This is a much more powerful tool to study, understand and tune biological processes.”
The system uses advanced pulse laser technology that is fairly new and builds upon the use of “green fluorescent proteins” that are popular in bioimaging and biomedicine. These remarkable proteins glow when light is shined upon them. Their discovery in 1962, and the applications that followed, were the basis for a Nobel Prize in 2008.
Existing biosensor systems, however, are created largely by random chance or trial and error. By comparison, the speed of the new approach will allow scientists to “see” what is happening at the molecular level and create whatever kind of sensor they want by rational design. This will improve the study of everything from cell metabolism to nerve impulses, how a flu virus infects a person, or how a malignant tumor spreads.
“For decades, to create the sensors we have now, people have been largely shooting in the dark,” Fang said. “This is a fundamental breakthrough in how to create biosensors for medical research from the bottom up. It’s like daylight has finally come.”
The technology, for instance, can follow the proton transfer associated with the movement of calcium ions – one of the most basic aspects of almost all living systems, and also one of the fastest. This movement of protons is integral to everything from respiration to cell metabolism and even plant photosynthesis. Scientists will now be able to identify what is going on, one step at a time, and then use that knowledge to create customised biosensors for improved imaging of life processes.
“If you think of this in photographic terms,” Fang said, “we now have a camera fast enough to capture the molecular dance of life. We’re making molecular movies. And with this, we’re going to be able to create sensors that answer some important, new questions in biophysics, biochemistry, materials science and biomedical problems.”
Findings on the new technology were published yesterday in Proceedings of the National Academy of Sciences (PNAS).