future timeline technology singularity humanity
future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed

Blog » Physics


6th May 2017

Fusion reactor achieves first plasma

A British company, Tokamak Energy, has achieved first plasma in the ST40, its latest prototype design for a fusion reactor. The machine is planned to reach 100 million degrees C in 2018, the temperature required for fusion.


fusion reactor future timeline technology 2018
Credit: Tokamak Energy


A new prototype fusion reactor has been turned on for the first time and officially achieved first plasma. The reactor aims to produce a record-breaking plasma temperature of 100 million degrees C for a privately-funded venture. This is seven times hotter than the centre of the Sun and the temperature necessary for controlled fusion.

The tokamak reactor, called the 'ST40', was built by Tokamak Energy, one of the world's leading private fusion energy ventures. The Oxfordshire-based company grew out of the Culham Centre for Fusion Energy and was established in 2009 to design and develop small fusion reactors. Tokamak Energy's aim is to put fusion power into the grid by 2030.

With the ST40 up and running, the next steps are to complete the commissioning and installation of the full set of magnetic coils which are crucial to reaching temperatures required for fusion. This will allow the ST40 to produce a plasma temperature of 15 million degrees C – as hot as the Sun's core – in autumn 2017.


fusion reactor sun core temperature
Credit: NASA's Goddard Space Flight Center


Following the 15 million degree milestone, the next goal is for the ST40 to produce plasma temperatures of 100 million degrees in 2018. This will be a record-breaking milestone, as the plasma will reach a temperature never before achieved in a privately owned and funded fusion reactor. 100 million degrees is an important threshold, as only at or above this temperature can the charged particles which naturally repel each other be forced together to induce a controlled fusion reaction. This will also prove the vital point that commercially viable fusion can be produced in compact spherical tokamaks.

Tokamak Energy's journey to generating fusion energy is moving at a rapid pace; the company has already reached the half-way point of its long-term plan to deliver fusion power. It is focused on working with a smaller reactor design – called a compact, spherical tokamak – that enables quicker development of devices, therefore speeding up the process towards achieving their ultimate targets: producing first electricity by 2025 and commercially viable fusion power by 2030. Tokamak Energy's research has also proven that this route to fusion power can be much faster than the development of conventional large-scale devices.


fusion reactor future timeline technology 2018
Credit: Tokamak Energy


Dr David Kingham, CEO of Tokamak Energy, commented: "Today is an important day for fusion energy development in the UK, and the world. We are unveiling the first world-class controlled fusion device to have been designed, built and operated by a private venture. The ST40 is a machine that will show fusion temperatures – 100 million degrees – are possible in compact, cost-effective reactors. This will allow fusion power to be achieved in years, not decades."

"We will still need significant investment, many academic and industrial collaborations, dedicated and creative engineers and scientists, and an excellent supply chain. Our approach continues to be to break the journey down into a series of engineering challenges, raising additional investment on reaching each new milestone. We are already half-way to the goal of fusion energy; with hard work we will deliver fusion power at commercial scale by 2030."





• Follow us on Twitter

• Follow us on Facebook

• Subscribe to us on YouTube


  speech bubble Comments »



5th May 2017

Biggest X-ray laser in the world generates its first light

The European X-ray Free Electron Laser (XFEL) in Germany has produced its first beams of x-rays.


biggest x-ray laser light future timeline
Credit: European XFEL / Heiner Müller-Elsner


In Hamburg, Germany, the European XFEL – the biggest X-ray laser in the world – has reached its last major milestone before the official opening in September. The 3.4 km long facility, most of which is located in underground tunnels, has generated its first X-ray laser light. This has a wavelength of just 0.8 nanometres (nm) – about 500 times shorter than that of visible light. At first lasing, this laser had a repetition rate of one pulse per second, which will later be increased to 27,000 per second, compared to the previous record of 120 per second.

The beams of the XFEL are extremely intense and a billion times brighter than conventional synchrotron light sources. The achievable light wavelength corresponds to the size of an atom, meaning that the X-rays can be used to make pictures and films of the "nanocosmos" at atomic-scale resolution – such as of biomolecules, from which better understanding of illnesses could be developed. Other opportunities include research into chemical processes and new catalytic techniques, with the goal of improving their efficiency or making them more environmentally friendly; materials research; or the investigation of conditions similar to the interior of planets.


biggest x-ray laser light future timeline
First laser light at the European XFEL, recorded by an X-ray detector at the end of the tunnel. Credit: DESY


Helmut Dosch, Chairman of the DESY Directorate, said: "The European X-ray laser has been brought to life! The first laser light produced today with the most advanced and most powerful linear accelerator in the world marks the beginning of a new era of research in Europe. This worldwide unique high-tech facility was built in record time and within budget. This is an amazing success of science. I congratulate all those involved in the research, development, and construction of this facility with passion and commitment: the employees of DESY, European XFEL, and international partners. They have achieved outstanding results and demonstrated impressively what is possible in international cooperation. The European XFEL will provide us with the most detailed images of the molecular structure of new materials and drugs and novel live recordings of biochemical reactions."

The power and speed of the XFEL will make it possible for scientists to investigate more limited samples and perform their experiments more quickly. Therefore, the facility will increase the amount of "beamtime" available, as the capacity at other X-ray lasers worldwide has been eclipsed by demand, and these other facilities are often overbooked.

At the start of September, the X-ray laser should be officially open. At that point, external users can perform experiments at the first two of the eventual six scientific instruments.


• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



18th April 2017

Device pulls water from dry air, powered only by Sun

The University of California, Berkeley, has created a device that pulls water from dry air, powered only by the Sun. Even under conditions of relatively low (20-30%) humidity, it can produce 2.8 litres of water over a 12-hour period.


water technology future timeline
Credit: University of California, Berkeley


Imagine a future in which every home has an appliance that pulls all the water the household needs out of the air, even in dry or desert climates, using only the power of the Sun. That future may be just around the corner, with the demonstration of a water harvester that uses only ambient sunlight to pull litres of water out of the air each day in conditions as low as 20 percent humidity, a level common in arid areas.

The solar-powered harvester, reported in the journal Science, was constructed at the Massachusetts Institute of Technology using a special material called a metal-organic framework – or MOF – produced at the University of California, Berkeley.

"This is a major breakthrough in the long-standing challenge of harvesting water from the air at low humidity," said Omar Yaghi from UC Berkeley, one of two senior authors of the paper. "There is no other way to do that right now, except by using extra energy. Your electric dehumidifier at home 'produces' very expensive water."

The prototype, under conditions of 20-30 percent humidity, was able to pull 2.8 litres (3 quarts) of water from the air over a 12-hour period, using one kilogram (2.2 pounds) of MOF. Rooftop tests at MIT confirmed that the device works in real-world conditions.


water technology future timeline
Schematic of a metal-organic framework (MOF). Credit: UC Berkeley, Berkeley Lab image.


"One vision for the future is to have water off-grid, where you have a device at home running on ambient solar for delivering water that satisfies the needs of a household," said Yaghi, who is the founding director of the Berkeley Global Science Institute, a co-director of the Kavli Energy NanoSciences Institute and the California Research Alliance by BASF. "To me, that will be made possible because of this experiment. I call it personalised water."

Yaghi worked with Evelyn Wang, a mechanical engineer at MIT, alongside students at the university. The system they designed consists of approximately two pounds of dust-sized MOF crystals compressed between a solar absorber and a condenser plate, inside a chamber open to the air. As ambient air diffuses through the porous MOF, water molecules preferentially attach to the interior surfaces. X-ray diffraction studies have shown that the water vapour molecules often gather in groups of eight to form cubes.

Sunlight entering through a window heats up the MOF and drives the bound water toward the condenser, which is at the temperature of the outside air. The vapour condenses as liquid water and drips into a collector.

"This work offers a new way to harvest water from air that does not require high relative humidity conditions and is much more energy efficient than other existing technologies," said Wang.

This proof of concept harvester leaves much room for improvement, Yaghi said. The current MOF can absorb only 20 percent of its weight in water, but other MOF materials could possibly absorb 40 percent or more. The material could also be tweaked to be more effective at higher or lower humidity.

"It's not just that we made a passive device that sits there collecting water; we have now laid both the experimental and theoretical foundations so that we can screen other MOFs, thousands of which could be made, to find even better materials," he said. "There is a lot of potential for scaling up the amount of water that is being harvested. It is just a matter of further engineering now."

Yaghi and his team are working to improve their MOFs, while Wang continues to improve the harvesting system to produce more water.

"To have water running all the time, you could design a system that absorbs the humidity during the night and evolves it during the day," he said. "Or design the solar collector to allow for this at a much faster rate, where more air is pushed in. We wanted to demonstrate that if you are cut off somewhere in the desert, you could survive because of this device. A person needs about a Coke can of water per day. That is something one could collect in less than an hour with this system."





• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



10th March 2017

IBM unveils roadmap for quantum computers

IBM has announced "IBM Q", an initiative to build commercially available universal quantum computing systems.


IBM quantum computer system
Credit: IBM Research


IBM has announced an industry-first initiative to build commercially available universal quantum computing systems. “IBM Q” systems and services will be delivered via the IBM Cloud platform. Current technologies that run on classical computers, such as Watson, can help to identify patterns and insights buried in vast amounts of existing data. By contrast, quantum computers will deliver solutions to important problems where patterns cannot be seen because the data doesn’t exist and the calculations needed to answer questions are too enormous to ever be processed by classical computers.

IBM is also launching a new Application Program Interface (API) for the “IBM Quantum Experience” enabling anyone with an Internet connection to use the quantum processor (via the Cloud) for running algorithms and experiments, working with individual quantum bits, and exploring tutorials and simulations of what might be possible with quantum computing. In the first half of 2017, IBM plans to release a full Software Development Kit (SDK) for users to build simple quantum applications and software programs.

“IBM has invested over decades to growing the field of quantum computing and we are committed to expanding access to quantum systems and their powerful capabilities for the science and business communities,” said Arvind Krishna, senior vice president of Hybrid Cloud and director for IBM Research. “Following Watson and blockchain, we believe that quantum computing will provide the next powerful set of services delivered via the IBM Cloud platform, and promises to be the next major technology that has the potential to drive a new era of innovation across industries.”


IBM quantum computer system qubit
Credit: IBM Research


IBM intends to build IBM Q systems to expand the application domain of quantum computing. A key metric will be the power of a quantum computer expressed by the “Quantum Volume” – which includes the number of qubits, quality of operations, connectivity and parallelism. As a first step to increase Quantum Volume, IBM aims to build commercial IBM Q systems with around 50 qubits in the next few years to demonstrate capabilities beyond today’s classical systems, and plans to collaborate with key industry partners to develop applications that exploit the quantum speedup of the systems.

IBM Q systems will be designed to tackle problems that are currently too complex and exponential in nature for classical computing systems to handle. One of the first and most promising applications will be in the area of chemistry. Even for simple molecules like caffeine, the number of quantum states in the molecule can be astoundingly large; so complex that all the conventional computing memory and processing power scientists could ever build could not handle the problem.

IBM’s scientists have recently developed new techniques to efficiently explore the simulation of chemistry problems on quantum processors and experimental demonstrations of various molecules are in progress. In the future, the goal will be to scale to even more complex molecules and try to predict chemical properties with higher precision than possible with classical computers.

Future applications of quantum computing may include:

Artificial Intelligence: Making facets of artificial intelligence such as machine learning much more powerful when data sets can be too big such as searching images or video
Cloud Security: Making cloud computing more secure by using the laws of quantum physics to enhance private data safety
Drug & Materials Discovery: Untangling the complexity of molecular and chemical interactions leading to the discovery of new medicines and materials
Financial Services: Finding new ways to model financial data and isolating key global risk factors to make better investments
Supply Chain & Logistics: Finding the optimal path across global systems of systems for ultra-efficient logistics and supply chains, such as optimising fleet operations for deliveries during the holiday season


IBM quantum computer system qubit


“Classical computers are extraordinarily powerful and will continue to advance and underpin everything we do in business and society,” said Tom Rosamilia, senior vice president of IBM Systems. “But there are many problems that will never be penetrated by a classical computer. To create knowledge from much greater depths of complexity, we need a quantum computer. We envision IBM Q systems working in concert with our portfolio of classical high-performance systems to address problems that are currently unsolvable, but hold tremendous untapped value.”

IBM’s roadmap for scaling to practical quantum computers is based on a holistic approach to advancing all parts of the system. The company will leverage its deep expertise in superconducting qubits, complex high performance system integration, and scalable nanofabrication processes from the semiconductor industry to help advance the quantum mechanical capabilities. The developed software tools and environment will also leverage IBM’s world-class mathematicians, computer scientists, and software and system engineers.

"As Richard Feynman said in 1981, ‘…if you want to make a simulation of nature, you’d better make it quantum mechanical, and by golly it’s a wonderful problem, because it doesn’t look so easy.’ This breakthrough technology has the potential to achieve transformational advancements in basic science, materials development, environmental and energy research, which are central to the missions of the Department of Energy (DOE),” said Steve Binkley, deputy director of science, US Department of Energy. “The DOE National Labs have always been at the forefront of new innovation, and we look forward to working with IBM to explore applications of their new quantum systems."





• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



7th February 2017

New technology could triple sharpness of displays

Researchers have developed a new blue-phase liquid crystal that could triple the sharpness of TVs, computer screens, and other displays while also reducing the power needed to run the device.


future display technology


An international team of researchers has developed a new blue-phase liquid crystal that could enable televisions, computer screens and other displays that pack more pixels into the same space while also reducing the power needed to run the device. The new liquid crystal is optimised for field-sequential colour liquid crystal displays (LCDs), a promising technology for next-generation displays.

"Today's Apple Retina displays have a resolution density of about 500 pixels per inch," said Shin-Tson Wu, who led the research team at the University of Central Florida's College of Optics and Photonics (CREOL). "With our new technology, a resolution density of 1500 pixels per inch could be achieved on the same sized screen. This is especially attractive for virtual reality headsets or augmented reality technology, which must achieve high resolution in a small screen to look sharp when placed close to our eyes."

Although the first blue-phase LCD prototype was demonstrated by Samsung in 2008, the technology still hasn't moved into production, because of problems with high operation voltage and slow capacitor charging time. To tackle these problems, Wu's research team worked with collaborators from liquid crystal manufacturer JNC Petrochemical Corporation in Japan and display manufacturer AU Optronics Corporation in Taiwan.

In the journal Optical Materials Express, the team explains how combining the new liquid crystal with a special performance-enhancing electrode structure can achieve light transmittance of 74 percent, with 15 volts per pixel – operational levels that could finally be practical for commercial applications.

"Field-sequential colour displays can be used to achieve the smaller pixels needed to increase resolution density," explains Yuge Huang, first author of the paper. "This is important, because the resolution density of today's technology is almost at its limit."


screen closeup


Today's LCD screens contain a thin layer of nematic liquid crystal through which the incoming white LED backlight is modulated. Thin-film transistors deliver the required voltage that controls light transmission in each pixel. The LCD subpixels contain red, green and blue filters that are used in combination to produce different colours to the human eye. The colour white is created by combining all three colours.

Blue-phase liquid crystal can be switched, or controlled, about 10 times faster than the nematic type. This sub-millisecond response time allows each LED colour (red, green and blue) to be sent through the liquid crystal at different times and eliminates the need for colour filters. The LED colours are switched so quickly that our eyes can integrate red, green and blue to form white.

"With colour filters, the red, green and blue light are all generated at the same time," said Wu. "However, with blue-phase liquid crystal, we can use one subpixel to make all three colours – but at different times. This converts space into time, a space-saving configuration of two-thirds, which triples the resolution density."

The blue-phase liquid crystal also triples the optical efficiency because the light doesn't have to pass through colour filters, which limit transmittance to about 30 percent. Another big advantage is that the displayed colour is more vivid because it comes directly from red, green and blue LEDs, which eliminates the colour crosstalk that occurs with conventional filters.

Wu's team worked with JNC to reduce the blue-phase liquid crystal's dielectric constant to a minimally acceptable range, to reduce the transistor charging time and get submillisecond optical response time. However, each pixel still needed slightly higher voltage than a single transistor could provide. To overcome this problem, the researchers implemented a protruded electrode structure that lets the electric field penetrate the liquid crystal more deeply. This lowered the voltage needed to drive each pixel while maintaining a high light transmittance.

"We achieved an operational voltage low enough to allow each pixel to be driven by a single transistor while also achieving a response time of less than a millisecond," said Haiwei Chen, a doctoral student in Wu's lab. "This delicate balance between operational voltage and response time is key for enabling field sequential colour displays."

Wu predicts that a working prototype could be available in the next year.


• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



6th February 2017

Directed energy atmospheric lens could revolutionise future battlefields

Within the next 50 years, scientists at BAE Systems believe that battlefield commanders could deploy a new type of directed energy laser and lens system called a Laser Developed Atmospheric Lens. This would enhance commanders' ability to observe adversaries' activities over much greater distances than existing sensors.

At the same time, the lens could be used as a form of 'deflector shield' to protect friendly aircraft, ships, land vehicles and troops from incoming attacks by high power laser weapons that could also become a reality in the same time period.

The Laser Developed Atmospheric Lens (LDAL) concept, developed by researchers at the company's military aircraft facility in Lancashire, UK, has been evaluated by the Science and Technology Facilities Council (STFC) Rutherford Appleton Laboratory and specialist optical sensors company LumOptica and is based on known science. It works by simulating naturally occurring phenomena and temporarily – and reversibly – changes the Earth's atmosphere into lens-like structures to magnify or change the path of electromagnetic waves, such as light and radio signals.


directed energy atmospheric lens
Credit: BAE Systems


LDAL is a powerful concept that mimics two existing effects in nature – the reflective properties of the ionosphere; and desert mirages. The ionosphere occurs at very high altitude and is a naturally occurring layer of Earth's atmosphere which can be reflective to radio waves. For example, it results in listeners being able to tune in to radio stations many thousands of miles away. Radio signals bounce off the ionosphere, allowing them to travel very long distances through the air and over the Earth's surface. The desert mirage provides the illusion of a distant lake in the hot desert. This is because the light from the blue sky is 'bent' or refracted by the hot air near the surface and into the vision of the person looking into the distance.

LDAL would simulate both of these effects by using a high pulsed power laser system and exploiting a physics phenomenon called the 'Kerr effect' to temporarily ionise or heat a small region of atmosphere in a structured way. Mirrors, glass lenses, and structures like Fresnel zone plates could all be replicated using the atmosphere, allowing the physics of refraction, reflection, and diffraction to be exploited.




"Working with some of the best scientific minds in the UK, we're able to incorporate emerging and disruptive technologies and evolve the landscape of potential military technologies in ways that, five or ten years ago, many would never have dreamed possible," said Nick Colosimo, BAE Systems' Futurist.

Professor Bryan Edwards, Leader of STFC's Defence, Security and Resilience Futures Programme, said of the work: "For this evaluation project, STFC's Central Laser Facility team worked closely with colleagues at BAE Systems. By harnessing our collective expertise and capabilities, we have been able to identify new ways in which cutting edge technology – and our understanding of fundamental physical processes and phenomena – has the potential to contribute to enhancing the safety and security of the UK."

Craig Stacey, CEO at LumOptica added: "This is a tremendously exciting time in laser physics. Emerging technologies will allow us to enter new scientific territories and explore ever new applications. We are delighted to be working with BAE Systems on the application of such game-changing technologies, evaluating concepts which are approaching the limits of what is physically possible and what might be achieved in the future."


• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



3rd February 2017

Quantum computer blueprint published

Researchers led by the University of Sussex have published the first practical blueprint for how to build a large-scale quantum computer.


quantum computer blueprint


An international team, led by a scientist from the University of Sussex, has published the first practical blueprint for how to build a quantum computer – the most powerful computer in the world. This huge leap forward towards creating a universal quantum computer is detailed in the influential journal Science Advances.

It has long been known that such a computer would revolutionise industry, science and commerce on a similar scale as the invention of ordinary computers. But this new work features the actual industrial blueprint to construct such a large-scale machine, more powerful in solving certain problems than any computer ever built before.

Once operational, the computer's capabilities mean it would have the potential to answer many questions in science; solve the most mind-boggling scientific and mathematical problems; unravel some of the deepest mysteries of space; create revolutionary new medicines; and solve problems that an ordinary computer would take billions of years to compute.

The work features a new invention permitting actual quantum bits to be transmitted between individual quantum computing modules, in order to obtain a fully modular large-scale machine reaching nearly arbitrary large computational processing powers.


quantum computer blueprint


Previously, scientists had proposed using fibre optic connections to connect individual computer modules. The new invention introduces connections created by electric fields that allow charged atoms (ions) to be transported from one module to another. This new approach allows 100,000 times faster connection speeds between individual quantum computing modules compared to current state-of-the-art fibre link technology.

The new blueprint is the work of an international team of scientists from the University of Sussex (UK), Google (USA), Aarhus University (Denmark), RIKEN (Japan) and Siegen University (Germany).

Professor Winfried Hensinger, head of the Ion Quantum Technology Group at the University of Sussex, who has been leading this research, said: "For many years, people said that it was completely impossible to construct an actual quantum computer. With our work, we have not only shown that it can be done, but now we are delivering a nuts and bolts construction plan to build an actual large-scale machine."

Lead author Dr Bjoern Lekitsch, also from the University of Sussex, explains: "It was most important to us to highlight the substantial technical challenges as well as to provide practical engineering solutions."

As a next step, the team will construct a prototype quantum computer, based on this design, at the University.


quantum computer blueprint


This effort is part of the UK Government's £270m ($337m) plan to accelerate the introduction of quantum technologies into the marketplace. It makes use of a recent invention by the Sussex team that can replace billions of laser beams required for large-scale quantum computer operations with the simple application of voltages to a microchip.

"The availability of a universal quantum computer may have a fundamental impact on society as a whole," said Professor Hensinger. "Without doubt it is still challenging to build a large-scale machine, but now is the time to translate academic excellence into actual application building on the UK's strengths in this ground-breaking technology. I am very excited to work with industry and government to make this happen."

The computer's possibilities for solving, explaining or developing could be endless. However, its size will be anything but small. The machine is expected to fill a large building, consisting of sophisticated vacuum apparatus featuring integrated quantum computing silicon microchips that hold individual charged atoms (ions) using electric fields.

The blueprint to develop such computers has been made public to ensure scientists throughout the world can collaborate and further develop this awesome technology as well as to encourage industrial exploitation.

Note: All images courtesy of the University of Sussex





• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



27th January 2017

Metallic hydrogen created for the first time

Scientists at Harvard have created a small amount of metallic hydrogen for the first time, a century after it was theorised. This material is thought to be present in the depths of gas giants like Jupiter.


  metallic hydrogen
Photographs of hydrogen at different stages of compression: Transparent molecular hydrogen (left) at about 200 GPa, which is converted into black molecular hydrogen at 415 GPa (middle), and finally reflective atomic metallic hydrogen at 495 GPa (right).
Credit: Isaac Silvera / Harvard


Nearly a century after it was theorised, Harvard scientists claim to have succeeded in creating the rarest – and potentially one of the most valuable – materials on the planet.

The material – atomic metallic hydrogen – was created by Professor of Natural Sciences, Isaac Silvera; and his colleague, post-doctoral fellow Ranga Dias. As well as helping scientists answer fundamental questions about the nature of matter, this material is theorised to have a wide range of applications, including as a room-temperature superconductor. The breakthrough is described in a paper published yesterday by the journal Science.

"This is the holy grail of high-pressure physics," Silvera said. "It's the first-ever sample of metallic hydrogen on Earth, so when you're looking at it, you're looking at something that's never existed before."

To create it, Silvera and Dias squeezed a tiny hydrogen sample at 495 gigapascal (GPa), or more than 71 million pounds-per-square inch – greater than the pressure at the centre of the Earth. At those extreme pressures, Silvera explained, solid molecular hydrogen – which consists of molecules on the lattice sites of the solid – breaks down, and the tightly bound molecules dissociate to transforms into atomic hydrogen, which is a metal.


liquid metallic hydrogen pressure temperature graph

Comparison of this study with other studies. Solid metallic hydrogen was achieved at 495 GPa.


While the work offers an important new window into understanding the general properties of hydrogen, it also offers tantalising hints at potentially revolutionary new materials.

"One prediction that's very important is metallic hydrogen is predicted to be meta-stable," Silvera said. "That means if you take the pressure off, it will stay metallic, similar to how diamonds form from graphite under intense heat and pressure, but remain diamond when pressure and heat is removed."

Understanding whether the material is stable is important, Silvera said, because predictions suggest metallic hydrogen could act as a superconductor at room temperatures.

"That would be revolutionary," he said. "As much as 15 percent of energy is lost to dissipation during transmission, so if you could make wires from this material and use them in the electrical grid, it could change that story."

Among the holy grails of physics, a room temperature superconductor, Dias said, could radically change our transportation system, making magnetic levitation of high-speed trains possible, as well as ultra-efficient electric cars and improving the performance of many electronic devices. It could also provide major improvements in energy production and storage, because superconductors have zero resistance energy that could be stored by maintaining currents in superconducting coils, and then used when needed.

In addition to transforming life on Earth, metallic hydrogen could also play a key role in helping humans explore the far reaches of space, as the most powerful rocket propellant yet discovered.


manned exploration of saturn via metallic hydrogen fuel


"It takes a tremendous amount of energy to make metallic hydrogen," Silvera explained. "And if you convert it back to molecular hydrogen, all that stored energy is released, so it would make it the most powerful rocket propellant known to man, and could revolutionise rocketry."

The most powerful fuels in use today have a "specific impulse" of 450 seconds – a measure, in seconds, of how fast a propellant is fired from the back of a rocket. In other words, a typical chemical rocket engine can produce one pound of thrust from one pound of fuel for 450 seconds. By comparison, the specific impulse for metallic hydrogen is theorised to be 1,700 seconds.

"That would easily allow you to explore the outer planets," Silvera said. "We would be able to put rockets into orbit with only one stage, versus two, and could send up larger payloads. So it could be very important."

To create the new material, Silvera and Dias turned to one of the hardest materials on Earth – diamond. But rather than natural diamond, Silvera and Dias used two small pieces of carefully polished synthetic diamond which were then treated to make them even tougher and then mounted opposite each other in a device known as a diamond anvil cell.


diamond anvil cell liquid metallic hydrogen


"Diamonds are polished with diamond powder, and that can gouge out carbon from the surface," Silvera said. "When we looked at the diamond using atomic force microscopy, we found defects, which could cause it to weaken and break."

The solution, he said, was to use a reactive ion etching process to shave a tiny layer – just five microns thick, or about one-tenth of a human hair – from the diamond's surface. The diamonds were then coated with a thin layer of alumina to prevent the hydrogen from diffusing into their crystal structure and embrittling them.

After more than four decades of work on metallic hydrogen, and nearly a century after it was first theorised, seeing the material for the first time, Silvera said, was thrilling.

"It was really exciting," he said. "Ranga was running the experiment, and we thought we might get there, but when he called me and said, 'The sample is shining,' I went running down there, and it was metallic hydrogen. I immediately said we have to make the measurements to confirm it, so we rearranged the lab... and that's what we did. It's a tremendous achievement – and even if it only exists in this diamond anvil cell at high pressure, it's a very fundamental and transformative discovery."





• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



26th January 2017

Sci-fi holograms a step closer

Scientists at the Australian National University have invented a tiny device that creates the highest quality holographic images ever achieved, opening the door to 3D imaging technologies like those seen in Star Wars.

Lead researcher, Lei Wang, said the team created complex holographic images in infrared with the invention that could be developed with industry.

"As a child, I learned about the concept of holographic imaging from the Star Wars movies. It's really cool to be working on an invention that uses the principles of holography depicted in those movies," said Mr Wang, a PhD student at the ANU Research School of Physics and Engineering.

Holograms perform the most complex manipulations of light. They enable the storing and reproduction of all information carried by light in 3D. In contrast, standard photographs and computer monitors capture and display only a portion of 2D information.


science fiction hologram technology future timeline
Credit: Australian National University (ANU)


"While research in holography plays an important role in the development of futuristic displays and augmented reality devices, today we are working on many other applications such as ultra-thin and light-weight optical devices for cameras and satellites," said Wang.

Mr Wang explained that the device could replace bulky components to miniaturise cameras and save costs in astronomical missions by reducing the size and weight of optical systems on spacecraft. Co-lead researcher, Dr Sergey Kruk, said the device consisted of millions of tiny silicon "pillars", each up to 500 times thinner than a human hair.

"This new material is transparent, which means it loses minimal energy from the light, and it also does complex manipulations with light," said Dr Kruk from the ANU Research School of Physics and Engineering. "Our ability to structure materials at the nanoscale allows the device to achieve new optical properties that go beyond the properties of natural materials. The holograms we made demonstrate the strong potential of this technology to be used in a range of applications."





• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



5th December 2016

Construction of practical quantum computers radically simplified

Scientists at the University of Sussex have invented a ground-breaking new method that puts the construction of large-scale quantum computers within reach of current technology.

Quantum computers could solve certain problems – that would take the fastest supercomputer millions of years to calculate – in just a few milliseconds. They have the potential to create new materials and medicines, as well as solve long-standing scientific and financial problems.

Universal quantum computers can be built in principle, but the technological challenges are tremendous. The engineering required to build one is considered more difficult than manned space travel to Mars – until now.

Quantum computing experiments on a small scale using trapped ions (charged atoms) are carried out by aligning individual laser beams onto individual ions with each ion forming a quantum bit. However, a large-scale quantum computer would need billions of quantum bits, therefore requiring billions of precisely aligned lasers, one for each ion.

Instead, scientists at the University of Sussex have invented a simple method where voltages are applied to a quantum computer microchip (without having to align laser beams) – to the same effect. The team also succeeded in demonstrating the core building block of this new method with an impressively low error rate.


quantum computer future timeline
Credit: University of Sussex


"This development is a game changer for quantum computing making it accessible for industrial and government use," said Professor Winfried Hensinger, who heads the Ion Quantum Technology Group at the university and is director of the Sussex Centre for Quantum Technologies. "We will construct a large-scale quantum computer at Sussex making full use of this exciting new technology."

Quantum computers may revolutionise society in a similar way as the emergence of classical computers. "Developing this step-changing new technology has been a great adventure and it is absolutely amazing observing it actually work in the laboratory," said Hensinger's colleague, Dr Seb Weidt.

The Ion Quantum Technology Group forms part of the UK's National Quantum Technology Programme, a £270 million investment by the government to accelerate the introduction of quantum technologies into the marketplace.

A paper on this latest research, 'Trapped-ion quantum logic with global radiation fields', is published in the journal Physical Review Letters.


quantum computer future timeline
Professor Winfried Hensinger (left) and Dr Seb Weidt (right).



• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



29th November 2016

The speed of light could be variable, say researchers

Scientists behind a theory that the speed of light is variable – and not constant as Einstein suggested – have produced a model with an exact figure on the spectral index, which they say is testable.


speed of light future timeline


Scientists behind a theory that the speed of light is variable – and not constant as Einstein suggested – have made a prediction that could be tested.

Einstein observed that the speed of light remains the same in any situation, and this meant that space and time could be different in different situations.

The assumption that the speed of light is fixed, and always has been, underpins many theories in physics, such as Einstein's theory of general relativity. It plays an especially important role in models of what happened during the very early universe, seconds after the Big Bang.

But some researchers have suggested that the speed of light could have been much higher in this early universe. Now, one of this theory's originators, Professor João Magueijo from Imperial College London, working with Dr Niayesh Afshordi at the Perimeter Institute in Canada, has made a prediction that could be used to test the theory's validity.

Large structures, such as galaxies, all formed from fluctuations in the early universe – tiny differences in density from one region to another. A record of these early fluctuations is imprinted on the cosmic microwave background – a map of the oldest light in the universe – in the form of a 'spectral index'.


cosmic microwave background future timeline


Working with their theory that the fluctuations were influenced by a varying speed of light in the early universe, Professor Magueijo and Dr Afshordi have now used a model to put an exact figure on the spectral index. The predicted figure and model it is based on are published this month in the peer-reviewed journal Physical Review D.

Cosmologists have been getting ever more precise readings of this figure, so the prediction could soon be tested – either confirming or ruling out the team's model of the early universe. Their figure is a very precise 0.96478. This is close to the current estimate of readings of the cosmic microwave background, which puts it around 0.968, with some margin of error.

"The theory, which we first proposed in the late-1990s, has now reached a maturity point – it has produced a testable prediction. If observations in the near future do find this number to be accurate, it could lead to a modification of Einstein's theory of gravity," explains Professor Magueijo. "The idea that the speed of light could be variable was radical when first proposed – but with a numerical prediction, it becomes something physicists can actually test. If true, it would mean that the laws of nature were not always the same as they are today."

The testability of the varying speed of light theory sets it apart from the more mainstream rival theory: inflation. Inflation says that the early universe went through an extremely rapid expansion phase, much faster than the current rate of expansion of the universe.


big bang universe future timeline
Credit: By Yinweichen, [CC BY-SA 3.0], via Wikimedia Commons


These theories are necessary to overcome what physicists call the 'horizon problem'. The universe as we see it today appears to be everywhere broadly the same. For example, it has a relatively homogenous density.

This could only be true if all regions of the universe were able to influence each other. However, if the speed of light has always been the same, then not enough time has passed for light to have travelled to the edge of the universe, and 'even out' the energy.

As an analogy, to heat up a room evenly, the warm air from radiators at either end has to travel across the room and mix fully. The problem for the universe is that the 'room' – the observed size of the universe – appears to be too large for this to have happened in the time since it was formed.

The varying speed of light theory suggests that the speed of light was much higher in the early universe, allowing the distant edges to be connected as the universe expanded. The speed of light would have then dropped in a predictable way as the density of the universe changed. This variability led the team to their prediction published this month.

The alternative theory is inflation, which attempts to solve this problem by saying that the very early universe "evened out" while incredibly small, and then suddenly expanded, with the uniformity already imprinted on it. While this means the speed of light and the other laws of physics as we know them are preserved, it requires the invention of an 'inflation field' – a set of conditions that only existed at the time.


• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



3rd November 2016

1,000-fold increase in 3-D scanning speed

Researchers at Penn State University report a 1,000-fold increase in the scanning speed for 3-D printing, using a space-charge-controlled KTN beam deflector with a large electro-optic effect.


3d printer scanner future timeline


A major technological advance in the field of high-speed beam-scanning devices has resulted in a speed boost of up to 1000 times, according to researchers in Penn State's College of Engineering. Using a space-charge-controlled KTN beam deflector – a kind of crystal made of potassium tantalate and potassium niobate – with a large electro-optic effect, researchers have found that scanning at a much higher speed is possible.

"When the crystal materials are applied to an electric field, they generate uniform reflecting distributions, that can deflect an incoming light beam," said Professor Shizhuo Yin, from the School of Electrical Engineering and Computer Science. "We conducted a systematic study on indications of speed and found out the phase transition of the electric field is one of the limiting factors."

To overcome this issue, Yin and his team of researchers eliminated the electric field-induced phase transition in a nanodisordered KTN crystal by making it work at a higher temperature. They not only went beyond the Curie temperature (at which certain materials lose their magnetic properties, replaced by induced magnetism), they went beyond the critical end point (in which a liquid and its vapour can co-exist).


3d printer scanner future timeline

Credit: Penn State


This increased the scanning speed from the microsecond range to the nanosecond range, and led to improved high-speed imaging, broadband optical communications and ultrafast laser display and printing. The researchers believe this could lead to a new generation of 3-D printers, with objects that once took an hour to print now taking a matter of seconds.

Yin said technology like this would be especially useful in the medical industry – high-speed imaging will now be possible in real-time. For example, optometrists who use a non-invasive test that uses light waves to take cross-section pictures of a person's retina, would be able to have a 3-D image of their patients' retinas as they are performing the surgery, so they can see what needs to be corrected during the procedure.

The group's findings are published in the journal Nature Scientific Reports.


• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



29th October 2016

New world record for fusion reactor

Massachusetts Institute of Technology has announced a new record for plasma pressure in an Alcator C-Mod tokamak nuclear fusion reactor – achieving over two atmospheres of pressure for the first time.


nuclear fusion reactor future timeline
Credit: Bob Mumgaard/Plasma Science and Fusion Center


Scientists and engineers from the Plasma Science and Fusion Center at the Massachusetts Institute of Technology (MIT) have made a leap forward in the pursuit of clean energy. They report a new record for plasma pressure in the Institute's Alcator C-Mod tokamak nuclear fusion reactor, pictured above. Plasma pressure is the key ingredient to producing energy from nuclear fusion, and MIT's new result achieves over two atmospheres of pressure for the first time. Senior researcher, Earl Marmar, presented the results at the IAEA Fusion Energy Conference, in Kyoto, Japan, which ran from 17–22 October.

Nuclear fusion has the potential to produce nearly unlimited supplies of clean, safe, carbon-free energy. Fusion is the same process that powers the Sun, and it can be realised in reactors that simulate the conditions of ultrahot miniature "stars" of plasma – superheated gas – that are contained within a magnetic field.

For over half a century it has been known that to make fusion viable on Earth's surface, the plasma must be very hot (more than 50 million degrees), must be stable under intense pressure, and contained in a fixed volume. Successful fusion also requires that the product of three factors – a plasma's particle density, confinement time, and temperature – reaches a certain value. Above this value (the so-called "triple product"), the energy released from a reactor exceeds the energy required to keep the reaction going.

Pressure, which is the product of density and temperature, accounts for about two-thirds of the challenge. The amount of power produced increases with the square of the pressure – so doubling the pressure leads to a fourfold increase in energy production.

During the 23 years Alcator C-Mod has been in operation, it has repeatedly advanced the record for plasma pressure in a magnetic confinement device. The previous record of 1.77 atmospheres was in 2005 (also at Alcator C-Mod). While setting the new record of 2.05 atmospheres, a 15% improvement, the temperature inside Alcator C-Mod reached over 35 million degrees Celsius, or twice as hot as the centre of the sun. The plasma produced 300 trillion fusion reactions per second and had a central magnetic field strength of 5.7 tesla. It carried 1.4 million amps of electrical current and was heated with over 4 million watts of power. The reaction occurred in a volume of approximately 1 cubic metre (not much larger than a coat closet) and the plasma lasted for two full seconds.




Other fusion experiments conducted in reactors similar to Alcator have reached these temperatures before, but at pressures closer to 1 atmosphere; MIT's result exceeded the next highest pressure achieved in non-Alcator devices by 70 percent.

"This is a remarkable achievement that highlights the highly successful Alcator C-Mod program at MIT," says Dale Meade, former deputy director at the Princeton Plasma Physics Laboratory, who was not directly involved in the experiments. "The record plasma pressure validates the high-magnetic-field approach as an attractive path to practical fusion energy."

"This result confirms that the high pressures required for burning plasma can be best achieved with high-magnetic-field tokamaks such as Alcator C-Mod," says Riccardo Betti, Professor of Mechanical Engineering and Physics and Astronomy at the University of Rochester.

Alcator C-Mod is the world's only compact, high-magnetic-field fusion reactor with advanced shaping in a design called a tokamak, which confines the superheated plasma in a doughnut-shaped chamber. Its high-intensity magnetic field – up to eight tesla, or 160,000 times the Earth's magnetic field – allows the device to create the dense, hot plasmas and keep them stable at such incredibly high temperatures. Its magnetic field is more than double what is typically used in other reactor designs, which quadruples its ability to contain plasma pressure.

Unfortunately, while Alcator C-Mod's contributions to the advancement of fusion energy have been significant, the facility has now been officially closed following this latest experiment. In 2012, the Department of Energy (DOE) decided to cease funding, due to budget pressures from the construction of the ITER project, which is due to be switched on in 2022. Following that decision, Congress restored funding for a few more years – but that funding has now ended.

C-Mod was third in the line of high-magnetic-field tokamaks built and operated at MIT. Unless a new device is announced and constructed, the pressure record just set in C-Mod will likely stand for the next 15 years. ITER will be approximately 800 times larger in volume than Alcator C-Mod, but will operate at a lower magnetic field. ITER is expected to reach 2.6 atmospheres when it reaches full operation by 2032.


nuclear fusion power future timeline
The Alcator C-Mod team celebrates the record setting plasma discharge on its last day of operation.
Credit: Jim Irby/Plasma Science and Fusion Center



• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



20th October 2016

Quantum computers: 10-fold boost in stability achieved

A team at Australia's University of New South Wales has created a new quantum bit that remains in a stable superposition for 10 times longer than previously achieved.


quantum computers stability breakthrough future timeline
Credit: Arne Laucht/UNSW


Australian engineers have created a new quantum bit which remains in a stable superposition for 10 times longer than previously achieved, dramatically expanding the time during which calculations could be performed in a silicon quantum computer.

The new quantum bit, consisting of the spin of a single atom in silicon and merged with an electromagnetic field – known as 'dressed qubit' – retains quantum information for much longer than 'undressed' atoms, opening up new avenues to build and operate the superpowerful quantum computers of the future.

"We have created a new quantum bit where the spin of a single electron is merged together with a strong electromagnetic field," comments Arne Laucht from the School of Electrical Engineering & Telecommunications at University of New South Wales (UNSW), lead author of the paper. "This quantum bit is more versatile and more long-lived than the electron alone, and will allow us to build more reliable quantum computers."

Building a quantum computer is a difficult and ambitious challenge, but has potential to deliver revolutionary tools for otherwise impossible calculations – such as the design of complex drugs and advanced materials, or the rapid search of massive, unsorted databases. Its speed and power lie in the fact that quantum systems can host multiple 'superpositions' of different initial states, which in a computer are treated as inputs which, in turn, all get processed at the same time.

"The greatest hurdle in using quantum objects for computing is to preserve their delicate superpositions long enough to allow us to perform useful calculations," said Andrea Morello, Program Manager in the Centre for Quantum Computation & Communication Technology at UNSW. "Our decade-long research program had already established the most long-lived quantum bit in the solid state, by encoding quantum information in the spin of a single phosphorus atom inside a silicon chip placed in a static magnetic field," he said.

What Laucht and colleagues did was push this further: "We have now implemented a new way to encode the information: we have subjected the atom to a very strong, continuously oscillating electromagnetic field at microwave frequencies, and thus we have 'redefined' the quantum bit as the orientation of the spin with respect to the microwave field."


quantum computers stability breakthrough future timeline
Tuning gates (red), microwave antenna (blue), and single electron transistor used for spin readout (yellow).
Credit: Guilherme Tosi & Arne Laucht/UNSW


The results are striking: since the electromagnetic field steadily oscillates at a very high frequency, any noise or disturbance at a different frequency results in a zero net effect. The UNSW researchers achieved an improvement by a factor of 10 in the time span during which a quantum superposition can be preserved, with a dephasing time of T2*=2.4 milliseconds.

"This new 'dressed qubit' can be controlled in a variety of ways that would be impractical with an 'undressed qubit'," adds Morello. "For example, it can be controlled by simply modulating the frequency of the microwave field, just like an FM radio. The 'undressed qubit' instead requires turning the amplitude of the control fields on and off, like an AM radio. In some sense, this is why the dressed qubit is more immune to noise: the quantum information is controlled by the frequency, which is rock-solid, whereas the amplitude can be more easily affected by external noise."

Since the device is built upon standard silicon technology, this result paves the way to the construction of powerful and reliable quantum processors based on the same fabrication process already used for today's computers. The UNSW team leads the world in developing silicon quantum computing and Morello's team is part of a consortium who have struck a A$70 million deal between UNSW, researchers, business, and the Australian government to develop a prototype silicon quantum integrated circuit – a major step in building the world's first quantum computer in silicon.

A functional quantum computer would allow massive increases in speed and efficiency for certain computing tasks – even when compared with today's fastest silicon-based 'classical' computers. In a number of key areas – such as searching enormous databases, solving complicated sets of equations, and modelling atomic systems such as biological molecules or drugs – they would far surpass today's computers. They would also be extremely useful in the finance and healthcare industries, and for government, security and defence organisations.

Quantum computers could identify and develop new medicines by vastly accelerating the computer-aided design of pharmaceutical compounds (minimising lengthy trial and error testing), and develop new, lighter and stronger materials spanning consumer electronics to aircraft. They would also make possible new types of computing applications and solutions that are beyond our ability to foresee.

The UNSW study appears this week in the peer-reviewed journal, Nature Nanotechnology.





• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



30th July 2016

Vortex laser offers hope for Moore's Law

A new laser that travels in a corkscrew pattern is shown to carry ten times or more the information of conventional lasers, potentially offering a way to extend Moore's Law.


moores law vortex laser


Like a whirlpool, a new light-based communication tool carries data in swift, circular motions. This optics advancement could become a central component of the next generation of computers designed to handle society's growing demand for information sharing. It may also help to ease concerns for those worried about the predicted end of Moore's Law – the idea that researchers will find new ways to make computers ever smaller, faster and cheaper.

"To transfer more data while using less energy, we need to rethink what's inside these machines," says Liang Feng, PhD, assistant professor in the Department of Electrical Engineering at the University at Buffalo's (UB) School of Engineering and Applied Sciences.

For decades, researchers have been able to cram exponentially increasing numbers of components onto silicon-based chips. Their success explains why a typical handheld smartphone has more computing power than the world's most powerful computers of the 1980s, which cost millions in today's dollars and were the size of a large filing cabinet.

But researchers are approaching a bottleneck, in which existing technology may no longer meet society's demand for data. Predictions vary, but many suggest this could happen within the next five years. This problem is being addressed in numerous ways, including optical communications, which use light to carry information. Examples of optical communications vary from old lighthouses to modern fibre optic cables used to watch television and browse the web. Lasers are a key part of today's optical communication systems and researchers have been manipulating them in various ways, most commonly by funnelling different signals into one path, to pack more information together. But these techniques are also reaching their limits.


moores law vortex laser


The UB-led research team is pushing laser technology forward using another light control method, known as orbital angular momentum. This distributes the laser in a corkscrew pattern with a vortex at the centre, as pictured above. Usually too large to work on today's computers, they were able to shrink the vortex laser to the point where it is compatible with modern chips. Because the laser beam travels in a corkscrew pattern, encoding information into different vortex twists, it can deliver at least 10 times the information of conventional lasers, which move linearly.

However, the vortex laser is just one component of many – such as advanced transmitters and receivers – which will ultimately be needed to continue building more powerful computers and data centres in the future.

The study was published yesterday in the peer-reviewed journal Science. The research was supported with grants from the U.S. Army Research Office, the U.S. Department of Energy and National Science Foundation.


• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



12th June 2016

Four new additions to the periodic table

The International Union of Pure and Applied Chemistry (IUPAC) has proposed the final names of four new additions to the periodic table.

Following earlier reports that the claims for discovery of these elements have been fulfilled, the discoverers have been invited to propose names and they are now disclosed for public review:

• Nihonium and symbol Nh, for the element 113
• Moscovium and symbol Mc, for the element 115
• Tennessine and symbol Ts, for the element 117
• Oganesson and symbol Og, for the element 118


periodic table 2016 update
By Sandbh (Own work) [CC BY-SA 4.0]


The IUPAC Inorganic Chemistry Division has reviewed and considered these proposals and recommends them for acceptance. A five-month public review is now being held, expiring on 8th November 2016, prior to formal approval by the IUPAC Council.

The guidelines for naming elements were recently revised and shared with discoverers to assist in their proposals. Keeping with tradition, a newly discovered element can be named after:
(a) a mythological concept or character (including an astronomical object),
(b) a mineral or similar substance,
(c) a place, or geographical region,
(d) a property of the element, or
(e) a scientist.

Nihonium, with atomic number 113, is a synthetic element (that can be created in a laboratory, but is not found in nature) and is extremely radioactive; its most stable known isotope, ununtrium-286, has a half-life of just 20 seconds. The name comes from one of the pronunciations of the Japanese word for Japan (nihon) that literally means "the Land of Rising Sun". Its research team hopes that pride and faith in science will displace the lost trust of those who suffered from the 2011 Fukushima nuclear disaster.

Moscovium, with atomic number 115, is in recognition of the Moscow region and honours the ancient Russian land that is the home of the Joint Institute for Nuclear Research, where the discovery experiments were conducted using the Dubna Gas-Filled Recoil Separator, in combination with the heavy ion accelerator capabilities of the Flerov Laboratory of Nuclear Reactions. Like nihoniuim, it is extremely radioactive; its most stable isotope has a half-life of only 220 milliseconds. About 100 atoms of moscovium have been observed to date.

Tennessine, with atomic number 117, recognises the contribution of the Tennessee region, including Oak Ridge National Laboratory, Vanderbilt University, and the University of Tennessee at Knoxville, to superheavy element research, including the production and chemical separation of unique actinide target materials for superheavy element synthesis at the High Flux Isotope Reactor (HFIR) and Radiochemical Engineering Development Centre.

Oganesson, with atomic number 118, was discovered by teams at the Joint Institute for Nuclear Research, Dubna (Russia) and Lawrence Livermore National Laboratory (USA). The name is in line with the tradition of honouring a scientist and recognises Professor Yuri Oganessian (born 1933) who played a leading role in discovering the heaviest elements of the periodic table, made significant advances in the nuclear physics of superheavy nuclei and produced experimental evidence for the "island of stability". Oganesson has the highest atomic number and mass of all known elements. It is extremely unstable, due to its high mass, and since 2005, only three or possibly four atoms of the isotope 294Uuo have been detected.

"It is a pleasure to see that specific places and names (country, state, city, and scientist) related to the new elements is recognised in these four names. Although these choices may perhaps be viewed by some as slightly self-indulgent, the names are completely in accordance with IUPAC rules", commented Jan Reedijk, who corresponded with each team and invited the discoverers to make proposals. "In fact, I see it as thrilling to recognise that international collaborations were at the core of these discoveries and that these new names also make the discoveries somewhat tangible."

Ultimately, and after the lapse of the public review in November, the final recommendations will be published in the journal Pure and Applied Chemistry.


• Follow us on Twitter

• Follow us on Facebook


  speech bubble Comments »



12th February 2016

Gravitational waves detected for the first time

In a historical scientific landmark, researchers have announced the first detection of gravitational waves, as predicted by Einstein's general theory of relativity 100 years ago. This major discovery opens a new era of astronomy.


gravitational waves 2016 science black holes
Credits: R. Hurt/Caltech-JPL


For the first time, scientists have directly observed "ripples" in the fabric of spacetime called gravitational waves, arriving at the Earth from a cataclysmic event in the distant universe. This confirms a major prediction of Einstein’s 1915 general theory of relativity and opens an unprecedented new window onto the cosmos.

The observation was made at 09:50:45 GMT on 14th September 2015, when two black holes collided. However, given the enormous distance involved and the time required for light to reach us, this event actually occurred some 1.3 billion years ago, during the mid-Proterozoic Eon. For context, this is so far back that multicellular life here on Earth was only just beginning to spread. The signal came from the Southern Celestial Hemisphere, in the rough direction of (but much further away than) the Magellanic Clouds.

The two black holes were spinning together as a binary pair, turning around each other several tens of times a second, until they eventually collided at half the speed of light. These objects were 36 and 29 times the mass of our Sun. As their event horizons merged, they became one – like two soap bubbles in a bath. During the fraction of a second that this happened, three solar masses were converted to gravitational waves, and for a brief instant the event hit a peak power output 50 times that of the entire visible universe.


gravitational waves 2016 science black holes


The gravitational waves were detected by both of the twin Laser Interferometer Gravitational-wave Observatory (LIGO) detectors, located in Livingston, Louisiana, and Hanford, Washington, USA. The LIGO Observatories are funded by the National Science Foundation (NSF), and were conceived, built, and are operated by Caltech and MIT. The discovery was published yesterday in the journal Physical Review Letters.

Prof. Stephen Hawking told BBC News: "Gravitational waves provide a completely new way of looking at the Universe. The ability to detect them has the potential to revolutionise astronomy. This discovery is the first detection of a black hole binary system and the first observation of black holes merging. Apart from testing General Relativity, we could hope to see black holes through the history of the Universe. We may even see relics of the very early Universe during the Big Bang at some of the most extreme energies possible."

"There is a Nobel Prize in it – there is no doubt," said Prof. Karsten Danzmann, from the Max Planck Institute for Gravitational Physics and Leibniz University in Hannover, Germany, who collaborated on the study. In an interview with the BBC, he claimed the significance of this discovery is on a par with the determination of the structure of DNA.

"It is the first ever direct detection of gravitational waves; it's the first ever direct detection of black holes and it is a confirmation of General Relativity because the property of these black holes agrees exactly with what Einstein predicted almost exactly 100 years ago."

"We found a beautiful signature of the merger of two black holes and it agrees exactly – fantastically – with the numerical solutions to Einstein equations ... it looked too beautiful to be true."



gravitational waves 2016 science black holes

LIGO measurement of gravitational waves at the Hanford (left) and Livingston (right) detectors, compared to the theoretical predicted values.
By Abbott et al. [CC BY 3.0]


"Scientists have been looking for gravitational waves for decades – but we’ve only now been able to achieve the incredibly precise technologies needed to pick up these very, very faint echoes from across the universe," said Danzmann. "This discovery would not have been possible without the efforts and the technologies developed by the Max Planck, Leibniz Universität, and UK scientists working in the GEO collaboration."

Researchers at the LIGO Observatories were able to measure tiny and subtle disturbances the waves made to space and time as they passed through the Earth, with machines detecting changes just fractions of the width of an atom. At each observatory, the two-and-a-half-mile (4-km) long L-shaped LIGO interferometer uses laser light split into two beams that travel back and forth along tubes kept at a near-perfect vacuum. The beams are used to monitor the distance between mirrors precisely positioned at the ends of the arms. According to Einstein’s theory, the distance between the mirrors will change by an infinitesimal amount when gravitational waves pass by the detector. A change in the lengths of the arms smaller than one-ten-thousandth the diameter of a proton can be detected; equivalent to a human hair's diameter over three light years from Earth.

"The Advanced LIGO detectors are a tour de force of science and technology, made possible by a truly exceptional international team of technicians, engineers, and scientists," says David Shoemaker of MIT. "We are very proud that we finished this NSF-funded project on time and on budget."

"We spent years modelling the gravitational-wave emission from one of the most extreme events in the universe: pairs of massive black holes orbiting with each other and then merging. And that’s exactly the kind of signal we detected!" says Prof. Alessandra Buonanno, director at the Max Planck Institute for Gravitational Physics in Potsdam.

"With this discovery, we humans are embarking on a marvellous new quest: the quest to explore the warped side of the universe – objects and phenomena that are made from warped spacetime," says Kip Thorne, Feynman Professor of Theoretical Physics at Caltech. "Colliding black holes and gravitational waves are our first beautiful examples."

Advanced LIGO is among the most sensitive instruments ever built. During its next observing stage, it is expected to detect five more black hole mergers and to detect around 40 binary star mergers each year, in addition to an unknown number of more exotic gravitational wave sources, some of which may not be anticipated by current theory.




  speech bubble Comments »



29th January 2016

New research challenges long-held views on time evolution

Research into the nature of time by Griffith University's Centre for Quantum Dynamics shows how an asymmetry for time reversal might be responsible for making the universe move forwards in time.


time travel tunnel


New research from Griffith University's Centre for Quantum Dynamics is broadening perspectives on time and space. In a study published by the journal Proceedings of the Royal Society A, Associate Professor Joan Vaccaro challenges the long-held assumption that time evolution – the incessant unfolding of the universe over time – is an elemental part of Nature. In the paper, titled Quantum asymmetry between time and space, she suggests there may be a deeper origin due to a difference between the two directions of time: to the future and to the past.

"If you want to know where the universe came from and where it's going, you need to know about time," she says. "Experiments on subatomic particles over the past 50 years ago show that Nature doesn't treat both directions of time equally.

"In particular, subatomic particles called K and B mesons behave slightly differently, depending on the direction of time. When this subtle behaviour is included in a model of the universe, what we see is the universe changing from being fixed at one moment in time to continuously evolving.

"In other words, the subtle behaviour appears to be responsible for making the universe move forwards in time. Understanding how time evolution comes about in this way opens up a whole new view on the fundamental nature of time itself. It may even help us to better understand bizarre ideas such as travelling back in time."

According to her research, an asymmetry exists between time and space in the sense that physical systems inevitably evolve over time, whereas there is no corresponding ubiquitous translation over space. This asymmetry, long presumed to be elemental, is represented by equations of motion and conservation laws that operate differently over time and space.

However, Associate Professor Vaccaro used a "sum-over-paths formalism" to demonstrate the possibility of a time and space symmetry, meaning the conventional view of time evolution would need to be revisited.

"In the connection between time and space, space is easier to understand because it's simply there. But time is forever forcing us towards the future," says Vaccaro. "Yet while we are indeed moving forward in time, there is also always some movement backwards – a kind of jiggling effect – and it is this movement I want to measure using these K and B mesons."

Associate Professor Vaccaro says the research provides a solution to the origin of dynamics, an issue that has long perplexed science.


  speech bubble Comments »



11th December 2015

Fusion reactor begins testing in Germany

The first helium plasma test has been successfully conducted at the Wendelstein 7-X fusion device in northeastern Germany. Tests with hydrogen plasma will begin in 2016.


wendelstein 7-x fusion device ignition 2015 future timeline


The first helium plasma was produced yesterday in the Wendelstein 7-X fusion device at the Max Planck Institute for Plasma Physics (IPP) in Greifswald, northeastern Germany. Following more than a year of technical preparations and tests, experimental operation has now commenced according to plan. Wendelstein 7-X, the world's largest stellarator-type fusion device, will investigate the suitability of this type of device for a commercial power station.

After nine years of construction work and over a million assembly hours, the Wendelstein 7-X was completed in April 2014. Operational preparations have been underway ever since. Each technical system was tested in turn, the vacuum in the vessels, the cooling system, the superconducting coils and the magnetic field they produce, the control system, as well as the heating devices and measuring instruments.

On 10th December 2015, the day had arrived: the operating team in the control room started up the magnetic field and initiated the computer-operated experiment control system. This fed around one milligram of helium gas into the evacuated plasma vessel, switched on the microwave heating for a short pulse of 1.3 megawatts – and the first plasma was observed by the installed cameras and measuring devices. The exact moment of ignition was captured in this video.

“We’re starting with a plasma produced from the noble gas helium,” explains project leader, Professor Thomas Klinger: “We’re not changing over to the actual investigation object, a hydrogen plasma, until next year. This is because it’s easier to achieve the plasma state with helium. In addition, we can clean the surface of the plasma vessel with helium plasmas.”

The first plasma in the machine had a duration of one tenth of a second and achieved a temperature of around one million ºC. “We’re very satisfied”, concludes Dr. Hans-Stephan Bosch, whose division is responsible for the operation. “Everything went according to plan.” The next task will be to extend the duration of the plasma discharges and to investigate the best method of producing and heating helium plasmas using microwaves. After a break for New Year, the confinement studies will continue in January, which will prepare the way for producing the first plasma from hydrogen.


wendelstein 7-x fusion device ignition 2015 future timeline
The Wendelstein 7-X fusion device. Photo: IPP, Thorsten Bräuer


The Wendelstein 7-X is the largest fusion device created using the "stellarator" concept, which refers to the possibility of harnessing the power source of the Sun, a stellar object. It is planned to operate with up to 30 minutes of continuous plasma discharge, demonstrating an essential feature of a future power plant: continuous operation. By contrast, tokamaks such as ITER can only operate in pulses without auxiliary equipment.

The Wendelstein 7-X is based on a five field-period Helias configuration. It is mainly a toroid – consisting of 50 non-planar and 20 planar superconducting magnetic coils, 3.5 m high – which induce a magnetic field that prevents the plasma from colliding with the reactor walls. The 50 non-planar coils are used for adjusting the magnetic field. It aims for a plasma temperature of 60 to 130 million K.

Stellarators were popular in the 1950s and 60s, but the much better results from tokamak designs led to them falling from favour in the 1970s. Wendelstein 7-X, however, aims to put the quality of the plasma equilibrium and confinement on a par with that of a tokamak for the very first time, potentially offering a new pathway to reliable fusion power.


  wendelstein 7-x fusion device ignition 2015 future timeline
Scheme of coil system (blue) and plasma (yellow) of the Wendelstein 7-X. A magnetic field line is highlighted in green on the plasma surface shown in yellow. Credit: Max Planck Institute for Plasma Physics [CC BY 3.0]


  speech bubble Comments »



4th December 2015

1,000-fold increase in 3-D imaging resolution

A new system developed by MIT can increase the resolution of conventional 3-D imaging devices by 1,000 times.


1000 times higher resolution 3d imaging


Researchers at the Massachusetts Institute of Technology (MIT) have shown that by exploiting the polarisation of light – the physical phenomenon behind polarised sunglasses and most 3-D movie systems – they can increase the resolution of conventional 3-D imaging devices by up to 1,000 times. This technique could lead to high-quality 3-D cameras built into smartphones, or the ability to snap photos of objects and then use 3-D printing to produce accurate replicas. Further out, the work may also improve the ability of driverless cars in rain, snow and other reduced visibility conditions.

"Today, they can miniaturise 3-D cameras to fit on cellphones," says Achuta Kadambi, a PhD student in the MIT Media Lab and one of the system's developers. "But they make compromises to the 3-D sensing, leading to very coarse recovery of geometry. That's a natural application for polarisation, because you can still use a low-quality sensor, and adding a polarising filter gives you something that's better than many machine-shop laser scanners."

The researchers have described their new system – which they call Polarised 3D – in a paper to be presented at the International Conference on Computer Vision later this month.

Their experimental setup consisted of a Microsoft Kinect – which gauges depth using reflection time – combined with an ordinary polarising photographic lens placed in front of its camera. In each experiment, they took three photos of an object, rotating the polarising filter each time, and their algorithms compared the light intensities of the resulting images.


1000 times higher resolution 3d imaging


On its own, at a distance of several metres, the Kinect can resolve physical features as small as a centimetre or so across. But with the addition of the polarisation information, the hybrid system was able to resolve features in the range of tens of micrometres: one-thousandth the size. For comparison, they also imaged several of their test objects with a high-precision laser scanner, which requires that the object be inserted into the scanner bed. Polarised 3D still offered the higher resolution.

A mechanically rotated polarisation filter would probably be impractical in a cellphone camera, but grids of tiny polarisation filters that can overlay individual pixels in a light sensor would work. The paper also offers the tantalising prospect that polarisation systems may help in the development of self-driving cars. Experimental self-driving vehicles of today are reliable under normal illumination conditions – but their vision algorithms go haywire in rain, snow, or fog, due to water particles in the air scattering light in unpredictable ways. Polarised 3D could exploit information contained in interfering waves of light to handle scattering.

Yoav Schechner, associate professor of electrical engineering, comments on the research: "The work fuses two 3-D sensing principles, each having pros and cons. One principle provides the range for each scene pixel – the state of the art for most 3-D imaging systems. The second principle does not provide range. On the other hand, it derives the object slope, locally. In other words, per scene pixel, it tells how flat or oblique the object is."

"The work uses each principle to solve problems associated with the other principle," Schechner explains. "Because this approach practically overcomes ambiguities in polarisation-based shape sensing, it can lead to wider adoption of polarisation in the toolkit of machine-vision engineers."


  speech bubble Comments »



« Previous  



AI & Robotics Biology & Medicine Business & Politics Computers & the Internet
Energy & the Environment Home & Leisure Military & War Nanotechnology
Physics Society & Demographics Space Transport & Infrastructure


















future timeline twitter future timeline facebook group future timeline youtube channel account videos future timeline rss feed