Researchers at Brown University report the transmission of data through a terahertz multiplexer at 50 gigabits per second, which could lead to a new generation of ultra-fast Wi-Fi.
Credit: Mittleman lab / Brown University
Multiplexing, the ability to send multiple signals through a single channel, is a fundamental feature of any voice or data communication system. A team of researchers has now demonstrated a method for multiplexing data carried on terahertz waves – high-frequency radiation that could enable the next generation of ultra-high bandwidth wireless networks.
Writing in the journal Nature Communications, they describe the transmission of two real-time video signals through a terahertz multiplexer at an aggregate data rate of 50 gigabits per second, 100 times the optimal data rate of today's fastest cellular network.
"We showed that we can transmit separate data streams on terahertz waves at very high speeds and with very low error rates," said Daniel Mittleman, professor in Brown's University's School of Engineering and the paper's corresponding author. "This is the first time anybody has characterised a terahertz multiplexing system using actual data, and our results show that our approach could be viable in future terahertz wireless networks."
Current voice and data networks use microwaves to carry signals wirelessly. But like most forms of information technology, the demand for data transmission is growing exponentially, and quickly becoming more than microwave networks can handle. Terahertz waves have higher frequencies than microwaves and therefore a much larger capacity to carry data. However, scientists have only just begun experimenting with terahertz frequencies and many of the basic components needed for such communication don't exist yet.
A system for multiplexing and demultiplexing (also known as mux/demux) is one of those basic components. It's a technology that allows one cable to carry multiple TV channels, or hundreds of users to access a Wi-Fi network.
The mux/demux approach Mittleman and his colleagues developed uses two metal plates placed parallel to each other to form a waveguide, as shown in the illustration below. One plate has a slit cut into it. When a terahertz wave travels through the waveguide, some of the radiation leaks out of the slit. The angle at which radiation beams escape is dependent upon the frequency of the wave.
"We can put several waves at several different frequencies – each of them carrying a data stream – into the waveguide, and they won't interfere with each other because they're different frequencies; that's multiplexing," Mittleman said. "Each of those frequencies leaks out of the slit at a different angle, separating the data streams; that's demultiplexing."
Due to the nature of terahertz waves, signals in terahertz communications networks will propagate as directional beams, not omnidirectional broadcasts like in existing wireless systems. This directional relationship between propagation angle and frequency is key to enabling mux/demux in terahertz systems. A user at a particular location (and therefore at a particular angle from the multiplexing system) will communicate on a particular frequency.
Credit: Mittleman lab / Brown University
In 2015, Mittleman's team first published a paper describing their waveguide concept. For that initial work, they used a broadband terahertz light source to confirm that different frequencies did indeed emerge from the device at different angles. While that was an effective proof of concept, this latest work took the critical step of testing the device with real data.
The team encoded two high-definition television broadcasts, beamed together into the multiplexer system. Their experiments showed that transmissions were error-free up to 10 gigabits per second, which is much faster than today's standard Wi-Fi speeds. Error rates increased somewhat when the speed was boosted to 50 gigabits per second (25 gigabits per channel), but were still well within the range that can be fixed using forward error correction, which is commonly used in today's communications networks.
The researchers plan to continue developing this and other terahertz components. Mittleman recently received a license from the FCC to perform outdoor tests at terahertz frequencies on the Brown University campus.
"We think that we have the highest-frequency license currently issued by the FCC, and we hope it's a sign that the agency is starting to think seriously about terahertz communication," he said. "Companies are going to be reluctant to develop terahertz technologies until there's a serious effort by regulators to allocate frequency bands for specific uses, so this is a step in the right direction."
Physicists at CERN's Large Hadron Collider (LHC) report the detection of Xi-cc++ (pronounced ka-sigh-see-see-plus-plus), a new particle containing two charm quarks and one up quark.
At the EPS Conference on High Energy Physics in Venice, the LHCb experiment at CERN's Large Hadron Collider has reported the observation of Ξcc++ (Xicc++) a new particle containing two charm quarks and one up quark. The existence of this particle from the baryon family was expected by current theories, but physicists have been looking for such baryons with two heavy quarks for many years. The mass of the newly identified particle is about 3621 MeV, almost four times heavier than the most familiar baryon, the proton, a property that arises from its doubly charmed quark content. It is the first time that such a particle has been unambiguously detected.
Nearly all the matter that we see around us is made of baryons, which are common particles composed of three quarks – the best-known being protons and neutrons. But there are six types of existing quarks, and theoretically many different potential combinations could form other kinds of baryons. Baryons so far observed are all made of, at most, one heavy quark.
"Finding a doubly heavy-quark baryon is of great interest as it will provide a unique tool to further probe quantum chromodynamics, the theory that describes the strong interaction, one of the four fundamental forces," said Giovanni Passaleva, spokesperson for the LHCb experiment. "Such particles will thus help us improve the predictive power of our theories."
"In contrast to other baryons, in which the three quarks perform an elaborate dance around each other, a doubly heavy baryon is expected to act like a planetary system, where the two heavy quarks play the role of heavy stars orbiting one around the other, with the lighter quark orbiting around this binary system," added Guy Wilkinson, former spokesperson of the collaboration.
Measuring the properties of the Ξcc++ will help to establish how a system of two heavy quarks and a light quark behaves. Important insights can be obtained by precisely measuring production and decay mechanisms, and the lifetime of the particle. The observation of this new baryon proved to be challenging and was made possible owing to the high production rate of heavy quarks at the LHC and to the unique capabilities of the LHCb experiment, which can identify the decay products with excellent efficiency. A paper on this finding has been published online.
Previous discoveries of the Large Hadron Collider have included pentaquarks and, of course, the Higgs Boson. In 2015, the machine underwent a major upgrade that boosted its power from 8 to 13 trillion electron volts (TeV). It will be given yet another boost in 2026, when it becomes the High Luminosity Large Hadron Collider (HL-LHC), increasing its luminosity by a factor of ten. An even larger successor, known as the Very Large Hadron Collider (VLHC) is expected to be operational from 2035 to 2075.
The European Space Agency (ESA) has confirmed the Laser Interferometer Space Antenna (LISA) as the third large-class mission in its future science programme, with launch planned for 2034.
A trio of satellites to detect gravitational waves has been selected as the third large-class (L3) mission in ESA's Science programme. In terms of its area and dimensions covered, the Laser Interferometer Space Antenna (LISA) will be the largest man-made structure ever put into space – with each "side" of its triangle stretching across millions of kilometres – forming a giant observatory to probe the Dark Side of the Universe.
In 2013, the "gravitational universe" was chosen as the theme for a future ESA mission. This would be designed to search for ripples in the fabric of space-time created by celestial objects with extremely strong gravity, such as pairs of merging black holes.
Gravitational waves were predicted a century ago by Albert Einstein's general theory of relativity, but remained elusive until very recently, when the first direct detection was made by the ground-based Laser Interferometer Gravitational-Wave Observatory (LIGO). That signal, announced in February 2016, was triggered by the merging of two black holes some 1.3 billion light-years away.
Since then, two more events have been detected and a follow-up study, LISA Pathfinder, has demonstrated that observations can be made in space – not just with ground-based instruments. This precursor mission will conclude on 30th June, after sixteen months of science operations, which have tested key technologies needed for the more advanced LISA in the 2030s.
Space-based operations will provide major advantages:
• no need to create an artificial vacuum, since the vacuum of space is free and better than anything that can be simulated in a lab;
• no interference from seismic noise, such as earthquakes or plate tectonics, passing vehicles and other human activity;
• no limitations in size for the observatory arms, which would otherwise be restricted by the curvature of the Earth.
To detect and measure gravitational waves from distant astronomical sources, a phenomenal level of sensitivity is required. Using laser interferometry over its trio of 2.5 million kilometre arms, LISA will track relative displacements with a resolution of 20 picometres – 1/50 billionth of a metre – less than the width of a helium atom. It will look for ripples in space-time with periods ranging from a few minutes to a few hours. Several thousand objects are expected to be resolved within the first year of operation.
In addition to studying black holes and compact binaries, LISA will probe the expansion of the universe and the gravitational wave background created during the early universe. It will also look for currently unknown (and unmodelled) sources of gravitational waves. History in astrophysics has shown that whenever a new frequency range/medium of detection is available, new and unexpected sources show up. This may, for example, include kinks and cusps in cosmic strings.
Following its selection, LISA will now enter a more detailed phase of design and costing, before construction begins. Its launch is expected during 2034 and the mission lifetime is four years – but the spacecraft will have enough power and orbital stability to potentially last until 2044.
Chinese scientists report the transmission of entangled photons between suborbital space and Earth, using the satellite Micius. More satellites could follow in the near future, with plans for a European–Asian quantum-encrypted network by 2020, and a global network by 2030.
In a landmark study, Chinese scientists report the successful transmission of entangled photons between suborbital space and Earth. Furthermore, whereas the previous record for entanglement distance was 100 km (62 miles), here, transmission over more than 1,200 km (746 miles) was achieved.
The distribution of quantum entanglement, especially across vast distances, holds major implications for quantum teleportation and encryption networks. Yet, efforts to entangle quantum particles – essentially "linking" them together over long distances – have been limited to 100 km or less, mainly because the entanglement is lost as they are transmitted along optical fibres, or through open space on land.
One way to overcome this issue is to break the line of transmission into smaller segments and repeatedly swap, purify and store quantum information along the optical fibre. Another approach to achieving global quantum networks is by making use of lasers and satellite technologies. Using a Chinese satellite called Micius, launched last year and equipped with specialised quantum tools, Juan Yin et al. demonstrated the latter feat. The Micius satellite was used to communicate with three ground stations across China, each up to 1,200 km apart.
The separation between the orbiting satellite and these ground stations varied from 500 to 2,000 km. A laser beam on the satellite was subjected to a beam splitter, which gave the beam two distinct polarised states. One of the spilt beams was used for transmission of entangled photons, while the other was used for photon receipt. In this way, entangled photons were received at the separate ground stations.
"It's a huge, major achievement," Thomas Jennewein, physicist at the University of Waterloo in Canada, told Science. "They started with this bold idea and managed to do it."
"The Chinese experiment is quite a remarkable technological achievement," said Artur Ekert, a professor of quantum physics at the University of Oxford, in an interview with Live Science. "When I proposed the entangled-based quantum key distribution back in 1991 when I was a student in Oxford, I did not expect it to be elevated to such heights."
One of the many challenges faced by the team was keeping the beams of photons focused precisely on the ground stations as the satellite hurtled through space at nearly 8 kilometres per second.
Quantum encryption, if successfully developed, could revolutionise communications. Information sent via this method would, in theory, be absolutely secure and practically impossible for hackers to intercept. If two people shared an encrypted quantum message, a third person would be unable to access it without changing the information in an unpredictable way. Further satellite tests are planned by China in the near future, with potential for a European–Asian quantum-encrypted network by 2020, and a global network by 2030.
A British company, Tokamak Energy, has achieved first plasma in the ST40, its latest prototype design for a fusion reactor. The machine is planned to reach 100 million degrees C in 2018, the temperature required for fusion.
Credit: Tokamak Energy
A new prototype fusion reactor has been turned on for the first time and officially achieved first plasma. The reactor aims to produce a record-breaking plasma temperature of 100 million degrees C for a privately-funded venture. This is seven times hotter than the centre of the Sun and the temperature necessary for controlled fusion.
The tokamak reactor, called the 'ST40', was built by Tokamak Energy, one of the world's leading private fusion energy ventures. The Oxfordshire-based company grew out of the Culham Centre for Fusion Energy and was established in 2009 to design and develop small fusion reactors. Tokamak Energy's aim is to put fusion power into the grid by 2030.
With the ST40 up and running, the next steps are to complete the commissioning and installation of the full set of magnetic coils which are crucial to reaching temperatures required for fusion. This will allow the ST40 to produce a plasma temperature of 15 million degrees C – as hot as the Sun's core – in autumn 2017.
Credit: NASA's Goddard Space Flight Center
Following the 15 million degree milestone, the next goal is for the ST40 to produce plasma temperatures of 100 million degrees in 2018. This will be a record-breaking milestone, as the plasma will reach a temperature never before achieved in a privately owned and funded fusion reactor. 100 million degrees is an important threshold, as only at or above this temperature can the charged particles which naturally repel each other be forced together to induce a controlled fusion reaction. This will also prove the vital point that commercially viable fusion can be produced in compact spherical tokamaks.
Tokamak Energy's journey to generating fusion energy is moving at a rapid pace; the company has already reached the half-way point of its long-term plan to deliver fusion power. It is focused on working with a smaller reactor design – called a compact, spherical tokamak – that enables quicker development of devices, therefore speeding up the process towards achieving their ultimate targets: producing first electricity by 2025 and commercially viable fusion power by 2030. Tokamak Energy's research has also proven that this route to fusion power can be much faster than the development of conventional large-scale devices.
Credit: Tokamak Energy
Dr David Kingham, CEO of Tokamak Energy, commented: "Today is an important day for fusion energy development in the UK, and the world. We are unveiling the first world-class controlled fusion device to have been designed, built and operated by a private venture. The ST40 is a machine that will show fusion temperatures – 100 million degrees – are possible in compact, cost-effective reactors. This will allow fusion power to be achieved in years, not decades."
"We will still need significant investment, many academic and industrial collaborations, dedicated and creative engineers and scientists, and an excellent supply chain. Our approach continues to be to break the journey down into a series of engineering challenges, raising additional investment on reaching each new milestone. We are already half-way to the goal of fusion energy; with hard work we will deliver fusion power at commercial scale by 2030."
The European X-ray Free Electron Laser (XFEL) in Germany has produced its first beams of x-rays.
Credit: European XFEL / Heiner Müller-Elsner
In Hamburg, Germany, the European XFEL – the biggest X-ray laser in the world – has reached its last major milestone before the official opening in September. The 3.4 km long facility, most of which is located in underground tunnels, has generated its first X-ray laser light. This has a wavelength of just 0.8 nanometres (nm) – about 500 times shorter than that of visible light. At first lasing, this laser had a repetition rate of one pulse per second, which will later be increased to 27,000 per second, compared to the previous record of 120 per second.
The beams of the XFEL are extremely intense and a billion times brighter than conventional synchrotron light sources. The achievable light wavelength corresponds to the size of an atom, meaning that the X-rays can be used to make pictures and films of the "nanocosmos" at atomic-scale resolution – such as of biomolecules, from which better understanding of illnesses could be developed. Other opportunities include research into chemical processes and new catalytic techniques, with the goal of improving their efficiency or making them more environmentally friendly; materials research; or the investigation of conditions similar to the interior of planets.
First laser light at the European XFEL, recorded by an X-ray detector at the end of the tunnel. Credit: DESY
Helmut Dosch, Chairman of the DESY Directorate, said: "The European X-ray laser has been brought to life! The first laser light produced today with the most advanced and most powerful linear accelerator in the world marks the beginning of a new era of research in Europe. This worldwide unique high-tech facility was built in record time and within budget. This is an amazing success of science. I congratulate all those involved in the research, development, and construction of this facility with passion and commitment: the employees of DESY, European XFEL, and international partners. They have achieved outstanding results and demonstrated impressively what is possible in international cooperation. The European XFEL will provide us with the most detailed images of the molecular structure of new materials and drugs and novel live recordings of biochemical reactions."
The power and speed of the XFEL will make it possible for scientists to investigate more limited samples and perform their experiments more quickly. Therefore, the facility will increase the amount of "beamtime" available, as the capacity at other X-ray lasers worldwide has been eclipsed by demand, and these other facilities are often overbooked.
At the start of September, the X-ray laser should be officially open. At that point, external users can perform experiments at the first two of the eventual six scientific instruments.