future timeline technology singularity humanity
 
Blog»

 

6th January 2017

IBM predicts five innovations for the next five years

IBM has unveiled its annual "5 in 5" – a list of ground-breaking innovations that will change the way people work, live, and interact during the next five years.

 

 

In 1609, Galileo invented the telescope and saw our cosmos in an entirely new way. He proved the theory that the Earth and other planets in our Solar System revolve around the Sun, which until then was impossible to observe. IBM Research continues this work through the pursuit of new scientific instruments – whether physical devices or advanced software tools – designed to make what's invisible in our world visible, from the macroscopic level down to the nanoscale.

"The scientific community has a wonderful tradition of creating instruments to help us see the world in entirely new ways. For example, the microscope helped us see objects too small for the naked eye, and the thermometer helped us understand the temperature of the Earth and human body," said Dario Gil, vice president of science & solutions at IBM Research. "With advances in artificial intelligence and nanotechnology, we aim to invent a new generation of scientific instruments that will make the complex invisible systems in our world today visible over the next five years."

Innovation in this area could dramatically improve farming, enhance energy efficiency, spot harmful pollution before it's too late, and prevent premature physical and mental decline. IBM's global team of scientists and researchers is steadily bringing these inventions from laboratories into the real world.

The IBM 5 in 5 is based on market and societal trends, as well as emerging technologies from research labs around the world that can make these transformations possible. Below are the five scientific instruments that will make the invisible visible in the next five years.

 


 

With AI, our words will open a window into our mental health

In five years, what we say and write will be used as indicators of our mental health and physical well-being. Patterns in our speech and writing analysed by new cognitive systems – including meaning, syntax and intonation – will provide tell-tale signs of early-stage developmental disorders, mental illness and degenerative neurological diseases to help doctors and patients better predict, monitor and track these conditions. What were once invisible signs will become clear signals of patients' likelihood of entering a certain mental state, or how well their treatment plan is working, complementing regular clinical visits with daily assessments from the comfort of their homes.

 

ibm five in five
Credit: IBM

 

 

 

Hyperimaging and AI will give us superhero vision

In five years, new imaging devices using hyperimaging technology and AI will help us "see" beyond visible light, by combining multiple bands of the electromagnetic spectrum. This will reveal valuable insights or potential dangers that may otherwise be unknown or hidden from view. Most importantly, these devices will be portable, affordable and widely accessible in our daily lives, giving us the ability to perceive or see through objects and opaque environmental conditions anytime, anywhere.

A view of invisible, or vaguely visible objects around us, could help make road and traffic conditions clearer for drivers and self-driving cars. For example, by using millimetre wave imaging, a camera and other electromagnetic sensors, hyperimaging technology could help a vehicle see through fog or rain, detect hazardous and hard-to-see road conditions such as black ice, or tell us if there is some object up ahead – as well as its distance and size. Cognitive computing technologies will reason about this data and recognise what might be a tipped over garbage can versus a deer crossing the road or a pot hole that could result in a flat tire.

 

ibm five in five
Credit: Lenovo

 

 

 

Macroscopes will help us understand Earth's complexity in infinite detail

Instrumenting and collecting masses of data from every source in the physical world, big and small, and bringing it together will reveal comprehensive solutions for our food, water and energy needs. Today, the physical world only gives us a glimpse into our highly interconnected and complex ecosystem. We collect exabytes of data – but most of it is unorganised. In fact, an estimated 80 percent of a data scientist's time is spent scrubbing data instead of analysing and understanding what that data is trying to tell us.

Thanks to the Internet of Things (IoT), new sources of data are pouring in from millions of connected objects – from refrigerators, light bulbs and heart rate monitors, to remote sensors such as drones, cameras, weather stations, satellites and telescope arrays. There are already more than six billion connected devices generating tens of exabytes of data per month, with a growth rate of over 30% each year. After successfully digitising information, business transactions and social interactions, we are now in the process of digitising the physical world.

By 2022, we will use machine learning algorithms and software to organise the information about the physical world, bringing the vast and complex data gathered by billions of devices within the range of our vision and understanding. IBM calls this idea a "macroscope" – but unlike microscopes to see the very small, or telescopes that can see far away, this will be a system to gather all of Earth's complex data together to analyse it for meaning.

By aggregating, organising and analysing data on climate, soil conditions, water levels and their relationship to irrigation practices, for example, a new generation of farmers will have insights that help them determine the right crop choices, where to plant them and how to produce optimal yields while conserving precious water supplies.

 

internet of things agriculture

 

 

 

Medical labs "on a chip" will serve as health detectives for tracing disease at the nanoscale

In five years, new medical labs on a chip will serve as nanotechnology health detectives – tracing invisible clues in our bodily fluids and letting us know immediately if we have reason to see a doctor. The goal is to shrink down to a single silicon chip all of the processes necessary to analyse a disease that would normally be carried out in a full-scale biochemistry lab.

Lab-on-a-chip technology will eventually be packaged in a handheld device. This will allow people to quickly and regularly measure the presence of biomarkers found in small amounts of bodily fluids – such as saliva, tears, blood and sweat – sending this information securely into the cloud from the comfort of their home. There it will be combined with real-time health data from other IoT-enabled devices, like sleep monitors and smart watches, and analysed by AI systems for insights. Taken together, this data will give an in-depth view of our health, alerting us to the first signs of trouble – helping to stop disease before it progresses.

IBM scientists are developing nanotechnology that can separate and isolate bioparticles down to 20 nanometres in diameter, a scale that gives access to DNA, viruses, and exosomes. These particles could be analysed to potentially reveal the presence of disease even before we have symptoms.

 

IBM five in five
Medical lab on a chip. Credit: IBM

 

 

 

Smart sensors will detect environmental pollution at the speed of light

In five years, new sensing technologies deployed near natural gas extraction wells, around storage facilities, and along distribution pipelines will enable the industry to pinpoint invisible leaks in real-time. Networks of IoT sensors wirelessly connected to the cloud will provide continuous monitoring of natural gas infrastructure, allowing leaks to be found in a matter of minutes instead of weeks, reducing pollution and waste and the likelihood of catastrophic events.

IBM is researching silicon photonics – an emerging technology that transfers data by light, for computing literally at the speed of light. These chips could be embedded in a network of sensors on the ground or within infrastructure, or even fly on autonomous drones; generating insights that, combined with real-time wind data, satellite data, and other historical sources, will produce complex environmental models to detect the origin and quantity of pollutants as they occur.

 

smart sensor
Credit: IBM

---

 

Comments »

 

 

 
 

 

Comments

 

 

 

 

⇡  Back to top  ⇡

Next »