Connections +
Feature

Seventh annual IBM 5-in-5 released

There'll be 'digital tastebuds' and one pixel will be worth 1,000 words


January 1, 2013  


Print this page

In mid-December, IBM Corp. unveiled the seventh annual “IBM 5 in 5” — a list of innovations that have the potential to change the way people work, live and interact during the next five years.

Touch: You will be able to touch through your phone

Sight: A pixel will be worth 1,000 words

Hearing: Computers will hear what matters

Taste: Digital taste buds will help you to eat smarter

Smell:  Computers will have a sense of smell

The IBM 5-in-5 is based on market and societal trends as well as emerging technologies from IBM’s R&D labs around the world.

The 2012 selection, the company says, will be the underpinnings of the next era of computing, which it describes as the era of cognitive systems.

“This new generation of machines will learn, adapt, sense and begin to experience the world as it really is. This year’s predictions focus on one element of the new era, the ability of computers to mimic the human senses — in their own way, to see, smell, touch, taste and hear.

“These sensing capabilities will help us become more aware, productive and help us think — but not think for us. Cognitive computing systems will help us see through complexity, keep up with the speed of information, make more informed decisions, improve our health and standard of living, enrich our lives and break down all kinds of barriers — including geographic distance, language, cost and inaccessibility.”

Below the company explains how each advance will evolve:

Touch: You will be able to touch through your phone.

Imagine using your smartphone to shop for your wedding dress and being able to feel the satin or silk of the gown, or the lace on the veil, all from the surface of the screen? Or to feel the beading and weave of a blanket made by a local artisan half way around the world? In five years, industries such as retail will be transformed by the ability to “touch” a product through your mobile device.

IBM scientists are developing applications for the retail, healthcare and other sectors using haptic, infrared and pressure sensitive technologies to simulate touch, such as the texture and weave of a fabric — as a shopper brushes her finger over the image of the item on a device screen. 

 Sight: A pixel will be worth a thousand words

 Computers today only understand pictures by the text we use to tag or title them; the majority of the information — the actual content of the image — is a mystery.

In the next five years, systems will not only be able to look at and recognize the contents of images and visual data, they will turn the pixels into meaning, beginning to make sense out of it similar to the way a human views and interprets a photograph. In the future, “brain-like” capabilities will let computers analyze features such as colour, texture patterns or edge information and extract insights from visual media. This will have a profound impact for industries such as healthcare, retail and agriculture.

Within five years, these capabilities will be put to work in healthcare by making sense out of massive volumes of medical information such as MRIs, CT scans, X-Rays and ultrasound images to capture information tailored to particular anatomy or pathologies. 

 Hearing Computers will hear what matters 

Ever wish you could make sense of the sounds all around you and be able to understand what’s not being said?

Within five years, a distributed system of clever sensors will detect elements of sound such as sound pressure, vibrations and sound waves at different frequencies. It will interpret these inputs to predict when trees will fall in a forest or when a landslide is imminent. Such a system will “listen” to our surroundings and measure movements, or the stress in a material, to warn us if danger lies ahead.

Raw sounds will be detected by sensors, much like the human brain. A system that receives this data will take into account other “modalities,” such as visual or tactile information, and classify and interpret the sounds based on what it has learned.  When new sounds are detected, the system will form conclusions based on previous knowledge and the ability to recognize patterns.

Taste: Digital taste buds will help you to eat smarter 

What if we could make healthy foods taste delicious using a different kind of computing system that is built for creativity?

IBM researchers are developing a computing system that actually experiences flavor, to be used with chefs to create the most tasty and novel recipes. It will break down ingredients to their molecular level and blend the chemistry of food compounds with the psychology behind what flavors and smells humans prefer. By comparing this with millions of recipes, the system will be able to create new flavor combinations that pair, for example, roasted chestnuts with other foods such as cooked beetroot, fresh caviar, and dry-cured ham.

A system like this can also be used to help us eat healthier, creating novel flavor combinations that will make us crave a vegetable casserole instead of potato chips.

Smell: Computers will have a sense of smell

During the next five years, tiny sensors embedded in your computer or cell phone will detect if you’re coming down with a cold or other illness. By analyzing odors, biomarkers and thousands of molecules in someone’s breath, doctors will have help diagnosing and monitoring the onset of ailments such as liver and kidney disorders, asthma, diabetes and epilepsy by detecting which odors are normal and which are not.

Today, IBM scientists are already sensing environmental conditions and gases to preserve works of art. This innovation is beginning to be applied to tackle clinical hygiene, one of the biggest challenges in healthcare today.