Intel Offers Peek At Future Of Mobile Devices

Context-aware devices, which use sensors to gather information about the user and their physical environment, are poised to fundamentally change the mobile computing experience, Rattner said in a Wednesday keynote speech at IDF 2010.

In the future, developers will build applications that take into account sensor data as well as behavior and circumstances that are specific to the user, he said.

"These devices will constantly learn about you, about your day, about your friends, and they'll probably know how you're feeling," Rattner said. "They'll know where you are and more importantly, where you're going."

Rattner demonstrated a working prototype built by Intel researchers and developers, in which the device takes on the role of a "Personal Vacation Assistant." He brought on stage Tim Jarrell, vice president and publisher of Fodor's Travel, to show how this functionality can help travelers locate nearby restaurants, museums, and local tourist attractions.

Sponsored post

But context awareness goes beyond just location: It can also take into account the device user's favorite foods, how much they want to spend, and what traffic they'll encounter along the way to their destination.

"These are choices tailored for you, and that makes it, from our standpoint, much more workable," Rattner said. The Personal Vacation Assistant is constantly collecting information, saving it to the Web, and even creating automatic blog entries with text and images, detailing where and when the user went, what they did, and for how long, Rattner said.

Of course, this sort of technology doesn't just happen. Intel says twenty years of research have gone into developing context-dependent devices, based on a vision that emerged in 1991. However, researchers are often overly optimistic, Rattner acknowledged, about the prospect of developing consumer products based on their vision.

Even now, the exact timeframe for when context-aware features will appear in Intel products is unclear. Rattner said this will happen in "the not-too distant future," but he also acknowledged that it may take some time for a viable market to develop around the technology.

"Now with mobile devices, all the infrastructure is there," Rattner said. "The glue, if you will, is now in place."

The infrastructural support for context-aware technology comes from data centers which provide a kind of "sense-making pipeline" that incorporates data from physical sensors and user behavior data, thus improving the accuracy of contextual information, Rattner said.

Eventually, Intel claims, the user will be able to leverage a whole collection of inference algorithms, without actually being specialized in these algorithms, and aggregate them over time. Intel also says the user will be able to protect the information, control what context gets shared, to whom and for how long.

Always concerned with energy-efficiency, Intel says users will be able to run these servers using minimal power. All remaining concerns aside, these are grand implications for a platform supporting devices that must always be able to sense what goes around them, with audio sensors, camera sensors, even sensors for emotion.

Emotional relationships, which Rattner defined as understanding what people care about and what's going to motivate them, are critical to context-aware devices. Rattner closed the keynote with a video clip demonstrating Intel's work with Carnegie Mellon that seeks to correlate brain activity with brain imaging to achieve a more direct understanding of human thought.

Twenty years later, it's more than just a vision.