Intel Zeroes In On Mobility, Sense Computing At IDF

Forum chip

The morning's media briefings focused on connected visual computing, sensing models for connecting physical reality with accurate digital interpretation and desired machine actions referencing that real-world data, and the increasingly small, mobile devices that could deliver practical computing applications based on such developments to on-the-go individuals in the future.

Intel researchers Jim Held, Andrew Chien and Mary Smiley gave presentations on those topics at Monday's morning session at a San Francisco hotel near the Moscone Center, where IDF officially kicks off Tuesday.

Chien's presentation, "Connecting the Physical and Digital Worlds: Sensing," was of particular interest. The director of Intel Research's Corporate Technology Group showcased a series of projects involving sensor technology and data processing. This research, according to Chien, spans stem cell analysis at the nanometer scale through to Everyday Sensing and Perception (ESP) projects to develop sensory technology at the "human scale" of feet and meters, and beyond, to "even higher-order levels like emotion."

Intel's research and broad collaboration with various universities in developing sense-based technology represents "perhaps the largest change happening in the interaction between the physical and computing worlds," Chien said.

id
unit-1659132512259
type
Sponsored post

Such work necessarily incorporates a great many individual areas of research at Santa Clara, Calif.-based Intel and in the wider technology community, such as machine learning, data-intensive computation, imaging/vision, sensing, novel applications and human/computer interfaces, he said.

For example, to develop a small mobile device capable of delivering practical, mainstream ESP applications that can interpret and act upon visual data from one's immediate surroundings, challenges would include building small, "egocentric" video sensors capable of 95 percent object recognition accuracy over hundreds of objects or dozens of human activities over the course of hundreds of hours in broadly variable physical locations and surroundings.

Intel and its academic collaborators' ambitious goal of building large-scale datasets capable of delivering "90 percent accuracy for 90 percent of your day" with such real time-tuned mobile applications is doable, Chien said. This can happen despite present real-time event detection technology that requires computational performance on the order of 4 Teraflops, sucking up 10 kilowatts of power to get the job done.

That's because "the history of computing is on our side," Chien insisted.

"If you take it as an order of magnitude challenge, history says we can get half of that from better algorithms and half of that from better systems," he said.

Other RandD chalk talks planned for IDF over the course of the conference cover such topics as robotics, managing innovation and "navigating future moneyscapes."