Intel announced a new research division at their Annual Labs' Day
event "Interaction and Experience Research" (IXR), that is focused
on defining new user experiences and new computing platforms.
The IXR Lab is expected to produce innovations that help re-imagine how we will experience computing in the future.
Engagement and experience with technology will become much more
personal and social through individual user contexts informed by
sensors,
increased by cloud intelligence, and driven by more natural
interactions such as touch, gesture and voice.
"Better technology isn't enough these days," said Chief
Technology Officer Justin Rattner. "What
the
individual values today is a deeply personal, information
experience. When I look ahead, this is the biggest change in
computing I see coming. At Intel, we've been building up our
capabilities in the user experience and interaction areas for over a
decade. We've recently assembled an outstanding team of researchers
consisting of both user interface technologists and social
scientists to create the next generation of user experiences. We've
learned, for example, that the television experience isn't the same
thing as the Web experience, even though more and more TV will be
delivered via the Internet. Browsing the Web at 10 feet is an
experience few people relish, but television experienced via the
Internet is a huge step beyond broadcast."
Intel Labs already has a strong focus on the next generation of user
experience technologies. Current work around context and location
has yielded a range of insights and technological possibilities. For
example, the idea that devices will understand their surroundings,
communicate with each other and change behavior or take actions
based on the user's environment.
One particular project on display
at the event, coined SENS, represents a new wave of social
networking that provides the ability to monitor real-time activities
and display these activities live and direct to networked friends
and family.
Researchers also demonstrated an experimental, low-cost energy
sensor, which could help change the way consumers manage personal
energy consumption at home. When coupled with a home information display, it would monitor
usage, recommend solutions for more efficiency and reward success. The sensor needs only to be plugged into the house wiring to
instantaneously measure and wirelessly report the power consumption
of each electrical load in the home,
providing data to analyze energy usage of devices and appliances
throughout.
|
Chronically ill people would
benefit from collaborative sensing technology that
continuously monitors heart rate. |
This technology forms the heart of a personal energy
management system that could lead to valuable changes in behavior
and save staggering amounts of energy.
Other technology demonstrated at the event was a system that changes a user's engagement with
technology, i.e. projection and
3-D cameras to light up nearby surfaces displaying buttons, windows,
images and movies onto work surfaces, tabletops or other flat
spaces. The video and vision system is able to recognize hand
gestures and objects, turning everyday surfaces such as a kitchen
counter, coffee table or classroom desk into an interactive portal
to the device and the Internet. Also demonstrated was a more
futuristic example, a computer that could read a user's thoughts,
replacing the need for typing altogether.
Click here to see a slide
show of "some of the more interesting projects" according to Chris
Preimseberger of E-Week. Slides 4,5 + 13 are particularly AIP
related.
|