Computing reinvented – again
An announcement made during the opening of the Intel Developer Forum 2012 in San Francisco yesterday opens the road to a new vision of computing for developers, writes ARTHUR GOLDSTUCK.
Intel has taken a giant leap into the future of computing. A kit unveiled in San Francisco yesterday will give developers the ability to create gesture, facial and voice recognition applications for computers.
Computers and gaming systems have had the capability for a while, but now any developer will easily and quickly be able to build applications that take advantage of these capabilities.
Making the announcement during his opening keynote address at the Intel Developer Forum 2012, chief product officer David Perlmutter said that the personal computing experience was shifting to one based on “perceptual computing where devices will take on human-like senses to perceive the user's intentions”.
The first Intel Perceptual Computing Software Development Kit (SDK) beta will be released early next quarter, and geared towards existing and future Ultrabook systems and PCs using Intel’s Core processor. The SDK will be available for free download.
Developers will also be able to purchase an Interactive Gesture Camera Developer Kit, which incorporates a USB-powered depth sensor camera tuned for short-range interactivity. It’s intended for Intel-powered Ultrabooks, laptops or PCs, and includes an HD webcam, infra-red depth sensor and dual-array microphone.
Perlmutter demonstrated voice recognition on the Ultrabook, showing a system running the Nuance Dragon Assistant Beta optimised for Intel Core processors. This enables the machine to provide a voice assistance service, much like Siri on the iPhone or S-Voice on the Samsung Galaxy S3 phone. While this is not in itself revolutionary, it is an indication of the direction of computing: taking in the learnings from gaming, smartphones and tablets.
Speaking in a strong Israeli accent, Perlmutter quipped: “In about a year, the voice capabilities will even understand my accent.”
Ultrabooks are expected to come into their own next year, when Intel will release the 4th generation Intel Core processor family, codenamed “Haswell”. It is expected to provide twice as much power as the 3rd generation, while using the same battery life, or use the same power with double the battery life.
This would open the way to “all-day computing”, enabling a computer user to work away from a power source for a full working day.
Perlmutter added: “Our focus to deliver even lower power with the great performance that our processors are known for is as fundamentally significant as when we shifted our development focus beyond sheer processor speed in 2001. As a result, you'll see our customers delivering sleek and cool convertible designs, as well as radical breakthrough experiences across a growing spectrum of mobile devices.”
He said that more than 140 different Ultrabook designs were in development, and more than 70 powered by the current 3rd generation Intel Core processors were available today.
He also announced the next-generation Intel Atom processor, codenamed “Clover Trail”, a new system-on-chip (SoC) designed for the portable demands of Windows 8. It is ideal for lightweight tablets and convertible notebooks, with longer battery life than anything offered today.
Intel is working with manufacturers to produce Windows 8 tablets and notebooks in a dizzying array of formats, including tablet, clamshell (normal laptop with lid), convertible (changes from laptop to tablet), and detachable (laptop with removable screen that becomes a tablet). The convertibles, in turn, include sliders (screen slides out and up), swivels (screen flips over to turn laptop into a tablet) and rotator (screen rotates to use as tablet or presentation device).
- To be notified of the availability of the Intel Perceptual Computing Software Development Kit beta, register at http://software.intel.com/en-us/vcsource/tools/perceptual-computing-sdk
Follow Arthur Goldstuck’s coverage of the Intel developer Forum 2012 on Twitter at @art2gee.