TEL AVIV, Israel--(BUSINESS WIRE)--Following hot on the heels of AMD’s product news from CES, eyeSight today announces that its leading gesture control technology has been integrated into AMD’s upcoming Accelerated Processing Unit (APU) platforms, “Richland” and “Temash”; intended primarily for desktop, laptop and tablet PCs. With eyeSight’s gesture recognition closely optimized and integrated into AMD Gesture Control, AMD’s “Richland” and “Temash” APU solutions are able to process gestures with optimal speed, accuracy and efficiency. With eyeSight’s gesture technology, AMD is providing an ideal solution to the snowballing demand for intuitive gesture-capabilities in both business and consumer devices.
eyeSight gesture recognition capabilities have been uniquely optimized for AMD APU technology. These enhancements enable unmatched recognition performance in challenging light environments, utilizing a standard image sensor.
By running the video processing algorithms on the upcoming APUs, eyeSight was able to reach remarkably low CPU consumption compared to running on traditional X86 CPUs. In some cases, the processing time has even been reduced by a factor of 20x- allowing eyeSight to introduce additional code and significantly increase recognition accuracy.
Although eyeSight’s gesture technology is already intuitive and accurate, the extreme efficiency of AMD’s upcoming APUs provides capacity for even more accurate, responsive, and seamless gesture control in devices, as well as better overall performance of concurrently-running applications. This will be noticed by the end-user as a smoother, more seamless gesture experience.
“Working with AMD to bring gesture control to their 2013 APUs in this way is really significant. It validates growing consumer demand, and identifies gesture recognition as essential for digital devices,” commented Gideon Shmuel, CEO, eyeSight. “And pre-integration makes perfect sense: the “Richland” and “Temash” APUs carry out gesture operations that are usually more CPU-intensive with an extremely low impact on the processing load of the system, making for a smooth overall experience. AMD’s solution will clearly be a very attractive option for any OEM looking to build a PC, laptop or tablet with gesture control functionality.”
eyeSight’s technology enables touch-free control of a wide variety of functions, including the Windows 8 modern UI , PowerPoint, Windows Media Player, Windows photo gallery, eBooks, PDF readers, and more.
- ENDS -
eyeSight was established with the vision of revolutionising the way people interact with digital devices, to create an interaction that is both simple and intuitive.
eyeSight was established in 2005, and is headquartered in Israel.
eyeSight is a leader in Touch Free Interfaces for digital devices. The company is led by a highly experienced team of Image Processing professionals with extensive experience in research, implementation, and optimisation of real time algorithms for embedded platforms.
eyeSight is a privately held company.
Further details on the company can be viewed at http://www.eyesight-tech.com/
You can also follow eyesight Mobile Technologies on Twitter on @eyesightmobile
eyeSight has a number of original use case videos, which demonstrate the broad capabilities of its technologies. Individuals are welcome to embed these videos on their websites, or contact eyesight for the original files.
About AMD Gesture Control
AMD Gesture Control is designed to enable gesture recognition as a tool for controlling certain applications on your PC. Only available on upcoming AMD A10 and A8 APUs codenamed "Richland" and upcoming AMD A6 and A4 APUs codenamed "Temash.” Requires a web camera, and will only operate on PCs running Windows 7 or Windows 8 operating system. Supported Windows desktop apps include: Windows Media Player, Windows Photo Viewer, Microsoft PowerPoint and Adobe Acrobat Reader. Supported Windows Store apps include: Microsoft Photos, Microsoft Music, Microsoft Reader and Kindle. Performance may be degraded in low lighting or intensely-focused lighting environments.