Jack Spine writes "A robotics researcher at Accenture has given a demonstration of a 'Pocket Supercomputer' — a phone behaving like a thin client. It can be used to send images and video of objects in real time to a server where they can be identified and linked to relevant information, which can then be sent back to the user. 'The camera on the phone is used to take a video of an object — such as a book ... By offloading the processing from a mobile device onto a server, there are few limits on the size and processing power available to be used for the storage and search of images.' To pinpoint the features necessary to identify an object, the image is run through an algorithm called Scale-Invariant Feature Transform, or SIFT, a technology developed by academic David Lowe. The software extracts feature points from a jpeg and makes a match against images in the database. If a match exists then the software on the server retrieves information and sends it back to the user's phone. A 'three-dimensional' image of an object can also be uploaded onto the phone, to look at the virtual object from different angles. The motion-tracking technology Accenture uses for this is a free library of algorithms called Open Computer Vision."