Although Apple files patents fairly frequently, this particular patent, which was awarded today, got us excited. The patent details an augmented reality system with the capability to tag real-world items in a live video stream and display information about those objects using a heads-up display.
AppleInsider broke the story, detailing “Synchronized, interactive augmented reality displays for multifunction devices.” Basically, the augmented reality system will be able to use features from iOS devices, such as the camera and multitouch screen. The AR system turns your world into a Pop-up video, overlaying a computer-generated layer of info with real-world images. The patent included several examples, one of which showed a user holding their iOS device over a circuit board.
What’s cool about this AR patent is that Apple users can interact with the computer-generated info. If the system incorrectly identified an object or couldn’t figure out what it was, the user can fill in the blanks using onscreen controls. Plus, if they’re collaborating with someone, they can send the live data view to another device, provided both users have Internet connectivity.
A split-screen view lets users view the live and computer-generated views separately, instead of one view laid on top of the other. The example from Apple showed a view of San Francisco in the live view window and a 3D composite computer-generated image in the other window. And users can sync this split-screen between two devices, which could come in handy when trying to give directions.