Apple’s personal assistant is fairly versatile, but Siri could definitely stand to be more accurate. In fact, she’s so off the mark at times that some dismiss Siri as a novelty. However, thanks to some key hires in the field of deep learning, Siri may soon get a welcome IQ boost.
According to Wired, Apple is scooping up talent to pursue breakthroughs in neural network algorithms. The idea is to better understand spoken words via machine learning models that work like neurons in the human brain.
What kind of improvement are we talking about? Microsoft saw a 25 percent boost in accuracy with this technology. Neural network algorithms are also what enable Microsoft-owned Skype to translate spoken words into other languages on the fly. Other companies employing neural network algorithms include Google with its Android voice recognition and IBM.
Among Apple’s key hires is Alex Acero, who researched speech technology at Microsoft for 20 years. The company also nabbed Gunnar Evermann from Nuance and a researcher for the University of Edinburgh.
So what could Apple do with a neural network-enhanced Siri? It’s not just about making the iPhone 6 smarter, but the upcoming iWatch. Google already employs voice recognition on its Android Wear-powered devices, and it makes sense that Apple’s wearable would make searches and performing tasks easier by speaking into the device.
For example, what if you could say this to your iWatch before your run: “Siri, keep my heart rate to 155 bpm.” Apple also has designs on the smart home with HomeKit, and Siri could be the ultimate liaison. Picture telling the assistant to set your home’s temperature by speaking into your wrist or phone — even while on vacation.