As per the information of the well know Apple’s web MacRumors, according to the new details shared by the Wall Street Journal in an article on how the company trains its voice assistant to handle the voice of users with language impairments, Apple is studying how to improve Siri To better understand the needs of stuttering users.
Apple has established a database of 28,000 audio clips provided by stuttering users. According to an Apple spokesperson, this database will be used to train Siri and improve the voice recognition system’s ability to understand users with language impairments.
In addition to improving Siri’s ability to understand users with language barriers, Apple has also added a Hold to Talk feature to Siri, which allows users to freely control the time Siri listens to voice, so as to prevent Siri from interrupting users who stutter.
With the Type to Siri function first introduced in iOS 11, users can use Siri without voice.
Apple plans to release a research report this week describing its work to improve Siri. The report will provide more details about Apple’s work in this area.
Google is collecting voice data of users with language barriers, and Amazon launched the Alexa Fund in December to establish an algorithm to identify the unique voice patterns of users with language barriers.