Apple users are changing their accents to accommodate Siri, according to The Guardian’s Tom Dart. And the same applies to users of other apps designed for speech recognition; all are struggling to sound more like northern Americans to ensure that their apps can figure out what they’re saying.
Speaking to Dart, Language Technologies professor Alan Black says this is not an unexpected development, noting that people tend to adopt a “machine voice” when speaking to computer systems; indeed, he speculates Siri was likely designed to be more of a polite assistant than a casual friend in order to encourage users to speak to her it with a certain diction. But together with demographic changes and the difficult-to-measure influences of mass media, these limitations in speech recognition are helping to dramatically reduce regionally specific accents.
Still, this is not an inexorable course. Black predicts that as speech recognition technologies advance, they will better adapt to users’ speech patterns, accommodating them rather than the other way around. And indeed, as voice command technology becomes increasingly important as a user interface on mobile devices and in the Internet of Things, it will be in the interest of companies developing these technologies to ensure that their systems can understand their customers. Google and Apple, for example, appear to constantly be refining their voice recognition technologies. There’s reason to hope that Siri will one day be able to properly advise you when you’re “try’na get oot of the Six” and in need of directions.
Source: The Guardian
(Originally posted on Mobile ID World)