If we weren't frustrated with our phones before due to untimely auto corrects, we're about to get a whole lot more pissed off. Now we'll be literally cursing at our phones (at Siri, specifically), repeatedly rephrasing and trying to better enunciate what it is we're trying to ask of her. She may only officially be in BETA, but after having just gotten our iPhone 4S in the mail yesterday, we're already frustrated to the point of giving her the silent treatment. Find out how speech recognition works!
IBM's Watson and Apple's Siri have been making big news over the past few months. This fantastic infographic from medicaltranscription.net (it's so big we've broken it into 8 parts!) helps explain how automatic speech recognition (ASR) works. While the technology behind ASR is incredibly complex and powerful, the end result from a user experience point of view for Siri has been hit or miss. As much fun as it is to play around with Siri and ask her silly questions, we ultimately just want to get stuff done with her as a valuable personal assistant - that's what she was pegged as in the first place. As excited as we were to really get to user (know?) her, our first few hours we're finding she's all over the place in what she can understand and what she can do. Our fellow contributor, Chris Perez, seems to agree. So here's to hoping that as the technology improves and Siri matures out of BETA phase, she gets much better at understanding what it is we're asking for. Either that or -we-are-going-to-have-to-talk-like-this-from-now-on and then pray she understood this third time before we give up altogether. How have your own experiences been with Siri?