Apple is probably wanting the semantic analysis technology behind Siri, which is supposedly pretty amazing. The natural language processing is probably something they're interested in too. They will probably integrate Siri's functions into the iPhone to expand into a personal assistant-like function. For example, they can add a "speak" button to the search page of the iPhone, and have users search through their phone (contacts, etc) or ask the iPhone a question, "What movies are playing around me?" They need this functionality to compete with Android, especially with the new Android versions having voice search and integrated turn-by-turn. I can see Apple developing their own voice recognition technology and not using Nuance. I think Apple can do a better job than them. And it makes sense for Apple to integrate turn-by-turn directions somehow, just to compete with Android. They could allow Google to provide turn-by-turn on the iPhone, but I don't think they want iPhone users that dependent on Google. So many iPhone users are already using Google for mail, calendar, contacts, RSS reader, etc. Apple needs to find a way to pull people away from reliance on Google and be more connected with their services.