Where's Alex when you need him?

If there’s one thing missing from the 3g iPhone that everyone’s calling for, it’s voice guided navigation. While the maps application serves a purpose, it can be distracting if you’re trying to follow of while driving. Having to constantly check the screen isn’t as usefull, or as safe, as having voice prompts.

The really baffling thing to me is that it would appear, if Apple are to believed, that the functionality already exists. Take a look at the original iPhone key note. Steve Jobs made a point oftelling everyone that the iPhone would be running osx. If this is true, any “service” that runs on osx would also run on the iPhone. And, with Leopard, Apple introduced a new way to speak written text, Alex. So the question is, why can’t Apple simply use Alex to speak the directions already provided by Google maps?

When you think about the functionality already available on Google maps, it wouldn’t be a huge amount of work to adapt it to work with Alex. It already gives a route and follows your position along it through gps. All you really need is Alex to “say” the directions google already gives you.

This really got me thinking about some of the other native osx features that would be welcome on iPhone. So apart from the obvious (copy & paste, video capture) which desktop osx features would you like to see on iPhone? I’ll throw a more thorough implementation of coverflow (how cool would it be for contacts) into the ring along with spotlight. So what would you like to see?