[NTLK] Who here thinks the iPad is a worthy replacement for the Newton?
Ryan
newtontalk at me.com
Sun Mar 14 17:37:26 EDT 2010
I'll say a few things on this...
The reality is that the market has decided what it prefers. The majority of people prefer a keyboard and now more and more people are using virtual keyboards on devices like the iPhone for input.
Pen computing overall failed in the marketplace. It failed for smaller devices like PDAs and larger devices like tablet computers. The market decided this. It had a good, long run but people just never took to it because of its inefficiencies.
Let's look at practical applications. I am standing on the train and there are people everywhere, noise, etc. I can phone someone without even touching my iPhone handset with its voice control, or with a noise cancelling bluetooth headset. I can create a contact with one hand... I can navigate the web with one hand... I can read RSS feeds with one hand... I can type a quick note with one hand...
With something like the Newton, that's really not possible. It takes 2 hands... it's much more unweildily and thus more impractical in busy, stand up environments.
With speech recognition, of course, those environments aren't places where you'd use it. It works well in quiet environments, as you say. However, for basic speech input, BlueAnt's noise cancelling headset, for instance, works great in noisy environments.
But a laptop with a keyboard is not good in those noisy, busy, "stand up" environments either. So devices like the iPhone are the most practical of anything, and that's why so many people are using them, and not devices driven by a stylus. One hand vs. two hands...
Speech recognition is really here to stay. Doctors and Lawyers for starters... it's widely used there. And now more and more consumers are starting to use it. Of course, consumers who are in the right environments at work can use it... freelancers who work from home, etc. Lots of people are using it. I recently did a seminar on it for language translators, and it's a bit of a revolution for them. Their production, since they started using it, has increased for chosen translations, they have less wrist pain because they type less, and the quality of the translations they use it on has gone up. They can now just sort of look at a small paragraph, and just say it the way they would say it in their language. It's incredibly fast and liberating. When you look at the kind of work they do, slaving away on a keyboard typing tens of thousands of words a week, week after week, it has changed their lives. They can even stand up, and be away from their computer with a wireless headset, and translate by speaking... this itself abstracts the computer away. More in that in a minute.
Speech recognition has made my life easier too, since I type a lot of Emails during the day. Speech recognition has increased my production significantly, and my wrists are not really sore anymore. Like everyone else, I mix keyboard use with speech recognition during a normal day. For those longer emails, I use speech recognition. And like so many others I know who have an office, they use it too. After the speeched email is finished, I sort of just smirk because I know how much longer it would have taken to type it all up. Dragon's speech software is now so good that you can speak as fast as you like and it makes very few errors. The wpm can reach up to 160 and beyond. I did demonstrate it at the Newton conference last year, dictating a passage from a book. It made two errors and what it got right was surprising, like putting capitals on Hewlett Packard, etc. I spoke pretty much as fast as I could.
With the iPad... if you noticed in Steve's keynote... something that does not get much attention in the media, he explained why the iPad is useful. That it is better than a smartphone, better than a laptop/netbook, at a core set of things. For instance, browsing photos, browsing the web, reading eBooks, etc. It's not something to try and put in a pocket to be pulled out on a train. That's where the iPhone comes in. If the iPad really is better at these things, then Apple is doing what it does best: giving people something they need where those people don't know it yet.
So the reality with all this is that the market has really decided what it likes better.
The evolution of computing now is what people are starting to write about more... that computers are slowly but surely being abstracted away. That means peripheral devices like mice, keyboards, and styluses are being conjoined as in one unit. The iPhone, the iPad are examples... it's the multi-touch that engenders this conjunction. Instead of all these peripherals, instead of file systems, instead of worrying about specs, all that stuff is becoming less important. And like others have said, we are really getting into an era where computers are becoming easier to use and devices like the iPhone with simplified GUIs have made this possible.
Using abstraction as an indication, things will further develop on this front, where computers will eventually disappear and become part of us as in augmented cognition. This is the most efficient computing paradigm of any. No longer will computers be separated from us, they will be part of us. That increases the bandwidth at which you can compute by many orders of magnitude. Brain wave computing is something that will engender this. And in this future, we will be able to control things in our environment just by thinking about them. There is already lots of exciting research that has demonstrated that these things are possible... so I see the iPad as just another step toward completely abstracting the computer away. And like anything, it has its practical applications...
Thank you,
Ryan
More information about the NewtonTalk
mailing list