Lower Cost Structure - In our industry (diabetes) you give away hardware to get ongoing disposable revenue. This hardware is expensive to produce and develop. Plus you spend a lot of effort on areas that don't add much value e.g. reinventing the wheel re: display drivers. This completely changes the economics of the industry.
Higher Product Quality - By taking advantage of the core Tablet attributes like color touch screen display and processor you can do things feature wise that would be prohibitive with custom hardware. The amount of "ooohs" we get showing off our iPhone UI vs the current LCD one is staggering.
New Revenue Opportunities - Again our business has thrived on a single revenue stream, the disposable test strips. With connections to the web all manner of "virtual good", subscription services, and other digital business models get opened up.
Overall it is a huge win for both user and entrepreneur and is going to fundamentally change a bunch of hardware businesses.
If you want to see our product, here is a nice review (http://www.fastcodesign.com/1662351/blood-glucose-monitor-fo...)
Cocoa's for-scientist, rapid-development ideology is perfect for using the OS to create an ethereal device.
Also, that's some solid hardware/software.
In general, yes it is a bad idea to use consumer technology for devices with the sort of lifetime you find in the medical industry (often up to 10 years). Because of the high levels of regulation and testing, its not cheap to swap out a component for something equivalent when a manufacturer stops producing the one you originally designed and tested your system with.
No platform is perfect.
(This isn't to say I don't agree with you that they shouldn't be able to kick you off, but, you know what? No mobile platform is good in that respect.)
Google Voice was rejected. It was never in the store. (Although, it is now.)
I think the key question here is whether or not the app-universe grows in size until consumers desire separation of widgets again.
I know from my own experience that I found I maximize productivity by having separate devices responsible for separate things. For instance, when I pick up my blue iPod it's for education -- I keep books and lectures on there. But when I pick up my black iTouch it's for fun -- I keep only tunes there. My phone -- although it has all kinds of neat wizardry in it -- I use solely for talking to other people.
Perhaps both trends are true. Perhaps we end up individually separating our apps into physical devices based on preference instead of tradition. Neat stuff.
I remember marveling at the YouTube app the first time I picked up an iPad. It was the best YouTube experience I'd seen; it felt like I was holding a purpose-built device for YouTube viewing. The home button simply prepared it to morph into a different purpose-built device.
[NB I noticed the other evening that an iPlayer at the distance I normally cradle it is larger than our 50" plasma at whatever distance that is from where I sit - no wonder it feels so immersive]
In a sense, the tablet format is more than just a software or hardware platform. It's more akin to a new kind of material - one that you can mold at will, through software, into a myriad of distinct widgets.
Don't underestimate hardware. A slightly dodgy software keyboard like the iPad gets rave reviews, where slightly too small netbook keyboards get slated for not being perfect full size keyboards. It's like Johnson's famous comment on female preachers: "Sir, a woman's preaching is like a dog's walking on his hind legs. It is not done well; but you are surprised to find it done at all."
The key of course is that a capacitive touch screen can do lots of different things okay and effectively disappear when not in use, rather than doing one thing really well and sit there as a big hardware lump the rest of the time.
That sentence is one instance of this sentence:
Developers have used <platformFunction> in ways <inventor> could never have imagined.
What better argument for open standards, APIs, and community cooperation?
I'm all for open standards, API's and community cooperation of course. I just don't think that Apple is necessarily the antithesis.
The fundamental difference between Apple's position today to Microsoft's monoculture of the previous decade lies in their motivations.
Apple wants to make great stuff.
Microsoft wanted to control the world.
btw PG: I rfs8.html is not on the rfs.html page yet.
Apple on the other hand clearly does care what you put on your iOS device, and in ways that go beyond a desire for good UX to a desire to control what is acceptable for society:
MS did try to push people into software as a service (but they failed at it). They support DVD region encoding. They have introduced DRM music services. They struck agreements to hamper attempts to sell dual-boot systems. I didn't experience this but understand that at one point windows media player would happily scan existing content on the drive and helpfully encomber it with DRM. Sharepoint takes data in from all sorts of sources but it's much harder to get it out again.
MS have also EOL'd products that dominate major market segments despite users who would have paid plenty for stability: original VB, excel before UI change, every NT release.
These are the actions of a company trying to increase its control on users, attempting to control what they do on their computer.
By way of comparison, traditional unix vendors would be more helpful of big-pocketed companies who wanted to pay to continue to use old software. They could have reversed a firewall and other patches into NT4 that would have hardened it and kept it going if they'd wanted to - these are the teams that reverse-engineered photoshop and hundreds of games and then hotpatched new code over their memory segments when they loaded so that they could get DOS and W3.1 software to run fine on Windows 95 for the release. Interesting alternate history. What if MS had tried to keep NT4sp3 tight and stable and kept drivers coming, and brought it forward conservatively.
Oh, how a decade of stagnation can change perceptions...
I wish people would stop saying that Bill Gates has always been a nice guy because he's retired and giving away his money.
Don't fool yourself into thinking that MS is more open, they don't have much of a mobile OS for you to compare with iOS anyways.
And Apple is going to kill off so many hardware companies because if it wins like Windows, it's going to be the sole arbitrator of who gets to build hardware for it.
All MS wanted was for you to pay the Windows tax. Apple wants to control the hardware as well. Eg. Windows Phone 7 is a OS that various OEMs can run on different devices. iOS is basically just firmware for Apple's devices.
It may be hard to appreciate today, but in the 90's you couldn't start a software company without a ready answer for how you were going to survive if MS decided they had to kill your product to protect the Windows monopoly.
I don't think anyone claimed that Apple's e.g. censoring porn is the same kind as Microsoft murdering competition and slowing down development.
For porn -- relax, much online video works on the iPad. :-)
I would argue almost all areas is better with competition. If you prefer monopoly solutions, there is still North Korea (Cuba will go a bit capitalism, I've seen).
Compare a Mac with windows, instead.
Aside from that, Apple can certainly be prudish, but that's more out of a desire to remain unobjectionable rather than a desire to control what is acceptable for society. They don't care if you play controversial games, they just don't want to be associated with such games.
Just look at webkit and how Apple's competitors build (are in the process of building) their entire foundations on top of that tech and Apple still contributes.
You don't have to use any Apple device if you don't want to. That choice is yours.
I spent many years working in places where you had to use Microsoft products. There was no choice.
Microsoft didn't care what you put on your machine because all they care about is making money selling you stuff.
That's ridiculous - you had choice - you could have chosen to work elsewhere. Can't get the same pay, conditions, etc? That's not the point.
These days there are plenty of jobs where you'll have to use a Mac because Jobs has dictated that you can only do iOS development on a (recent) Mac.
If their number one priority was really to please the end-user, they'd be selling everything at cost, open-sourcing and giving away OS X for free, etc. People are generally greatly pleased when they can get cool stuff for cheap or free.
Apple wants to please their customers so much that Apple is deciding what apps their customers are allowed to use, because their customers are too stupid to correctly choose the application that works best for his/her needs. And if you attempt to circumvent this restriction, Apple will do everything in its power, technically and legally, to stop you from doing so.
Did Microsoft ever endeavor to do this? Personally, I find top-down control of the entire distribution channel more of a "platform lock-in" thing than encouraging the use of proprietary IE extensions.
It seems to me that Apple develops more to please Apple than to please end-users. The worship that Apple gets is so very silly, in my opinion.
EDIT: here, straight from the horse's mouth
Your comment, while mostly fiction IMO, does have a nice storyline.
I've known plenty of people whose jobs didn't even require the use of a computer; if you are so picky about the software you use, perhaps you could consider a line of work that doesn't involve much computer usage.
Then that's when I'll start to take issue with them.
And Voltaire, "No snowflake in an avalanche ever feels responsible."
The OP's argument doesn't necessarily follow because usefulness does not equate to functionality and hack-ability. Good design is not something you can quantify by "featurefulness". If it were, users would have stopped buying new copies of Microsoft Office years ago. Following from the end line of Paul Graham's post, if you give hackers an inch, they will take you a mile. PG doesn't state whether or not that is a mile in the direction your preferred direction of travel.
If there were any way to quantify this -- "quality-of-life improvement credits" -- Apple would likely be quite rich in these as well as simply in financial currencies.
Its hard to imagine now but there will be Apple without their fearless leader one day. They are a profit seeking venture like any other--brilliant marketing has people convinced otherwise.
My question is: what has Apple done to directly propel the world (particularly the 3+ billion people in the "third world" - impoverished and middle class) towards globalization? Microsoft (perhaps in their bid for control) has donated & sold (or subsidized) millions of products to this group of people. In some countries, apple hasn't even attempted to sell their products at reasonable prices despite the opportunity. Isn't ushering all parts of the world into the next decade as important as creating the next great iOS?
If the workers were not happy with their workplace and conditions, they are free to leave, or even better, organize a labor union.
Also, be advised the GE could also pay their chinese laborers a bit more, if you would be willing to pay more for your microwave, you water heater, your light bulbs, your cookware, etc. It's not going to happen. Let China enjoy being the manufacturing capital of the world while we still have fossil fuels, k?
Also: Developers have used the accelerometer in ways Apple would never approve of. These apps we don't get to see. At least not yet.
After having two jailbroken iPhones, I have to say, I didn't really see any killer accelerometer apps in Rock or Cydia.
If they used the accelerometer to measure how hard they hit people over the head with iPad when mugging them, and triggered a string of copycat attacks, that wouldn't be desirable for Apple or society.
It's almost arguing against firewalls - why restrict computer access when developers can find so many new and interesting uses for it? (Buffer overflows, spam and more is why. Default-deny is the good practise alternative. Assuming that's my stance, your suggestion isn't a convincing argument at all).
And your firewall example is not relevant. Hackers aren't using firewalls against the companies that employ them, they are circumventing the firewalls. What you said is more like "people can climb over fences easily, so we should stop building them around our yards"
My example is relevant because it directly counters edw519's implicit claim that "if an idea can be used in surprising ways, that is good" by citing open network connections as an example of something used in surprising ways which are bad, and our opposition to that by using a "not open" stance as the agreed best position.
To be convincing that openness is good, it would need to be established that significant good comes from it, or that more good than bad comes from it, or that at least serious bad cannot come from it or similar.
Saying that X can be used in previously unexpected ways is neither here nor there as an argument for open APIs or collaboration, whether or not I like open APIs and collaboration.
Actually, on rereading your original comment, I think I misunderstood that paragraph. The exact wording is still very confusing to me, but what I think you were trying to say is that edw519's implicit argument (for all X, X should be open because it can be used in surprising ways) can be applied to network connections, where network connections are the X. I thought you meant that firewalls were the X, and that idea is what I was arguing against.
In fact, one entire category of technology that's overlooked right now is sensors. Besides sound (mic) and light (camera), accelerometers and magnetometers are already there. Some other detectors of "state" that make sense:
Pressure (think of being able to predict rain)
Humidity/Temperature (think of an environment controller)
Directional Microphone (4): Conferencing, noise cancellation, etc.
Galileo might take issue with that: http://www.jimloy.com/physics/galileo.htm
Using an accelerometer would likely be a worse user experience than just stepping on a scale, but a system where the iPad actually measures your body fat could be easier to use.
However, it will likely not be very accurate. Perhaps if one were to set a reference weight at the start, and it just measures delta using the touch screen?
Using the accel, a possibility is to see how close a person can swing the iPad towards his body. I assume that this would change dependent on the amount of fat around his arms.
A common place people gather fat is on their thighs - thin people often have space between the legs and heavier people not as much. Also a possibility, even if a not very clever one.
So it would be possible to give an estimate of upper body weight if a person would swing the iPhone with hand straight. It could give the body mass as a multiple of the hands mass. It would be easy to cheat though.
(Is this the current world record for deletionism, or have there been more egregious examples?)
The wikipedia article on ephemeralization is somewhat more complete and what is linked from the page on Fuller.
Other internet sources seem to indicate that ephemeralization is the term he used.
Edit: Someone has already undeleted it.
The really amazing things are not the genius predictions that are right and venerated, but the predictions that are wrong but the predictors are still venerated.
http://en.wikipedia.org/wiki/Thomas_Robert_Malthus - 1798 predictions of "gigantic inevitable famine" because of population growth. (Industrialization and technology have allowed population growth dramatically beyond Malthusian limits. The same arguments are still made today however.)
http://en.wikipedia.org/wiki/Criticisms_of_Marxism - "the socialist revolution would occur first in the most advanced capitalist nations and once collective ownership had been established then all sources of class conflict would disappear" http://plato.stanford.edu/entries/marx/ "labour intensive industries ought to have a higher rate of profit than those which use less labour" (argue about whatever forces you want that have defeated socialism, but the argument continues)
http://en.wikipedia.org/wiki/Simon%E2%80%93Ehrlich_wager Malthusian environmentalist Ehrlich predicts hundreds of millions of people starving to death in 1970s and 80s, and a genuine age of scarcity." He also loses 10-year bet against economist Simon. "All of [Ehrlich's] grim predictions had been decisively overturned by events...Repeatedly being wrong actually seemed to be an advantage, conferring some sort of puzzling magic glow upon the speaker." [Wired]
Recorded, detailed, insightful, coherent, highly varied, completely wrong.
Oh, and Ehrlich won the MacArthur Foundation genius award AFTER his predictions were proven wrong...
I'd be surprised. Very, very, very surprised.
Ubiquitious Computing is upon us, and much like the PC revolution, it will have been invented at Xerox and perfected at Apple.
Really? I mean it might help someone not need reading glasses for the tasks they do on the iPad... but they still have to have them to read the dinner menu, instructions on the box of food, and so on. If you can't replace it fully how good is it? The iPhone replaced regular cell phones because of the fact you no longer need two devices.
Or... I read too much into that.
A macro capable camera & some sort of lighting and suddently you replace one of these: http://www.independentliving.com/products.asp?dept=264&d...
It not only magnifies, it lights up with the camera flash, it can reverse the black/white or add color filter if you find that easier, take a snapshot and let you zoom around it etc.
Snap a photo, two finger zoom of the photo and you are done
For now, tablets are great. And Apple is great at supplying them. But by no means does this mean anyone will be enslaved to Apple in the long term – someone else has the opportunity to create an open platform that enables any and all technologies to communicate with each other. Someone else will have to sell this platform to businesses, governments and, most importantly, consumers. And someone else will have to create the other, new interfaces by which we access and derive meaning from this data collection. And the challenge of preventing this from being too closed, too proprietary, is what will distinguish the best approach from the most profitable approach, and where we as users can choose to avoid a "client monoculture."
The tablet approach is just a step in an ongoing direction. It's way bigger than this.
That said, I would love to hear more about your idea, if possible.
On a side note, I really need an innovation for keys, they scratch my iPhone! So, go, that new yc company, go!
What we now call a phone is developing so quickly now that calling it anything related to what it does now will mean we have to give it another new name in a few years.
What happens when someone wants to release a NES inspired D-Pad controller for iOS but wants to allow existing game makers to create apps that support it? Right now that is sort-of possible but it's very high friction.
Apple is a company who likes to build the whole stack from hardware to software; they feel like is necessary to create beautiful experiences. Will they compromise on this to facilitate a world where you can connect your iphone to any device in the house?
If they don't, progress may stagnate, hacks (like communication over wifi) will persist, and potentially they are giving up market share. Obviously they need to maintain the integrity and stability of the iOS devices but in my opinion they error too far on the side of caution.
They won't for people who need all of the functionality of an SLR or dedicated GPS. But how many of the people buying such things actually need all of the functionality? Quite often, they don't. They buy more than they need to indicate status, or because they fear having an inadequate device.
Tablets provide an alternate form of status at the moment. And the fear of inadequacy is mitigated by their popularity: If everyone is buying one, how bad can it be?
Single-purpose cameras and GPS devices are well on their way to marginalization as extremely niche devices. I think that will be true of almost everything a tablet can replace.
I think if phones started coming with a better flash, it be very hard to justify bothering with a separate point-and-shoot camera. Although, to be fair, most point-and-shoots have a pretty crappy flash as well.
Another generation or two to sort out the always-on/background bugs of current iOS GPS apps and the battery life bit, and tablets will take over - fast processors, big screens, easily available hardware already present to develop on (sound, internet access, etc). Why not?
So for all intents and purposes, I think it'll replace cameras and GPSes for the mainstream consumer, and you'd only use dedicated hardware in specialized cases.
Isn't the only difference between an iPad and iPhone the screen size. So really, we're starting to refer to these devices based on size rather than power/memory/speed.
Let me give you a clear example: my computer 10 years ago had 128 MB of RAM. My iPhone has 16 GB of RAM. Does that make my old computer not a computer?
My point is that the usage is what matters. I still use my computer to browse the web, compile code, run Excel and Word, etc.
Sure, none of those are specific to the "computer" form factor. But they are all things that people think of when they think about their desktop computer (some mac and linux users excluded, of course: but that's still only 10% of the computer-using population). NONE of those items I just listed are problems with any "Tablet" computer.
The form factor is different, but so is the very model of user software and computing.
Of your 5 points - boot up and software update are reasonable. The other 3 can be absolutely issues.
The point is that they are not perceived in that way, so people are not scared of them. It doesn't matter how true it actually is, it matters what people think.
Phones take a while to boot too, but like most modern computers you just wake them from some kind of sleep mode when you want to use them.
Most phones don't use package management in the linux sense, so if Apps A, B and C are all using some buggy library that isn't part of the OS then who knows on what schedule they're going to be updated since you're carrying three copies of the same buggy library (This is ignoring side-loading and jailbreaking).
Something MS either doesn't get or can't admit publicly.
The only reason we even consider calling them "mobile devices" is that the iPhone preceded the iPad. If the iPad had come first, we wouldn't think of the iPhone as a phone; we'd think of it as a tablet small enough to hold up to your ear. Hence the joke calling the iPad a giant iPhone. That was a pretty good description. If the future of telephony is VoIP, then that is pretty much bang on.
The dashboards of the near future are going to start piping a lot of control over to paired devices, but I think they'll maintain their current level of sophistication in and of themselves. But those paired devices will be a huge opportunity.
And in the same arena: automotive diagnostic machines are going to be replaced. (e.g. OBD tools becoming an OBD->bluetooth adapter + software) As well as repair and service manuals. (Who needs a book to tell them when to rotate the tires if the car tells your phone?)
Those aren't near a fully programmable tablet but I think that's still a long way off since there might be regulatory problems, eg. maybe you have to be able to see the speedometer at all times. I'm not totally convinced I want a fully programmable display though anyway - I don't think that a million customisable settings would enhance my car. But I'm sure there are people who would like it...
http://www.youtube.com/watch?v=BBOPwHgpkFk (skip to about a minute in)
" Skyline that had a screen with heaps of different displays (g-force meter and what have you)"
You're thinking of the Nissan GTR:
Also in the summer, I was camping when my flashlight died en route to the washroom. On the side of the path, I downloaded and installed a flashlight app, and then used it to find my way.
I volunteer in an after-school guitar class at my son's school, and use my phone to tune the kids' guitars before class starts.
A few weeks ago, a website I maintain went nonresponsive and I used my phone to ssh into the server and restart apache.
Just for fun, I installed an app that measures my heart rate using the camera.
Just five years ago, if you had suggested these uses for a phone, I would have thought you were nuts.
However, I'd make the following claims:
(1) Some of the functions are simply not that used. You can get by perfectly well without knowing what double and triple clicks on the home button, for example, If you don't know they exist, you will never even notice.
(2) iOS has less features than Android and arguably this makes it somehow less powerful but also less confusing for my mom (e.g. no task manager, no intents).
(3) Switching to a Mac was actually quite challenging coming from PC land ("where's maximize?", "where's a working Alt-Tab?", etc). I'd say the usability still kicks in as far as how easy it is to do your job for most after the learning curve has passed. Your mileage may vary, but I find myself enjoying using my Mac more than I ever did using Windows on my PC for usual tasks, and that's part of what makes it usable for me. I'd expect a similar phenomenon with iOS.
On the other side, you have textbook publishers, who generally learn a lot of money with ever slightly changing editions and will do a lot to not see that income stream dying...
In my final days there they were just starting to get interest from some institutions who wanted to buy Sony eBook Readers (as the best devices then on the market) preloaded with 30-50 books, then dish these out to students on some high-cost courses. They felt the students would prefer this (a not unreasonable belief with that volume of material) and it'd be easier for them to manage.
So, yes, this sort of device (in the broad sense) will very likely replace textbooks at least partially. I saw it happening first hand a few years ago and see no reason it should have slowed down since.
I respectfully disagree.
The trend for specialised vs. generalised devices seems to go in cycles over a period of a few years, in a similar way to the classic thick vs. thin client cycle. Consider games consoles vs. gaming on PCs, the iPod vs. mobile phones with media storage, etc. Neither extreme is ever going to take over entirely, and the bias moves as technology evolves.
I think this is mostly driven by trying to balance convenience and power. When new tools come along that are generic enough to make a certain broad class of jobs easier, we tend to jump on them. Many jobs get moved to those devices, and specialist devices that used to perform those jobs become obsolete. On the other hand, if you get too generic, you start to introduce waste and therefore inefficiency, which pushes things back the other way. Also, if your generic device is OK at doing lots of things but not particularly good at any of them, there is still a market for specialised devices that do a particular job better because their priorities are more appropriate.
We used to write software that ran on desktop PCs, but it turned out that a lot of practically useful software is essentially a simple user interface to a simple database. Native applications had common pain points in this field that could be overcome by hosting the code and data centrally, in areas like installation/updating/backup. Thus Web apps were born.
However, today, we're seeing major players in the industry trying to turn just about everything into such an application, and they are failing. It turns out that while Web apps are great for presenting relatively simple database UIs, they are relatively weak at performing most other tasks. Cloud computing is a pretty direct extension of the same argument.
I suspect things will go the same way with phones/tablets/mobile devices. A generic mobile device with a bunch of common built-in peripherals and sensors will solve a wide variety of real world problems, and thus various kinds of mobile app have been born. No doubt many more variations will follow over the next few years, as these devices support new functionality that was not previously available and ideas will spring up to take advantage of that functionality. The devices will be good enough for these purposes and will be widely adopted as a result.
On the other hand, Swiss army phones could easily start to suffer from both overspecification in breadth of features and underspecification in performance of individual features. For example, the suggestion in the article to replace reading glasses with a smart phone seems unrealistic and oversimplified to me: it sounds great initially, given that we have cameras and screens on these devices, but then you consider the vast range of different reasons that people are prescribed glasses, the consequent individuality of each prescription, and the fact that glasses do not generally require holding in your hand to use them.
In short, I'm afraid I don't buy pg's argument here at all. A certain class of applications, some of which already exist and some of which will be developed, will probably move to handheld multipurpose devices. However, specialised tools aren't going away any time soon, because any generic device is always going to be either a poor replacement for a good tool or too highly specified to be efficient for a broad market, even if the technology exists to combine high-quality implementations of all the required features within the required space and cost constraints in the first place.
Devices like the iPhone and iPad replace a lot of things; the examples of glasses really wasn't a strong one. Consider this list of things being replaced: phone, mp3 player, GPS, maps, compass, books, wrist watch, stop watch, alarm, photo album, voice recorder, notepad + pen, calculator. From now on, most people with a tablet device will use that instead of anything on the list most of the time (except probably notepad + pen). These things will just never be nearly as popular as they once were, and the list is very incomplete.
It's just too convenient to have all those things condensed into one easy-to-use device, and in many (but certainly not all) cases the tablet is better than the device it's replacing.
In terms of handheld/mobile devices, I suspect you're being a little optimistic with your list of dedicated tools that can be replaced effectively by building on the more general platform.
For example, digital photo frames are great, but you probably have several of them around your home if you use them. Those devices don't each need to come with an accelerometer, a GPS system, and audio connectors, and all those useless (in this context) extras push up the cost.
Books are an interesting case, where I can see technology improving and converging to the point where tablets really do take over, but I expect it will be quite some time before this becomes the norm. I wouldn't be surprised to see a passive, full colour, high resolution screen inside the next five years, but there is much more to this issue than just the technology. I personally believe that reconciling the various commercial, social, logistical and ethical issues is going to be the hard part, and a lot of these probably won't even enter popular debate until the tech is ready to do the job.
Perhaps the underlying theme here is that even if the technology in a mobile device is capable of replacing many other tools, whether it can do so effectively is a different question, and one that has to take into account factors like cost-effectiveness as well as techhnical capability.
The same argument can be applied to the scale example pg gives. Sometimes when you are dripping wet, in the privacy of your bathroom, a scale is just perfect. That doesn't mean your scale will never talk to an app that let's you track your weigh-ins, it just means a scale is a dedicated device that people will likely want to keep for the foreseeable future.
But pg is dead-on on many items, as are you. There will be, as has been, a migration of specialized devices to software solutions.
Where the line will be drawn (due to consumer preference i.e. value creation) is up for debate though!
And I love good maps, it's quite sweet NOT to get lost when you run out of battery.
Hotel locks are amazingly lame. Audio is an interesting output from the phone; flashing the display in front of a photosensor in a pattern would be cool too.
Mechanical locks are basically obsolete in the age of digital photographs and rapid prototyping...if you let me look at a mechanical key for a second, I can make you a perfect copy in 15 minutes.
The series would have to be time-dependent, or you'd be vulnerable to replay attacks. Or maybe the lock plays a challenge and the phone plays the response.
Also, it isn't always the case that you have signal in an entryway, especially in weird apartment buildings, basement apartments, etc.
Also, I am not comfortable taking my $800 iPhone out of my pocket in front of a door in a dark alley, whereas a fist full of keys makes a damn fine improvised weapon.
Also, I don't want to do a wan connection, web API call, wan link to home, and low power rf link to the door every time I want to unlock my door. Especially if both of my hands are full of bags.
If this startup is done right, I see no reason to force you to take your phone out of your pocket in order to use the key functionality.
It's not like the physical shape of the phone has anything to do with the key aspect, so it may as well just stay in your pocket. As you approach a door, it'll unlock.
Well, this is how I imagine it anyway.
Apple's key was realizing that the software had to change, too.
I disagree. Reading glasses are close to the eye and magnify without sacrificing the amount of text to viewable surface area. People who need to significantly increase the font size (i.e. the same people who would use reading glasses) are going be constantly interacting with the iPad to tell it to pan/scroll the viewable (magnified) surface around so that they can see everything. Pagination is only a partial workaround (still have to interact, just deal with the large increase in page turns) and only makes sense with text-type data (e.g. pictures lend themselves to panning, not paging).
I'd also like to see more innovation in the space where users hold tablets while they're facing a television set. The tablet-as-remote-control where the program listings are on the tablet. The tablet-as-gaming-controller where you and your opponent both have tablets (draw a path on a map to move a character instead of guiding the character turn-by-turn).
Another idea (likely in the works somewhere): tablet as guitar effects pedal.
On the guitar effects pedal: Apple has actually run an iPad ad that included a shot of a guitar plugged into an iPad running an "amplifier emulator" (not sure what the right term is for that sort of thing). Not sure which app it is, but it's probably $10 or less. :)
Is the Droid Pro a tablet? What about the rumored Playstation Phone? Tablet seems to imply a form factor, and I don't think that's what is important in how we categorize these devices.
It's like if we called PCs "beige boxes" when they came out or if we called TVs "two through thirteen dial machines".
I expect the form factors for these devices to continue to evolve and improve, yet I think the general category of device will be the same (much like how a 3D LED HDTV is still a TV, although looks very different than an old B&W TV from the 50s).
And of course there are dangers in making a machine more general-purpose than it needs to be. Machines that are too general-purpose becomes more susceptible and sometimes tempting targets for hacking. (As usual, ground well-trod by xkcd: http://xkcd.com/463/, http://xkcd.com/801/)
Let's start with the iPod. Everyone uses it to listen to stuff. My brother listens to rap music, almost exclusively. My niece listens to whatever boy band is en vogue today. Very different genres of music. I love audiobooks on mine. Along with music. I have an aunt that has church sermons on hers.
Move to computers. I have not used a word processor in months. I spend a helluva lot of time in spreadsheets though. My niece, with her school reports, is the opposite. My nephew is almost exlusively a gamer on his computer.
Even the web. Most people have a dozen or so sites that account for 90% (made up stat) of their browsing time. It is a different dozen for all users though.
So yes, the vast majority of users will use their devices for only a handful of purposes. It will be a different handful for each user though.
Google unfortunately discounts this, or rather, is late to understand how important UX is in the tangible world as much as the intangible. When you pickup a phone, tablet rather, your first impressions are based on the physical device. The intrinsic value during this interaction is irreplaceable by any software, no matter how good.
Use makes things interesting. Particularly as relates to platforms. If the platform isn't used, its potential is a moot point.
I could do more on Maemo and even Palm OS than I could ever do on iOS and Android. There were features everywhere... The Sony CLIE series of Palms had multi-tasking abilities, great MP3 players, better organizer features, and more powerful apps on average than it felt like my iPod Touch did. Maemo had features everywhere... I could video chat before Facetime was every dreamt up, I had full featured applications (Abiword with a foldable and pocketable keyboard is missed), and web browsers came out that were fantastic (I could use full Facebook, Google Reader, and such in the Tear browser just as well as on my desktop). And the cool interaction features existed too.. on Palm OS I had a universal remote control. On Maemo I could use a Bluetooth'd Wii Classic controller to play Mario in a NES emulator. Features were everywhere.
But why did I switch to iOS and later Android?
They just weren't fit for a touch interface. Using dedicated apps like the iOS model fits much better for ease of use with a touch interface. When I got the iPod, I switched from using full-featured versions of websites to settling for a reduced web experience that was more information-based.. and having to make notes of what to look at later on the computer. It separated bringing the computer experience with me to making a new mobile experience. When I have a computer nearby, it's not so bad. It does make for a nicer experience.
Android feels a lot like iOS... almost like a slightly more functional and open copy of it. It's no where near where Maemo was. Still, it's enjoyable as a mobile experience, instead of just transplanting everything I could do on a desktop to a mobile-sized device.
That said, I still miss Palm OS. I almost went for a used Palm OS phone instead of an Android phone, but I decided I wanted something built for the Internet.
Hackability is completely off the average consumer's radar.
What's reassuring is that Apple doesn't have control over all the hardware interfaces that make (and will make) this possible.
I'm trying now to remember a story relayed here about a company who developed a product that bested Apple, Apple considered buying them/it but the negotiator dropped the ball or something and then Apple wiped them out by besting them.
Anyway, if you beat Apple then they'll either lock you out (why handicap it if they'd let you carry on) or integrate your idea working around your IP. You're going to need lots of IP/anti-monopoly lawyers. You might get bought out too, especially if it's cheaper than there end of the court cases ... I guess what I'm saying is that building hardware on top of Apple's platform seems it would always be limited in some way.
But I'm probably wrong.
"Hi Steve, it's Cabel, from Panic."
"Oh, hey Cabel! Nice to meet you. So tell me, what'd you think of iTunes?"
"Well, I think it looks great! You guys have done a great job with it. But, you know, I still feel we'll do all-right with Audion."
"Oh, really? That's interesting, because honestly? I don't think you guys have a chance."