I understand the abstract principle behind it, but that does not help.
Two examples I noticed when fooling around with it. Nothing major, but they illustrate the problem very well. This is using a default and clean install of the consumer preview:
So you boot it up and are presented with a big image that displays the time and nothing else. Clicking just makes the image hop, you actually have to swipe upwards (with the mouse!) to be allowed to log in.
You see this PDF in the Explorer and double click it. A Metro app launches and removes all UI elements you just saw and are used to. There is no way to close it on screen, nowhere to click to get out of it. You have to know about the hot corners or keyboard shortcuts to get out of it. Even then: It’s a jarring transition between two completely different UI paradigms. That is no fun even if you know how to get out of it, even if you know that it’s supposed to be that way.
All of this would be fine if normal Windows apps were on their way out. If we all were to ditch our mice. If the Desktop were only there for compatibility reasons, to run those old apps no one is going to use in a year or two anyway. Only, even Microsoft doesn’t seem to be on that path. It looks like Office will not be a Metro app, for example.
I don’t want to see Windows 8 fail. Metro is incredibly cool for tablets and phones. (Windows 7 also was incredibly great – for Windows – as a desktop OS. It works really well.) It’s really awesome! But I’m not sure about the Metro and old Windows UI hodgepodge. I just can’t convince myself that touch interfaces work well with the mouse. It’s just not fun to swipe with a mouse. And interfaces which are fun to use with swipes and taps are not necessarily fun to use with a mouse. (The reverse is obviously also true.) All the Metro apps seem like incredibly cool tablet apps. I can see that I would have a lot of fun with them on my iPad. The Store is also cool and very well integrated. But with a mouse? In such a leaky environment? Where the mouse user is confronted with touch interfaces over and over?
What Microsoft is doing is incredibly brave. That alone deserves recognition. But I really can’t see it succeeding. Maybe I’m wrong.
So what's the disconnect?
I'll give my opinion at the bottom. You've pointed at the leaky tablet paradigm, but let's just assume Office takes a year to catch up to Windows. That's historically been the case.
"Normal" windows apps should be phased out rapidly and Metro everywhere should become the new Windows UI.
Since when has real office productivity needed a Tablet UI? It seems obvious to me that the iPad (and iPhone) are for casual use or travel. It may be productive to review sales figures, or tweak the wording in the presentation, while on the plane. Anything more requires a keyboard in my opinion.
The physical difference between the iPad and a keyboard is the primary reason I think office workers will hate Metro. Second to that I see them hating Metro because they just want to "get stuff done." Metro lacks the streamlining that the old apps have taken years to achieve. Instead of being able to add a formula with one click, you have to learn a whole new set of interface tricks.
Microsoft wants a quick success. So Metro will probably get ditched for something new by 2016, regardless of its core merits. (Obligatory: I'm biased: I think there is a niche for PC's and a separate niche for iPads. I see no Post-PC world, only an admission that Dell's profit margins will never return to 1990's levels.)
The "Post PC" world doesn't remotely mean PCs will disappear. We've been in the "Post Mainframe" world for 30+ years, yet Big Blue still makes a lot of money on super computers.
When people talk about "Post PC", they are referring to where the majority of dollars will be spent and how the average person will interact with computers. We are still learning how to make that interaction really effective for producing content, so we haven't completely left the PC world, but within a couple years, we will be there.
One other thing that will be a hallmark of this transition will be ubiquity of computing. In the PC world, there were one or two general-purpose computers in a home. In the Post PC, there will be dozens. And even more special-purpose devices.
Up to today, Post PC products have been mostly used for consumption. This is a side effect of two things: most computers are used for consumption and, as I mentioned earlier, the interactions for creation are still a work in progress. But progress is happening. I've used my iPad to author business presentations from scratch, to do mockups for my web product and to author blog posts.
There are a lot of advantage to having a device that works more naturally with my creative energies. If I'm relaxing on the couch, I can continue to do so. The laptop forces me into a different mindset.
But, yes, PCs will certainly continue to be around and useful for a good number of people.
As I recall, average have jobs. Jobs are becoming PC oriented all the time. That is because the PC architecture (screen, local storage, keyboard) is pretty much needed for full-time production on a PC (and when has been computerized that means that less, not more, physical activity will be needed).
Post-PC products may indeed continue making inroad in consumption and those jobs where people need to walk-around. But outside that, the form factor that is the tablet's advantage become its disadvantage and only obsessive cool-aid swallows will take it beyond these areas.
And it will remain that way until applications will be able to make more effective use of multiple cores.
My "travel system" has changed three times in the last 9 years (Macbook Pro, Macbook Pro, MacBook Air) - but my productivity desktop has remained the same - a Dell Precision 650 running windows XP. I'm _already_ looking forward to my fourth laptop (picking up a 2012 thunderbolt MacBookAir - local backups over a thunderbolt connection to a high-speed NAS will make local backups both more likely to happen as well as more painless) On the flip side- my circa Q1 2004 productivity desktop _still_ does pretty much everything I need of it - I don't have any real incentive to request a new machine, or upgrade off of Windows XP.
I'm picking up a new iPad on Friday, but I don't really see how Windows 8/Metro is going to be a useful replacement for my fairly optimized Windows XP experience. Eventually the Precision 650 is going to break - and I'll probably upgrade to Windows 7 + whatever dell desktop will last me another 10 years, but I agree 100% with the parent - Mobile/Tablets/Laptops still have 2.5-3.5 year lifespan, desktops have moved into the 4-6 year rotation in the enterprise (And, in my case, even longer)
As the world becomes more mobile, and desktops continue to extend their life, we'll see even more transition of leadership (and profit) to those vendors who focus on the "Mobile Experience" - that's what's driving Microsoft to Metro - not because they believe it will enhance our desktop experience (it really, really won't) - but because it's where the market is moving.
Agreed. But given what they're doing with their "operating system for the post-PC world", one can't help wondering if that is exactly what Microsoft thinks it means.
We're going to look at non touch-sensitive screens like relics of an old era.
If you buy that future, you may also come to the conclusion that your flagship OS and cash machine had better be ready for it. And, clearly the last few shots at making a windowed UI touch-enabled didn't go so well.
With that in mind, I submit that it's prudent to wait for the baby to stop moving before you throw it out with the bathwater.
Mainframes never achieved the dominance of PC's precisely because they weren't as useful; so it is logical to say that PC's will continue to dominate embedded devices because the iPad isn't as useful.
Embedded devices never even had to compete with PC's on units sold. That didn't make the PC an afterthought then, so what's different now?
These padigm shifts aren't about what is inside the device. They are about how the devices are used and how that usage impacts people and society.
The glaring, obvious sign that the world is changing is a grocery store checkout line. Five years ago, a couple people in the store might be looking at BlackBerries. Now, everybody and their kids have a device, and often, those devices are talking to each other, via wifi, 3G or Bluetooth.
That is the people impact in the Post PC world. And that impact will be as large as the impact PCs had on the world (IMO). I just hope it's a positive impact.
I was on my way out of town and stopped at the new supermarket to pick up a sub for dinner. As I was checking out, a young guy and his girlfriend were in line in front of me.
He says to the cashier, "you know your cell phone coverage sucks."
The cashier says, "yea, only Verizon works."
The reply was, "I bet they're sorry the didn't think of that when they were building it."
> everybody and their kids have a device, and often, those devices are talking to each other
The inter-device communications are just part of the app, another way of describing the cloud.
The ubiquitous embedded device is as unimportant to the PC as the wristwatch. I still don't think a wrist computer (or an iPad) will be so dominant people say, "remember when we used to use keyboards?"
Everyone may have a "device," but they will still use their laptop every day.
3. Touchscreen (finger, stylist)
4. Audio (e.g. Siri)
5. Video (e.g. Kinect, eye tracking)
Some were great, some were terrible. The "great" tended to be so because they were tailored to the software they were programmed for. Playing Guitar Hero with a guitar controller is good, but playing Call of Duty with it is bad. There are plenty of different types of inputs that exist and will be invented, but some work best for some things and terrible for others.
As you mention, Microsoft Office is designed around a keyboard and mouse. A touchscreen is a hybrid of the two, more portable, but clunkier. An accountant who lives by arrow keys, hot keys, and their numberpad in Excel is going to hate a touchscreen. For MS Office to work on a tablet, it'll have to be re-designed from the ground up, and even then it may not be superior to its desktop counterpart in an office setting.
I don't know what MS wants to do with Win 8. To think all desktops are going to die and become tablets is wrong, just like thinking that T.V. was going to kill radio. Both have their places, but a tablet != desktop.
Maybe you're right that they just want a "quick success", but that strategy hasn't paid off for them recently (Zune vs. iPod, Bing vs. Google, their fragmented mobile efforts vs. iPhone). MS has been in reaction mode for a while, and now they're reacting to the iPad with Metro.
One thing Microsoft does have going for them is the corporate market; Apple hasn't directly targeted that yet, though to think that Apple is not going to go after it shortly is foolish. MS is entrenched in here with Active Directory and Exchange. But cloud efforts are going to shortly give that a run for its money.
With multiple competing options (some in the cloud), surely one of them will cater to the old UI that uses the arrow keys.
Specific to accounting, my anecdotical experience has been that many accountants don't like change unless it makes them more productive. If it's confusing or has a large learning curve, expect lots of complaints and resistance. I upgraded a few with new computers, from Win98 to Vista, and from Office 2000 to Office 2007. The "ribbon" UI was so terrifying to them that they literally unplugged the new computers, plugged the old ones back in, and used them for 9 more months before they finally crapped out.
Any type of manual data entry needs to be done really fast and accurate. Keyboards are better at this than touchscreens in most cases. Touchscreens will have to become more keyboard-like to compete (this is going to begin happening soon, but we're a few years away before it becomes good enough: http://cnettv.cnet.com/senseg-demos-prototype-touch-feedback...).
You can easily pair a wireless keyboard to an iPad. This is how I got my parents to replace their computer with an iPad, and further lower my family tech support costs.
and the subsequent discussion here (can't find it right now)
Your point on lower tech support costs stands, though :)
My dad is utterly overwhelmed by a computer with a mouse and keyboard. We could never email pics of his grand kids and family events because even something like the gmail login page (that users like you and me take totally for granted) was too much for him.
He'd fish out his login details from the desk drawer, hunt and peck them in and finally gain access (sometimes) only to learn that there is no new email waiting for him. At which point he'd throw the desk upside down and start kicking the computer to shit.
Contrast that with simply tapping the envelope to retrieve email and I say retrieve not check because the badge notifications mean that he never even has to tap the envelope unless there really are emails waiting.
You simply cannot underestimate the value of this for people on that side of the digital divide.
He now has a Facebook account, he messages us from that and uses regular email, he has access to family pictures, groups he shares interests with and generally does have an authentic slice of the digital age of his very own that he greatly enjoys.
Even hunt and peck typing is a more fluent thing on a touch screen... Who'd a thunk it?
My whole point is this: horses for courses.
You and I have no frame with which to reference the massive value of a 'Luddite' device just as my dad has no way of understanding the type of functionality guys and girls like us require.
As for my family, a number of them, like my sister, proudly exclaim that they are luddites, and they like wearing it stubbornly like some strange badge of honor.
What is really likely to go the way of the dodo, at least in the office productivity area, are mice. There is pretty much no good reason for mouse-oriented UI's these days.
As someone who frequently uses applications like Illustrator, Autocad, Sketchup, etc. I'd be very interested to hear how you'd qualify that statement.
Even with that changed and the hotcorners indicated, the hodgepodge stays, though. The hotcorners are only the tip of the iceberg, what you notice immediately. (Or, rather, not at all, like me, who figured out to use the Windows key pretty quickly – but still after some confusion – but didn’t find the hotcorners for a long time after that. I can totally identify with the guy in the video. I wasn’t quite as helpless, but close and I’m young and grew up with that stuff.)
You can also scroll the mousewheel.
There was an influential blog post years ago documenting the 11 or so ways to shutdown Vista. More than likely someone is reciting the "multiple ways bad" mantra.
I'm surprised UI designers let us shut down our computers at all.
95% of the time for 99% of users, this is perfectly reasonable behavior - unlike wanting to shut off your computer which is a far more useful use-case. For the small remainder (power users), there's InsomniaX .
Having some optional hint or tip mechanism built in for new users would be nice. Maybe not in the style of Clippy from Office but something that would point out all the nice little shortcuts.
Clearly Apple traded off a lot of learnability for aesthetics and a pleasant experience after you learn. I see this as very similar.
However, with regard to my personal experience, I picked up the iPhone 4S as my first smart phone. I've never seen an iPhone or iPod touch in action before and this was my first experience with an iOS device. It took me very little time to figure out how to navigate it, close and uninstall apps, etc. I never googled how to do something, and even had fun trying out my new toy.
I think comparing something like the simple iOS and the should-be simple Windows 8 is a very complicated act. The basic nature of a single Home Button and a touch screen simplified my learning of the iPhone. I imagine, without an introductory tutorial, Windows 8, with such complications as an external mouse and full keyboard and no touch-screen monitor, may be a very hectic experience (as seen in the video).
Welcome to DOS.
The user will likely type "help" first. On Kubuntu I get this:
GNU bash, version 4.2.10(1)-release (x86_64-pc-linux-gnu)
These shell commands are defined internally. Type `help' to see this list.
Type `help name' to find out more about the function `name'.
Use `info bash' to find out more about the shell in general.
Use `man -k' or `info' to find out more about commands not in this list.
A star (*) next to a name means that the command is disabled.
job_spec [&] history [-c] [-d offset] [n] or history -anrw [filename] or histor>
(( expression )) if COMMANDS; then COMMANDS; [ elif COMMANDS; then COMMANDS; ]... [>
. filename [arguments] jobs [-lnprs] [jobspec ...] or jobs -x command [args]
: kill [-s sigspec | -n signum | -sigspec] pid | jobspec ... or kill>
[ arg... ] let arg [arg ...]
[[ expression ]] local [option] name[=value] ...
Actually I don't think it's about experience, it's about willingness to read and attempt to comprehend a system. If a person doesn't want to try then yes this will be difficult for them; it needs effort.
I'm not sure about the mouse-tablet thing, in general, sure. But the specific way you use the lock screen, for example, is likely to get fixed before they release.
(I've been using the Consumer Preview for about a week on my primary machine and I'm pretty happy with it.)
All of these specific UI issues result from this impedance mismatch between tablet and desktop UIs. So yes, Microsoft may iron out the specific bugs we bring up. It won't matter, though, because UI issues of this nature will crop up at every seam between the desktop and tablet interfaces. What will happen is that Microsoft will patch these piecemeal and we'll end up with layers and layers of band-aids trying to compensate for a fundamentally broken design.
Junking your existing stuff and making your users into involuntary recruits in your drive to hijack what the pundits say is the latest and great thing merits adjectives like "arrogant", "overweening" and "full of hubris".
No-one is being forced to upgrade, and security patches for 7 will be coming out for a while yet.
I think the users that don't want Windows 8 will live.
For what it's worth, Kinect for Windows was released on, I think, February 1st. There's some talk that we may see Kinect technology and gesture recognition coming to all laptops within a year or two.
Personally, I've been waiting for the RSI-inducing mouse and keyboard interface to die (or at least be demoted) for a long time now.
People use their computers in very different ways, and while a user may have a windows PC at work and an iPad at home, that doesn't mean they want to "do work" on the iPad. In many cases they actually hate the idea.
Secondly, I get the feeling that many windows users are still 100% windows because that's what they know. Microsoft seems insistant on forcing users to re-learn the OS at every release. I can't help but think this is what makes more and more think, "well, if I have to learn a new OS I should try out those apples/Linux that everyone loves to talk about". That can't be good from a Microsoft pov.
If you're a consumer, you use the Metro UI.
If you're a producer, you use the Desktop.
You can set the desktop to be default on bootup. And I'm 90% sure this is how home computers will ship from DELL. While tablets will be Metro.
The desktop is not going anywhere, MS has simply re-aligned the default UI to cater for the mass-market: the consumers that want to browser youtube, check the email, tweet, etc.
> Microsoft seems insistant on forcing users to re-learn the OS at every release.
IMO, they are insistent on improving their products.
The line between "consumer" and "producer" is so fuzzy that this doesn't sound feasible to me. [For the record, I haven't used Windows 8 yet.]
Both "types" of users—if we assume for argument's sake that such a distinction can even be made—tend to use a number of apps; each of those individual apps falls somewhere on a continuum of consumption vs. production. And even where each app falls on that continuum for that person may vary throughout the day!
I expect there will continue to be tablets — and they should have tablet UIs — and there will continue to be "desktops" — and they should have desktop UIs. They should be made similar in aspects where it makes sense, kept different in those aspects where it doesn't, and there should definitely be many apps that run on both (with appropriately varying UIs depending on platform) and which constantly sync so that the users' production and consumption can flow between them as desired.
If Microsoft pulls off a single system (even with two modes) that can transition without confusion and frustration between the two and that doesn't heavily compromise either one, then they'll have a very unique offering. But I don't think it's clear that such a goal is even achievable.
The problem with this is that the "new" desktop is fundamentally different from the old one. It's not as simple as saying "oh, just use the desktop".
> IMO, they are insistent on improving their products.
I wouldn't consider presenting users with a puzzle to be an improvement. I can pick up my iPad and instantly figure it out. I can pick up a MacBook Air with iOS-like features and instantly understand how they work. I was stunned at how much effort was required to learn Windows 8. Absolutely stunned. How exactly is that an improvement?
By the same token, I can pick up a Windows Phone 7 device and instantly understand how to use it - but Windows 8 still somehow turned out to be a confusing mess with a mouse/keyboard. Even on a touchscreen device - it's not completely clear how things work. It was as confusing to use as a Blackberry Playbook.
If you can't figure out how to use the desktop without the start button (by going to the left-right sides with a mouse movement that is even easier to do), then you really should be using the Metro UI instead, because it was made exactly for you.
> I can pick up my iPad and instantly figure it out.
1. When I got my iPad it enraged me to learn/figure-out that I had to install iTunes and connect the two.
2. You're now comparing the touch based iPad UI with mouse/keyboard driven Windows 8 Desktop UI.
2. It doesn't matter if it's apples-to-oranges as far as touch vs. mouse/keyboard - the key issue here is how understandable the UI is. Windows 7, if you know how to use a mouse and a keyboard, is completely understandable. Windows 8 makes absolutely no sense.
I predict we will see a number of utilities emerge in the next few months to emulate the Start menu or otherwise provide alternatives to the Metro UI for said tasks.
When has Microsoft made this kind of switch before? Last time I recall this significant of a jump was 3.1 to 95.
Disclaimer: Microsoft employee
2000 to XP was a big change. So much so that many people reverted to "Classic" mode as soon as they could do so.
XP to Vista wasn't even accepted by most users. I have no idea how many old XP users went to OSX like I did, but I would posit that the number is not zero.
Vista/XP to 7 was also a decent sized change, as I learn anytime I've attempted to verbally explain where a particular setting is and realize that it could be called anything. Unlike Vista, this has nothing to do with the quality of the OS, as Windows 7 is very good. However I can't help thinking that at least some users have gone OSX from this change as well.
"Classic" mode only changed the way the OS looked. It had nothing to do with the functionality, and certainly didn't change whether the user had to "re-learn" the OS.
> XP to Vista wasn't even accepted by most users.
How did users have to "re-learn", though? Vista was certainly not adopted at the rate that Microsoft had hoped, but whether users accept the product is a different question from whether they have to "re-learn the OS".
> Vista/XP to 7 was also a decent sized change, as I learn anytime I've attempted to verbally explain where a particular setting is and realize that it could be called anything.
They did move some settings (can't recall if this was actually Vista or 7). I would hardly say that this required re-learning the OS, though. I will admit that one of the first things I always do is switch the Control Panel to "Large Icons" rather than "Category".
P.S. Disclaimer: MSFT employee
A minor aesthetic change which requires a minor intuitive leap for the power user is a major change for most home users. Sure, the steering wheel is on the other side of the car, but thats minor. Except that now the user has to learn how to drive on the other side of the road.
I loaded up windows 7 to look. Where's add/remove software? Wait, that changed and I need to set the control panel to classic to see it. Uh, where's classic mode at. Turns out you select the drop down box to Large/Small Icons for it to change the icon selection entirely. What? I spent a while longer searching for where to install OS components (IIS, etc). Minor irritants to me. Major headaches for my Father, Sister, Brother, etc. Even moreso when they call the family tech who can't figure what the hell they're talking about.
I change to the interface IS a change to the OS as far as all by %1 of users are concerned.
Guess what it works.
Classic what? Click what? Where's what?
Just search for it.
Implying the rest just seems archaic, especially from a user perspective.
Have you used an OS prior to Windows 7 and/or OSX?
Search on windows actually working is a huge step forward. Many users have simply not adapted to this actually being something worth trying.
Search on Windows is horribly broken. One of the first "shortcuts" I learned on OS X was Cmd+Space and typing out the application I wanted to run. Spotlight immediately brought up what I wanted. Windows never did that for me, or spent 45 seconds or more with a spinning hourglass to return a document that happened to be named similar to a program I wanted. I don't care if it works better now, they've set a precedent in my mind that it is broken, because it was broken for the ~15 years I used their OSes.
And for reference, if you go to the control panel in Windows 7, "Uninstall a program" is right there on the bottom left. If you want to install a program (like IIS), clicking "Programs" takes you to a convenient menu that lets you "Turn Windows features on or off".
It's not quite how it was, but it's actually more intuitive. I'd imagine that you learned the old way of doing it by trial and error. The new system makes that process easier.
You innovate or you die, and Microsoft is dying. The iPad is killing them. Maybe not quickly and obviously, but it will end Microsoft's dominance in less than a decade.
I don't think any of the examples so far justify the claim that users have to "re-learn the OS" every version, though. It's a sensationalist claim with pretty much no backing. Most of the changes have been very incremental. Users have moved straight from XP to Win 7 in droves, and few people have complained about having to "re-learn the OS", despite it being a jump of not just one by two major versions.
Users only understand the most primitive of abstractions; they understand that a button is a button, but what the button does is only learned from experience (good labels help, of course). The change to the grouping-based icons on the taskbar absolutely forced users to relearn.
See to you the fact that the buttons are in the same place as the old ones is an indicator that they have similar functionality; the typical user does not make these connections at all. They only know that the old task buttons are gone, and replaced with something they don't know how to use (yet).
Now the level of change for this is much smaller than some other changes. Most users figured it out, many on their own (but many did not). I only used this example to demonstrate how things that are obvious and simple to you are massive barriers to common consumers.
Sorry, but just I can't accept that the vast majority of computer users are morons who can't understand it at all when interfaces change and have to start at square one every time. This is too cynical.
The taskbar grouping was not that big of a change. Users look for icons in the taskbar, so dropping the labels was pretty minor. The grouping behavior itself was quite discoverable. If you ask an XP user who's just switched to Win7 to open a couple of word documents, and a web browser, and a few more things (enough to trigger grouping), and then ask them to go back to the word document, they'll be able to do it. They'll look in the task bar, click on the icon for Word, and then when the window thumbnails pop up, they's say "what's this?" and then click on one of the thumbnails. Functionality discovered. Yes, there's some initial confusion, but I'd hardly call it a "re-learning" of the entire OS.
I also can't believe that Microsoft would have added the grouping if focus groups showed that average users were so lost that it was like using a new OS. There's not enough value added by grouping to ship it if it significantly hurt typical users' experiences.
I think you're wrong. Before the grouping was added, I watched people click through half a dozen different windows trying to find their document. The labels were never useful once you had multiple documents, because they got truncated so short.
Moreover, I don't believe that the entire UI world just randomly decided to add icons to everything. It seems more likely that all indications are that people use the icons.
> 7 won't allow it at all.
I don't know what you're talking about. The "never combine" setting is still there.
> And don't try to tell anybody that thumbnails of sustantially-similar looking text documents are a substitute for taskbar labels.
Actually, they're much more useful, because the thumbnail often reveals a lot about the document, and also because the title is right above the thumbnail.
You're right, that rather hyperbolic interpretation of what's being described isn't quite true.
But it sounds like those kinds of users are quite a bit more common than you realize. And when it comes to driving the PC market, those users are powerful. The reason why most companies in my industry are still standardized on XP is precisely because there are a lot of people like that, and the potential benefits of switching to Windows 7 are minuscule compared to the productivity (read: $$$$) losses that would result from getting these users back up to speed on the new OS.
> See to you the fact that the buttons are in the same place as the old ones is an indicator that they have similar functionality; the typical user does not make these connections at all. They only know that the old task buttons are gone, and replaced with something they don't know how to use (yet).
You're right, they tend not to change. Hence, Windows XP.
Your company's problem isn't that people will use XP forever, its that when they do buy a new computer, they may not buy a Windows PC, since they understand that they have nothing to relate to from Windows XP.
Really from an interface perspective Windows 8 is a completely new beast. The only advantage you have is that you still have Word. Otherwise you'd be at major risk of losing higher end customers to Apple, but you're probably still at risk of losing customers to the iPad on the lower end.
Honestly, can you explain why Microsoft chose to all but abandon the desktop market? The only explanation that makes sense to me is infighting; the Windows division successfully killed the courier and got to be the ones that made the tablet, and it made sense (to them) to not break the team into 2 separate groups (as is more logical) so they simply picked what seemed like a more future-proof bet.
Absolutely. I think we just disagree on how catastrophic that is. I think most Windows XP users would find that Windows 7 is much less of a change than OS X Lion.
Windows 8 is a different beast altogether and it is indeed a huge bet.
> Honestly, can you explain why Microsoft chose to all but abandon the desktop market? The only explanation that makes sense to me is infighting; the Windows division successfully killed the courier and got to be the ones that made the tablet, and it made sense (to them) to not break the team into 2 separate groups (as is more logical) so they simply picked what seemed like a more future-proof bet.
[What's below are purely my thoughts and guesses. I have zero actual insight into the Windows team or its non-public history or plans.]
I don't think infighting is a factor. I think a fairly unified vision is driving Windows 8. But I personally think that it's for the best that Courier was killed off. After using modern tablets, it becomes obvious that two smaller screens is simply less compelling. It's worse for media consumption, and it is also more awkward to use when not placed on a flat surface. Gate's original vision for tablet computing was close to what users want. However, touch first and pen second is what was missing (and obviously users will give up the pen entirely if they have to). Of course shoehorning the desktop OS onto tablets was not the right approach, any more than it was the right approach for phones.
I also don't believe that Microsoft is abandoning the desktop market. They've got a different vision for it now, though. The Metro world is much different from Windows 7, but it's not worse. I would say that in many ways it is much better. I even think that Metro with a mouse and keyboard will be an improvement over Windows 7. What's jarring is the marriage between the legacy desktop and the modern style. I think the legacy desktop is intended to become less and less important over time, though, and eventually typical users will never see it at all (though I could be wrong). At that point, the only interface they will use will be Metro, and the transitions to and from the legacy desktop will be gone. I expect that the various other differences will eventually either become moot as people adjust or they will be addressed in a future release.
I disagree that splitting the team into separate groups would be more logical. There's no reason that a compelling tablet OS can't be a compelling desktop OS. It does require changing some interaction paradigms, though.
The question at hand seems less about how quickly or readily people can adopt Metro than how often they need to leave it. It seems the context switch is actually the hard bit. Metro Office will go a long way to fixing this.
However, for older folks, this view is not typically the norm.
I do computer repair and routinely listen to gripes about changes in Vista/7.
For instance, I often hear complaints about things like changing the theme/style settings, searching, modifying network interfaces/settings, or changes to how the control panel works.
These are often, aside from theme/network settings, an improvement, minimal, and intuitive changes IMHO.
But, to someone who is barely computer literate, even "little" things like that can represent a big change and can be very confusing.
And, of course, the change from Win7 et all -> Win8 is massive, even to me...
I actually agree that the control panel change is annoying, though. I think it was a bad change, because it's not merely different. I think it's actually less usable.
And yes, the change from Win7 to Win8 is massive. Not a lot of people would dispute that.
However, how I interpret that statement is that it is ill-advised to over-estimate the general computer user's ability to adapt to changes, or the way 'little' changes end up disrupting the way they interact with the OS.
I might be interpreting OP's statement with my own bias, or reading into it too much, and we could probably go back and forth on semantics all day.
Regarding the control panel, I think the search functionality, which again a lot of non-techies may not even realize is an option, is the main redeeming quality and makes it much less aggravating to work with.
I agree that it is dangerous to overestimate the general user's ability to adapt to changes. Microsoft does a lot of focus groups for this very reason. I don't think that Microsoft can cripple itself by never changing the interface, though. Maintaining an identical interface might work for a while, but eventually competitors who weren't afraid to innovate on the UI would win.
I agree about the control panel as well. Search fixes a lot of the issues. I'm glad Windows 8 has kept search working (and arguably improved it).
What do you see as the changes that forced the user to "re-learn the OS"?
In WinXP you might do: Start->Find folders and files. Write search terms. Location Dropdown->"Browse..."->Find folder. Press search.
Vista: Find the folder in Explorer, write your search terms in the search box in the upper right corner .
This workflow wasn't changed for a long time, as far as I remember.
Using a new operating system takes some getting used to. But first time use isn't the relevant metric for an activity which one will do for years and in which one engages for productivity.
The metrics should match the expected learning curve - i.e. F16's are not designed for toddlers.
But he's sitting down at a new version of Windows and using it as if it were his first time using an alien OS. In other words, the metrics are completely at odds with what should be the expected learning curve.
Is an operating system upgrade supposed to be like transitioning from a 2008 BMW 3 series to a 2011 BMW 3 series?
Or is it o.k. for the learning curve to be like transitioning from a 2008 BMW 3 series to a 2011 Dodge Grand Caravan where the brake, accelerator, and steering wheel remain the same, but all the other controls, the instrumentation, and cabin layout are different?
Any change has a learning curve which can feel steep at first. I've personally experienced it when changing browsers among IE, firefox, chrome, and Opera (never mind mobile versions) even though they all pretty much work the same way.
And my father makes the same face when he has new email from someone he doesn't know.
Bad analogy. You go on to point out why:
where the brake, accelerator, and steering wheel remain the same
If you've made such a transition, you will find you have to adjust to the headlamp controls moving from the right side stalk to the left side on the console and being replaced on the right side stalk with the wiper control which is on the left side stalk in the other car.
Seat controls, speed control, radio adjustments, door lock operations, etc. all also change.
Not to mention that you can't get a Grand Caravan with a clutch and no sensible person would get a slush bucket in a 3 series.
I speak as a former programming team member from a very high ranking school in the SE Region according to the ACM International Collegiate Programming Contest regionals.
"He's also been primed to expect a difficult task by his annoyingly condescending son. I'm not at all surprised that he struggled with it."
This has an enormous effect on a person's ability to solve problems. I learned this one day speaking with one of our coaches. I took what I recall to be a 1000 or 1100 point TopCoder Level 1 problem and told him it was "easy". I was being sarcastic, but he took me seriously and efforted more at the problem because it was "easy". He even postulated a few theories on how to solve the problem in our conversation. He said the reason he even did that was because I told him it was easy. Had I said it was hard, the results may have been different.
Having to abandon all the skills you had in muscle memory for years and lern an entirely new set of clues, moves and small, day-to-day tricks surely wont increase your productivity.
This is basically like changing your keyboard layout (which is running unchanged for approx 150 years, btw) every two-three years when the "new version" comes out. Or like remapping all your Emacs/Vi keybindings with every "new version". shudder
It is clear an simple: MS has basically declared the desktop dead (not profitable enough), and Windows 8 is supposed to encourage (i.e. force) current Windows devs to write tablet-compatible apps, which is where MS sees the profits of the future.
It is a bold, incredibly ballsy move, they're putting all their crown jewels (the UI familiarity, the backward compatibility, the Win32 API, the developer community, the stability required by enterprise customers) on the table for the ONE BIG BET.
And it is not even likely that they'll win. The market is already pretty saturated and mature where they want to go. It is not a vacuum, where they can get in and become the only system provider like they were on the IBM PC. Even if they get in, it will be nothing more than the proverbial foot in the door, nothing more. I absolutely cant imagine this going well for them. You only make giant bets like this if you're either absolutely confident that you'll win because you can accuretely predict the future, or if you're absolutely insane.
In the short term perhaps.
Over the long term, I have found that switching OS's [or browsers] makes me more productive because I don't continue using all the beginner inefficiencies I was in the habit of using.
An example from switching browsers is learning to use the middle button to open a link in a new tab I picked up when switching to Opera a few years ago rather than using the right click context menu as I had been doing for years.
It took me the better part of a decade to move the task bar from the bottom of the screen to the left side. The value of starting again as "an experienced beginner" should not be dissmissed.
Also making the app launcher (metro) a modal window that covers up all of your work is idiotic on a PC. It's a tablet convention, there's no need to cram it down our throats when we're on a PC.
The integration of the Metro interface with the desktop PC requires a paradigm shift, but in the long run it will probably prove better to have done sooner than later because of the benefits it affords developers directly, and thus users indirectly.
If you were to pin all of your main apps to the task bar then conceptually it's not too far off from the OSX dock plus launchpad concept. I really find no use for the launchpad on anything but a tablet so that's probably why I feel the same about Metro.
James Randi's lecture  on proving a negative (in this case, "windows 8 doesn't work for users") is highly relevant and poignant.
EDIT: fixed link formatting
> However, it would be a grievous mistake to insist that someone prove all
> the premises of any argument they might give.
> So why is it that people insist that you can’t prove a negative? I think it
> is the result of two things. (1) an acknowledgement that induction is not
> bulletproof, airtight, and infallible, and (2) a desperate desire to keep
> believing whatever one believes, even if all the evidence is against it.
You can prove negative statements within a set of assumptions. But, as even this
author acknowledges, it is impossible to absolutely prove anything in all cases, and
I think that is what Randi basis his argument on: that nothing can be absolutely proved
false in all cases.
Disclaimer: I'm not an epistemologist, nor a philosopher, nor an experienced logician.
I may have misread this PDF and/or Randi's speech.
The "impossible to absolutely prove" phrase has a misuse of "prove". A proof is absolute. A proof is either valid or invalid judged solely on whether it follows the rules of the predicate calculus (or the rules of some other proof-framework you're using which is likely using the predicate calculus or an extension behind the scenes anyway). A proof's conclusions are "true" if and only if they are true. ( http://yudkowsky.net/rational/the-simple-truth )
It is true (so far as we know, we might be wrong in the end) that we can't be absolutely certain that something is false, but it's the same case for being absolutely certain that something is true. (Here "absolutely certain" means "no admittance to even the possibility of being incorrect".) This is more generally identified as the problem of induction, but it's more of a law than a problem. One can guess from the name it has to do with inductive arguments that rely on guessing+evidence rather than deductive proofs that rely on accepting premises (which may be true or false) and a proof framework like the predicate calculus.
Edit: Have a Feynman video on UFOs. :) http://www.youtube.com/watch?v=wLaRXYai19A When he says "I can't prove it's impossible", the interpretation you should take is "I can't produce a set of premises we can all agree with that leads to a deductive proof that UFOs are impossible."
The start button was the first thing> I landed my self on the desktop (because fuck Metro) and quickly realized I have no idea how I can actually do anything. Virtual Box wasn't capturing my Windows Key entries as they were being nabbed by my Linux OS so I was S.O.L. there but it should not be this frustrating to actually use an OS to install Visual Studio + Libraries + FF.
At this point I'm willing to axe that Virtual OS and just dual boot with a Win7 copy that sits on the "you have 30 days to activate" B.S. before axing it next Monday.
The thing is that Microsoft is going the way of thinking "the average user no longer requires a desktop." I tend to agree, the content creators - whether they're programmers or artists - are the ones who require a fully operational OS. Those who simply consume the content are well off with smart devices which Metro was designed for. But these two worlds are DIFFERENT. I need the ability to dig through the internals of my OS. I need the ability to work with my system registry, services, administrative options and everything else mixed in. When a seasoned user suddenly struggles to accomplish tasks that are second nature you've IMMEDIATELY lost a sale.
It wasn't until I realized that the entire interface was built around the hot corners. This was totally non-discoverable inside the VirtualBox interface, though, because my mouse would continue beyond the edges of the screen to my own desktop. But that's a virtualization-specific problem, so I guess I can't complain about that.
What the hell is a Windows key? Why would an OS be built to expect a special OS specific key?
To be honest, I know what a windows key is, but I haven't used a windows key keyboard in my lifetime.
I can get to my Windows 7 start menu by clicking on it, and failing that, CTRL-ESC will work. If there's nothing to click on, and I'm now forced to CTRL-ESC all the time, I can live with it, but it's definitely not something I would call an improvement.
I know there's a familiarization burden with every new major release, but I'm hearing about an awful lot of extremely inconvenient sounding issues with this one.
I wonder how hard it would've been to include an OS-wide "tablet mode" / "desktop mode" toggle.
You mean, like the Ctrl and Esc keys you mention two sentences later? There are quite a few "special" keys on every computer keyboard. The OS is quite justified in assuming that your keyboard has the requisite ones.
I expect Apple to have an Apple key on their keyboard, because their hardware is tied to their OS and you buy into their entire ecosystem when you buy their desktop. But Windows doesn't work that way. Many people use Windows on the same hardware that they use other operating systems. So a Windows key doesn't make sense there.
I suppose it's just a naming issue and most operating systems now recognize the key as a "meta" key, but it would be nice if its name wasn't tied to a particular operating system. And it's especially hard to do that if it has a particular operating system's logo on it.
I guess people became accustomed to this a long time ago and I should just "let it go".
So don't call it the "Windows" key. Call it the "Command" key instead. That's what Apple does, and it's the same key.
You can bind the key on Linux as well, and you can even get a "Tux" key if you prefer.
> Many people use Windows on the same hardware that they use other operating systems
No they do not. Installing multiple operating systems on a single hardware system is very much a power user behavior, and even there it's restricted to a small set of power users. Even having multiple desktop operating systems in a single household is uncommon.
I'm not talking about multiple simultaneous installs. I'm talking about the hardware being the same across operating systems. If I'm buying hardware to set up a linux machine, I will go to the same place as if I were setting up a windows machine. So I guess I could take your advice in that case and try to find a Tux key keyboard. Got it.
But I wouldn't. Because I don't use full size keyboards. Which brings me to the original and more significant point. I don't think the operating system should expect this "command" key at all. If people use it, that's fine. But it sounds like Windows 8 might be a serious PITA without a "command key" keyboard.
Building a machine from scratch is not normal user behavior. You can't talk about "many people" and then describe uncommon behavior as if it's typical. The vast majority of users buy off-the-shelf computers with bundled keyboards. Power users who build their own machines should know to buy the right keyboard.
> But I wouldn't. Because I don't use full size keyboards.
What keyboard do you use that doesn't include the Windows key? A compact keyboard does not imply a lack of a Windows key.
> I don't think the operating system should expect this "command" key at all.
Why do you believe this? How is the Windows/Command key any different from the Control key or the Escape key or Alt key or any of the Function keys or the other "special" keys? Why is it reasonable for the OS to expect those other keys will be present but not the "Windows" key that's present on virtually every keyboard that's shipped in the past 15 years?
I use this http://www.tigerdirect.com/applications/SearchTools/item-det...
The reason I believe that is because unless every keyboard has that key, then a major operating system that has traditionally been fine without it shouldn't suddenly expect it unless there is some major advantage to requiring it.
According to wikipedia, this key became a standard key on PC keyboards. Obviously I'm of the opinion that this should not have happened. But at this point it's ridiculous for me to complain about it.
No, I don't think that "many people" do this in the grand scheme of things. I'd bet less than 3% (possibly much less) of new PCs for home use are purchased this way. But I don't have any stats for this.
> The reason I believe that is because unless every keyboard has that key, then a major operating system that has traditionally been fine without it shouldn't suddenly expect it unless there is some major advantage to requiring it.
I understand your point, but I disagree. This means that there can be no progress. There have been keyboards in the past that didn't have the Alt key or the Escape key. There have been keyboards with Meta keys. Hardware can and should change, and software should change with it and not be trapped in the past. European keyboards often have an "AltGr" key that allows inserting characters that to my knowledge cannot be easily inserted from a standard US keyboard (Wikipedia says Ctrl+Alt does the same thing, but it doesn't on my system). I think it's perfectly reasonable to say that if you want this functionality, you get a keyboard that supports it.
FYI, the version of the keyboard you use was discontinued in 2009, and the replacement does indeed have the Windows key (but no trackpad). http://support.lenovo.com/en_US/product-and-parts/detail.pag...
The old joke: Escape Meta Alt Control Shift
Super + L is my friend for locking the screen
Besides, Open any terminal emulator and Alt/Option either behaves as Meta by default or has the option to do so, e.g. Terminal.app).
For all current practical purposes and in most default cases Alt behaves as either Alt or Meta depending on the context and Super as Super. Even emacs folks (maybe the biggest piece of software in use today relying the most heavily on Meta) agree on that .
Meta really does not exist anymore and all mappings one can come up with (whether they are using Alt or Super) are merely fallbacks.
On an Apple keyboard, the Command key does the job. (Or vice versa, if you use a PC keyboard on a Mac.)
To be honest, I haven't seen a keyboard that was manufactured since the late 90s and didn't have a 3rd key down there (whether it's Command or Windows). Even the Happy Hacking keyboard has one.
They trained their users for nearly two decades to use the start menu and abandon it for a weird hybrid between desktop OS and tablet OS which ends up being neither. It's actually rather sad.
I am definitely going to skip this version of Windows and hope that Microsoft will come to its senses and realize that you can either have a Tablet/Mobile interface or a desktop interface but not both.
That's why it's funny.
I suspect that they were thinking about the mountain of user data they have and drawing reasonable conclusions from it.
My family members use the power button to shut down Windows, not the start button. They tend to use icons to launch apps, not the start menu. Though atypical in some ways, I suspect these are not among them.
However, it is easier to draw incorrect conclusions from anecdote.
Tablets are a bad business, read tablets not the iPad. The iPad's sales and huge profits are a Black Swan in a lake full of regular White Swan tablets making no money and carving a hole in OEMs quarterly results, so at the end some are actually considering leaving the nascent tablet market. That's because the majority of people still prefers to pay $500 for a very functional and feature-loaded laptop than paying that same amount (or more) for a barely competitive tablet.
That vast majority uses Windows, and now MSFT instead of taking a page from Apple and using WP7 for tablets they are giving the finger to all the Windows users that keep their company going (almost every other division is hemorrhaging money or barely breaking even) by forcing them into the limitations of a tablet UI.
This is incorrect.
You can see the latest earnings results here. The division results are at the bottom.
The only division losing money is the Online Services Division. Windows and Windows Live is not even the largest in terms Revenue or Profit, it is dwarfed by Microsoft Business Division.
Also mind I'm considering the vast deficit left by the devices division, meaning that despite the success of the X360 it still has to break-even after all the money lost with the first Xbox, the RROD, Zune, Kin, etc...
I've tried to use Gimp and LibreOffice, but they both are close enough to Photoshop and MS Office that I can't ever learn their "dialects", which are very close to the originals', but not quite. Instead, I constantly feel like I'm fighting them.
When I moved to Pages instead, I was forced to learn a whole new language, and so that's what happens. No remapping, just mapping.
People seem to hate change, even for the better. This reminds me of a recurring theme in the Linux world where any Windows originated UI feature seems to follow the pattern:
1) Users complain loudly about not copying useless Windows features.
2) UI features get copied from Windows with a 5 year delay once they become the expected norm.
3) 10 years later the community staunchly opposes the removal of features. The ones which were once "unnecessary Windows stuff".
However it's mystery meat: it provides no clue as to what is active and what is not (in fact, it provides no activable element at all). There's a reason why WP7 mandates (if not requires) side-arrows or that a bit of next screen's content be visible on the edges of the current screen: it hints that there is something there, it's a very nice clue (even with that, the most frequent criticism of WP7 is that it's hard to know what is active and what is not).
Completely hidden "edge of screen" elements are very nice but only work if you know they're there already. What do you do when you don't? You're boggled, and either you try hunting the interface for active areas (looking for pointer changes, old-school adventure games style) or you give up and go back to something you can use at a glance.
Is this really how we expect to progress as an industry? Basically make no meaningful changes so that we never expect our users to learn anything new? Do we have that little faith in our users?
How obvious is it to a new iOS user that they need to press an icon on the Home screen until the icons go wiggly in order to re-arrange them? I'd argue that this is pretty non-obvious, yet every iOS user has learned this action, despite there being no "active areas" that reveal this behavior.
The iOS example is valid, but not as valid as you think. You could happily use you iOS device forever without learning that (especially now that Apple put the same functionality in the settings app, probably because it wasn't discoverable). The same is not true in Windows 8.
Given you have to know a gesture just to get logged in, you can't even start using it without knowing the magic incantations.
I think it's great MS is experimenting. They have had he same metaphors since 1995. But you have to be very careful when you change 15 year old metaphors, and I don't think MS has been careful enough. I hope I'm wrong, because that will mean users are getting more savvy, which can only be good for making more interesting apps. But my experience has taught me to never rely on users being more savvy.
People always complain about change. Always. That's not to say that there couldn't be improvements to the W8 interactions, but it's good to remember that a lot of complaints are really about the fact that something has changed, and not how that thing has changed. Look at the hate Facebook gets every time they change, and then notice that despite all the complaints, everyone is still using Facebook, and that the next time Facebook makes a change, those people loudly demand that Facebook revert to the version they complained so much about last time.
I am certain that the Windows team is listening to the feedback they are getting. The removal of the Start button was due to that feedback. People complained that it was confusing and misleading that the old start button did something so different. Whether that was the appropriate reaction is obviously something that not everyone agrees with.
> especially now that Apple put the same functionality in the settings app, probably because it wasn't discoverable
Out of curiosity, where? I looked and couldn't find it there.
> Given you have to know a gesture just to get logged in, you can't even start using it without knowing the magic incantations.
I feel like this is kind of a ridiculous claim. The login screen "swipe" is extremely discoverable. It's so understandable that Apple stole it (bounce and all) to use for exposing the camera functionality from the lock screen in iOS.
On top of that, any key will invoke the "reveal", as will the scroll wheel.
> But you have to be very careful when you change 15 year old metaphors, and I don't think MS has been careful enough.
I agree. I think the Windows team realizes that they have to change, though, or they will get passed by.
Click on an app, and you have the option to delete it.
I would put the original MacOS eject behavior as much closer. That was horrible.
The old MacOS eject behavior was indeed horrible. It was a miderable design. I can't understand how anyone ever thought "throw the floppy away" was a reasonable abstraction for "eject".
Anyway, to a large extent I agree with you - change is often OK and users, while complaining, will pick up on it.
Here's a screenshot, obviously from a long-ago version: http://www.pdastreet.com/images/articles/MoveIconsC.jpg To the best of my knowledge it's still there, though I think only if you set up the device as new.
Assuming it is still there, it wouldn't at all surprise me if a huge percentage of users would say they've never seen it, which is exactly what I was trying to say is the problem with such messages.
The big flaw of Metro is that you can't do anything without figuring out the non-obvious edge of screen elements. That's not a learning curve, it's a learning cliff.
For example, running apps are available from the left hand side. When an app disappears (eg. hitting the Windows key), just having it slide over (perhaps even popping out the running windows to see it slot in) would communicate where you need to look for things.
The RHS charms are probably harder, but I think there's a way to do this given a little extra polish.
The problem was that the sliding animation just gave you a headache after a while, especially using the start screen from the desktop. So in the Consumer Preview it was replaced with a more subtle fade-ish animation, making it less apparent where the app went but protecting the user's long-term sanity.
My Windows desktop is the middle of three computers on my desk using Synergy; at work we use MouseWithoutBorders which doesn't have corner protection. I predict serious frustration in my future.
If Metro actually replaced Windows classic I would agree but it doesn't. It just sits there on the side and randomly throws you out of a comfortable environment. Running two entirely different GUIs in parallel is infinitely confusing to normal people. Microsoft could have probably got away with this if they did it 5 years ago when there was no competition to the PC. People would have had to just deal with the change but that's no longer the case. A lot of those PC buyers are just going to end up with iPads instead.
It may be convenient, but the real problem here is discoverability. There's no way you can realize that screen edges are interactive unless you either knew that beforehand, stumbled upon it on accident or saw a YouTube video explaining it.
I still agree that excluding the Start button seems like a bad idea. Many novice users don't ever think of using the keyboard for navigation and it's clear from the video that switching back to Metro with the mouse is non-obvious.
No, it doesn't. At least not the Consumer Preview.
And that workflow only works for the very first person to use the system. It's defeated in any situation where you've got multiple people sharing a computer, and all using the same account. (i.e., normal computer usage patterns.) It's also defeated in any situation where the user dismisses the tour without even looking at it first. (i.e., normal computer usage patterns.) What they actually need to have Win8 do is automatically detect if it's running on a standard PC, and if so then have it default to the kind of user interface that PC users have been used to for the past 17 years.
I suspect that Metro is a great interface for tablets, and I could see even making it very easy to get to from PCs, for users who like it. But making it the default for PCs will be a dangerous and stupid move my Microsoft if they decide to stick with it. It's not just that it'll be confusing to users. It'll be that it's sacrificing the single biggest thing which ensures their continuing dominance in the PC market - familiarity. The interface on Win8 CP manages to be different enough from previous versions of Windows that it will feel more alien to their customers than the OS of their primary competitor, Apple. The last thing Microsoft needs to be doing right now is accelerating the rate at which Apple takes their customers by voluntarily jettisoning the primary reason why many of their users haven't switched yet.
it's clear from the video that switching back to Metro with the mouse is non-obvious.
And how. "Non-obvious" is like the official slogan of Windows 8. It's amazing how many things are non-obvious on Win8 consumer preview. When I first tried it (having not tried developer preview first), I found myself at a loss the first time I tried to find the control panel, get to the desktop, open IE in a Window (that is, get the non-Metro browser), get back to the Start screen from the desktop, close a Metro app, switch Metro apps, shut the computer down, open an application that isn't on the start screen (e.g., cmd.exe), add an app to the start screen. . . All of this might have been easy enough to figure out, except that I was sitting there looking at an OS named "Windows 8", so I kept (perhaps foolishly) expecting it to behave somewhat like the one named "Windows 7".
Long story short, Windows 8 is the first operating system since Unix where I felt the need to keep some sort of instruction manual close at hand while I was getting used to it. Even Unity felt less jarring to me.
It's functionally similar to pressing the old start button. though there's clearly some guesswork involved, since no start button is present.
This has exactly the same problem as the above method.
Not having seen Windows 8, I don't know what UI has replaced the Start Button. I hope that before I use Windows 8 I do get some of those basics from somewhere (unlike my iPhone, which required far too much Googling to make work. Perhaps I'm showing my age).
Edit: Start me up original Windows 95 ad - http://www.youtube.com/watch?v=5VPFKnBYOSI
What should be simple things, like being able to just click the desktop ANYWHERE, and drag it back and forth to more it horizontally (like you can do with a finger/touch) doesn't work. You have to go to the bottom of the screen and use the scroll bar.
Swipe-In events from the top/sides/bottom of the screen don't work, ruining much of the default "usability" (if you call it that) of new apps like IE10(Metro version) or even the default desktop Metro UI functionality and charms.
The mouse is definitely a second-class citizen in Metro, and it shouldn't be. If Microsoft would just make the touch controls work with a mouse, that would eliminate probably 70% of my problems with Metro. Put the Start Menu back, and I'm left with just vague uneasiness cause something is changing.
"swipe-in from top/bottom" = right-click, and "swipe from sides" = swing around the corners.
In general the design philosophy for mouse in Win8 is that it shouldn't be used to imitate the touch controls, but rather it should have its own independent set of controls for the same actions, that better fit the mouse.
MS doesn't seem to be "protecting" it's market share anymore; it seems to want be seen as an innovator, rather than make its current users happy (which seems a very odd philosophy for a company with 90% market share).
The only way I can understand this is to believe that MS is consciously "ignoring" internal usability data in order to push forward an innovation agenda.
I don't plan on using Windows 8, but if I did, I'd probably be alienated just as much as since until now I haven't really used a single touch device.
Also, if the users are going to have to simulate touch gestures with a mouse, I foresee an increse in RSI for the next generation.
There are still many commandline apps, even many important parts of windows require the command line to operate, however microsoft hasnt exactly put a great deal of development time into advancing commandline UI technology (and there are many companies who HAVE done just that)
plainly speaking it seems they are relegating the desktop to be the "next commandline" another relic, however the transition fails to address the fact that at it's current state, metro is not able to replace the desktop for anything but the most basic applications.
another problem mentioned elsewhere ("windows 8 is leaky") can also be explained another way - tablets have touchscreens but no easy way to use a mouse / keyboard, desktops have mouse / keyboard but generally no easy way to use a touchscreen, so using one interaction method pretty much precludes the other and trying to make an interface which works with both creates a mushy middling kind of experience which combines good and BAD things from both methods.we see this when trying to use a mouse with metro apps as well as when trying to use touch with desktop ones.
I'm treating windows 8 as another vista for now (and I used vista for many years) it's usable, full of new ideas, but lacking in that final polish where all the new pieces and old pieces meet.
MS wants to translate its power from desktop monopoly to tablet market. And therefore they play against users.
It seems that MS Windows team is too powerful to let smaller teams cannibalize Windows market share. "Innovators Dilemma" case?
Personally I would import only tablet features which make sense for desktop OS. For example app sandbox and market, standard app upgrade api, notifications.
People who want to continue using the mouse will continue using W7 for the next 10 years, like they did with XP before.
Asuming that the number of notebook style PCs continues to rise, your fingers, when on the keyboard are actually pretty close to the screen, often closer than they are to the mouse. So you would just move your have foreward some 10 or 20 cm and every once in a while.
To touch on a cell in Excel or mark a couple of words in a text processor. Maybe use two fingers when in Photoshop or AutoCAD. Well, for the later maybe a mouse is necessary, but that's what W7 will be around for for quite a while.
The mouse is a nightmare: First you have to find it on your desk or whereever you happen to work. Then, you have to focus on the screen and try to find that tiny little arrow. Finally, you have to coordinate movement of the mouse and arrow to click somewhere.
We are very used to it now, but it's a nightmare anyhow.
I don't even start about the trackpad!
How much easier to just stick your finger out, you don't even need to divert your eye's attention from the screen. Its natural. It fast. And your and fall instantly back onto the keyboard.
Anyway, point & click scenarios can be argued. Click and drag scenarios are where touchscreens fail entirely in their current implementations. At least on my tablet, it requires me to long press, drag, long press again, and press copy. Minimum of 5 seconds with some practice to copy some text. With a mouse and a left hand on my home keys, I can copy text in under a second.
I don't need to take the eyes of the screen, but maybe I am not the everage user. If you take a look at how less tech oriented people use a PC, a touch interface would definitely be a plus, though. Its like stylus against finger.
In addition to that, using your finger is not nearly as accurate as that "tiny little arrow" (which is of adequate size), and it obstructs your view of what you're pointing at. Touch screens are only good for big, bulky, simple UI.
1. Keeping any one finger extended while keeping the rest off the screen is none to pleasant for me, whereas mice are pretty comfy to rest the hand on.
2. Being a female with somewhat longer nails, I find using touch pads and touch screens to be a nightmare in terms of accuracy. I know exactly where my mouse is and what it's going to activate when I click - I can't say the same for my finger nail getting in the way or the larger area of finger-pad that touches as a result of avoiding nail contact. I suppose this could be remedied with a stylus of some sort, but I'm also sad when my hand is in the way of viewing stuff around what I'm trying to manipulate. I've got some pretty slender hands, but this still leads to awkward wrist action trying to keep a finger on the screen and look at everything around it simultaneously.
3. The distance I cover with my mouse on the desk in order to move my pointer across three monitors is a fraction of the actual distance travelled by my pointer, so instead of sweeping my arm dramatically across a meter or so, I've flicked my wrist a few cm to achieve the same effect. But, when I move my mouse very slowly, I can still achieve pixel accuracy.
4. Unless screens develop the ability to change their texture and structure, there is just no screen substitute for the little grooves between the keys, home row extrusions, and physical feedback of my keyboard. I can rest my hands on the keyboard without worry because it takes a certain amount of pressure to activate the keys, and I know for certain when I have pushed a key, even if there is no immediate visual feedback. I can always look down to my keyboard and it reveals a good deal of it's functionality to me. It also never takes up any precious screen real estate and I don't have to do anything elaborate to bring it up.
5. Goodness, can you imagine how dirty my screens would get, so fast! Yuck. Finger prints and smudges everywhere.
But this is just my opinion and may be subject to change if better technologies come around. Presently I'm in no hurry to abandon my mouse & keyboard.
-edit for formatting.
You may figure out how to quickly launch the browser and a couple of apps on the Mac, but you'll need months to get used to it and know the few times per month used functionality.
His dad would never install the Consumer Preview himself. He did not know it exists. He did not read blog posts about it, articles, interviews, etc.
CP was released for developers and users who are interested in seeing what the new Windows looks like. They are already familiar with it and know how it works.
Obviously, after installing the final release there will be a tutorial explaining all the new features, problem solved.
While I'm not sure about Metro's usefulness for laptops and desktops, I do think that the real test for the cancellation of the start button is whether the learning curve is short, and the change, once you've mastered it, is perceived as an improvement.
So until someone does that on a reasonable scale (say, 5 people), there's nothing to see here.
I don't know what Microsoft is trying to do with Windows 8. I know they intended to make touch a first class citizen but it seems like they've forgotten about the mouse and keyboard.
Now Microsoft wants to have its cake and eat it too, look Windows is a tablet and desktop! It can do both! No, that means it fails at both as the common user is exposed to a sub-par experience no matter which device they are using.
This reminds me a lot of Windows Media Center. I don't know if it was XP of Vista that came with bundled versions of it, but it left the user thinking, "What the F is this?"
Microsoft is in serious trouble, and they know it. They refuse to take seriously the things that are most vital to their long term survival as a company selling consumer products. Apple's version of a half-assed product release is Apple TV -- "just a hobby"; compare that to Microsoft's expansive portfolio of half-assed products, Apple TV looks like a polished, finished product (and it is.)
If Steve Jobs was still alive and ran Microsoft, one would imagine that everyone responsible for Windows Metro would be fired.
What's funny is that they put Windows in phones and it didn't work, so now now they try to put Windows Phone in pcs.
My dad does the same thing, bifocal eyeglasses and a belly make for some weird contortions when trying to view a monitor.
What I don't like is how consumption-oriented Metro is. It doesn't matter how slick it is, I don't want to be bombarded with notifications when I sit down at my pc. But of course, I can avoid it entirely, which I probably will. Windows 7 should do me for the next five years, easy.
Compare OS X 10.0 to the current version - there's a whole lot that has changed, and the user experience is massively superior. And they managed to get there through a series of 7 incremental releases where the single most jarring change was swapping the semantics of the "scroll" gesture. Stuff has come, stuff has gone, but they never once did it in a way that left existing users at a loss for how to interact with the new version.
Heck, skip the tiny increments. I'd go so far as to suggest that if a Mac user were to time-travel forward from 1984, they would be less confused by the current version of OS X than Windows 7 users seem to be by Windows 8.
Second, I'm sure there will be tutorials built into Windows 8 for people that want to learn how to use Metro and any of the other changes. Some of the comments here seem to ignore that. It doesn't excuse the stupidity of removing the start button of course.
I feel like I'm pretty open to changes in software, I'm a habitual upgrader. But ditching the start menu really throws me for a loop. Having to leave my working screen and go to another screen just to launch a new app is super annoying. It could be that tablets are the future, but most of us are still using an actual computer so it sucks to have functionality reduced down to a tablet when you are working at a desktop.
edit: ...the start button as seen in earlier builds, is that icon on windows 8