Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft introduces Universal Windows apps (wmpoweruser.com)
284 points by chris-at on April 2, 2014 | hide | past | favorite | 183 comments

This is imho the best news ever from microsoft. Now I can target serverside, frontend, windows8 devices and windowsphone all with one language (javascript for me). Not bad at all

update: I have to add other news just announced. With winJS now we can target also xboxone, ios and android device And also the full framework is opensource under apache license

update2: winJS on github https://github.com/winjs/winjs

I was worried that MS would drop support for WinJS and double down on XAML. Allowing fully native builds for Windows Phone from Javascript instead of just native wrappers is great, now if only iOS and Android would follow suit.

They've just announced they are making WinJS cross-platform and open source (under Apache license). Now I haven't used WinJS, I haven't ever developed a WinRT app (just a few for the Phone), but doesn't WinJS have a ton of quality components, even more so than WinRT XAML? This seems pretty awesome.

I missed the cross platform part, I've been bouncing between the live stream and work this morning. So the claim is iOS apps with WinJS as if it had been built in Obj C with XIBs and not just a web view hosted in a native harness? If true that would be pretty amazing (it would probably also cause Xamarin to shit a brick)

No, definitely just a WebView. Still having a set of quality cross-platform MVVM-aware components for HTML5 is nice.

Oh, I misunderstood what the implications were for "cross platform" then in this context. I would never think that javascript wasn't cross platform, I'll have to look into the MVVM components but JS already has that with libraries like Knockout so obviously I need to go back and watch the keynote again.

Thanks for the clarification.

The APIs are what is now cross-platform. Previously WinJS had hooks for opening files (as one example) that obviously only worked in Windows.

Unless it is shoehorned into their appstore ... have they said anything about that?

Couldn't you do this with javscript before and also get MacOS, Linux, and Android? Anything with a web-browser?

win8's js framework isn't just a browser with your code in it. They expose pretty much all of their API surface to JS with JS-friendly APIs and things like promises for async calls, so you're not stuck using dumbed-down apis or calling out to native code shims for things HTML5 can't do yet.

No, web browsers don't have APIs for accessing everything the native platforms are capable of doing.

The amazing thing is that you can potentially write all your code in C# and have backend/frontend logic for the web (ASP.NET), Windows, Windows Phone, and even iOS/Android (via Xamarin). Wow.

The amazing thing is that you can potentially write all your code in C and have backend/frontend logic for the web, Windows, Windows Phone, OSX, FreeBSD, OpenBSD, Tru64, VAX, Arduino, Linux Kernel, Windows Kernel, Mach, Hurd, and practically every processor or operating system ever built, and even iOS/Android (via Xamarin). Wow.

You can even link to and reuse that code from languages such as: C++, C#, Java, Ruby, Perl, Python, assembler, lua, scheme, lisp.

The future is gonna be so 1970.

> he amazing thing is that you can potentially write all your code in C and have backend/frontend logic for the web, Windows, Windows Phone, OSX, FreeBSD, OpenBSD, Tru64, VAX, Arduino, Linux Kernel, Windows Kernel, Mach, Hurd, and practically every processor or operating system ever built, and even iOS/Android (via Xamarin). Wow.

Have you ever done C development like this, ever in your life?

I did. Between 1994 and 2003, we had applications run across Windows, HP-UX, Aix, DG/UX, GNU/Linux, BSD, Solaris.

An awful mess of #ifdefs, compiler specific behaviours, lack of POSIX compliance, OS specific extensions.

This on the server side, and lets not forget there are ZERO portable standards for writing portable GUI code in C.

Finally, C shouldn't never had become mainstream given its lack of safety, that we are still suffering in 2014. There were better systems programming languages in the late 70's, they just weren't at Bell Labs.

The only portable GUI code that I've ever seen work somewhat reliably is HTML/CSS.

Everything else either implements a 3rd GUI, Swing, et al, or looks weird on a few of the platforms, GTK.

Oddly enough the best cross-platform lib I've seen, Qt, is written in C++.

C certainly isn't the be all and end all, however, it's far more portable, reusable, and linkable than C#.

From a platform perspective, Qt is good for all the desktop environments, but for mobile, .NET/Xamarin is the way to go.

From a language perspective, why in the world would you want GUI code in C or C++? Both languages make it very easy to shoot your foot off and have no idea what happened. However, C can be very elegant if used for the right reasons. C# is safer, but also not perfect. That's where F# comes in. :)

I actually had a super hard time with F# and XAML, this was back in F# 1.0/1.1 days.

While not strictly C, ObjC/UIKit is a helluva language for doing GUIs.

I actually don't like GUIs in C++/Java, or C# because it's far too restrictive and you spend all your time making the class hierarchy happy and typing boiler plate rather than getting stuff done.

ObjC is a beautiful mix of scriptyness and performance / low memory utilization. If ObjC is too slow, C is right there, and if it's too complicated for C add in a little C++. My typical mix is usually about 90-95% ObjC, 5-10% C, and 1% C++.

I love that if you know an object supports a method you can just cast to id and call it. If an object is missing a method, you just add it. If a base class is stupid, you override it.

Java/C# are total pieces of crap in this regard, although C# is far better for having extension methods.

Too bad you missed out on Flash / Flex. At one point it was absolutely brilliant.

Solution: Lua.

C# is a much higher-level language than C, with a much richer set of libraries that can be relied upon to be everywhere. That's why it matters.

C# is both higher-level and lower-level than C.

If you try and add a "JumpList" to a program in C, you end up using Windows-specific extensions / APIs that make it no longer crossplatform. C# + Low-level Win32 APIs is closer to the OS than the generic C-libraries designed on a DEC PDP-11 40 years ago.


Besides, your typical C-program will not understand SEH Exceptions or DLLs (concepts that are extended as part of the "Win32 Runtime". Its not like Win32 is POSIX compliant ya know...

You can rely on C# being on all platforms by default? Linux and OS X? Really?

Last time I used Windows (several years ago), there were always apps that needed to download and install the latest version of the .NET runtime... - is that still a thing?

int main(){ puts("Hello World"); }

This doesn't work in Windows. The standard entry point to Windows is "int CALLBACK WinMain( _In_ HINSTANCE hInstance, _In_ HINSTANCE hPrevInstance, _In_ LPSTR lpCmdLine, _In_ int nCmdShow );"

C never really "ran" on Windows. C is built on top of abstractions to even run on Windows. A SegFault in standard C is supposed to be passed to a signal handler (in Linux/ OSX).

A SegFault in Win32 is passed to the SEH exception handler, to the __except{} statement in some function as the stack is unwound. (Oh yeah, __try{} and __except{} don't exist in "normal C", do they?)


The "OS Language" of Windows is C++ (through COM and COM+ interfaces). The "OS Language" of Linux and OSX is C.

The sooner you realize this, the easier programming on Windows is going to be. Windows never really supported C as a first class citizen. C code cannot easily call core Win32 libraries like DirectX. (It can, but it is significantly harder than just using the C++ COM interfaces)

C# works very closely with COM interfaces in Win32. And thus, C# is the second language of choice of Windows. The only reason C# hasn't overtaken C++ in everything seems to be politics.

Mind you, Microsoft fully supports C# Device Drivers. The lowest level hardware interactions in Windows can be done with C#.

  int main(){ puts("Hello World"); }
> This doesn't work in Windows.

Sure it does. Windows has console programs. Though I'd argue that your use of non-standard main is not something I'd do, but IIRC even that works on Windows, at least it did with older compilers, haven't done much windows C programming with the newer ones.

What you point out is an _abstraction_ that compilers include as part of their kits.

You claim to be a low-level programmer who understands C. The truth is right in front of you. Decompile those programs, look at their symbol tables. Notice, every Win32 program starts at WinMain, called with the arguments that I listed above.

Come back when you've done this simple exercise. Realize, WinMain is the _true_ starting point of "C Programs" in Windows. The rest are compiler abstractions.

http://msdn.microsoft.com/en-us/library/windows/desktop/ff38... >>> How does the compiler know to invoke wWinMain instead of the standard main function? What actually happens is that the Microsoft C runtime library (CRT) provides an implementation of main that calls either WinMain or wWinMain.

The true starting point for PE executables is AddressOfEntryPoint in the PE header, with a few parameters pushed on the stack. WinMain is a compiler abstraction; it is not looked up by name.

You can't code in a high-level language like C without working with abstractions. Whether you look at the main() level or the WinMain() level, there will still be library initialization hooks running before your end-user code gets to run.

I'm gonna upvote your comment as you are one of the few people in YCombinator who seems to know what they're talking about. Good job catching that, and you're right. The PE Header contains the entry point. (And the MZ Header is potentially a 2nd entry point left in for DOS Compatibility purposes)

The rest of the people here talking as if C is some sort of ultra-portable magic language need to learn about the low level details that differ between OSes.

My primary point remains however, the easiest and most straightforward way to interface with Windows libraries is through C++ and C#. Even C itself is a high-level language built on top of abstractions built by compilers and linkers.

Show me where in the ANSI standard it says this is illegal. Implementation details do not matter. The Microsoft C compiler will compile a ANSI C89-conforming program into a runnable executable on Windows. The rest is undefined.

And while we're on implementation details, please note that most UNIX platforms do not start at main either. Most platforms include some soft of crt0 that must be linked which contains the real entry point. The dynamic linker will also run code prior to main.

But the C standard only defines what the environment looks like when execution begins and makes no statement on what might run before or how the program got into memory in the first place.

Come to think of it, I don't think I've used any platform where main() is the real entry point to the executable. On Linux the real entry point is _start, all of the bare-metal embedded stuff I've touched does a whole bunch of hardware setup in crt0 before it calls main(), etc...

main() is the entry point function in newer OS X versions. Why ship crt0 in everything when you can put it in dyld?

crt0 would still get linked into static binaries in OSX, I presume.

> Realize, WinMain is the _true_ starting point of "C Programs" in Windows

Actually I do know that, but you simply said "int main() blah blah blah" doesn't "work" (to quote you there). If you're going to require me to be precise, maybe you should look in the mirror first.

Asm output from a c program that prints hello world to the screen. You can figure out the other sections I'm sure.

No win main or other similar concepts here and built using vs2012.

Microsoft (R) COFF/PE Dumper Version 11.00.61030.0 Copyright (C) Microsoft Corporation. All rights reserved.

Dump of file consoleapplication6.exe


_main: 00401000: 68 48 20 40 00 push offset ??_C@_0M@KIBDPGDE@Hello?5world?$AA@ 00401005: FF 15 00 20 40 00 call dword ptr [__imp__puts] 0040100B: 83 C4 04 add esp,4 0040100E: 33 C0 xor eax,eax 00401010: C3 ret


        1000 .rdata
        1000 .reloc
        1000 .rsrc
        1000 .text

I've tried to do programs with /nostdlib before, how did you implement the startup and shutdown functions that are required?

Sorry about that I didn't see your reply! Because I have to use phone it's challenging to type out all I did. While I achieved it, I wouldn't say it's sensible or easy!

Anyway, you have to turn off c++, c++ exceptions, security checks in compiler options. In linker set the entry point to main and turn off error reporting. From there it's finding the right lib which is dependent on arch, etc.

To be honest it's not worth doing. I only knew of it because I had to patch a vs2010 exe to run on windows 2000.

I always thought that the whole WinMain() thing was an ugly mistake. The right way to do this would be to organise the libraries and startup code so that the simplest Windows GUI program was not 50 or 100 lines of boilerplate, but was instead something like;

  #include <windows.h>
  void main()
Or similar. Extra functions and parameters as required to setup custom icons, window classes, message handlers, whatever.

At least they provide the mainCRTStartup thing [1] so you can use main() in GUI executables. Regarding boilerplate, Win32 was designed as a very low level API, with MFC and other high level libraries to make things easier.

[1] http://stackoverflow.com/questions/11785157/replacing-winmai...

I didn't know about the mainCRTStartup thing, thanks. But it should have been the default - to me WinMain() is just a failure to understand what is best exposed and what is best encapsulated. My boilerplate reduction proposal is to provide a few helper functions to optionally avoid exposure to some of the pipework - just to make life a bit easier when starting out. MFC on the other hand is a massive new layer of compromises and other crud slapped over the top of the Win32 API - somehow managing to make it even more cryptic (YMMV). Also MFC postdated the Win32 API by a few years, so presumably wasn't considered as the first choice solution at the start. The first Petzold book was a C thing, MFC and C++ came later.

If by "boilerplate" you mean creating/registering a window class before creating a window, you don't need to do that if your application's UI largely looks like dialogboxes - DialogBoxParam will create a message loop automatically and all you need to give it is the layout template and the message calback WndProc.

(I've been working with Win32 for around a decade now, it's got its warts but really isn't that bad once you get used to it. You can do a lot of interesting things with it.)

Sure I am not arguing about the whole of Win32, just the fact that it would have been easy to modify the API a bit so that main() was the entry point for a normal windows app.

It's done like that in C# (for WinForms, at least).

    namespace WindowsFormsApplication1
        static class Program
            /// <summary>
            /// The main entry point for the application.
            /// </summary>
            static void Main()
                Application.Run(new Form1());

What always jumped out at me when I looked at Windows code was all the capitalized type macros/typedefs.

Have to disagree with you there. Windows is older than COM. It's actually older than C++. The classic core Windows libraries and interfaces like GDI are plain old C. Not even a hint of C++.

COM is language agnostic. You can consume COM servers in Delphi that have been written in VB. Again, no C++, not necessarily. The only "true" C++ API I'm aware of is GDI+. Everything else to my knowledge was and is either straight C or COM.

And I have to disagree with one of your statements a little bit:

"The classic core Windows libraries and interfaces like GDI are plain old C."

__cdecl: this is the native C calling convention Calling function pops the arguments from the stack.

__stdcall: this is the standard Win32 library interface Called function pops its own arguments from the stack.

Calling conventions explained here: http://msdn.microsoft.com/en-us/library/k2b2ssfy.aspx

> The "OS Language" of Linux and OSX is C.

Not quite. Significant parts of XNU (the Mac OS X kernel) are written in C++.

And significant parts are written in Assembly as well. But Assembly isn't the "OS Language" of either Linux nor OSX :-p

What I mean by OS Language, is the language that interacts with core OS Libraries. The OS Language of Android is Java. iOS is ObjC. OSX core is either ObjC or C. Linux is definitely C based. (ioctl)

Microsoft is either C++ or C#. (DirectX, Ribbons, etc. etc. are behind COM interfaces that are most easily interfaced by C++ or C# code)

If nothing else, OS X's I/O Kit API is C++ish (it's 'Embedded C++').

Wow, is that thing really embedded C++? Bjarne always said that embedded C++ deserved to die if it wasn't already dying, mainly because it appears to remove all the useful bits of C++ to make it some static-typed-only inheritance mush put on top of C, as I understand (most likely wrong).

Yup http://en.wikipedia.org/wiki/I/O_Kit. I mean Embedded C++ is not really a language, any C++ compiler can be turned into an "Embedded C++ compiler" by turning some features off. As to the Bjarne quote, a lot of things that are nice in application development can become tricky when you are closer to the metal.

Thanks for the link. The bare metal thing is probably why C guys typically see C++ as bloated and why I look at C and think "how do you write anything?!?!!"; I live happily in C++ land you see.

Microsoft seems to think that it'll work: http://msdn.microsoft.com/en-us/library/bb384838(v=vs.110).a...

MS also has more people using a 14 year old OS than their latest and greatest.

is that still a thing?

come on, you make it sound like Windows is the only OS that has prerequisites for software?

Last time I checked, yesterday, I wanted to get a file server running on my Debian machine and I had to apt-get install 5 different packages I didn't yet have.

So, yes, that's still a thing, and it always will be. Although afaik since Windows 7 .Net 4 or at least 3.5 should always be there, and 4.5 on Windows 8.

One of the main differences is that apt will do that for you. You made it sound like you had to manually research and ask apt to download each library, when actually you probably just typed 'apt-get install samba' and it took care of researching and downloading all that crap for you.

In Windows, that doesn't exist. The installer either bundles it for you (which can be bad if you weren't expecting it, although is usually fine) or it leaves it in your hands to find and download the correct version yourself.

So yes, not exactly the worse thing ever (still an improvement over the old DLL-HELL) but it's nowhere near how friendly a decent package manager is, either.

Comparing to Debian isn't exactly setting the highest bar for usability. FWIW, OS X manages to support native code without requiring runtime downloads or other manual dependency management.

All MS platforms yes, why would MS care about OSX or Linux.

C# is ..., with a much richer set of libraries...

Citation please. AFAIK, there are significantly more C libraries than C# libraries.

Maybe OP means the C# standard library is richer... I'd believe that.

> C# is a much higher-level language than C, with a much richer set of libraries that can be relied upon to be everywhere. That's why it matters.

Actually this is not true. I've just been tasked with maintaining some C# code that is used as a business tool.

No problem, I thought, I'll just use Xamarin and Mono on my Mac. I made progress, but as soon as the C# code went to use the OleDbConnection stuff (to write to an MS Access database and make Excel files) it threw a DLLNotFoundException. So now I have to find a Windows machine to do the rest of my debugging.

What a well-structured and bullet-proof response. Congratulations, you're now revered amongst your peers and are considerably more employable thanks to your insightful observation. High five!

Seriously though: not only is C# a much easier language to learn which includes features and libraries that most developers expect from a modern language, that probably wasn't even what gum_ina_package was "Wow"ing at.

To me, the point is that the barrier to entry for writing apps across all platforms has now been lowered for C# developers - a group of developers, by the way, which is numerous.


Just what my iOS app needed, more shitty Windows devs, and dependency injection, or whatever the latest enterprise bullshit is. Why write code that 100 to 200 lines of code that actually work when you could make a giant UML diagram, a bunch of flow charts and write unit tests all day because everything you've ever heard of needs to be an object with a class hierarchy.

The Linux kernel doesn't have unit tests, DI, TLA, etc and works and every ASP.NET app you've ever seen has loads of them yet can barely accomplish anything before crashing / doing something weird?

Customers are gonna be fucking pumped when they find out we've added dependency injection and millions of lines of code that are pointless and unnecessary. Why release memory when you're done with it? Just don't do anything wave your hands and voila you're process is using 200 MB instead of 2 MB, yet it claims to have a 'garbage collector'

Have you ever considered that some people choose the languages and platforms they work with specifically to avoid the anti-patterns commonly found on the platforms and languages they refuse to work with?

Have you ever actually used C in a real project? I can't seem to get away without using any compiler specific constructions, much less operating system specific call routines.

There isn't even a Boost for C.

Honestly, I'd be surprised if you actually knew C with a statement like that.

Have you ever tried to get C# code deployed on Linux, OSX, and Windows?

You end up at exactly the same point you did with C, except now you have 100-500 MB of garbage to haul around with your application.

Cross platform dev and deployment is hard, Java and C# just push the work somewhere else while adding garbage that gets in the way.

If you want to make GUIs for iOS,Android,Windows,OSX, and Linux use HTML/CSS/JS.

If you want to process data on iOS,Android,Windows,OSX, and Linux use C/C++, using only the std C/C++ libs, or a portability library.

>Java and C# just push the work somewhere

You mean someone else, which is the entire point of why you would use any platform/library/framework.

No I mean somewhere else, because for the most part the cross platform stuff doesn't work.

Oh? You have a cross platform path separator builtin? Wow that must be totally worth 200 MB of garbage.

I'm sure the client will be fine with fucking around with the CLR / JVM all day instead of just downloading a 200 KB executable. It's just what they need, now not only do they need to update your application but they have a dependency nightmare too.

What in the world are you talking about? Apps in Xamarin can be as much as 10mb for a full app. Its linker is smart and knows not to bring in stuff it doesn't need. If you don't, it's around 40mb.

I'm talking about releasing a cross platform app for Windows, Linux, OSX, Android, and iOS, and the differences between Java/C# and C.

To be fair I haven't tried to release an app for Windows/OSX from Xamarin but my understanding is that it requires the .NET or Mono frameworks which on most computers is a 100 to 200 MB install.

For Java you're for sure going to have to download the JRE. When you start the app it eats gobs of memory.

Maybe it's changed but even Xamarin itself requires Mono to be installed, this is the kind of shit I'm talking about, you should download the app, drag it to applications, done. Or on Windows click Next a whole bunch of times.

Even the monstrosity that is XCode is a one drag install.

The most basic mono install currently takes about 3.7 MB of disk space, this includes about 1.7 MB for the JIT and 2 MB for mscorlib.dll.


There's even a text editor that runs on all those platforms!

But what about Linux?

Mono! Lots of ASP.Net projects (like Nancyfx or SignalR) are actually built on Mono. I think there's some hope that web dev on Windows isn't dead.

I tried Keepass2 with mono on Linux and it's pretty slow and unnatural UX experience. Much like running Wine. On OSX Keepass2 mono just froze constantly and looked weird. I've had similar experience with other cross-platform apps.

Total cross-platform is a pipe dream. Just like Java or more recently HTML5/JS cross-platform mobile apps, they are almost always a half-baked solution compared to proper native OS apps. The UX and performance will never be consistent enough to make it better than just investing in native development (Note I'm talking about software, not simple content consumption apps where HTML5 in browser is fine).

Having primary operating system -> tablet -> phone is a good enough goal. They should focus 100% and make it as stable ass possible.

Naturally mono probably has some quirks, which you need to keep in mind during development. However, you can make successful apps using it. The Unity3D engine uses mono on linux, and the games using it run fine (even on my laptop).

Mono makes no pretext of offering cross-platform GUI live, and so any app that attempts to use the same GUI on multiple platforms using winforms on Linux or gtk on win32 is going to be ugly. They're up front about this.

How is supporting two desktop operating systems any less realistic than supporting a desktop operating system and at least two separate mobile form factors on three different platforms? Both have the potential to suck if one platform doesn't get enough attention.

KeePass is a bad example. It's a Windows Forms app. If they wanted to properly support Linux, they could continue to share most of their C# code, but write a tailored UI using something like GTK#.

Nobody is truly advocating "write once, run anywhere".

I think Qt does it decently, as far as the desktop platforms are concerned.

I mean look at something like Origin or the Battle.net desktop app.

I would also say that wxWidgets makes it pretty easy, particularly as you have the source. I know there is a mass of #defines in it etc. but I have found it very useful and reliable across the 3 major platforms (Win, Linux, Mac). And it renders natively on them, unless you force wxUniversal (which does its own rendering of controls, like Qt does). I would even argue that Qt does a worse job of making stuff look native, particularly on Mac OSX; wxWidgets controls are native on Mac OSX, as underneath it is Obj-C mm files everywhere.

Miguel de Icaza considers Linux dead (http://tirania.org/blog/archive/2013/Mar-05.html). It shows. Last time I tried to compile latest version of mono on ubuntu (a couple of months ago) it was a painful process of missing dependencies and compiler errors. In the end, I couldn't make it run.

in theory Microsoft COULD support linux directly,would be a bit of work, but they would not even need mono.

Now from a business persperctive would it make sense? i dont know,Microsoft business relies heavily on Windows. But hey,anything's possible.

Problem with mono is ,mono or not,.net dev without VS kind of sucks,and VS is Windows bound.

I'm still trying to figure out how/why this makes sense as a goal. Building an application that runs across multiple platforms in congruent spaces makes sense: iOS and Android and Windows Phone. When you move to running the same application across Laptop vs. Phone style platforms, even on modern hardware, you're looking at capabilities and resources that are radically different. I can see wanting to share certain common code, but a library system ala npm, bundler, cpan, etc., would handle that better than sharing a common code base/project. What am I missing?

Think of the set of applications that can also be implemented as responsive web applications.

Now you'll be able to implement them with a native experience, without significantly increasing your costs.

What about games? I've got a load of games on the iPad, why can't I play them on my (hypothetical) touch-enabled MacBook Pro?

Why do I have to pay twice for everything?

This is just what Sony have done with Cross-Play - customers love being able to buy a game once and then be able to play it on their PS3, PS4 and Vita.

The real key here is sharing business logic.

Microsoft apps are written using the MVVM (Model View, View-Model) framework. The idea here is that you would share almost all of your model code, and really only rewrite the view layer (XAML) per platform. You're also allowed to share the exact same views across platforms, but it's unlikely that will be a good experience for users, given the differences: screen sizes, input modalities, etc.

How is this really different from what exists today? You can write in C# for pretty much every MS platform already. If you're smart you're already writing your business logic with as little dependencies as possible on view specific logic so it's more testable, portable, etc.

Not sure if you have tried doing this. In practice, it's quite painful as you need to have a separate visual studio project file for each platform (phone vs tablet).

Further, the Phone implementation of WinRT is a subset of the tablet's, so sticking to the WinRT API alone isn't enough. Complicating matters further, the phone lets you actually use a Win32 subset which is not allowed on tablet.

This (appears to atleast) unify everything towards more of a write once, run on all windows platforms world.

The other key is that for people who have invested in other platforms, they know that they can get more bang for their buck on MS by buying one app everywhere. That would be a big deal for me if I ever come home to Windows.

You could say the same for a 4.3 inch phone vs. a 10 inch 2.5k, and, soon, 12 inch 4k Android tablet. And if you want radically different capabilities, phones have SIMs that are assumed to be single-owner while most Android tablets are multi-user. Processors range from 800Mhz single-processor with mediocre bus performance, to quad-core 2.3 Ghz. There really isn't any interactive software you wouldn't run on a mobile device, and if you grok Fragment on Android, you can write for any device geometry from a single code-base.

The true boundary between mobile and desktop computing is numeric computation. You don't want to mine bitcoin or your stock market rocket science on a mobile device, mainly due to burning through the battery in a trice.

It never made sense for things called Windows not to have a unified pool of apps. This is a significant improvement.

What about things like Keyboard and Mouse vs. Touch? Network Quality (Persistent highspeed connection vs. intermittent limited quality)? Available storage space (You can consider most desktop machines (1/4 terrabyte or better) unlimited in comparison to mobile devices? Multiple Large displays (not on all I know) vs. single small display? There's alot more there than numeric computation abilities alone to take into consideration.

There are reasons to sit in front of a couple large non-touch displays attached to an uncompromising $5000+ computer. But that's for the <5% that need that and they know who they are.

The other 95% will benefit from being freed from their veal cubicles and looking up at human faces while walking around.

As for mouse vs touch etc. I agree, I think. Microsoft should have made a touch OS that isn't Windows. They might get Windows fully evolved to span that gap about the time Android takes away the enterprise business.

Microsoft has NuGet, which can do that. Common base is stuff for REST Interfaces, Entities etc

You aren't quite seeing the full picture and including some interesting additional bits. The 'experience' of an app across devices - like WhatsApp portable (and then some) - is feasible with what they put in the keynote and what the lesser Scotts did later. It's never been a better time to be a .net/C# dev.

They put on display some pretty great tooling for sharing common code across platforms and tuning the experience per UI. With decent cloud integration, there's a smoother on ramp for C# devs to get to wide audience apps with less dev resource than most any other dev env I can think of.

Microsoft has been trying to run Windows on tablets since 2001, if not earlier. To me, this really is to bridge the phone/tablet barrier (previously you had two or even three separate apps, since tablets ran either Windows RT (ARM) or Windows 8 (x86)).

Beware Apple, Google: Microsoft is starting to get it.

What about the internals? Is it still kind of a mess under the hood?

I heard they rebuilt the TCP/IP stack so it's not user land 32 bit driver anymore.

Any ideas if/when they'll modernize NTFS? Right now it's a race to the bottom to who has the worse FS, HFS+ or NTFS. WinFS or ZFS on OS X would be a godsend.

What about the registry? Any idea if they'll make that easier to manage for regular people? Am I still fucked if it corrupts?

What about sunsetting Win32? Or improving the driver model?

They're starting to get it on the UI side, but the reason why I use Mavericks as my personal machine and work on a LAMP stack has less to do with how useful or pretty the start menu is, and more to do with the godawful internals of Windows. Or have we just resigned ourselves to a crappy OS internally that we've all just gotten used to?

No it's rock solid underneath.

The TCP stack hasn't been a 32-bit driver since XP.

NTFS has been modernised already: http://en.wikipedia.org/wiki/ReFS

The registry is fine - it's people pissing around in it that break it.

Win32 - haven't touched it for years. Driver model: not my problem.

Mavericks is a crime compared to windows 8.1. I use both, regularly. It's virtually impossible to use efficiently with a keyboard unless you have twisted mutilated hands to hit meta keys galore and half the apps don't actually survive more than 2 minutes without crashing.

All these points are fine but theres no reason to flat out lie about the quality of Mavericks....

1) Keyboard shortcuts are fine, if u use a OSX all day every day you will think Windows shortcuts are bad. That and you can edit pretty much every single shortcut in OSX if you are really that offended by them.

2) If your apps are crashing every 2mins (or even more than once a week...) your computer is f'd.

No they're horrible, inconsistent between apps and put emacs to shame.

It is fecked. Its a 2011 MBP with NVidia graphics. Total waste of money that was. Totally unreliable machine.

Not to take away from your other excellent points, but ReFS is not exactly a modernization of NTFS, since it additionally removes quite a few features.[1]

The lack of disk quotas, deduplication, and named streams (and more) mean that it's not exactly a general-purpose filesystem, at least as far as standard Windows Domain deployments are concerned.

I suspect they'll address all of these issues eventually, and look forward to it, but as it stands you unfortunately can't just format everything as ReFS instead of NTFS and expect that everything will work. Unlike with ZFS.

[1] http://en.wikipedia.org/wiki/ReFS

I should've been way more clear about my gripe with the Windows TCP/IP stack. Why the hell does the TCP/IP stack eat itself? No other OS I've ever used has the nuttiness that is

netsh winsock reset

as a regular troubleshooting step when diagnosing network connectivity issues. I was long under the impression that the reason why that kept happening was that the NT kernel didn't have a robust TCP/IP stack of its own and was reimplementing the BSD sockets via a translation layer that was easily hooked into by godawful apps that felt like putting whatever it wanted there.

ReFS wasn't on my radar, and that's my fault.

The registry isn't fine. As of Windows 8 it can still hose a machine by going corrupt. The concept of having a centralized application database for application state and other metadata isn't a bad one, the current implementation absolutely sucks. It goes corrupt and the entire machine is rendered useless. Granted, the current solution is to just back it up and hope none of the backups corrupt either. Corruption doesn't happen from just people manually messing with it, corruption happens from a lot of apps reading and writing from it at the same time. A lot of apps not even bothering with good practices in how to handle it either.

As far as Win32 and the driver model not being your problem... When crappy apps written in 20 year old APIs and devices using crappy drivers crash your machine, then it absolutely IS your problem.

I can't speak to your use of OSX with a keyboard. However, given my choices, I'll take weird metakey combos over, "Is my TCP/IP stack going to eat itself this week?" Besides, emacs uses nothing but crappy and bizarre meta key combos and it's still a thing. So.

None of the "problems" of which you speak are actually mainstream problems. It's perfectly possible to hose your network stack in any OS. Just don't do that and you'll be fine.

USB drivers, sound drivers, printer drivers, web cam drivers, etc etc, are all using user-mode drivers since Vista. They cannot crash your system. Similarly, graphics card drivers use a special model that means they can auto-restart if they were to crash; which as it happens is exceptionally rare and often resolved by upgrading the driver.

I've used Windows ever since it first got a Registry (95) and it's never corrupted on me, ever.

I lol'd at the "or improving the driver model?", as though there is ANY mainstream OS out there with a better more flexible and stable driver model than Windows has today.

You've basically dug up a load of non-issues, fictional issues and historical issues and portrayed them as relevant issues in the world of Windows today. They aren't.

I used to do technical support for several ISPs before moving into a somehow nuttier profession(PHP development).

netsh winsock reset fixed quite a lot of things.

BTW: Before Microsoft added an automated tool in NetShell to repair corrupted WinSock2, this was the manual procedure: http://www.wikihow.com/Repair-Winsock-and-TCP/IP

Crappy Win32 APIs where bugs are features isn't a mainstream problem? Somehow in your decades of using Windows you have never run into buggy, crash prone applications. Amazing. Even Neo couldn't dodge those bullets.

Registry may never have corrupted on YOU, but that doesn't mean that it doesn't suck when it does happen. And it happens often enough that if you're in any sort of support role, you'll see it more often than you ever care to. Oh, it's rare, but it's prone to happen. It's just godawful design(this coming from someone who likes PHP, mind you) that you have this monolithic system database that somehow everyone can write to in various ways, not all of them good.

I can't find any better sources for why I thought that the driver model itself was wrong, but I do remember that a lot of Windows crashes happen because of crappy drivers. I remember there was a specific reason for it, and it wasn't fixed in as late as Windows 7, and it's even considered a feature. So, you got me there. what ever. I can still gripe about having to wait for Windows to install a driver for my keyboard when I use bootcamp so I can play MWO or Hawken.

The fact you merely once did tech support and then moved to PHP development explains why you hold ridiculous opinions on various things.

I'm sure Winsock reset _did_ solve a lot of problems, for you. But at the same time you probably just made other problems for the poor peoples affected by your tech support. Whilst Winsock reset may have removed the particular third-party component that was causing the issue (often shitty AV software), it will have removed any others that had valid reason to be there, such as VPN software. Do some reading about LSPs (layered service providers) in the context of Winsock and educate yourself.

"Crappy Win32 APIs" Citation needed. (Figure of speech on HN, don't worry I'm not expecting you to actually try to find one)

"Somehow in your decades of using Windows you have never run into buggy, crash prone applications." Don't put words in my mouth. Of course I've experienced buggy crash prone applications. Just as I've experienced such things on OSX and Linux. As long as such shitty apps don't bring down the OS or your Shell, all's fine really. Don't blame the OS for shitty third party crap.

"And it happens often enough that if you're in any sort of support role," Starting to see a pattern here. You misdiagnosed a lot of problems when you used to work in tech support, huh? Don't blame you, gets them off the phone doesn't it?

Nobody denies the Registry is stupid by design. But that doesn't make it buggy and broken like you claim. Microsoft acknowledges the Registry should never have happened and they do their best now to try to encourage better practices.

"I can't find any better sources for why I thought that the driver model itself was wrong," Because you won't find any but don't let me stop your desperate web searching to back up bullshit claims.

"but I do remember that a lot of Windows crashes happen because of crappy drivers" Well thanks for that Mr Captain Obvious. Just like shitty drivers on ANY of the big 3 OSes can and do cause crashes? With the other big percentage of Windows crashes being faulty hardware. Fortunately Windows has advanced itself to such a point now where the vast majority of drivers, especially those most likely to be badly written (i.e. all things USB pretty much!) are held as user-mode processes that are unable to crash the system.

"I remember there was a specific reason for it, and it wasn't fixed in as late as Windows 7, and it's even considered a feature." Nothing was "fixed" in Windows 7 concerning the fundamental stability of the driver model in Windows. They make subtle and often completely unnoticed resiliency improvements all the time. NT 6.0 (Vista to laymen) added the capability to automatically restart a kernel-mode graphics driver. Which happened to be quite a valuable feature at the time because Nvidia especially was really struggling to write stable LDDM (aka WDDM now, the L stood for Longhorn) graphics drivers at the time. It took them a good 6 months to sort those issues, during which time they were leaning on the inherent resiliency of the NT 6.0 kernel to keep customers sane.

From your one can tell that Windows is really engrained in your workflow. There's nothing wrong with that, but it's sometimes good to open up a bit and acknowledge the weaknesses of a system such that you can use this knowledge for future decisions.

So let me give you examples of why the registry is wrong. Let's say you or a user that you administer has a problem with Complex GUI application - for some reason it won't start anymore. What's your usual solution in Windows? Right, reinstall the thing and hope for the registry being cleaned ine the process. POSIX systems? Delete the config files either in the home folder or globally per app. OSX even has forma and location of these files standardized.

Then again, why even install an app? A userland application should never feel the need of any 'installation'. That's again how it works in OSX, mostly thanks to being registry free.

The main benefit of the Windows way is how registry settings can be pushed with the AD - but there's no reason this couldn't be solved without a registry. Containerization is where I see POSIX going in order to solve this.

Wait just a minute. The topic was registry corruption. Not whether the registry is a good or bad idea as a whole.

All I said is that the registry corruption is a non issue. It was a slight problem on Windows 9x; but it never happened to me personally. But was never an issue at any point on Windows NT based systems.

I've known for years that the registry in itself is a software design anti-pattern. It's just a giant bag of global mutable state. Of course it is bad by design. But don't let a bad design be misconstrued as something that is also buggy and prone to corruption. Because it isn't.

PS: It's not that Windows is "engrained in my workflow". Full disclosure: I use a MacBook. It's just me defending it against senseless and baseless attacks and accusations here on HN. I notice for example that people shut up when I pointed out that Windows has the most advanced pluggable driver model of any mainstream OS and that it is laughable to suggest it requires improvement to catch up; because they know it is correct.

Well, Taiki's original point was

"What about the registry? Any idea if they'll make that easier to manage for regular people? Am I still fucked if it corrupts?"

Which I also understand as a criticism on how it works in the first place. Also, I see 'corruption' not just when the whole registry is rendered unusable (which never happened to me at least and I've never read about such a case), but when applications create a state that renders them unusable, and that state persists in the registry such that one either has to reinstall his whole OS or go hack the registry (which even most electronics store supporters can't do, so without enterprise level support or deep technical know how any user is out of luck).

hose your network stack in any OS

No, this is very very rare on things that aren't Windows.

Lots of people say "registry corruption" when they mean "cruft"; over time apps register shell extensions and various other systemwide bits and pieces that gradually reduce UI responsiveness or break in strange ways.

It's rare on Windows too. Only on Hacker News, on this thread, does it appear to be an issue.


No, it's not as rare as you think. Yes, it's because of the lousy design of the TCP/IP stack in windows.

Yes it is. No it isn't. Yes it is. No it isn't. Repetitive much?

There's nothing wrong with the network stack in Windows. NT has always had a first class network stack and they keep improving it with every release, as is normal for any modern mainstream OS.

PS: That Google link I assume was provided for humour rather than any real substance.

Two can play at that game:



That's resetting settings.

What netsh winsock reset does is tear out DLLs, tear out various registry settings and mucks about with other system files.

Clearly you have no idea what you're talking about.

This describes perfectly what a Winsock reset performs and it doesn't exactly correlate with your FUD: http://support.microsoft.com/kb/299357

So there is no real distinction here between a Winsock reset and the equivalent operations in the other 2 big OSes. The difference is that whilst Windows holds its network configuration in the Registry (ugh), the other two do not. The operations being performed is otherwise pretty much identical.

> Any ideas if/when they'll modernize NTFS? Right now it's a race to the bottom to who has the worse FS, HFS+ or NTFS. WinFS or ZFS on OS X would be a godsend.


You mean like WinFS?

I'll believe it when they ship it.


Retracted. Looks like it's actually going to ship with Win8.1


Yes they are. Windowed Metro-apps, and legitimate Start Menu with decent Metro live tile integration (and a shutdown button)

I was hoping that they'd have jumplists back on the startmenu, but lets see what they come up with...

Google has universal apps? When will they get 'it'?

Chrome, and thereby all Google webapps, run on every major OS.

What about their tablet and phone OS - Android? Is there an app-store for those 'webapps' (not the Chrome app store)?

coming soon TM

so far they've been working with phonegap to allow porting packaged apps to phonegap (to then package for specific platforms)


I think the most significant part of this announcement is how they are dogfooding, Office for 'metro' and for the windows phone are using this Universal targeting.

They've been dogfooding for decades, that's a very core part of the developer culture at MS. (You can say many bad things about them, but they really do dogfooding all the time.)

Kind of... Still, they didn't use .NET, Windows Forms, WPF and countless data access layers in their core products (Office and Windows), which probably contributed to the fact most of these frameworks got obsoleted every few years.

Interesting move in the right direction. I wonder if they'll take it further and let you write iOS and Android apps using Xamarin / Mono?

This would actually provide an easy path for developers to get into the Windows ecosystem while ignoring the whole market share issue. Combine first class integration with an awesome IDE (Visual Studio), and frankly way better tooling than Eclipse this would be a pretty compelling reason for me to use Xamarin, and by extension have a Windows run target for my app.

This just in...Universal apps run on Xbox one.

This is interesting - Xbox one is their most interesting platform right now.

Looking at twitter it seems like XAML is getting a ton of love from MS. Go back 2 years and many were predicting that HTML was going to win the war on Windows; what gives/what's changed?

Disclosure: written more XAML than I'd care too.

What happened was that most third-party developers refused to use WinJS, and stuck with XAML:


From my understanding and use of XAML and HTML/JS, the UI performance you get on XAML apps is still better than whats achievable on HTML/JS apps.

One thing I don't see mentioned, do universal apps support C++11? I can understand why it's not as easy to support native code across platforms (i.e. need to do some kind of fat binary thing), but think it's a little sad to see all the work done to support modern C++ app development kind of abandoned. Really annoying if you have to choose between an app being universal or written in modern C++.

Microsoft is one of the biggest supporters of C++11, and C++14 for that matter.


Well, they're certainly catching up, but they have a bit to go to before they match clang or gcc: http://cpprocks.com/c1114-compiler-and-library-shootout/

> (i.e. need to do some kind of fat binary thing)

Fat binaries aren't needed to support pointer arithmetic or any other C++11 features in a cross-platform binary. LLVM bitcode supports pointer arithmetic. I also believe .NET bytecode supports pointer arithmetic. I'm not sure what other features you thought would require some kind of fat binary.

WinRT lets you write native Win 8 apps in C++, however it doesn't use LLVM or .NET bytecode. If universal binaries are supposed to be supported across different architectures (like x86, ARM, etc.) I don't know of any other way to support it than fat binaries.

AFAIK they do. WinRT apps can be written in many languages including C++. And this is pure native C++ and not "managed" C++.

Yes I'm aware of that, but can universal apps be written in native code? I don't see any mention of it in their announcement.

I am 100% sure they can be written in native code but don't have any reference to give you for now.

They need a funky catchphrase, how about :

"write once, run anywhere"

"write once, run away"?

Think pure thoughts. :)

(for those who don't get the reference: https://www.youtube.com/watch?v=bGX7N-CoaPo)

I'm resisting saying Write once, debug everywhere. This sounds cool, but the road to this hell has been littered with many failures.

Dammit, there goes my resistance. Which apparently is futile. I'll show myself out.

If this does work its a cool things. Just microsoft needs to make c# and the .net framework open source and actually cross platform. Mono is a non starter for a lot of places.

"write once, test everywhere"

Then fix and start again.

I am wondering whether they can make Windows Runtime's GUI avaliable for existing desktop appliacations. There are many professional applications needs a brand new API for user interface.

I believe they also had a segment about this for sideloaded apps.

Haven't looked at the APIs, but are the APIs going to scale between phone, tablet and 30" 4K monitors? It's relatively easy to make the same app to be adaptable between phone and tablet, or even laptop, but to make it look good on 30" 4K monitor is a completely different UX, if full screen. Or in high resolutions the apps run in windowed mode?

You can adapt any part of the app. I.e. you can simply migrate the app to be universal across devices, but if a specific part does not look good, then you can change the XAML/code behind for that part.

But do these "universal" apps still run full screen on 30" monitor? If so, then it's not a good UX. The benefit of a large high resolution monitor is to display lots of information at the same time, this frequently implies more than one app/window visible at the same time.

Yes, unless I am missing something. It's up to the developer (same as on Web w/ CSS Breakpoints) to accommodate the various 'breakpoints' or devices visually. To say it's poor UX is inaccurate, will there be plenty of unoptimized UI/UX full-screen apps? Sure, but the functionality itself is not poor UX.

They just announced that you can make your apps run in a window, instead of full screen.

XAML and its Adaptive Design concept allows the layout to rearrange itself to better fit smaller or larger screens. And the programmer can always explicitly use different UIs depending on how many pixels are available to the app.

And everything I wrote is already happening (actually, it's been happening for a while). Just try any well-built W8 app in different screen sizes or snapping configurations. ;)

You can move any app so it takes up only a slice of your screen

This just in: yes, you can run them windowed.

This seems similar to Google's tablet/phone strategy taken to it's natural extreme. This makes me wonder if google will soon do something similar with the (currently separate) Chrome/Android ecosystem.

I don't think it will -- it's really hard for MS to provide the tooling for making universal apps more or less feasible, even if they all share most of the WinRT APIs. Integrating Chrome/Android will be much harder.

They really need to go both ways with their cross platform offerings.

I can use their technology to develop apps for other platforms, but I cannot use their technology to run those apps. That requires having the other platform.

It would be nice if I could not only develop cross platform apps for iOS, Android, and Windows Phone, but I could also run them on their Windows Phone platform. I would buy a windows phone if it could do that.

Why would they want to help you provide content for Android?

Because I would do it using their tools, so it would also be content for Windows Phone.

Given the current state, why would I bother making content for windows phone? Why use C# if it's a second class citizen on the majority platforms? Why even learn it if the only thing it's good for anymore is Web API?

Windows could be the universal platform but instead they are still functioning like it's something they could monopolize. Those days are over.

Exciting! This is getting progressively closer to an ideal I call PAO (Personal Application Omnipresence [1]), wherein I use a single set of applications on all my devices. I am very happy to see progress in this direction, but there is still much more to be done in the future.

[1] http://tiamat.tsotech.com/pao

Sounds like iWork.

I see this as equivalent to writing one application that targets both iPhone and iPad and even Android with one app targeting tablets and phones. The only really new thing is support for a single application but the design and UI work still needs to be addressed for each screen size.

Some people here have mentioned this already but I am quite surprised with all the happiness and it's nice but... I have been writing apps on my Mac targetting asp.net, ios, android, wp8 and win8 with a lot of reuse and perfect frontends for two years. I love f#/c# and it works like a charm. Business wise it is nice that you have to pay only once but outside that what in god's name is so nice or special about this? I was hoping this would include a build system on Azure and a unified system using Xamarin all from VS. This is nothing at all. Wow I can target those 3 wp8 users from my win8 codebase app. Pfff. So it's just fanboys or almost nothing changes or I am missing some significant thing even though I am quite deep in this ecosystems for years now?

Agreed, it's been possible for years, especially with PCL libraries gaining significant traction last year. But these announcements and improvements have made things even easier than before. And to a certain extent have made what was once somewhat a "black art" to the layman C# devs now a official development track.

Not sure why you've been downvoted for stating facts. So I +1'd.

Probably because my tone and term 'fanboy'; I said it very carefully even as I am not anti MS or whatever; I use F#/C#/.NET every single day and like it. But apparently that was enough to trigger :) Thanks anyway; I was just surprised why suddenly whole HN seemed to hail in the new messiah... Easier is nice, but...

So is this Silverlight but for WinRT apps instead of WPF apps?

OK, and where is the code?

They must live in quite a bubble if "universal" only includes Microsoft platforms.

I guess Google and Apple also live in a bubble? Not really sure what your point is. Is your expectation that Microsoft Universal apps also run on Android and Google? On Web browsers? Would you like them to use a different term?

It's just that a mobile app that runs neither on Android nor iOS (90%+ of all phones) is not exactly universal.

If all it does is run on all Microsoft platforms, from Windows Phone to XBoxOne, it's just a "Microsoft" app. If Microsoft decides to call what the Xbox runs "Windows" they'll become... Windows apps...

Throwing around words like "universal" suggests a real cross-platform API, not "convergence of Windows with Windows Phone". Lots of people would like to see real cross-platform, not more vendor lockin.

Its fine if they only target Microsoft Products as a Platform, calling it Universal was confusing to me. Of cause Google and Apple also have there bubbles too, they seam to be more aware of the universe they live in to not brand themselves as the universe.

So you are aware that Apple also calls apps that run on both iPhone and iPad as Universal apps?

Just like a universal binary on OSX?

Exactly, except Microsoft has taken it to the logical conclusion we all want supporting all platforms, and sadly Apple hasn't. I have code that runs on both iPhone/iPad and shares most of it with the Mac version yet I have to build the Mac as a separate app and submit to a different store. That sucks.

I bet that apple is working on this, and Microsoft just got there first. Not to take anything away from Microsoft, this is awesome!

Totally.. see the appearance of TextKit on iOS using tons of NS* classes instead of UI* classes, and UITextAlignment being replaced with the OSX NSTextAlignment, etc.

I doubt its a very high priority though given how small the OSX Desktop market is in comparison.. it'll just be a very slow merging ;)

What's to say, once they complete the purchase of Xamarin, that universal also includes OSX, iOS, Android and even Linux?

Weirdly, PE executables on Windows are Portable Executables that the system model would permit to run under not just Intel but other architectures.

Sadly the other platforms died so the portable executable is really just the stuck executable.

I'm shocked this didn't happen sooner. If you ever wanted to point to a problem with MS vision you can ask yourself why this didn't happen in 2005.

I think your underestimating the INSANE amount of vision and foresight that would be required to do this in 2005.

1. The xbox 360 was released in 2005. This was essentially the first Microsoft console to run 'apps,' and those had to be architected specifically for the console.

2. The iPhone was years away in 2005. Phones ran shitty little apps that had to be architected specifically for the phone.

3. The iPad was years away. Microsoft tablets just ran XP. The idea of the iPad as a "bigger iPhone" was mocked to no end when it was announced. People thought they wanted OS X on a touchscreen. Microsoft would have thought the same thing.

4. gMail has just been released. The idea of a true "web app" required remarkable vision and foresight by itself. Additionally, web apps had to be architected specifically for the browser (seeing a pattern?).

Even if Microsoft had seen 5 years into the future and predicted all these changes, the platform they would have created in 2005 wouldn't have had anything to run on.

The idea of a centralized "app store" was not new. People had been complaining about Windows not having Linux-like package management since the 90s.

Similarly, the idea of web apps taking over everything was not new. Microsoft was acutely aware of that, since that was Netscape's big idea (also Java was in theory the same threat - write once, run anywhere, make Windows unimportant). Again, all in the 90s. This is the most fundamental answer as to why MS didn't do this stuff before. Not because of a lack of foresight, but because their foresight was good enough to allow them to identify it as a threat to their monopoly.

Touch screen phones/tablets I agree were less obvious predictions, but even if MS did figure that out early on, it wouldn't have changed their motivation to fight back against Netscape and Java and similar technologies.

But now the tables have turned...

Microsoft had an application store in Windows XP. Remember "Find applications"? They had an online store of sorts that listed programs, but nobody bought anything on it because there didn't appear to be any vendors on it, and CD-ROMs were all the rage back then. App Stores have only really taken off as Internet speeds have sky-rocketed. That was not the case when XP was released; everyone was on dialup or barely slow ISDN if they were lucky.

Good points, but it is also worth considering that in 2005, Microsoft had a much stronger position in the Smartphone/handheld market with Pocket PC/Windows Mobile. Even looking at how they named their mobile os makes you think they should have been thinking about cross platform apps. Those phones felt a lot like tiny PCs, which was also their weakness.

Microsoft had tablet PCs running Windows in 2001. The price point never worked out. I'd say they had the vision, but the focus on backwards compatibility prevented them from bringing in a new paradigm.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact