
Which programming language is used for making Windows 10? - donnemartin
https://www.quora.com/Which-programming-language-is-used-for-making-Windows-10/answer/Axel-Rietschin?share=1
======
TravHatesMe
A project of such scale and longevity always seems to pique my interest. I bet
there are some interesting historical artifacts in that codebase. Why is old
code so fascinating?

Decades ago someone as insignificant as you wrote this code. I find it
especially interesting to come across comments in code, which allow developers
to express themselves outside the confines of the language. I feel like a
historian in this context, analyzing this person's thoughts and state of mind.
Perhaps the author included elements of emotion or humour. Who was this
person? An insignificant developer, as I am -- did they ever become something
of themselves?

I think the developer's story is rarely told, but it's a story worth telling!

~~~
aerovistae
I do actually think code archaeology may one day be a sub discipline of the
field, perhaps in another hundred years.

~~~
KineticLensman
In Vinge's "A Deepness in the Sky" [0] one of the characters is a Programmer-
at-Arms, and one of his roles is essentially software archaeology - not for
the fun of it, but to keep the deeply layered software of a starship
functional.

[0]
[https://en.wikipedia.org/wiki/A_Deepness_in_the_Sky](https://en.wikipedia.org/wiki/A_Deepness_in_the_Sky)

~~~
conover
This is a fantastic book. Software is written and rewritten over thousands of
years, layer upon layer, powering interstellar vessels and their “automation”.
It makes you wonder what Windows will look like 100 years from now, if it’s
still around. Will it be easier to just write another layer on top or dig down
into the original source and modify?

~~~
baroffoos
History seems to suggest that it will eventually get bogged down with too much
junk but its too hard to rewrite and keep compatibility so eventually some
fresh project without the legacy will be able to become the technically
superior product and everyone will move over to that.

~~~
gmueckl
In theory, yes. In practice, either of two solutins will take hold: either the
old system is put into a container on the new system or the application
software itself obtains an emulation layer for the old system interface during
the port. Either way, not a lot will be gained.

------
ralphc
Back in the late '80s, my company had a program that included getting the
Windows 3.0 source code. Looking through the code I had found the line "goto
wearefucked;" and accompanying label further down. When we got the source for
Windows 3.1 the first thing I did was check that file, but to my
disappointment the label had been changed.

~~~
IWeldMelons
Yeah, well if you ever traced a Win95/98 app and stepped into the OS code, one
of the internal system API's was called BOZOLIVESHERE.

[https://devblogs.microsoft.com/oldnewthing/20031015-00/?p=42...](https://devblogs.microsoft.com/oldnewthing/20031015-00/?p=42163)

------
sweden
This post is kind of dramatic. Windows is much more than a simple kernel: it
is a collection of drivers, tools, a desktop envinroment, a network stack,
user base applications (like Paint), etc, etc.

If you compare it to the entire Ubuntu project, you will se a project of
similar dimensions.

~~~
laumars
I don't know about Ubuntu specifically but some distro's do ship sources with
their installation media and it usually fits on 1 DVD.

When I say "sources" I am talking about kernel, user land and desktop
applications too.

However that would be compressed tarballs and doesn't include git history -
the latter being where I suspect most of the disk space is being used in that
screen shot.

~~~
megous
You're off by a magnitude.

[https://cdimage.debian.org/mirror/cdimage/archive/9.7.0/sour...](https://cdimage.debian.org/mirror/cdimage/archive/9.7.0/source/iso-
dvd/)

Source code compresses quite well, so uncompressed, this may be somewhere
between 200-250 GiB.

~~~
laumars
My benchmark was SuSE, which was 1 DVD's worth. However your post now has me
wondering if that didn't contain the complete sources either - just sources
for server components. That said, those 12 DVDs likely cover way more than the
Windows sources would. eg multiple different image editors, LibreOffice (would
Windows sources include the sources for MS Office as well?) several different
databases (MariaDB, PostgreSQL, Redis, etc). So even that might not be a fair
comparison either?

Unfortunately without access to that source directory all of this is all just
going to be speculation anyway.

I did also make the point about compression too by the way :)

~~~
megous
Right. Probably it's not at the level of full debian archive, but those 500GB
probably include all kinds of weird services and windows tools and programs
most users never run or have a use for.

~~~
laumars
Oh I'd guarantee it would. And the vast majority of those libraries will exist
for backwards compatibility too. Much like with the CLI user land on Linux
(GNU coreutils et al).

I think the real proof of the pudding is the footprint of a default desktop in
Windows vs Ubuntu+. There was a time when Windows would literally consume one
order of magnitude more disk space than Linux after a fresh install however I
think things have since converged in the middle somewhat.

Going back on topic though, that looked a dev directory for Windows so would
likely have contained a .git directory too. That would easily balloon the disk
space used by any (mature) project's source.

\+ I know you were talking about Debian, but I'm going with Ubuntu now because
it's more of a desktop orientated distro and frankly it works better in
Windows favour anyway due to it installing more software by default than
Debian would.

------
vcanales
> You can spend a year (seriously) just drilling down the source tree, more
> than a half million folders containing the code for every component making
> up the OS workstation and server products and all their editions, tools, and
> associated developement kits, and see what’s in there, read the file names
> and try to figure out what does what.

This makes me really want to know how they manage to document everything to
make such a huge project easy to work with.

~~~
Scrantonicity
Usually when you're working on the scale of operating systems, you have people
focusing on particular areas e.g. kernel teams, networking teams, graphics
teams etc and you also have their corresponding test teams. So no one person
usually even has understanding of the entire source code.

~~~
jkuria
Not even Dave Cutler? I thought he single-handedly wrote the first version of
the kernel. At least that is the legend:

[https://en.wikipedia.org/wiki/Dave_Cutler](https://en.wikipedia.org/wiki/Dave_Cutler)

Lots of great stories about him. One is recounted here:

[https://capitalandgrowth.org/articles/930/xbox-co-founder-
my...](https://capitalandgrowth.org/articles/930/xbox-co-founder-my-life-in-
games-seattle-hacker-ne.html)

~~~
greenyoda
The kernel is a pretty small part of the operating system. Development of
Windows NT started in 1989[1], when RAM sizes of PCs were measured in MB, not
GB, so there wasn't room for a lot of bloat. (NT 3.1 could run on a machine
with 12 MB of RAM.[2]) Also, it wasn't the first kernel Cutler wrote. While
Cutler is undoubtedly a brilliant programmer, I doubt he could keep the
details of all the other parts of the system (graphics, etc.) in his head at
once.

Other well known kernels were also written by very small teams. The first Unix
kernel was written by two people and the first Linux kernel was written by one
person.

[1]
[https://en.wikipedia.org/wiki/Windows_NT#Development](https://en.wikipedia.org/wiki/Windows_NT#Development)

[2]
[https://en.wikipedia.org/wiki/Windows_NT#Hardware_requiremen...](https://en.wikipedia.org/wiki/Windows_NT#Hardware_requirements)

~~~
jdsully
> so there wasn't room for a lot of bloat

That's not how NT was seen at the time. It was quite bloated relative to other
PC OSs. Of course modern OS features required a bit more overhead than the
crazy world of windows 3.0.

~~~
greenyoda
I guess I should have said "a lot of bloat by today's standards"...

My first experience with NT was NT 3.51 on a 32 MB machine. If I remember
well, it didn't feel much slower than Windows 3.1. But finally having a PC
operating system that didn't crash all the time really made development much
more pleasant and productive.

~~~
santoshalper
32MB was a lot of memory back then. Typical machines at the time had 8. 16 was
extravagant.

~~~
temac
A lot of affordable machines a few years old even had 4M. I remember my dad
upgraded from 4 to 8 and from Win3.11 to Win95. The computer had maybe 1 or 2
years, hardly more. It was around 1500€ (well, it was 10000 FRF at the time)
when bought, with a 14" screen.

32M was insane, probably out of reach of all but rich homes.

Only a few years later (1 or 2) a classmate told us he had a computer with
128M. That sounded so ridiculous that we thought he was a mythomaniac or
something. Turns out his dad was able to get a powerful workstation from work,
and the first time we visited him we saw that mythical beast, and our mind
were blown.

And right now the private working set of my start menu process is 27M...

------
Bizarro
Anybody interested in the history of NT should read _Showstopper!: The
Breakneck Race to Create Windows NT and the Next Generation at Microsoft_

It focuses on Dave Cutler, the father of NT

~~~
pickle-wizard
I read that book a few years ago. I've read several books about various
computers projects and I usually leave with a wish that I could have worked on
that project. After reading Showstopper, my main thought was, thank $deity I
didn't work on that project.

I think it is because in the book Dave Cutler comes off as an asshole.

------
dboreham
There was Prolog in the network stack configuration system. At least there was
way back when.

~~~
jim_lawless
"The Microsoft NT 4.0 kernel network driver contained a small Prolog
interpreter to help puzzle out network interface cards"

Using Prolog in Windows NT Network Configuration

[http://web.archive.org/web/20040603192757/research.microsoft...](http://web.archive.org/web/20040603192757/research.microsoft.com/research/dtg/davidhov/pap.htm)

------
snazz
500 GB strikes me as quite small. The entire Git repository (it sounds like
Microsoft uses Git internally) must be a whole heck of a lot larger than this
if there are over 60,000 commits every few weeks.

Still, it’s a mammoth project, which makes it all the more impressive that it
works (most of the time).

~~~
antiuniverse
Keep in mind that to manage the insane scale, Microsoft internally hides git
behind a custom virtual file system:

[https://vfsforgit.org/](https://vfsforgit.org/)

[https://github.com/Microsoft/VFSForGit](https://github.com/Microsoft/VFSForGit)

~~~
seba_dos1
Whoa, they actually changed its name from GVFS (which clashed with GNOME)? I
didn't expect that. Cool!

------
missjellyfish
The real question is: is it compiled with the same msvc compiler that ships in
the SDK? If so, they don’t even use C99. C89 all the way. That especially
means no variable declaration anywhere in the function body.

~~~
ComputerGuru
While MSVC doesn't have full C99 support, MSVC supported most of the features
that shipped in C99 that make it much less painful to develop in C quite a
number of years before C99 support formally landed in other compilers. MS will
probably never support full C99 because the demand isn't there, but it's
certainly not ANSI C you are expected to write.

~~~
coldcat
MSVC support C11 because it a prerequisite for C++17. The limits are you need
to compile in C++, so the very few changes/incompatibilities between C and C++
always bend to the C++ choices. The preprocessor isn't what people used to
have in C99 so some macro still not portable.

~~~
ComputerGuru
To the best of my knowledge (i.e. not updated since C++14), C++ is no longer a
strict superset of C as of C99, and there are C99 features that are not
available in C++, so I'm not sure that's the case. But plenty of overlap, for
sure.

Here's a community wiki answer on SO:
[https://stackoverflow.com/posts/47526708/revisions](https://stackoverflow.com/posts/47526708/revisions)

Extracted portion of relevance:

    
    
        > The following C99 features are not supported by C++:
        >
        > * restricted pointers
        > * variable length arrays
        > * flexible array members
        > * static and type qualifiers in parameter array declarators
        > * compound literals
        > * designated initializers

------
eclipseo76
I'm actually surprised they use GIT internally. I would have thought they have
their own custom VCS. It says a lot about the success of GIT.

~~~
simplyinfinity
They used to have their own VCS, but moved away fairly recently, 1-2-3 years
ago

~~~
ardy42
> They used to have their own VCS, but moved away fairly recently, 1-2-3 years
> ago

TFS?
[https://en.wikipedia.org/wiki/Team_Foundation_Server](https://en.wikipedia.org/wiki/Team_Foundation_Server)

~~~
simplyinfinity
no, as a sister comment of this mentioned, i think it was based on perforce,
they didn't use TFS for this internal project AFAIK

------
solarkraft
Does anyone happen to know how the Windows shell (CShell?) is built? Winforms?
WPF? UWP?

~~~
ComputerGuru
None of the above. It's GDI and the widgets are most similar to those
available via MFC (this talking about the classic she'll, so Metro/UWP aside).

~~~
maxxxxx
It always infuriated me that MS usually wasn't using the tools they gave to
developers for their own stuff. I think it has got a little better in the last
years. Now they seem to be using UWP to some degree. But before that they
didn't use Winforms or MFC. WPF only for VS 2010 and up.

~~~
userbinator
If you've ever used any other applications written in pure Win32 you would
know why --- the efficiency difference is enormous. They did start using more
managed code in the system starting with Vista, which resulted in all the
complaints about it being slow and bloated. I suspect there's been even more
of it added since then, because the Win10 explorer.exe on a recent quad-core
i7 system with an SSD still manages to respond more slowly to actions like
opening a folder than XP's on a 15-year-old PC with a regular HDD...

~~~
maxxxxx
"If you've ever used any other applications written in pure Win32 you would
know why"

I have written pure Win32 and I know that...

------
octosphere
[https://www.wired.com/2014/03/msdos-source-
code/](https://www.wired.com/2014/03/msdos-source-code/)

------
agumonkey
remember the MinWin project ?
[https://en.wikipedia.org/wiki/MinWin](https://en.wikipedia.org/wiki/MinWin)

I wonder if it helped shrinking the code base too

~~~
jdsully
Minwin was about moving code around. By necessity it required even more code
to support the indirection and compatibility of programs expecting code in
specific DLLs.

------
superasn
Makes me wonder if there exists an OS that is truly written from scratch
designed for only modern hardware and devoid of all backward compatibility and
bloat (like the JS frameworks ditching support for msie).

It would be certainty a very interested side project I guess.

~~~
icebraining
Well, TempleOS (by the late Terry Davis) was designed from scratch to support
only multicore x86-64 CPUs:
[https://en.wikipedia.org/wiki/TempleOS](https://en.wikipedia.org/wiki/TempleOS)

------
athaca
(serious) how do I become a kernel engineer coming from a background of
javascript and swift?

~~~
saagarjha
Learn C and how operating systems work.

------
monochromatic
That’s a little surprising. I would have figured that outside the
kernel/drivers, it’d be all C++.

~~~
solarkraft
I'd have expected much more to be in C#.

~~~
shaklee3
There's no reason to write OS components in a language that will inherently
give you a performance hit before you do any real work.

~~~
pjmlp
Sure, unsafe all the way.

As note, Midori did power part of Bing during its existence.

And Microsoft security is now advising to move new development into a mix of
C#, Rust and constrained C++ (Core Guidelines).

~~~
shaklee3
Unsafe sounds like a really bad word, but unsafe usually also means
performant. I know that's not always the case, but usually. Rust falls in the
exception to that rule category, for the most part, while C# does not. Also,
the people typically working on these codebases, like Linux, are very senior
developers and have code reviews being done by other very senior people. So
while they definitely have had a lot of bugs related to security in the
language, we don't have any examples of other languages implementing something
anywhere near this size that didn't have as many bugs.

~~~
pjmlp
According to Google's security team 68% of exploits in Linux are due to memory
corruption. Source, their keynote at Linux Kernel Summit 2018.

Check DoD security assement of Multics versus UNIX regarding ease of security
exploits, and how using PL/I prevented the large majority of them. On the
mobile now, the document is accessible at the Multics history site.

Or any deployment of High Integrity Computing OSes for that matter.

~~~
shaklee3
You could also say freebsd has far, far fewer memory corruption bugs. Is that
the language, the small user base, or something else?

------
tus87
> Windows 10 is the same (evolved) code base as the code base of Windows 8.x,
> 7, Vista, XP, 2000, and NT

Err....it's been NT all along.

~~~
colejohnson66
They’re referencing way back when (before 2000) when it was just NT with a
version number

------
8456523
> You can spend a year (seriously) just drilling down the source tree, more
> than a half million folders

I would be more inclined to switch to Windows if its source tree were
_smaller_.

My guess is that a big reason I prefer MacOS over Windows is that Apple has
been much more willing to drop support for legacy hardware and old
applications to keep the source code more manageable.

~~~
MagicPropmaker
Oh come on. Be honest. I can tell just from the tone of this post that you'd
_never_ switch to Windows.

~~~
8456523
That's not true.

It is difficult for humans to estimate how they spend their time, but my guess
is that I've spent at least 35 hours recently exploring Windows 10: Changing
various settings, installing software, asking Windows-specific questions of
Google Search.

And I'm writing this on Windows. (Except for the Windows-specific questions,
I'm not counting web use in the estimated 35 hours because of course the web
mostly works the same way across the desktop OSes.)

I think computer users vary drastically in how much they value predictability
in the software-based systems (or "environments") they use, and that I value
predictability much more than the average user does.

I realize that it is unreasonable to hope never to be surprised at the
response the system makes to an action of mine. But my response to being
surprised is to try to understand how I could correctly predict similar
responses in the future. For example, I might try to understand the reasoning
of the designer of the part of the system in question. Or I might search for
ways that I might have misinterpreted the situation. And I don't like it when
I never reach an understanding of the surprising response.

Its source code's having half a million folders is a sign that Windows will
never stop surprising me, which, all other things being equal, makes me less
hopeful that spending time getting to know Windows will pay off for me.

 _Life_ of course will never stop surprising me. But to have any hope of
getting anywhere in life or achieving any goal whatsoever, my brain must be
sufficiently reliable and predictable. I see my computer as an extension of my
brain that helps my brain be more reliable in the ways that will help me to
succeed.

If you see your computer or the web site you are interacting with as a
potential friend with agency of its own, then I can see where you might be
offended by my original comment. I see computers and web sites as tools.
Levers, if you will.

~~~
shaklee3
Windows 10 with WSL has enough Ubuntu features built in now that there's
really no reason to use a Mac, since the "it has a Unix shell" comment doesn't
work anymore. Your choices for applications are much larger on Windows.

~~~
pickle-wizard
With the new ConPTY they also one of fix my last gripes about Windows. That
the console sucks.

With all the changes that Microsoft is making lately I really wonder how long
until Windows becomes free as in beer. With their revenue from services
growing, they don't need the Windows licensing revenue as much. Making Windows
free would also allow them to making Windows cloud servers cheaper. Allowing
them to grow the ecosystem.

~~~
MagicPropmaker
I switched from Apple to Windows 10 when Apple stopped offering decent "Pro"
platforms. I need NVIDIA cards for the CUDA number crunching I do. (The Mac
faithful say "b.b.but you can run an Nvidia externally over Thunderbolt." Ha!)

I even used to be an Apple employee.

Windows 10 is just fine. With a tiny amount of care to choose hardware that is
all "happy path" (I run dual Intel Xeons, with a SuperMicro MoBo, and now dual
Nvidia 2080 Ti for scientific processing), everything works fine. (Also, don't
run third-party virus scanners. Just use what's built in to Windows!)

And things that don't quite work right on Macs, like 30-bit color, etc, work
great.

I run Linux and FreeBSD in VMs. I don't use WSL much, I just build many things
on Windows under Powershell if I don't need Centos or FreeBSD. I find that a
lot of FOSS stuff works great on Linux and native Windows and not so well on
MacOS because of "non-standard" choices about directory locations, etc.

It's best to give up your prejudices and give Windows 10 a try. Microsoft
really did solve almost everything.

~~~
pickle-wizard
My $dayjob gave us all Macs for workstations. Which is fine and dandy, but I
spend all day in an RDP session as my job is automating Windows. Honestly
Windows is fine and is getting better. I'm running the Windows 10 Insiders
Preview so sometimes things get a little funky but still it pretty much just
works.

