

Windows is not a Microsoft Visual C/C++ Run-Time delivery channel - nkurz
http://blogs.msdn.com/b/oldnewthing/archive/2014/04/11/10516280.aspx

======
mattgreenrocks
A lot of 'real' ( _cough_ ) hackers like to look down at MS, but Raymond Chen
is one of those guys I'd _hate_ to get into a technical argument with: he has
experience, is extremely sharp, and very sarcastic. Those three attributes
make Old New Thing my favorite MS blog even if I don't write Win32 anymore.

If anything, Windows' level of backward compatibility is a giant cautionary
tale: enable poor behavior from devs, and it will proliferate. You cannot
trust app devs to do the right thing, they need to be forced to; whether by
gatekeepers at app stores, or OS restrictions. It is a tragedy of the commons.
Whether it's inane programs inserting themselves into the systray,
'preloaders' for bloated apps (which slow startup), browser extensions,
Explorer add-ons, or other garbage, app devs still seem to do a fantastic job
of gunking up a Windows install.

This is why it's a bit of a blessing that webapps can't do much; because the
more powerful they become, the more annoying and inane they will be.

~~~
Too
This is why I'm against the commonly praised "Be conservative in what you
send, be liberal in what you accept" principle. Being liberal in what you
accept only gives you compatibility nightmares in the future. Another perfect
example of this is html and is the reason why every browser has to reimplement
all the bugs of internet explorer 6 in order to not "break the internet".
Really, why should a browser try to correct a corrupt document with trailing
tags and open attributes. Beginner developers/designers will simply think
"looks good on my machine, deploy and forget" and then we are stuck supporting
that kind of corruptness forever.

~~~
TwoBit
You must be my doppelganger, because I've been saying exactly this for some
time. Internet Explorer caused so much trouble partly because it accepted so
much bad HTML. It would close tags for you, even.

~~~
ygra
The automatic tag closing behaviour is even specified now.

But look at it another way: imagine browsers accepted only valid HTML and
every browser had some bugs in what they consider valid. Writing a document in
a way that it displays _at all_ might be a challenge already then. Add to that
that not everyone is on the latest browser version (that was probably more
true in the 90s as well) and you get all the current pains just with worse
symptoms. I'm not really sure that's better.

~~~
evacchi
I may be wrong here, but isn't implicit tag closing consequence of html
deriving from sgml? mandatory tag closing is something that came with xml
AFAIK

~~~
fhars
Yes, but then nobody implemented the SGML definition of HTML correctly.
According to the standard, browsers should have parsed

    
    
        <a href=foo/bar/baz>qux</a>
    

as

    
    
        <a href="foo">bar</a>baz>qux</a>
    

since it did not only include optional closing tags, but also short tags.
(This is incidentally why several SGML-based HTML validators used to complain
about "end tag for element A which is not open" in the presence of href
attributes without quotes.)

------
rossy
The discussion in the comments is interesting. MinGW, the compiler for VLC,
LibreOffice and most other FOSS projects on Windows does exactly what Raymond
says not to do. In fact, the entire purpose of MinGW is to make GCC able to
target msvcrt.dll. Developers and users love this, since they don't have to
distribute and install the CRT with the program. Developers following the GPL
aren't even allowed to distribute the CRT with their program, so they have to
use one already present on the system.

Though Raymond is correct. The MinGW guys should probably "write and ship
their own runtime library," or at least use the msvcrt from ReactOS or Wine.
That should make it possible to statically link to it.

Having said that, Microsoft probably won't change their msvcrt.dll in a way
that breaks MinGW software, since their users will complain that they broke
VLC.

~~~
rwallace
MinGW relies on dynamic linking? I'm surprised at that because when I ran a
test just now, Microsoft C++ compiles _Hello World_ to an 84k executable
whereas MinGW generates a 261k executable. I always assumed MinGW was doing
the right thing and linking statically against (its own version of) the
standard library; if not, where does the extra file size come from?

~~~
aninteger
That's probably because you wrote C++ and are usimg C++ streams. Try just
using printf. The binary, after stripping, will be much smaller (I think less
than 20k). You can use DependencyWalker to check what DLLs you are linking to.

~~~
rwallace
Ah, I was using C but I didn't realized needed to strip the binary, after
doing that it comes down to 39k which is smaller than the Microsoft equivalent
so you're right, it does seem to be using dynamic linking :(

~~~
userbinator
For comparison, VC6 defaults give 40K (statically linked), switching to
dynlink shrinks it to 16K, add a few more switches and it drops to 1.5K.

Is it possible to do that with the latest versions (i.e. linking to their CRT
instead of MSVCRT.DLL), or are they incapable of doing it for some odd reason?
These are the options I used:

/MD /O1 /Os /link /align:4096 /filealign:512 /merge:.rdata=.text
/merge:.data=.text /section:.text,EWR

------
malkia
71, 80, 90, 100, 110, 120 - Did I miss any of these? - 6 different "C" runtime
versions (apart from side-by-side sub-versions) for compiler that was released
in span of 10 years.

It's not that I like the MSVCRT runtime. It's just that I have to target it.
Any popular commercial product that has some form of plugin architecture
(Autodesk for example) through DLLs would require more or less for one to
compile it's own plugins with the exact version the main application was
compiled.

It's a bit of strange moment - when one developer cries that OpenSSL should've
not used it's own malloc implementation, and then another cries - don't expose
malloc/free interface (but do say your_api_malloc, your_api_free) and this way
you can target any "C" runtime.

Now these are completely two different things, but not so much. What if say
OpenSSL used the "malloc" runtime - what version of MSVCRT.DLL would've they
target? Does anyone really expect to target all these different versions and
all these different compilers that you can't even find the free versions now
through MSDN?

(Now I'm ignoring the fact that you can't easily hook malloc and replace it
with "clear-zero" after alloc function, but that's just a detail).

What I'm getting is that there are too many C runtimes, hell DirectX was
better!

I only wished MS actually somehow made MSVCRT.DLL the one and only DLL for "C"
(C++ would be much harder, but it's doable).

~~~
anon4
Can't you distribute plugins as an archive of compiled, but unlinked .obj
files and link them on import to the runtime you use?

~~~
malkia
Even if that was possible, you need a linker. Also recent versions of MSVC
plug-in a pragma that contains the version number of the MSVCRT dll and once a
.c/.cpp file was compiled with specific version it has this version in the
.obj file, so if you try to link it with another version it'll complain.
Recently they've added this for debug/release builds (NDEBUG, _DEBUG) (so you
can't mix them - but I think this was done only for C++ - because of STL)

------
leaveyou
I grew up with windows & windows dev tools and I always had a pragmatic
appreciation for them, but lately I became quite bitter and cold towards
Windows and Windows developers and this happened long before I started using
alternatives. At one point it was very hard to avoid the feeling that
something, somewhere deep in windows is rotten. With every new windows version
from winXP on, I had the feeling that I'm being sold a slightly improved
technological improvisation with a slightly better theme applied on it. Visual
Studio and .Net framework tried hard and nearly succeeded to hide the messy
roots of the problem but the stench of thousands of leaky abstractions can
still be felt every time you open one of the many doors leading to Windows
basement. The C/C++ foundation of windows (the shaky ABI) is only making it
worse and in my opinion many major problems of modern software can be traced
to unsolved problems in C/C++. I have a dream (a delusion :D) that despite the
accelerating patches applied to both(coincidence?) of these technologies
(C89,99,11,1X C++98,03,11,17 Win Vista,7,8,8.1,9) they will both end in the
greatest "oh, f*ck it!!" in the history of software and be abandoned for
something much better, which I hope will crystallize out of these ~70 years of
computer science.

------
gilgoomesh
Why can't Windows _officially_ include standard versions of this library? You
know, like Windows already does with .NET versions since Vista or every other
OS does with libstc++ or libc++? Forcing every C/C++ program to bundle their
own MSVCRTXX.dll is pretty silly.

~~~
api
I must confess to statically linking my Windows binaries. It's better than
making the user click through extra installers-within-installers to install
the right MSVC redistributable.

It also seems to paradoxically improve load times. Shrug.

~~~
jevinskie
Yes, resorting to static linking is a pretty good solution for executables.
Pitty about the bloat, but it works! How would static linking work with
libraries? Would you get multiple heaps?

~~~
delhanty
Sometimes you do get multiple heaps, but it can be made to work if one is
careful not to allocate on one heap and then free on the other heap. One
example is a C/C++ COM addin for Excel say - instead of freeing the Excel
allocated memory directly the addin DLL invokes addRef/Release appropriately
on the IUnknown interfaces to the various objects it gets handed by Excel.
That can be automated a bit with ATL and CComPtr. A similar factory / smart
pointer API can be made to work for any DLL that wants to maintain its own
heap.

~~~
CamperBob2
_Sometimes you do get multiple heaps, but it can be made to work if one is
careful not to allocate on one heap and then free on the other heap_

Agreed, and it's sad to think how much confusion and grief would have been
saved if they'd just made this a mandatory policy from day 1.

Windows developers regularly jump through an entire maze of hoops -- and make
their users do the same -- just so they can call free() on a pointer they got
from some random DLL, or do other goofy things that they shouldn't be allowed
to do.

------
rwallace
Microsoft C++ supports static linking with the standard library, which you
should be using for release builds. That way, your program will always be
using the exact version of the standard library that it was tested with, and
it's guaranteed not to interfere with anything else on the target system.

~~~
jevinskie
What can you do for the situation where you ship a library instead of an
executable?

My team's product is moving to statically link libstdc++ and libgcc thanks to
the runtime linking exception in GCC. It will allow us to use C++11 on old
systems. We are grateful that libstdc++ is backwards compatible to GCC 3.4
(other libstdc++'s can link against us)!

I'm looking into statically linking musl libc so our product could (in theory)
run on any 2.6/3.0 kernel. =)

~~~
vetinari
glibc itself is not supposed to be linked statically, because it itself
dlopens other shared libraries - for pam, for nsswitch, etc. These are system
specific (as configured by admin), so there is no one size fits all.

You may statically link another libc, but then the users may wonder why your
app does not works as it should, when they configure nsswitch for example and
your app will not respect that.

------
csense
Does Linux suffer from this problem?

 _I think_ the answer is "no" because Linux distros generally recompile the
world with each new major standard library version. If any C standard library
gurus are reading this, feel free to chime in!

~~~
deniska
When glibc guys changed memcpy implementation which changed its undefined
behavior, it broke Flash Player and Linus got pissed off. Afair they reverted
the change.

[https://bugzilla.redhat.com/show_bug.cgi?id=638477#c129](https://bugzilla.redhat.com/show_bug.cgi?id=638477#c129)

~~~
cesarb
IIRC, they later used symbol versioning so that only newly compiled programs
get the real memcpy. Older programs get a fake memcpy which simply calls
memmove, which is what Flash should have been calling instead of memcpy.

------
jevinskie
Could symbol versioning, like ELF has, help the situation? I know that glibc
has made backwards incompatible changes and they up the symbol version when
they do so. I don't know if that handles changes in struct sizes though.

------
userbinator
Sorry, but I'm not going to link to a huge convoluted mess of variously
versioned DLLs just to get standard C library functions. Maybe the situation
is different with C++ (probably due to no real ABI standard), but the C
library functions shouldn't change since they were standardised. If an
application breaks because the _internals_ of a library were changed, that's
none other than the applications' fault.

~~~
greenyoda
_" If an application breaks because the internals of a library were changed,
that's none other than the applications' fault."_

I agree 100%, it's certainly the application's fault, but Microsoft has some
very large corporate customers that it wants to keep happy, so in some cases
it will eat the cost of cleaning up after sloppy developers.

I don't work for Microsoft, but the company I work for (we write expensive
enterprise software) has a similar practice of bailing out customers who get
into trouble by relying on undocumented behaviors... as long as the customer
is important enough that alienating them could cost us significant future
revenue. Being willing to provide support like this can differentiate you from
your competitors. Not to mention that you can also charge customers extra for
this higher level of support.

 _" Sorry, but I'm not going to link to a huge convoluted mess of variously
versioned DLLs just to get standard C library functions."_

You don't really have to do that. If you build your application with Visual
Studio 2013, you link against the version of the runtime library that it uses,
and your application's installer installs the redistributable DLL for that
version (if it's not already installed on the customer's machine). You don't
have to deal with any other versions.

~~~
userbinator
> and your application's installer installs the redistributable DLL for that
> version

I don't want an installer, nor to include the redistributable which is around
2 orders of magnitude larger than the application itself! I want a single tiny
binary, just download and run on any system, and that's what using MSVCRT
allows.

~~~
bananas
What's wrong with an installer? An unattended installer or even a click
through one is no obstacle these days.

~~~
userbinator
\- I write my applications to run anywhere and not require any installation.
There is a huge convenience factor in this. As a collorary, they also don't
require _un_ installation since they do not "embed" themselves into the system
(registry, config files, additional files elsewhere, etc.)

\- It makes for a bigger download, of code and data that would never be used
again, with one extra step the user has to do before use, and also an extra
step for me to build the installer too.

I can see the advantages of an installer for large applications that need to
be "installed" since they modify various other things in the system, but
that's not the type of applications I work with.

~~~
bananas
The problem with that is that your software is a bad citizen in the windows
ecosystem. Even a one shot can be pushed out or run via an msi package with no
install. And you get the benefit of being able to easy throw it out via GPO,
pull dependencies like VCRT versions etc. MSI packages are pretty small as
well - our current VSTO package which has about 25,000 LoC in it is only 400k.

~~~
userbinator
> your software is a bad citizen in the windows ecosystem

According to MS who seem to be trying to force the idea that all software
needs to be "installed" and have an overly complex installer/uninstaller. I
care more about the flexibility and convenience for my users than what MS
thinks (which is ultimately heading towards locking users in a walled garden,
so no surprise that "just download and run a self-contained executable
anywhere" is perceived as somewhat of a threat.)

> And you get the benefit of being able to easy throw it out via GPO, pull
> dependencies like VCRT versions etc.

Those are not needed for what I'm doing, so they're not benefits to me. But I
guess if your idea of "pretty small" is 400k, then that's not really a
concern...

------
rbanffy
"This meant not changing the size of the class (because somebody may have
derived from it) and not changing the offsets of any members, and being
careful which virtual methods you call" scares me.

I understand DLL technology have been around for a couple decades now and that
it solved many difficult problems when introduced, but, still, this looks
insanely fragile.

------
Aardwolf
Why does a program compiled with VS need a specific DLL anyway?

Can't VS compile FOR Windows, with only the DLLs Windows provides by default?

~~~
ygra
Sure, just don't link against the C++ runtime library and you're good. Of
course, you won't have anything said runtime lib provides, then. Or link the
runtime statically, as suggested in a few other posts. Your binary size will
increase then, though.

~~~
to3m
Here's an article about it: [http://www.drdobbs.com/avoiding-the-visual-c-
runtime-library...](http://www.drdobbs.com/avoiding-the-visual-c-runtime-
library/184416623)

Without the runtime library, strictly speaking you're no longer programming C
or C++, hence the surprises...

