It's not very user-friendly. Having stand-alone applications and double-clicking them is nice, having to extract and add your own shortcut to the executable isn't.
Distributing directories over the web sucks.
Both of these problems result in installers which encourage other undesirable behaviours (think: shared libraries, using the registry).
Users can't be sure that applications are standalone, meaning end users don't get most of the advantages of bundles because you can't rely on them working.
Without top-down guidance or enforcement, developers will do whatever the hell is convenient for them without regard to what's good for the end user. The New Old Thing has detailed this principle again and again over the years.
On top of that, many developers don't even like to accept that the shared dependency issue has, in practice, mostly been solved by not sharing dependencies and that the reasons against sharing libraries generally aren't comeplling anymore. I'm still grumbling that the developers of Haiku think that implementating a package manager with dependency resolution is a good idea when everyone else has figured out that it isn't years ago.
(Note: Linux is an exception. Package management with dependency resolution is necessary because Linux as an operating system is ultimately a web of interweaving dependencies rather than a coherent whole. It's a necessary, if not ideal, workaround for the problem.)
True that the resources for Windows apps can all reside in a single directory, but Mac OS wins because it enforced a more consistent standard with treating app bundles as self-contained applications and encouraging Mac developers to follow that example, which has yielded a more consistent user experience when installing and removing apps. Windows developers have had too many installation strategies to choose from, with little help from Microsoft (Microsoft once offered a weak setup toolkit on MSDN but Windows never really had a consistent built-in app installation API, not even a simple copyfile() function was provided by the OS) and thus every Windows app install and remove process is entirely different, there's even a market for competing app install toolkits in Windows, which describes the overall user experience in Windows: inconsistent.
I can't double-click a directory to run the program contained inside that directory. The system also doesn't recognize a directory as a "program" at all, it's just a file system directory.
Windows doesn't have package management. Period. Trying to rename something to look like a "package" is disingenuous.
It is true that a Mac's application bundle is preferable to Window's entirely undefined method of organizing apps. However, it is just not true to say that the bundle solves all the app installation and removal issues. Any given app can have files in a number of different locations under /Library. To really remove an app, it is necessary to remove all files under Application Settings, Extensions, Preferences, Launch Daemons, etc. Some apps even place their files in the verboten /Library/System/Extensions, or add files to /usr/local.
All-in-all it is still somewhat better than Windows, where it is still possible for apps to dump DLLs directly into C:\Windows\System32. Most Mac apps obey the convention that /System means what it says: It's for Apple-supplied system files and is not to be touched by anything else. Regardless, it's still a pain to remove apps on a Mac, especially since there is still no standardized or centralized method for doing so.
It's not very user-friendly. Having stand-alone applications and double-clicking them is nice, having to extract and add your own shortcut to the executable isn't.
Distributing directories over the web sucks.
Both of these problems result in installers which encourage other undesirable behaviours (think: shared libraries, using the registry).
Users can't be sure that applications are standalone, meaning end users don't get most of the advantages of bundles because you can't rely on them working.
Without top-down guidance or enforcement, developers will do whatever the hell is convenient for them without regard to what's good for the end user. The New Old Thing has detailed this principle again and again over the years.
On top of that, many developers don't even like to accept that the shared dependency issue has, in practice, mostly been solved by not sharing dependencies and that the reasons against sharing libraries generally aren't comeplling anymore. I'm still grumbling that the developers of Haiku think that implementating a package manager with dependency resolution is a good idea when everyone else has figured out that it isn't years ago.
(Note: Linux is an exception. Package management with dependency resolution is necessary because Linux as an operating system is ultimately a web of interweaving dependencies rather than a coherent whole. It's a necessary, if not ideal, workaround for the problem.)