Hacker News new | comments | show | ask | jobs | submit login
DLL Hijacking Just Won’t Die (textslashplain.com)
229 points by ingve on Dec 27, 2015 | hide | past | web | favorite | 94 comments



I've used this behaviour --- non-maliciously --- multiple times in the past to override APIs for logging/tracing/debugging purposes. For example, I have a set of DLLs[1] that you can put in the same directory as the .exe of an application and it will log all network traffic that it generates. Very useful and convenient compared to the alternatives. Another use is to workaround compatibility problems.

What I find more saddening is the trend to view any behaviour that could potentially be exploited as a vulnerability, regardless of how useful it could be, which just leads to locked-down user-hostile systems where nothing is possible without going through some sort of ridiculously bureaucratic excess of process.

Thus, I think the root cause of this problem is not with the DLL loading behaviour, but with this...

The bad guy just navigates a frame of your browser to the DLL of his choice and, if you’re on Chrome or Microsoft Edge, the DLL is dropped in the Downloads folder without even asking

...entirely unconsented file download. The design of putting all downloads in one place is a small contributing factor.

[1] http://www.netresec.com/?page=Blog&month=2011-01&post=Proxoc...

Edit: we can also exploit this behaviour benevolently by putting a set of DLLs in the Downloads folder that would be loaded by any installers being run from there, which could do things like sandboxing/install logging. (Presumably browsers would be not so brash as to overwrite an existing file of the same name in there!?) It's not so bad after all...


It doesn't need to be the default/transparent behavior, isn't it? Mechanisms like LD_PRELOAD can be used for the same purpose in more controlled way.

Not having the current directory in PATH / LD_LIBRARY_PATH is a well-known wisdom now to avoid inadvertent interference. I suspect Windows behavior is more like an overlook, designed when environment was less hostile, but now they can't change because of the backward compatibility. It's not a vulnerability per se, but a bad choice in retrospect, imho.


It was a design decision from a time when software management was non-existent. "Installing" meant creating a directory in the root of the hard drive and copying all your files there. Shared libraries didn't exist.

But this type of exploit still existed even under MS-DOS in the form of BBS door hacks. Remote access systems would let you run external programs connected to the serial line. But some would leave the CWD in the download directory after an upload thus allowing you to send a file with the name of an external program then when you activate that program you have a shell.

The solution then still applies today: Don't run external programs without sanitizing your environment. Quarantine all uploads. Don't allow the remote client to control the filename of uploads.

And it's not like Windows doesn't have an executable bit on files. The shell already prompts you for permission to run downloaded programs. Why is it not clearing execute permission by default then adding it when the file is confirmed safe? (I know from experience that a non-executable ACL will prevent a DLL from loading.)


That was added as part of XP SP2 as an ADS I think.


> I suspect Windows behavior is more like an overlook

I prefer to have dynamically loaded libraries (dll's) but never shared locations, other than for OS level libraries and runtimes (which are rarely installed by applications). Everything I manage myself I prefer to keep in the application directories. This is also the normal behaviour of application installers on windows.

    \app1\app1.exe 
    \app1\somelib-1.0.dll

    \app2\app2.exe 
    \app2\somelib-1.1.dll
Now it is pretty reasonable that the application searches for dependencies somewhere relative to the directory of the application. This has several benefits but a few security drawbacks 1) hijacking is possible since the current directory is used, 2) if there is a security vulnerability in somelib, you have to patch all applications that use it. I much prefer this to the idea of applications installing files to a shared location on my disk however.


> What I find more saddening is the trend to view any behaviour that could potentially be exploited as a vulnerability, regardless of how useful it could be, which just leads to locked-down user-hostile systems where nothing is possible without going through some sort of ridiculously bureaucratic excess of process.

I'm going to take a viewpoint counter to this. One of the things that constantly frustrates me on my desktop computer is my inability to run untrusted third-party code outside my browser. The desktop was simply not built to do that, and I wish it was. I'm currently judging games made for the 34th Ludum Dare game jam, and the way I do it safely is by creating a completely separate user account in Windows, and playing the games there. This is inconvenient. I'm not going to play the games under my normal account, because I simply don't trust the code, but Windows gives me no other option. I know my paranoia puts me in the minority, because after the jam is over, I uninstall Java and Flash and the other "high-risk" pieces of software, and I'm pretty sure most other people don't do that.

On the other hand, the recent SIP lockdown on OS X and the way it's affected dtrace is not something to be ignored either. But given the choice, I'd rather have a dozen dialogs ask permission rather than one piece of malware slip by uncontested.

Phones may be a nightmare in terms of hackability, but at least I can download a random application without worrying about whether it's CryptoLocker in disguise or something like that.


Dozens of dialogs lead to "dialog blindess" where users blindly accept dialogs to get their thing to work.

I think sudo works because it requires some expertise to use a terminal, even though it's conceptually the same as a windows UAS dialog for example.


Right, but I was responding to the viewpoint that not all exploitable holes should be closed. At least with dialogs, a user has the possibility to make an informed choice. With "vulnerable by default", you don't get to make a choice at all.


I think you should use a virtual machine rather than a separate user, for what it's worth.


Why? That sounds like even more work, does it provide any benefits? (The guest would need solid graphics acceleration.)


Local user to root privilege escalation is generally easier than a VM escape. Though both are possible.


Weighing the risks. Cost of protecting against root escalation versus likelihood of a root escalation exploit on a patched Windows computer, and cost of using a different user to play the games against the likelihood that someone typo'd a script and it wipes out my home directory. If I really wanted security I'd unplug from the network and refuse to run third-party code at all.

See Myth II: http://minimaxir.com/2013/06/working-as-intended/

This is the type of scenario I'm protecting against, as well as unsophisticated but malicious actors.


You could get a perfectly working GPU by reimaging a second partition with a fresh install every time, just like the PCs in high school.

Not that that's any less work!


A VM would be less work because you wouldn't have to install than uninstall Flash each time.


Three times a year... versus the time it takes to set up a VM.


> and the way I do it safely is by creating a completely separate user account in Windows

Leaving aside the fact that this is not advisable because user to admin boundary is a lot more porous than remote boundary I hope you're at least running under local non-admin account because of a certain peculiarity of Microsoft[1]. Microsoft doesn't regard UAC as security boundary for admin accounts and won't fix vulnerabilities that enables software to bypass UAC completely. Under default configuration in Windows 7 and 8 (not sure about 10) all software thus runs effectively under root. Also check what parts of filesystem games will have under that account because by default only files in a few system folders are protected while executables outside can be overwritten.

If you can I'd suggest you use dedicated machine for testing, if that is not possible and if you have compatible hardware then use visualization with bypass for GPU which can speed things to usable levels. If both aren't an option then you could test games on a separate hard disk while having main one disconnected physically, which would leave your attack surface open only to firmware overwrites which aren't very common.

[1] http://www.istartedsomething.com/20090611/


What I find more saddening is the trend to view any behaviour that could potentially be exploited as a vulnerability, regardless of how useful it could be, which just leads to locked-down user-hostile systems where nothing is possible without going through some sort of ridiculously bureaucratic excess of process.

Automatic downloading falls into the category "kind-of useful and convenient, but exploitable". Maybe you want to reconsider your argument?


I found the argument quite clear, as he explicitly says that automatic downloading is the problem here and that it's not happening in Windows, but in Chrome and Edge (just tried the test and my Firefox asked me if I want to download the dll).


Would UWA's be vulnerable? This seems like a good reason to abandon Win32 APIs anyway. If each process can be sandboxed from an executable's perspective, it should reduce the risk. If the number of UWP/UWA apps are the majority, then there would be fewer Win32 installs to exploit.

I think this could be a multi-pronged solution. Why not attack both sides? Edge and Chrome should correct their behaviors but Windows should also patch things up with DLL loading to reduce the risk. Maybe a signed manifest could verify the assemblies an app is loading and only authorize execution if it passed a UAC prompt explaining why it is blocked.

As a developer, I appreciate that you can use other DLLs too, but this feels like it should be the exception more than the rule.


Another trick:

libkeepalive: library preloading to force linux programs support TCP Keepalive

http://tldp.org/HOWTO/TCP-Keepalive-HOWTO/addsupport.html


proxocket is cool. Could it decrypt schannel read/writes?


This trick is used by a number of game mods, by way of providing a fake DirectX dll that monkeys with the internal game code and then calls the actual DirectX library.

ENBoost is probably the most prominent at the moment, and is considered nigh mandatory by modders for Skyrim, the Fallout series, etc, because it lets you override the game engine and use free RAM (beyond the game engine's 2GB or 4GB limit) as an extended cache for your video card.


I personally use this trick on Rocksmith 2014 to add custom DLCs


But this should also be possible without the game executable looking up DLLs in its CWD? A mechanism analogous to LD_PRELOAD_PATH on Linux should work equally well or the mod could provide its own binary, load its dll and then run the main game binary?

In other words, I don’t understand why “load DLLs from CWD” is vital to the workings of such mods?


Well it's easier to explain to users of the mod at least (dropping files in a folder). Also replacing the binary is problematic because that will cause updating and DRM problems.


With an LD_LIBRARY_PATH they'd have to drop 2 files into the folder, the dll and a wrapper that sets the library path, I don't see the issue...


Yes of course. But then the users have to run the game with the wrapper file. Not a large inconvenience though. The Linux system is better.


Really bad consequence of the current/same directory being in the library search path. It's similar to why, on Linux/Unix, it's bad practice to include "." in your PATH (or LD_LIBRARY_PATH). I know including same-directory libraries can make things more convenient, but it should be something that an app explicitly needs to want and enable, not something that just happens.


> It's similar to why, on Linux/Unix, it's bad practice to include "." in your PATH (or LD_LIBRARY_PATH).

An even trickier problem on Linux is that an empty component in the LD_LIBRARY_PATH is interpreted as a reference to the cwd. Many tools come with launcher scripts that do something like:

    export LD_LIBRARY_PATH=/usr/local/lib/my-app-1.0:${LD_LIBRARY_PATH}
    /usr/local/bin/my-app-1.0
... and bam, now if $LD_LIBRARY_PATH was empty (which it usually is) you've unintentionally added the cwd to it. This problem is fairly common in practice.

FWIW, the correct way to set the variable is:

    export LD_LIBRARY_PATH=/usr/local/lib/my-app-1.0${LD_LIBRARY_PATH:+:$LD_LIBRARY_PATH}
(or make sure the binary has a DT_RUNPATH entry that points to the library directory, so you don't have to mess around with environmental variables at all.)


I wonder if changing this behavior would break anything. I can't imagine too much software relies on it.


IMO, loading by hash and/or sign are better solution.


Typical result of developers overestimating users. If a browser developer sees that some file suddenly downloaded automatically, he would notice it instantly and delete it if it's something fishy. But a lot of typical users wouldn't notice (the kind that never closes download bar in chrome anyway) and would never clean up their downloads folder. Damn, a lot of them would even run it to find out what it is.


No, I think it has more to do with the way that executables on Windows are 'tagged' as having been downloaded from the internet. When a user opens one of them (such as an installer) they get a prompt if they want to run this code from the internet. Generally, this is considered a good thing.

Browsers, in a stroke of unassailable logic, have decided that the existence of this new security feature should mean they should now start silently downloading executables from websites. After all, what good does it do to have two pop-up warnings that users simply click through?

This is just my impression, I could be wrong about all this.


> they get a prompt

Yeah. About prompts and real users. In my experience, once user clicks on a given prompt 5 times, after that he's completely trained to close it automatically without giving a slightest regard to it.


In this case, the user clicking through the prompt isn't the issue. They wanted to run the non-malicious installer and so they did. If they read and re-read the prompt, they would still make the same decision


This is an interesting point because there's no corresponding warning when loading a DLL from the internet zone. hm.


Interesting aside...I downloaded the dll this afternoon just for laughs to see what the fuss was. Since I didn't want to dig around for an installer that would trigger the hook, I just left it in my download dir.

Just now, I decided to mess with Steam again after a long hiatus...guess what popped up when I ran SteamSetup.exe?


Can someone explain all the steps here?

I get that:

1) the browser will silently download a DLL to the Downloads folder

2) the installer does something that causes the DLL to run, compromising the machine.

I get why people don't like (1), but why is (2) actually happening? Are the DLL hijackers exploiting name collisions, is an API being abused, or something else? And is there a robust way to prevent this even if browsers persist in silent downloads?


The linked article [1] has an explanation:

  In simple terms if an application (e.g. Test.exe) loads
  a DLL (e,g. foo.dll) by just the name, Windows follows
  a specific search order depending upon whether
  “SafeDllSearchMode” is enabled or disabled to
  locate the legitimate DLL. 
So it looks like the installer is legitimately attempting to load some DLL but either hasn't fully qualified the name or hasn't enabled this flag

[1]: http://blog.opensecurityresearch.com/2014/01/unsafe-dll-load...


I was interested so read the article. Enabling the "SafeDllSearchMode" flag doesn't help as it is not the current directory from which the DLL is loaded, but the startup directory from which the installer is run (eg. the Downloads folder).

The directory from which the application is loaded is still the first directory from which DLLs are loaded with "SafeDllSearchMode" on. Moving current directory down in the search order doesn't help.

edit: The Microsoft Security Research center also had a paper on this [1] and there it lists that "SafeDllSearchMode" also employs a "known DLLs" technique when this flag has been set.

Now the question is if Microsoft has added this DLL as a known DLL or not. I don't see it in my registry, but it might be defined elsewhere (or I might overlook it)

[1] http://blogs.technet.com/b/srd/archive/2014/05/13/load-libra...


The installer tries to load a dll called version.dll. As the default behavior on windows is to load from the same folder as the started binary first, it finds the malicious dll and loads that.


I don't get why an installer would ever load an external dll? The installer should bundle all of its own logic AND all the code of the application it's installing. That's kind of the point of making an installer in the first place. When is it useful to have a dll next to an installer (for non exploit purposes)?


The installer depends on functionality available within windows. In this case, the version.dll is loaded indirectly by windows or the installer dependencies.

The version.dll is a standard Microsoft DLL, just checked it here on a few systems and on Windows 7 it doesn't even appear to have been signed by Microsoft. Only starting in Windows 8 it has been signed.

The version.dll provides very standard functionality such as Win32 API call GetFileVersionInfoA. It is the same thing as what you see in Windows explorer by right-clicking a binary such as exe or DLL and display the version info in the properties tab page.

It is very commonly used functionality so lots of applications/installers will fall for this.


I see. Then is seems the issue is that you can replace a normally signed OS-library with a local unsigned one like that. Having the OS libraries load before the local directory would resolve it just like GAC assemblies are preferred over locally resolved ones in .NET



Does anybody know why Chrome and Edge would silently unilaterally download binary code like that?


The common use cases - downloading zip files, photographs, mp3s - tend to reward this behavior, because it lets a user casually just click on a bunch of download links and accumulate them in the Downloads folder. They never have to click on any spooky dialog boxes. Ideally 'dll' and 'exe' would be blacklisted from this...


Well it's not silent (at least in chrome). it does show that it is downloaded, but it doesn't do anything to prevent this.

IMO a good solution is to make the user manually re-confirm that they meant to download a .dll file on windows to prevent this kind of thing from happening.


IIRC Chrome will begin downloading an EXE in the background while the prompt is open, so you don't even lose time to the confirmation window.


obviously it's sketchy for binaries / dlls / etc., but confirmation dialogs do kindof suck. Obvious use case: makes it fast to download a lot of files.

Though I've never had a need to download multiple exe / msi / dlls at once, so I wouldn't really mind a warning on those downloads. But the OS already does warn for exes from the internet, which makes it a little redundant.


it seems like the OS should be preventing you from running an executable in your root Downloads folder, maybe copying it into a Temp dir before executing it. This could also be used to add security, sandboxing the exe within the temp dir and prompting you before accessing the larger filesystem.


There's actually a Windows feature intended to mitigate this, that was added sometime around Server 2003, but it's never turned on by default because of backwards compatibility issues. Look under 'Windows Settings\Security Settings\Software Restriction Policies' in a GPO to see it.


Here is a fantastic on on Privilege Escalation using DLL Hijacking http://www.pentesteracademy.com/video?id=575

Just how malware works today! even saw this in a Mark Russinovich talk.


The question that needs to be asked here is how to get all installers using MSI. MSI is a secure, declarative format which runs off MS code in known directory. Because it's declarative, it can also be queried and tracked. I recently packaged an installer using it and it worked really well. WiX has terrific documentation and it was straightforward.


Need to set the registry key HKLM\SYSTEM\CurrentControlSet\Control\Session Manager\CWDIllegalInDllSearch


That's not enough - that's just removing the current working directory. But unfortunately, the issue here is that DLLs are put into the same directory as the executable to be run.

You can't remove loading DLLs from the same directory as the executable on the OS level because many applications rely on this feature. This can only be fixed on a per-executable basis.


Browsers could protect users from this class of vulnerability by making it so that, if you run a browser-downloaded executable from within the browser, it first moves it to an empty directory and runs it there.


Browsers can't do that. At least not in a general enough way, since it's Windows that executes the files.

Windows could stop trusting code in folders with some flag. That would be much more useful.


Such flag already exists: http://www.howtogeek.com/70012/what-causes-the-file-download...

It's a simple fix for Windows to disable local folder DLL loading based on the Zone.Identifier ADS. There's no usecase where an EXE downloaded from the internet should load a DLL from it's current folder.

An alternate quick fix for the browser to redirect all DLL/SYS/OCX/Whetever to a DLLs folder inside the Downloads directory, and keep the EXEs in the main folder as before, so as not to confuse the users. Downloading DLLs is so rare that the extra hoop is acceptable.


Don't need the extra move step. Just put every download in its own directory. Or give every source (domain? something else?) its own directory. There's something in here that feels like cookies and cross-site conventions.


This sounds like an interesting mitigation, but comes with it's own drawbacks: - you now start worrying about if it's possible to get a malicious download from your domain that dll-hijacks non-malicious downloads (better than the current state of the world, but not a lot) - less user friendly: who wants to remember what domain they downloaded something from when they go looking for an old download?

also there's the question of should you associate the file with the domain that served it (probably?) or the domain that was in the browser bar when the download was triggered? I think it's unclear because the user's will expect the latter but the former makes more sense as security.

Anyhow I think the right answer is that this is on the OS to provide better defense mechanisms. Maybe just something like the ability for the browser to tag the file with the domain it's from, instead of just "the internet". And then the OS could say "exe from www.foo.com is trying to load dll from www.bar.com, this looks fishy do you want to allow?" or maybe just flat out fail it; this way you get your cross-domain restrictions but less usability problems.

Or better yet, the OSes could solve this problem entirely by having a much stronger whitelist of what DLLs are included from where. why don't we just require programs to specify the full path to any DLL they try to load, instead of doing a search for the file? Or maybe some sort of code signing technique could apply, and programs could specify they want to load X dll, but only if signed by Microsoft or whoever the expected party is, with some optional power-user way to override this?

Or maybe we should just have some sort of warnings when you download a DLL from the internet, and put DLLs in their own download folder or something? That might be the easiest mitigation for a browser to implement.


> if you’re on Chrome or Microsoft Edge, the DLL is dropped in the Downloads folder without even asking

What fucked up joke is this?!


Safari does the same thing. I think FF is the only browser that still defaults to asking you what you want to do when you click on a download link.


Well Chrome all day long tells me "foo.exe might be an unusual download" and with every version they make it harder to bypass that (also sites with misconfigured HTTPS, broken cipher suites and other crap which is perfectly fine in a internal environment).

But then it allows silent DLL dropping... jeez.


Best tip I've learned in the past month: when you're at one of those unskippable TLS error warnings in Chrome/Chromium, just type "danger" on your keyboard.

Example: https://subdomain.preloaded-hsts.badssl.com/


Interesting. I'm on 48 (beta) / OS X and can't get this to work, though.


Really? I'm on Chrome 47.0.2526.106 and OS X 10.11, works fine for me, takes me to some 404 page.


I'm on stable OSX and it works for me.


Chrome 47 OS X and it works for me


Too bad one can't donate karma here. You deserve a boatload.


Great! Got one for FF too? :)


Sadly, no. I read the FF source code and couldn't find anything, although I'm far from an expert in that code base, so it might be hiding somewhere.


I think you're right, though I find FF's interface for that particularly inelegant.

In the case of Safari, I'm not sure what downloading a DLL would hurt on a Mac anyway, and an actual executable file would be quarantined.


As someone accustomed to selecting the location with every download, I really don't like the idea of a separate folder just for downloads. The first time I used a browser like that, when I clicked a link and it downloaded the file immediately, it was quite disorienting because I didn't know where it went. The files I download need to go in different places depending on what they are, so it's hugely irritating to have to go to Downloads (which is conveniently somewhere with a nontrivially long path...) and move them to where they should go. This encourages users to put everything they download in one place, which is really bad for organisation.

IMHO the browser should always ask where you want to save something, defaulting to the last location you chose. The extra click/keypress, and having the files downloaded to their intended location, is far better than the multiple actions required to rummage in Downloads and move the files to where they should be afterwards. The workaround, to right-click and "Save As...", also involves an extra action.


I actually wish that it would remember the last place I saved a file to from that particular domain, instead of just the last in general. E.g. sometimes I'm downloading software but sometimes I'm downloading bills which I have a pretty strict folder structure they go into.


You can set FF to always prompt you..


I think the bigger joke is that you can hijack DLLs, and the UAC prompt and signed code verification don't say a word about it.


It's certainly an interesting problem. Suppose that an program that needs administrative permissions depends on lib.dll. Should lib.dll have to be bundled with the application and signed with the same key? Why not statically link at that point then?

Since the libraries the application might pull in might not even be available before runtime would you prompt for each load? Maybe prompt for each load that isn't in a system directory -- implying that there isn't prior trust?

You could embed the key the library developer signed the library with but then we have a key revocation problem and updating the library might require updating the application and business critical applications might hold entire departments back from getting security fixes.

I don't think this is really a problem with DLLs or the signature process, but that the old installer tries to load any library in the executable directory.


The constant issue of computer security is perhaps that software can only inspect actions, not intent.

And any action taken can be either legitimate or nefarious, given the context of intent.


At some point, infosec lies on user education. No arguing about this. But the exact point is open to argument.

Software need an interfaces that reflect intent - if your browser dumps stuff from unrelated places at the same folder, and then the OS uses those unrelated stuff as if it was related, there's a completely failure from the software data to reflect the user intent. This is a blatant fault of the software stack (but of no party in particular).


The classic way of mitigating the issue of intent, of course, is capability-based security. Standard ambient authority models basically leave the door wide open by decontextualizing all action, such that merely riding on someone's session from their end is sufficient.


That finds me thinking that a issue is perhaps that with modern personal computing, the system/kernel view of state (or some such), and the user view of state, are not lined up.


It's not Microsoft's or the UAC's fault. It's Nullsoft's fault.

NSIS (Nullsoft Scriptable Install System) loads and executes the DLL without any verification.

Also, you're not hijacking DLLs -- you're using a DLL to hijack NSIS (which runs with administrative privileges).


How can NSIS verify the DLL? How can an installer from, say, 2012 know if the copy of shfolder.dll in Windows 10 is valid?

And worse - in some cases, NSIS wasn't even directly loading the DLLs, Windows was doing it behind its back (eg. NSIS called a Win32 function called OleInitialize, which loaded UXTheme.dll by itself).


All valid points. I think it should be possible, at least for Microsoft DLLs, to specify a flag that you are willing to load only those signed by Microsoft. A malicious version.dll wouldn't probably have a Microsoft signature. The signature verification would be done by the OS and thus would be future-proof. Even a OS-local whitelist for exemptions could work, where an admin could add some unsigned DLL hashes to the list that would still load. None of the downloaded malicious DLL files would pass this test.


It is the NSIS' fault for auto-loading existing dll in the same working directory if presents. And UAC only does its job: To run a command with elevated privilege (so does sudo). UAC does not really care the executed code.


Doesn't UAC share some blame for telling me the code its running is signed and unmodified but actually it is not?


It is the NSIS fault for auto-loading existing dll in the same working directory if presents.

Well, I'd say it's Windows fault for having LoadLibrary do so by default. Unsafe actions should have to be manually enabled, not manually disabled!


afaik we can specify absolute path in LoadLibrary function. If it isn't, yes, LoadLibrary will use cwd relative by default -- which is dangerous for non-installed program. This is okay for installed program since you need additional privilege to alter Program Files directory.


It's relative to the location of the executable, not the current working directory.

The idea on Windows is that an application's directory is a trusted location created specifically for that application. Running executables directly out of a Downloads folder is a violation of the Windows security model, so in that sense the security vulnerability is Chrome's fault, as the "correct" thing to do is to put downloaded files into different folders as older versions of IE did. Of course, that's terrible UX, and considering that this exact problem has been popping up for decades, I'd consider a manifest flag for "don't trust the application directory" long overdue.


Of course, that's terrible UX

I find the "put everything that gets downloaded in one folder" to be worse UX than being able to choose where each download goes as it means having to move things back out of the downloads folder instead of them being downloaded to where I want them; especially when they're large files I wanted to put on a different drive, so the move turns into a lengthy copy operation.


I think both have their pros and cons - the "where do you want to save this" dialog sucks if you don't care, but the lack of a dialog sucks if you do care are you are particular about your organization. Personally I don't find organizing after the fact to be that difficult, but then again I don't typically download huge files.


> It's relative to the location of the executable, not the current working directory.

It actually searches both the location of the executable and the CWD: https://support.microsoft.com/en-us/kb/2389418#mt1


That's a pretty awful security model, and an even worse assumption!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: