Excel can't save to paths over 218 characters 54 points by glennos 1339 days ago | hide | past | web | 52 comments | favorite

 When the names mean nothing, it's harder to gauge. Here's a believable directory structure that might lead to the problem:C:\Users\reasonableusername\Documents\Work\CompuGlobalHyperMegaNet\Non-Project Documents\Administrative\Business Contracts\Fiscal 2013\First Quarter\February\Week Two\TPS Reports Cover Sheet And Memos Template - 3.xls
 Yeah, I have seen it first hand. Admittedly, there was some redundancy and some excess in the folder names, but it was for a relatively large project that was organized with fairly deep levels of sub-projects and tasks. If you happened to be working on one of the sub-projects that had a longer, more verbose title, you would run into this issue.
 Social implications of a size limit can be quite large. Non tech saavy team forced to restart work (half of them will lose the file if saved in a temporary location) after waiting for someone to explain them this additional rule.
 Yep, that's the sort of project path structure which was causing me grief. Excel then gave me a warning with a list of 6 possible issues, none of which mentioned the 218 limit.
 This actually happened to me; I blogged about it here.
 And even with tab completion that path shows why only a masochist with OCD thinks paths that long or deep are needed.People who name their folders "Documents and images from the conference in August with CEO" deserve what they get.Although 260 characters does seem like the Excel team think more like me than the rest of (more reasonable) MS.
 > only a masochist with OCD thinks paths that long or deep are neededI'm surprised you can't imagine working on a path that deep. Path arrangements are frequently thrust upon you in work environments and if you have a lot of work (or a lot of people sharing the same folder hierarchy), you may need a deep path structure to keep it organized. Attacking people as "masochists" because their paths are difficult to type seems unfair.As for your example though, Windows usually shortens path components longer than 8 characters as the path gets longer (so "Program Files" becomes "PROGRA~1") so long single components aren't as problematic as many shorter components.
 I am pretty sure it's more for historical reasons than a solution from MS for long filenames on win32. It's due to the application writing to the file-system, not the OS.Earlier DOS FS (and others) were sometimes limited to 8 characters + 3 for extension and Windows just used to shorten filenames when writing to those FS. But to these days it still causes some problems for legacy software and you might see some PROGRA~1 hanging around but it's not from the OS, it's from the application (or its old out-date API for FS access) doing the shortening to fit the long filename it was given into an 8 characters word it thinks the FS can handle.Windows's file explorer or file picker doesn't shorten path or filenames when downloading long filename from the web for instance (via Firefox or IE).
 Oh I can imagine working that deep - and the issue is never "if only we had MAXPATHLEN set higher our business processes would magically be more efficient". Paths that deep that are meant as markers for humans to read are a pathological case - if you don't at a minimum put a web server in front with a folder to human-readable name in front then all the other benefits of automation are likely to pass you by too.In short, paths that deep show you have humans trying to do what computers are better at. Schumpter will be round later to have a word.
 C:\users\myusername - so far so good.C:\users\myusername\work - good.C:\users\myusername\work\projectname1\ - still good.Now checkout here the root of your favorite SCM from a project with 30+ developers.There's a good chance that you'll run into problems./rantOr wait, let's do it the Unix way! Isn't "etc" obviously "configuration"? And "bin" obviously "programs"? Yes, let's all use short, non-intelligible folder names like they used back in the days of teletypes when we have 24+" monitors. And let's add on top of that some inside jokes from Bell Labs or Berkeley while we're at it ("spawning daemons").
 You're assuming all paths are created and consumed by humans. Automated processes could easily generate and work with long paths for perfectly-valid reasons. A limit this low causes grief for people that "deserve" better.
 Indeed, I've even managed to run into this issue on linux, where an openembedded build would fail because the list of files passed to a command in a shell script would not fit in the buffer of arguments (about 5K IIRC). Fixing it would require recompiling the kernel.
 But that's surely an argument for rewriting the command or the shell script?Yes its freaking annoying, but I think having a path > 260 chars or a need to pass 5000+ bytes in as line arguments all count as "code smells".If you are hitting these issues, something else is wrong and in my experience, fixing the other things is much much more productive.
 Exactly, recently had some visual studio build problems with a large repository and local node modules going over the file path limit.
 Most people only type their pathnames occasionally and often use descriptive sentences to avoid confusion when clicking.
 its takes an especially poor quality of programmer to knowingly write this code when the user can select the path - this has been the case since Win 95 shipped when every programmer worth his salt noticed this without even thinking...
 Or someone who uses maven...
 When i first hard coded MAX_PATH in my program i though the length was reasonable. Now i think 1024 would be better as a really large file name does happen sometimes.Apple has a limit of 1024 apparently( correct me if I'm wrong )Solution by microsoft: http://blogs.msdn.com/b/bclteam/archive/2007/02/13/long-path...
 Setting an arbitrary limit is now and will always be the wrong thing to do in API design. MAX_PATH is one of the most persistent reminders of this problem.Neither 260 chars on Windows nor 1024 chars on the Mac is an actual limit of the filesystem. Both Windows NTFS/FAT32 and Mac HFS+ can actually support thousands of nested folders.But the "MAX_PATH" concept persists and a large number of programs will fail mysteriously when the limit is reached. Even though the filesystem supports vastly more, we are stuck with operating systems that encourage programmers to needlessly limit things. And there's never been a serious push to improve the situation. Long paths aren't a serious alternative since they're not supported by many common Windows APIs (plus, users get confused when \\?\ appears in front of everything). It's completely insane.
 The \\?\ thing is a weird hack. There are other side-effects of \\?\, for example the system will stop removing trailing spaces and dots from path elements. So you could have a filename that fits well within the limit of MAX_PATH but is inaccessible without \\?\ - just end it with a dot or a space.It's not even always MAX_PATH directly that decides when you need \\?\. IIRC for a directory you must fit within MAX_PATH minus the length of a slash and an 8.3 filename.There are other limits, for example by the time a create gets to the NT APIs the path must fit in a UNICODE_STRING structure which has a maximum length of 0xffff/sizeof(WCHAR). Once I was curious about probing the limits and I created an absolute path that was longer than this limit. I did it with something like this cmd script: mkdir a :top ren a a.tmp mkdir a move a.tmp a\ goto :top  Sure enough I was able to create a dir that would stump just about anyone's recursive-directory-delete code. To delete it on my own filesystem I wrote a similar rename-loop to unravel it. If you ever want to play a prank on a Windows programmer who recursively enters directories, show them this.
 I think you can blame C's poor string processing functions for that. Every time I have to deal with paths in C I prefer to allocate a static "MAX_PATH" buffer than dealing with dynamic memory allocation and string concatenation (did glibc finally include strlcat or will it take an other decade?).Or better yet, I'd prefer to use a third party lib for that, but I don't know anything small and simple for the command line.
 > "We do need a solution in the underlying windows API, but this would most likely emerge as new APIs rather than changing the existing ones. We've discussed this at length on the longpath alias at Microsoft (yes, we have a whole alias devoted to the issue!) and there are no plans to change the existing ones, since it would break third party code that depend on MAX_PATH buffers on the stack."And thus, thanks to Microsoft's slavish devotion to being backwards-compatible with brittle software, we get yet another, slightly different, set of APIs in Windows.
 Let me rant on something here. I work in finance, Excel is my everyday. We do a lot of manual monkey work (copy pasting stuff around a lot), when we "automate", we use the adamant (can't find a better word) VBA.Now, you see how not helpful is this error message? Ok, try that when you are writing a macro in VBA. You literally start regretting the day you were born.Nothing beats Python stack traces...
 From that article, something I haven't seen before: To verify the error message that you receive in Excel 2007, press Ctrl+Shift+I. The following number is displayed in the lower-right corner of this error message dialog box: 100202  It's no stack trace, but should be helpful in searching for proper error codes vs generic text.
 Except I am on Excel 2003. The company is slowly moving to 201x although (can't remember the version, the email subject was "digital toolbox" and thus I directly deleted it).
 So do you work with a 2003 version of Python?
 Python 2.3 still had stack traces.
 Nice try. Not my choice to work with Excel 2003 although. Also, I hate the ribbon interface.
 And TFS can't handle path lengths longer than 260 characters.I don't want to hear the BS about MAX_PATH. These are really annoying customer-facing flaws that scream "amateur hour".
 Yup. All due to a choice someone made when deciding how to wrap the win32 file system APIs for dot-net.Also, if you think that's bad, how about this. The TFS build system schedules builds on agent machines using a time format of UTC time of day. Seems fine except for that it is utterly broken in the context of daylight saving time. If you schedule builds that you want to run at a particular local time of day, which almost everyone does as it's an exceedingly common use case, then the actual time of day that build runs will be determined by whether or not the build definition was last saved during the current DST period. For example, if you have a bunch of build definitions and daylight savings time had just ended then builds could run either on time or an hour early.
 It's also problematic for enterprises trying to roll out sharepoint. Site addresses tend to be pretty verbose so I can't see that playing particularly nicely given this limitation.Extra points also for the super helpful error messages it throws.
 Ah.. the old 218 characters in your path problem! Because of course!
 It's MAX_PATH in Windows (== 260. There's a 32K Unicode-equivalent limit[1]). In Linux it's PATH_MAX, which is (very likely) 4096. That's a bigger number than 260, sure; but it's still a discrete one; instead of, say, ∞.---[1] 'To specify an extended-length path, use the "\\?\" prefix. For example, "\\?\D:\very long path".' http://msdn.microsoft.com/en-us/library/windows/desktop/aa36...
 For reference, the GNU Hurd kernel was designed to avoid fixed limits like PATH_MAX. As a result, they have reaped endless compatibility issues with software that assumes there will be some specific fixed limit: https://www.gnu.org/software/hurd/community/gsoc/project_ide...
 And another proof that GNU Hurd is superior to every existing OS.
 You mean finite, not discrete :-)
 Visual Studio (at least 2010 and 2012) also has this problem, actually. If the path (including filename) to your solution file is more than 218 characters, it refuses to work with it.Welcome to 1995!
 My favorite experience with it was trying to deal with a zip file of "C:\Documents and Settings\" from an old computer that happened to contain multiple files whose paths were 255 characters long or so. You couldn't unzip it in a subdirectory of "C:\" because one of the resulting paths would be too long and cause the built-in zip program to crash/fail entirely. Using 7zip was the eventual solution.On a related topic, do you know how difficult it is to search for files whose path are above a certain length on a fresh/default Windows setup? I eventually resorted to installing Python on it just to write a script to do it.
 MAX_PATH, bane of my existence. I cannot count the times a customer did not find a document in search because Windows decided to throw this limit in our faces. It can pop up at the weirdest places ...
 I'm pretty sure I said this exact same thing when my boss finally figured out that this issue was causing a script we use to fail on the BOMs I was trying to release. It took an embarrassing amount of time for a few engineers to figure that one out.
 The msdn article [1] is a great reference on the strangeness of the Windows path naming conventions.
 This gets even worse when mixed in with emails, I recently bumped heads with the naming limitations as the naming of invoices exceeded the limits and rendered the excel file useless (it "existed", it would just error if you tried opening it).File name limitation in 75 characters, so now my system renames the file when sending them by email to just the invoiced clients name, and retains the properly formatted one in the folder structure (It's a very detailed one following a "- - .xsl" pattern as it revolves around certifications and needs to follow certain regulations, and obviously exceeds 75 characters with ease)
 This limitation is obviously put in place to make it easier for the NSA to index your documents
 Can't load them either, I guess. When I tried to open an Excel file directly from Sparrow I would get a "path longer than xxx" message. Saving it to ~/Downloads and then opening it worked just fine.
 No surprise here. Long path names have always been a pain on Windows, from not being able to open files to not being able to restore previous versions, backups, etc.Hitting these limits is very common on file servers.
 From their page, I offer a suggestion: Things to Try: Increase the maximum path length.Jeez.
 It also can't open two documents with the same filename in different directories, which comes up a lot more often in my experience...
 does this apply to 2013? it is a rank amateur quality of mistake... and has been for quite a long time before 2007, since we lost the constraints on paths in early windows/dos land with Windows 95 I think...
 Neither can winrar, or was that fixed?
 Excel 2007.
 Not just 2007 from what I can see and given the last update was before Excel 2013 was released, it could be affected too. Do you use 2013?--From the article-- Article ID: 213983 - Last Review: September 18, 2011 - Revision: 6.0 APPLIES TO Microsoft Office Excel 2007 Microsoft Excel 2002 Standard Edition Microsoft Excel 2000 Standard Edition Microsoft Excel 2010

Search: