Not everyone is a frontend web developer, or any sort of frontent developer. I develop software to manage network attached storage, making my resolution pretty much irrelevant for testing my software. In fact, for the small amount of UI work that I do, I run it in a VM at 1024x768; the tiny resolution on this thing means that that would take up pretty much my whole display.
Furthermore, I like to be able to see my code and and the final product at the same time. In fact, I prefer to be able to have several columns of code, a terminal, a web browser, and my VM that I'm deploying my work to all visible at once. The more I can see, the better. I usually work with 3 1920x1080 monitors plus my laptop display, but sometimes need to use my laptop when I'm not at my desk. Being able to fit multiple columns of code and/or terminals on my screen at once is important to me.
Heck, these days, phones are coming with greater resolutions than that; you can't even fit your phone emulator on that display without scaling, if you develop mobile software. The Nexus 10 has 2560x1600 display on a 10 inch screen; why does a $400 tablet have such a better screen (its smaller dimension has more pixels than the XPS 13's larger!) than a $1500 laptop?
Aside from the poor resolution I also don't understand the price. I purchased a gaming laptop (ASUS G75VW) w/ 16GB of RAM, a 256GB SSD, 750GB HDD, GTX 660m & 3610QM, which runs 1920x1080 for under $1400. This dell laptop has the specifications of my $400 Chromebook. It seems like it has been massively overpriced because it's one of the few machines that runs linux out of the box.
What does the resolution of an end user have to do with the desired resolution of a developer's machine? Each is accomplishing different tasks. Should you also program in an uncomfortable chair because many of your users will not be able to afford a quality one?