It's not a bad solution, but I don't think it's the best solution. Obviously, the thing that sticks out as being the most inefficient is that the small (or default) image is always loaded. This means that for tablets and desktop browsers there will be an extra image request that is never used. Not too bad, but on a page with a lot of images this could add up.
Another thing I would suggest the team think of is not loading images by browser width only. If we're tying these libraries to the idea that they improve performance by optimizing which images are loaded - so that you only transfer the necessary amount of KB per page view - browser width is a bit removed from what you want. What you actually want to measure is the user's network speed (which can be done with libraries like Foresight.js : https://github.com/adamdbradley/foresight.js). Loading large images on a slow network isn't going to be good for performance. By using both browser size and network speed, you can optimize images for mobile devices over 3G or those connected to WiFi. Or desktops on broadband versus desktops on 56k modems.
Checking network speed is actually a great idea - we'll investigate this. The reason we use the img src (which as you noted means you might load two images) is twofold: so that if your image is the same aspect ratio, you'll get an immediate load of something before the better image comes in (without which you'll have a really nasty reflow). This also guarantees you'll get something that works if JS is disabled or unavailable.
Please submit pull requests or issues for ways we can make this better, we're all ears.
I haven't personally tried it yet, but it seems the best option out there at the moment is the Capturing polyfill by Mozilla (https://hacks.mozilla.org/2013/03/capturing-improving-perfor...).
Another thing I would suggest the team think of is not loading images by browser width only. If we're tying these libraries to the idea that they improve performance by optimizing which images are loaded - so that you only transfer the necessary amount of KB per page view - browser width is a bit removed from what you want. What you actually want to measure is the user's network speed (which can be done with libraries like Foresight.js : https://github.com/adamdbradley/foresight.js). Loading large images on a slow network isn't going to be good for performance. By using both browser size and network speed, you can optimize images for mobile devices over 3G or those connected to WiFi. Or desktops on broadband versus desktops on 56k modems.