I quote: "Warning: This image is oversized (which is a bug) and will not fit onto a standard 703MiB CD. However, you may still test it using a DVD, a USB drive, or a virtual machine."
Tangentially, there are still netinstall (mini.iso) and Lubuntu Alternate images (which allow to install a basic command-line system and go from there) with a text-based installer which DO fit on a CD.
Not really. Compared to, say, Firefox' future versions, it's barely ahead, with Firefox putting some features counted multiple times behind a flag (let's remember that `let' has been available in Firefox for ages, regardless of user settings, assuming the script tag has suitable type attribute).
Further than that, it's silly to compare browsers that aren't stable yet.
Not to mention that IE is usually a bit ahead at launch, but with such a long tail (roughly two years) between releases gets pretty stale, and is in general the boat anchor that holds back being able to use said features without transpiling and/or shims for 4+ years.
Windows has been running on ARM since version 8 (the ARM-specific version was called Windows RT). Since then, Microsoft announced a “special” version of Windows that's intended for the new Raspberry Pi 2.
I don't think these Chromebooks are sold in Google Store in any country.
(Google Store selection varies by country, anyway. Where I live, they don't sell any Chromebooks; I might have considered the new Pixel if it weren't for the hassle.)
Early C? The parent described accurately how pointer addition in C works for all char pointers. (Well, other than the “nobody wants that” part, because that's how you skip n bytes of a string.)
Not true (technically). When you add an integer to a char, you get an integer. (If you add a floating point number, the result is such as well.)
You can, of course, use the integer you got like a char (after all, C's char is a small integer type) to get your gibberish. You can also use a floating point number as a char, because C has lax implicit conversions.
Not true; you added the byte that happens to be identified in ASCII by !, 33, to the byte that happens to be identified in ASCII by #, 35, and get 68, which is D. This is all perfectly sensible and type safe, two 8-bit numbers being added to produce another 8-bit number; you are only confused by the surface syntax. There's plenty of high-level languages that will let you do this; there's all sorts of reasons to make numbers and the ASCII chars they represent easily usable for each other in source code.
No, I didn't. I actually added the character '!' to '#'. The C language doesn't DISTINGUISH characters from their numerical representations. That's where I say it's weakly typed. It does not have an actual character type, it only has 8 bit numbers.
Of course in C it is perfectly logical. I am pointing out that it is not as strongly typed as other type systems that have richer types.
While I wouldn't generally praise (any of) the browsers of that time, Macs really didn't have much competition at that time when it came to web browsers.
That's the stopgap solution, I think. Looking at the larger picture, it's silly that a projector only looks for video-in, instead of video-in/power-out, considering it's already plugged to a wall outlet.
You think the end game is "We'll improve the hardwire connection to the project so it can backfeed power" rather than "We'll just get rid of the wire"?