Hacker Newsnew | past | comments | ask | show | jobs | submit | exprL's commentslogin

Probably because a Ruby activist wrote a cool script and no-one cared about the implications of a new dependency at all.


I fear many projects seems to take the infrastructure and manpower of large distros for granted these days.


For a user, DNF is mostly just a quicker YUM. It still deals with RPMs and the UI is similar (at least for basic operations).


A couple of years ago.


If that is true, the message on this page is hilarious:

http://cdimage.ubuntu.com/daily-live/current/

I quote: "Warning: This image is oversized (which is a bug) and will not fit onto a standard 703MiB CD. However, you may still test it using a DVD, a USB drive, or a virtual machine."

I guess it's not a high-priority bug then :)


Tangentially, there are still netinstall (mini.iso) and Lubuntu Alternate images (which allow to install a basic command-line system and go from there) with a text-based installer which DO fit on a CD.


Not really. Compared to, say, Firefox' future versions, it's barely ahead, with Firefox putting some features counted multiple times behind a flag (let's remember that `let' has been available in Firefox for ages, regardless of user settings, assuming the script tag has suitable type attribute).

Further than that, it's silly to compare browsers that aren't stable yet.


Not to mention that IE is usually a bit ahead at launch, but with such a long tail (roughly two years) between releases gets pretty stale, and is in general the boat anchor that holds back being able to use said features without transpiling and/or shims for 4+ years.


I was under the impression we are going to see regular updates to Spartan similar to what we get from Chrome now.


s/IE/Mobile Safari/


I would warn against using `let` in Firefox today, their implementation has a lot of bugs[1][2] at this point.

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=950547 [2] https://bugzilla.mozilla.org/show_bug.cgi?id=1023609


Windows has been running on ARM since version 8 (the ARM-specific version was called Windows RT). Since then, Microsoft announced a “special” version of Windows that's intended for the new Raspberry Pi 2.


If I remember correctly, it's more of a runtime for some .NET programs with access to the Pi's GPIO pins than a full-fledged OS.


I don't think these Chromebooks are sold in Google Store in any country.

(Google Store selection varies by country, anyway. Where I live, they don't sell any Chromebooks; I might have considered the new Pixel if it weren't for the hassle.)


FWIW, https://store.google.com/category/chromebooks lists bunch of chromebooks for sale in the US, including the new Pixel.


The list of chromebooks here in the UK is just one: the new Pixel.

Despite all of the pictures showing Acers etc. you can only actually buy one from Google. It's silly.


Early C? The parent described accurately how pointer addition in C works for all char pointers. (Well, other than the “nobody wants that” part, because that's how you skip n bytes of a string.)


C has pointer arithmetic


Not true (technically). When you add an integer to a char, you get an integer. (If you add a floating point number, the result is such as well.)

You can, of course, use the integer you got like a char (after all, C's char is a small integer type) to get your gibberish. You can also use a floating point number as a char, because C has lax implicit conversions.


If you add '!' to '#' you get 'D', which is non-sense in any high level language.


Not true; you added the byte that happens to be identified in ASCII by !, 33, to the byte that happens to be identified in ASCII by #, 35, and get 68, which is D. This is all perfectly sensible and type safe, two 8-bit numbers being added to produce another 8-bit number; you are only confused by the surface syntax. There's plenty of high-level languages that will let you do this; there's all sorts of reasons to make numbers and the ASCII chars they represent easily usable for each other in source code.


No, I didn't. I actually added the character '!' to '#'. The C language doesn't DISTINGUISH characters from their numerical representations. That's where I say it's weakly typed. It does not have an actual character type, it only has 8 bit numbers.

Of course in C it is perfectly logical. I am pointing out that it is not as strongly typed as other type systems that have richer types.


>mostly due to being actually quite good

While I wouldn't generally praise (any of) the browsers of that time, Macs really didn't have much competition at that time when it came to web browsers.


That's the stopgap solution, I think. Looking at the larger picture, it's silly that a projector only looks for video-in, instead of video-in/power-out, considering it's already plugged to a wall outlet.


You think the end game is "We'll improve the hardwire connection to the project so it can backfeed power" rather than "We'll just get rid of the wire"?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: