Hacker News new | comments | show | ask | jobs | submit login

"Imagine a computer language where 1/0 was a legitimate statement, not caught by the compiler, but always blew up in your executable."

... isn't that all languages?

Of course it is, and there are collections of letters that are pronounceable words, but it doesn't give them meaning. The equivalent in English would be a spell checker that didn't flag "douberness" and passed it along. Sure you can pronounce if if you look at it phonetically but it doesn't mean anything. It is syntactically correct but broken. VHDL has a lot of things that can be written but not actually expressed in hardware.

Sure, I've no doubt it's more common there - that's very much my understanding. The wording of the above just struck me very much as if it were meant to be hypothetical, which I found amusing given that it's nothing like.

Whether it's detected at compile time or runtime, a statement that evaluates to DIVBYZERO can be handled. Taking the result as an ordinary value that blows up your program, on the other hand...

"All languages" was a bit tongue in cheek, but even in Haskell, (div 1 0) gives a run-time failure.

In this case, a 'run-time failure' would be completely unacceptable, as the 'run-time' environment is your $X000 hardware manufacturing run. Hardware development isn't in the same league as software. It's not even the same sport. Like comparing football to rugby. Both played on a gridiron, but entirely differently.

First, there exist software environments where errors cost significantly more than a hardware run. Obviously, those environments contain hardware as well, but "cost of a runtime error" is clearly not the only important thing here.

Second, my only point was that the example given was a piss poor example of the difference between hardware and software. Obviously a bad example doesn't disprove the claim it's supposed to support.

Everyone's piling on you because that wasn't the point of the example. Automation grants humans extraordinary powers, as long as humans aren't simply steps within the automatic system.

There's been an awkward growing phase of the technology industry that has led to technicians that don't have any real understanding of the systems they maintain. Compare and contrast Robert DeNiro's character in Brazil with the repairmen he has to clean up after. We could be training those poor duct maintenance guys better.

... what?

If you haven't seen Brazil, you can safely ignore that part of the post. But you should see it.

I love Brazil, I'm just not tracking how all of that fits into the above.

The article is about how DevOps is killing the role of the Developer by making the Developer be a SysAdmin.

Chuck points out that abstracting the Developer's work too far away from the system in question means the Developer doesn't really understand the system as a whole. Jeff refers to "purely development roles" and other "pure" roles that aren't necessarily natural boundaries.

The example of VHDL is not about hardware and software, but about learning that you didn't actually know something you thought you knew.

The repairmen in Brazil do not realize (or necessarily care) what they don't know about duct repair. The system allows them to function and even thrive, despite this false confidence in their understanding.

At one point at least, Google was investing in (metaphorically) having DeNiro cross-train those guys, instead of umm... Well, the rest of the movie.

I've read this a few times and it still doesn't really have any bearing on the aside I was making, which was that something was presented as a hypothetical (Imagine ...) that is the overwhelmingly typical case, and in some measure that amused and confused me.

Well, it helped that I'd been discussing the topic out of band not that long prior to the original comments...

The initial detail was that VHDL, unlike "software" languages, has very different consequences. Can you imagine a language where (1 / 0) wasn't defined away as a DIVERR, but otherwise managed to remain mostly self-consistent? Where something can be logically / syntactically coherent, but not physically possible?

And if that example didn't hit home for you, so it goes, but there was plenty of detail unrelated to the specific example that I thought was more important / interesting to discuss. :shrug:

Nah, in dependently typed functional programming languages you can prevent this at compile-time.

Yes, "every language" was glib. In any language we could avoid it, actually, by hiding division behind something that gave a Maybe or Option or similar. My point, though, was that his "Imagine..." was actually representative of virtually all of the languages that virtually all of us work in virtually all of the time. It is therefore a poor example of a way in which HW is different.

I went to dependent types specifically because I figured we meant static avoidance without resorting to checked arithmetic. (better performance)

Sure, that would be a good reason to go there. I didn't mean to cast aspersions at dependent types. I was just confused/amused at the typical case being cast as a hypothetical.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact