Hacker Newsnew | comments | show | ask | jobs | submit login

At least in my world view this is a much better definition of DevOps. Folks who make the world run, and through automation can keep a larger portion of the world spinning. It requires someone who can analyze failures, figure out how to predict and mitigate them, and then code automation to do so.

Oddly this is much more like the 'developers' of old. If you sat down at a workstation you needed to know how to be your own system administrator, and you needed to write code.

Automation has enabled a fairly new class of engineer which I think of as someone who has no idea how the pieces all fit together but they can assemble them with the help of a framework and a toolkit into useful products. They become experts at debugging the toolkit and framework but have little knowledge of how everything else actually works.

The problem with this new type of coder is that they can write syntactically correct impossible programs. I didn't understand that until I taught myself VHDL (a hardware description language). VHDL was the first "language" that I knew where you could write syntactically correct "code" which could not be synthesized into hardware. The language expressiveness exceeded the hardware's capabilities (and sometimes you would need to have a time machine). Imagine a computer language where 1/0 was a legitimate statement, not caught by the compiler, but always blew up in your executable.

So we have folks who can write code that is grossly inefficient or broken on "real" systems.

Google had started a program when I was there to have developers spend time in SRE (their DevOps organization) and it was to invest in them the understanding of what went on in the whole stack so they could write better products. The famous 'times every programmer should know' by Jeff Dean was another such tool. You cannot get too far away from the systems that are going to run your code if you want to write performant code.




When Flickr did their DevOps talk in 2009, most of the infrastructure engineers I worked with at the time saw the trend in reverse. The people wearing the Developer hat were relying on our team's ability to automate anything, so the Ops team ended up being the team that best understood how the whole system worked.

In 2009, DevOps seemed like there was finally a reasonable answer to Taylorism. Engineers and Programmers and Hardware Technicians and Support Representatives were not cogs in a machine, but humans that could collaborate outside of rigid boundaries. Even at the lowest levels of the support organization, individual workers along the chain could now design their own tiny factories.

From there, it's just a matter of communicating tolerances properly up and down the chain. I am probably over-romanticising these notions, but it certainly felt exciting at the time. Not at all like the "fire your IT department" worldview it turned into.

-----


"Imagine a computer language where 1/0 was a legitimate statement, not caught by the compiler, but always blew up in your executable."

... isn't that all languages?

-----


Of course it is, and there are collections of letters that are pronounceable words, but it doesn't give them meaning. The equivalent in English would be a spell checker that didn't flag "douberness" and passed it along. Sure you can pronounce if if you look at it phonetically but it doesn't mean anything. It is syntactically correct but broken. VHDL has a lot of things that can be written but not actually expressed in hardware.

-----


Sure, I've no doubt it's more common there - that's very much my understanding. The wording of the above just struck me very much as if it were meant to be hypothetical, which I found amusing given that it's nothing like.

-----


Whether it's detected at compile time or runtime, a statement that evaluates to DIVBYZERO can be handled. Taking the result as an ordinary value that blows up your program, on the other hand...

-----


"All languages" was a bit tongue in cheek, but even in Haskell, (div 1 0) gives a run-time failure.

-----


In this case, a 'run-time failure' would be completely unacceptable, as the 'run-time' environment is your $X000 hardware manufacturing run. Hardware development isn't in the same league as software. It's not even the same sport. Like comparing football to rugby. Both played on a gridiron, but entirely differently.

-----


First, there exist software environments where errors cost significantly more than a hardware run. Obviously, those environments contain hardware as well, but "cost of a runtime error" is clearly not the only important thing here.

Second, my only point was that the example given was a piss poor example of the difference between hardware and software. Obviously a bad example doesn't disprove the claim it's supposed to support.

-----


Everyone's piling on you because that wasn't the point of the example. Automation grants humans extraordinary powers, as long as humans aren't simply steps within the automatic system.

There's been an awkward growing phase of the technology industry that has led to technicians that don't have any real understanding of the systems they maintain. Compare and contrast Robert DeNiro's character in Brazil with the repairmen he has to clean up after. We could be training those poor duct maintenance guys better.

-----


... what?

-----


If you haven't seen Brazil, you can safely ignore that part of the post. But you should see it.

-----


I love Brazil, I'm just not tracking how all of that fits into the above.

-----


The article is about how DevOps is killing the role of the Developer by making the Developer be a SysAdmin.

Chuck points out that abstracting the Developer's work too far away from the system in question means the Developer doesn't really understand the system as a whole. Jeff refers to "purely development roles" and other "pure" roles that aren't necessarily natural boundaries.

The example of VHDL is not about hardware and software, but about learning that you didn't actually know something you thought you knew.

The repairmen in Brazil do not realize (or necessarily care) what they don't know about duct repair. The system allows them to function and even thrive, despite this false confidence in their understanding.

At one point at least, Google was investing in (metaphorically) having DeNiro cross-train those guys, instead of umm... Well, the rest of the movie.

-----


I've read this a few times and it still doesn't really have any bearing on the aside I was making, which was that something was presented as a hypothetical (Imagine ...) that is the overwhelmingly typical case, and in some measure that amused and confused me.

-----


Well, it helped that I'd been discussing the topic out of band not that long prior to the original comments...

The initial detail was that VHDL, unlike "software" languages, has very different consequences. Can you imagine a language where (1 / 0) wasn't defined away as a DIVERR, but otherwise managed to remain mostly self-consistent? Where something can be logically / syntactically coherent, but not physically possible?

And if that example didn't hit home for you, so it goes, but there was plenty of detail unrelated to the specific example that I thought was more important / interesting to discuss. :shrug:

-----


Nah, in dependently typed functional programming languages you can prevent this at compile-time.

-----


Yes, "every language" was glib. In any language we could avoid it, actually, by hiding division behind something that gave a Maybe or Option or similar. My point, though, was that his "Imagine..." was actually representative of virtually all of the languages that virtually all of us work in virtually all of the time. It is therefore a poor example of a way in which HW is different.

-----


I went to dependent types specifically because I figured we meant static avoidance without resorting to checked arithmetic. (better performance)

-----


Sure, that would be a good reason to go there. I didn't mean to cast aspersions at dependent types. I was just confused/amused at the typical case being cast as a hypothetical.

-----




Applications are open for YC Winter 2016

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: