Hacker News new | comments | show | ask | jobs | submit login

In my view, the sophistication is implied by the breadth of expertise required to put the whole thing together. Google Search and the OS landscape are for sure broad and sophisticated. However, their development was accomplished by computer scientists.

In order for stuxnet to be effective, it was necessary to employ expertise in:

- Uranium enrichment methods and processes

- Capital equipment control systems and their development environments

- Theory of operation of centrifuge machines

- Corporate espionage of some sort

- Organizational management skills that can pull all that together

- and deep understanding of the operating systems referenced above




But do those things really contribute to the sophistication of the software? For example imagine some code written with no understanding of uranium enrichment:

    const int CENTRIFUGE_RPM = 500;
And then some other code written with a deep understanding of uranium enrichment:

    const int CENTRIFUGE_RPM = 1203;
Can you really say that the second bit of code is more "complex"? Same goes for stolen driver signing keys and some of the other things mentioned in the post.

Other large software projects like operating systems or Google search involve much more complex software concepts which I think is the primary thing that should be measured when discussing the sophistication of software.


>Can you really say that the second bit of code is more "complex"?

Yes.

Complexity in the sense discussed is related to the domain knowledge (including CS knowledge) required for the program to be written and work well.

Else even a trivial BS program could be very complex, just sprinkle it with gotos and unnecessarily convoluted code...


This is such a powerful distinction that I feel it should help us rethink language paradigms. Complexity is not (just) the complications one can impose by construct or the involutions required of ones algorithms, it's the overall real world system your code addresses.

Simple programs which are coded simply may address complex phenomena to complex ends--perhaps that's even the ideal?


You might enjoy Fred Brook's essay "No Silver Bullet", where he distinguishes between "Accidental Complexity" (basically, complexity created by software engineers when implementing a solution) and "Essential Complexity" (complexity that arises because software is written to solve some real world problem and the world is complex).


Most people perceive complexity as things they don't understand. In that case, complexity will be relative.


> Most people perceive complexity as things they don't understand.

I don't think this is true. For example, as a math teacher, I couldn't do a very good job predicting how easy or difficult students would find particular problems. But I could easily predict which problems would be easier and which would be more difficult. I could do that even though I personally understood all the problems.


I don't think difficulty is complexity. For example, the bitcoin mining protocol complexity is the same but the difficulty goes up or down.

I'll attribute difficulty to the energy required to resolve a system. For example, pulling weight. The complexity of the action is the same. But the difficulty depends on the weights to pull.


Complexity is difficulty of understanding. In the context of mathematical problems, that is the relevant kind of difficulty.


It seems you’re vastly misusing the words and their contexts here.


Sure, I suppose you'd just need a good definition for complexity. Notions like computational complexity have clear definitions while what I think you're describing might not. Or may be it would require some thinking and be valid in some limited regimes of "real world" effects as you call it.


Something something about simple rules being able to describe complex behaviour. Example: you can describe a flock of birds in motion around an object with 2 or 3 rules.

Complex rules yields stupid results. Example: tax codes in most countries.

Must be a quote but I wasn't able to find a source for it.


The problem with simple rules is the volume of computation. Theoretically you could write a tax code using quantum mechanics, but good luck calculating your tax each year (or before the heat death of the universe).

When systems get too complex to simulate from first principles, we have to resort to inductive reasoning--observe the system and then create rules as we see a need.

Yes the resulting rule set is a mess, like our tax code. But the physical system that the U.S. federal tax code (for example) covers--the United States of America--is mind-bogglingly complex.

We have trouble computationally simulating more than a certain number of neurons... there are billions of neurons in each human brain, and there are hundreds of millions of human brains interacting in the U.S. This does not even get into other physical phenomena like surface water or mineral distribution.

The results are stupid because we are too stupid to understand and analyze the system we're trying to describe and manage.


That something something is actually Agent Based Modeling / Simulation.

Back when I was in academia I used to develop ABMs to represent the behaviour of complex systems with a simple set of rules of agent action and interaction.

The game of Life is the quintessential example of that.


Stringing together independent pieces doesn't produce a significant rise in complexity.

For instance, the payload which specifically looks for uranium centrifuge hardware is independent of the worm which carries the payload. They can be developed separately, by different people, and combined.

That specific worm could carry a payload to attack anything.

Or, that specific payload could be carried by any worm.

There is next to no coupling between "worm" and "payload".


Agreed. In computer science the smallest lever can move the largest mass and smaller levers are not inherently less sophisticated.


agreed. upvoted.


> const int CENTRIFUGE_RPM = 1203;

As the linked article points out, it wasn't just raising the speed, it was raising it in a subtle enough way to ruin the process while other experts routinely monitored the system


Most importantly it worked very successfully. With Windows, Google search, or most of the others mentioned here, they have had a huge number of problems. The word used was "sophisticated." I think that also implies some level of near-flawlessness in the end result.


... except it was discovered and widely publicized.


After it had dismantled a whole countries nuclear weapon program...


7 years after deployment


There is no telling. It could still be out there.


It was found, and now everyone is using that code base to iterate on new weapons.


99.99% of the time, Windows works just fine for me.

Stuxnet only needed to work once.


It worked multiple times. And it needed to propagate, undetected, for months until it made its way into the nuclear facility.

It didn’t just work once.


I think vkou might have been talking about the precision of Stuxnet.


The complexity or quality of a software code does not neccessarily say anything about the complexity of the problem it solves.


so true. upvoted


The sophistication of a piece of code is not merely an attribute of its complexity.

Else a program with tons of accidental complexity (a badly written program by an intern) would be equally good with a program with huge essential complexity (a 10 line program that takes tons of domain and/or programming knowledge to write)...


youre right. upvoted


There was a way to make your point politely and be taken seriously. This was not the way.


You mean the parent was being sarcastic? If so, it went over my head.


I think, the number of zero-days included in Stuxnet is an important factor in making it sophisticated and complex.


The second piece of code is not more complex, but it is (presumably) a lot more sophisticated.

The fact that I had to prefix that with "(presumably)"—i.e. I can't actually tell using my own expertise—is evidence of that.


Have you written motor control software before? If you haven't that might be why you can't tell. Whenever hardware is involved, with perhaps sometimes the exception of GPUs and workstation CPUs, I've noticed people's intuitions get a lot less reliable -- it's sort of like looking up the programming abstraction tower, lexical closures with higher-order functions to compute derivatives can seem awfully sophisticated to someone who's never seen something like it.

Of course if the sophistication is more about what they needed to know in order to break the things (and make that code change), then talking about this subsystem by itself that's either way lower or roughly the same as what they'd need to know to build and operate their own centrifuges. Much less, if they only needed to focus on one part of the process (motor control) that would cause problems (which might just be a brief consultant call with our own nuclear physicists and engineers, I don't know, nuclear science details seem as mysterious to me as high level language details might to impoverished programmers), or about the same, if they knew everything the Iranians knew about the systems (did we ever find out if they got all the blueprints and so forth and built replicas for end-to-end testing?) plus a bit extra on how and where to make it break without easily being detected.

Anyway how sophisticated can they really be when they didn't even use source control? (Old joke... https://news.ycombinator.com/item?id=4052597)


Uh, that’s one small but important conponent of Stuxnet. The complexity is in the delivery mechanism, and the way it disguised itself, and the way it actually broke the centrifuges.


upvoted thanks


From https://en.wikipedia.org/wiki/Sophistication

> Sophistication has come to mean a few things, but its original definition was "to denature, or simplify". Today it is common as a measure of refinement

So no, it can in many cases even be the precise opposite of complexity.

It actually originally comes from "sophistry", which is an ancient greek discipline of wisdom and excellence. I would generally associate the word with a high level of complexity that has been expertly reduced and refined to an elegant quality.


The sophists, as you say, were ancient Greek teachers.

But sophistry now means something rather different: using subtle, specious reasoning to deceive.


Typically, different words refer to different things. Most often, words considered synonyms actually refer to slightly different things.


>Can you really say that the second bit of code is more "complex"?

Yes. Take fastinvsqrt() for example. Cleve Moler learned about this trick from code written by William Kahan and K.C. Ng at Berkeley around 1986.

  float fastInvSqrt(float x) {
    int i = *(int*)&x;
    i = 0x5f3759df - (i >> 1);
    float y = *(float*)&i;
    return y * (1.5F - 0.5F * x * y * y);
  }
Simple instructions, VERY complex code. Not as complex as this one, though, which took almost 20 years to come about:

  float fastInvSqrt(float x) {
    int i = *(int*)&x;
    i = 0x5f375a86 - (i >> 1);
    float y = *(float*)&i;
    return y * (1.5F - 0.5F * x * y * y);
  }
Chris Lomont says "The new constant 0x5f375a86 appears to perform slightly better than the original one. Since both are approximations, either works well in practice. I would like to find the original author if possible, and see if the method was derived or just guessed and tested."


A model aircraft can be simple, but an understanding of principles for designing it can be hard. IMHO, these two pieces of code are extremely simple, in terms of logic, instructions and computations. But they are sophisticated, the second is even more sophisticated than the first.

Root of the debate: words are not well-defined.


> But do those things really contribute to the sophistication of the software?

> Can you really say that the second bit of code is more "complex"?

I don't think you should equate complexity with sophistication.


I, personally, would differentiate between complex and sophisticated.

That is just one line of code, sure. But I can't imagine what it took to get that line of code there, and everything that comes with that. How many people were involved, PhD's, years of experience in a range of fields, and not just years of experience in any field but experience in fields like espionage.

My uneducated brain would still put "most sophisticated software ever written" in the hyperbole box, but even then I'm hesitating.


yeah. in order to agree with that "most sophisticated software" claim i think he'd need to compare it to some other candidates for that title.


Hell, sure yes. The complexity is in the data. At the end of day, it is all 0s and 1s. It is the pattern/effect that matters.


Wouldn't the people who know the physical things just write requirements for those farther on down the chain?

The threat analysts say, we need to destroy Iran's ability to make nuclear weapons. The nuclear weapons specialists say, the part where we can best do that is by somehow breaking their centrifuges. The centrifuge technician they call up says, "well, x RPMs will really ruin those things. And it would be hard to tell if they did it like this..." Then the software guys make the code that ruins the centrifuge, and the red team incorporates it into their fancy worm, with specs on what exactly to look for.

Ultimately, it was kind of a failure in that anyone found out about it. Maybe there were better programs, and because they were better we never heard about them at all. But still it's pretty amazing :)


The key part is that you have to bring all of those all together. In hindsight it might be straightforward but if you had a blank slate, how would you approach the problem of "stop Iran from refining Uranium"?


To me the most surprising result would be if it cost fewer than bombing the nuclear facility. At 100k$ each bomb, Stuxnet looks affordable, plus all the expertise and other attack vectors you get from piecing it togethet.


$100K for a bomb? I have no idea what bombs really cost, but if we go with that number, they could have dropped a lot of bombs for that price. One junior engineer working for a year costs that much. We know that expertise in a lot of fields existed, that implies a number of engineers.

I'm going to guess a bomb is cheaper. Of course a bomb has a lot of other disadvantages which is why it wasn't used.


One particularly expensive component of stuxnet is deniability. Although the commonly accepted theory for stuxnet's invention is "a state actor", specifically the United States, there's no proof of that at all. And conjecture without proof poses no threat to the US government.

If the government were to, on the otherhand, bomb Iranian nuclear facilities, one small mistake in the plan could ruin their chances of deniability, bringing down international condemnation on the US.


>and deep understanding of the operating systems referenced above

I think this understates it; it required a deeper understanding of the vulnerabilities of those operating systems than anyone else in the world, including the creators of the operating systems


Well, in the case of windows, I recall that maintaining backward compatibility with a variety of applications required a knowledge of the resource demands of each of those applications, with those applications each operating in a different domain. Similarly, creating memory allocators is something of a "black art" - it's a matter of generically good allocator but one which doesn't generate fragmented memory in "normal usage patterns" and then you have to learn what those normal usage patterns are, which involves understanding however many applications.

So the question of "sophistication" is both subtle and difficult to call.

Edit: And the production of a algorithm that's a conglomeration of ad-hoc processes might qualify as another sort of sophistication, see "the hardest program I ever wrote":

http://journal.stuffwithstuff.com/2015/09/08/the-hardest-pro... http://journal.stuffwithstuff.com/2015/09/08/the-hardest-pro...


Most of the bullet points you've listed can be summed up as "business logic." I'm sure the Stuxnet programmers worked with physicists and industrial controls specialists.

Developing software for, say, jet engines requires sophisticated knowledge of jet engines, which is probably about equally complex. But it's manageable because programmers work with engineers who are subject matter experts.


Or just like: how do you mess up a centrifuge controlled by SCADA without them knowing? just change the speed and report another speed, done.

You don't need to know classical mechanics to use a bike, or know about internal combustion engines to use a car.


Then think about the expertise to put together a self driving car... From sensors to ML...


The breadth of the expertise is: writing the worm, plus the domain knowledge of a nuclear engineer. Period. You could argue that the control software of those centrifuges is as sophisticated as the worm, since in requires knowledge in two separate domains: writing software and nuclear engineering. Same goes for any ERP software, which requires the contribution of software experts and domain experts.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: