Hacker News new | comments | show | ask | jobs | submit login

I'd argue that Google Search is much more sophisticated than Stuxnet. Windows is much more sophisticated. Linux is more sophisticated than Stuxnet. The list goes on.

We tend to ignore the sophistication of things we are familiar with, and hype those that surprise. But that's not a fair measure of anything.




In my view, the sophistication is implied by the breadth of expertise required to put the whole thing together. Google Search and the OS landscape are for sure broad and sophisticated. However, their development was accomplished by computer scientists.

In order for stuxnet to be effective, it was necessary to employ expertise in:

- Uranium enrichment methods and processes

- Capital equipment control systems and their development environments

- Theory of operation of centrifuge machines

- Corporate espionage of some sort

- Organizational management skills that can pull all that together

- and deep understanding of the operating systems referenced above


But do those things really contribute to the sophistication of the software? For example imagine some code written with no understanding of uranium enrichment:

    const int CENTRIFUGE_RPM = 500;
And then some other code written with a deep understanding of uranium enrichment:

    const int CENTRIFUGE_RPM = 1203;
Can you really say that the second bit of code is more "complex"? Same goes for stolen driver signing keys and some of the other things mentioned in the post.

Other large software projects like operating systems or Google search involve much more complex software concepts which I think is the primary thing that should be measured when discussing the sophistication of software.


>Can you really say that the second bit of code is more "complex"?

Yes.

Complexity in the sense discussed is related to the domain knowledge (including CS knowledge) required for the program to be written and work well.

Else even a trivial BS program could be very complex, just sprinkle it with gotos and unnecessarily convoluted code...


This is such a powerful distinction that I feel it should help us rethink language paradigms. Complexity is not (just) the complications one can impose by construct or the involutions required of ones algorithms, it's the overall real world system your code addresses.

Simple programs which are coded simply may address complex phenomena to complex ends--perhaps that's even the ideal?


You might enjoy Fred Brook's essay "No Silver Bullet", where he distinguishes between "Accidental Complexity" (basically, complexity created by software engineers when implementing a solution) and "Essential Complexity" (complexity that arises because software is written to solve some real world problem and the world is complex).


Most people perceive complexity as things they don't understand. In that case, complexity will be relative.


> Most people perceive complexity as things they don't understand.

I don't think this is true. For example, as a math teacher, I couldn't do a very good job predicting how easy or difficult students would find particular problems. But I could easily predict which problems would be easier and which would be more difficult. I could do that even though I personally understood all the problems.


I don't think difficulty is complexity. For example, the bitcoin mining protocol complexity is the same but the difficulty goes up or down.

I'll attribute difficulty to the energy required to resolve a system. For example, pulling weight. The complexity of the action is the same. But the difficulty depends on the weights to pull.


Complexity is difficulty of understanding. In the context of mathematical problems, that is the relevant kind of difficulty.


It seems you’re vastly misusing the words and their contexts here.


Sure, I suppose you'd just need a good definition for complexity. Notions like computational complexity have clear definitions while what I think you're describing might not. Or may be it would require some thinking and be valid in some limited regimes of "real world" effects as you call it.


Something something about simple rules being able to describe complex behaviour. Example: you can describe a flock of birds in motion around an object with 2 or 3 rules.

Complex rules yields stupid results. Example: tax codes in most countries.

Must be a quote but I wasn't able to find a source for it.


The problem with simple rules is the volume of computation. Theoretically you could write a tax code using quantum mechanics, but good luck calculating your tax each year (or before the heat death of the universe).

When systems get too complex to simulate from first principles, we have to resort to inductive reasoning--observe the system and then create rules as we see a need.

Yes the resulting rule set is a mess, like our tax code. But the physical system that the U.S. federal tax code (for example) covers--the United States of America--is mind-bogglingly complex.

We have trouble computationally simulating more than a certain number of neurons... there are billions of neurons in each human brain, and there are hundreds of millions of human brains interacting in the U.S. This does not even get into other physical phenomena like surface water or mineral distribution.

The results are stupid because we are too stupid to understand and analyze the system we're trying to describe and manage.


That something something is actually Agent Based Modeling / Simulation.

Back when I was in academia I used to develop ABMs to represent the behaviour of complex systems with a simple set of rules of agent action and interaction.

The game of Life is the quintessential example of that.


Stringing together independent pieces doesn't produce a significant rise in complexity.

For instance, the payload which specifically looks for uranium centrifuge hardware is independent of the worm which carries the payload. They can be developed separately, by different people, and combined.

That specific worm could carry a payload to attack anything.

Or, that specific payload could be carried by any worm.

There is next to no coupling between "worm" and "payload".


Agreed. In computer science the smallest lever can move the largest mass and smaller levers are not inherently less sophisticated.


agreed. upvoted.


> const int CENTRIFUGE_RPM = 1203;

As the linked article points out, it wasn't just raising the speed, it was raising it in a subtle enough way to ruin the process while other experts routinely monitored the system


Most importantly it worked very successfully. With Windows, Google search, or most of the others mentioned here, they have had a huge number of problems. The word used was "sophisticated." I think that also implies some level of near-flawlessness in the end result.


... except it was discovered and widely publicized.


After it had dismantled a whole countries nuclear weapon program...


7 years after deployment


There is no telling. It could still be out there.


It was found, and now everyone is using that code base to iterate on new weapons.


99.99% of the time, Windows works just fine for me.

Stuxnet only needed to work once.


It worked multiple times. And it needed to propagate, undetected, for months until it made its way into the nuclear facility.

It didn’t just work once.


I think vkou might have been talking about the precision of Stuxnet.


The complexity or quality of a software code does not neccessarily say anything about the complexity of the problem it solves.


so true. upvoted


The sophistication of a piece of code is not merely an attribute of its complexity.

Else a program with tons of accidental complexity (a badly written program by an intern) would be equally good with a program with huge essential complexity (a 10 line program that takes tons of domain and/or programming knowledge to write)...


youre right. upvoted


There was a way to make your point politely and be taken seriously. This was not the way.


You mean the parent was being sarcastic? If so, it went over my head.


I think, the number of zero-days included in Stuxnet is an important factor in making it sophisticated and complex.


The second piece of code is not more complex, but it is (presumably) a lot more sophisticated.

The fact that I had to prefix that with "(presumably)"—i.e. I can't actually tell using my own expertise—is evidence of that.


Have you written motor control software before? If you haven't that might be why you can't tell. Whenever hardware is involved, with perhaps sometimes the exception of GPUs and workstation CPUs, I've noticed people's intuitions get a lot less reliable -- it's sort of like looking up the programming abstraction tower, lexical closures with higher-order functions to compute derivatives can seem awfully sophisticated to someone who's never seen something like it.

Of course if the sophistication is more about what they needed to know in order to break the things (and make that code change), then talking about this subsystem by itself that's either way lower or roughly the same as what they'd need to know to build and operate their own centrifuges. Much less, if they only needed to focus on one part of the process (motor control) that would cause problems (which might just be a brief consultant call with our own nuclear physicists and engineers, I don't know, nuclear science details seem as mysterious to me as high level language details might to impoverished programmers), or about the same, if they knew everything the Iranians knew about the systems (did we ever find out if they got all the blueprints and so forth and built replicas for end-to-end testing?) plus a bit extra on how and where to make it break without easily being detected.

Anyway how sophisticated can they really be when they didn't even use source control? (Old joke... https://news.ycombinator.com/item?id=4052597)


Uh, that’s one small but important conponent of Stuxnet. The complexity is in the delivery mechanism, and the way it disguised itself, and the way it actually broke the centrifuges.


upvoted thanks


From https://en.wikipedia.org/wiki/Sophistication

> Sophistication has come to mean a few things, but its original definition was "to denature, or simplify". Today it is common as a measure of refinement

So no, it can in many cases even be the precise opposite of complexity.

It actually originally comes from "sophistry", which is an ancient greek discipline of wisdom and excellence. I would generally associate the word with a high level of complexity that has been expertly reduced and refined to an elegant quality.


The sophists, as you say, were ancient Greek teachers.

But sophistry now means something rather different: using subtle, specious reasoning to deceive.


Typically, different words refer to different things. Most often, words considered synonyms actually refer to slightly different things.


>Can you really say that the second bit of code is more "complex"?

Yes. Take fastinvsqrt() for example. Cleve Moler learned about this trick from code written by William Kahan and K.C. Ng at Berkeley around 1986.

  float fastInvSqrt(float x) {
    int i = *(int*)&x;
    i = 0x5f3759df - (i >> 1);
    float y = *(float*)&i;
    return y * (1.5F - 0.5F * x * y * y);
  }
Simple instructions, VERY complex code. Not as complex as this one, though, which took almost 20 years to come about:

  float fastInvSqrt(float x) {
    int i = *(int*)&x;
    i = 0x5f375a86 - (i >> 1);
    float y = *(float*)&i;
    return y * (1.5F - 0.5F * x * y * y);
  }
Chris Lomont says "The new constant 0x5f375a86 appears to perform slightly better than the original one. Since both are approximations, either works well in practice. I would like to find the original author if possible, and see if the method was derived or just guessed and tested."


A model aircraft can be simple, but an understanding of principles for designing it can be hard. IMHO, these two pieces of code are extremely simple, in terms of logic, instructions and computations. But they are sophisticated, the second is even more sophisticated than the first.

Root of the debate: words are not well-defined.


> But do those things really contribute to the sophistication of the software?

> Can you really say that the second bit of code is more "complex"?

I don't think you should equate complexity with sophistication.


I, personally, would differentiate between complex and sophisticated.

That is just one line of code, sure. But I can't imagine what it took to get that line of code there, and everything that comes with that. How many people were involved, PhD's, years of experience in a range of fields, and not just years of experience in any field but experience in fields like espionage.

My uneducated brain would still put "most sophisticated software ever written" in the hyperbole box, but even then I'm hesitating.


yeah. in order to agree with that "most sophisticated software" claim i think he'd need to compare it to some other candidates for that title.


Hell, sure yes. The complexity is in the data. At the end of day, it is all 0s and 1s. It is the pattern/effect that matters.


Wouldn't the people who know the physical things just write requirements for those farther on down the chain?

The threat analysts say, we need to destroy Iran's ability to make nuclear weapons. The nuclear weapons specialists say, the part where we can best do that is by somehow breaking their centrifuges. The centrifuge technician they call up says, "well, x RPMs will really ruin those things. And it would be hard to tell if they did it like this..." Then the software guys make the code that ruins the centrifuge, and the red team incorporates it into their fancy worm, with specs on what exactly to look for.

Ultimately, it was kind of a failure in that anyone found out about it. Maybe there were better programs, and because they were better we never heard about them at all. But still it's pretty amazing :)


The key part is that you have to bring all of those all together. In hindsight it might be straightforward but if you had a blank slate, how would you approach the problem of "stop Iran from refining Uranium"?


To me the most surprising result would be if it cost fewer than bombing the nuclear facility. At 100k$ each bomb, Stuxnet looks affordable, plus all the expertise and other attack vectors you get from piecing it togethet.


$100K for a bomb? I have no idea what bombs really cost, but if we go with that number, they could have dropped a lot of bombs for that price. One junior engineer working for a year costs that much. We know that expertise in a lot of fields existed, that implies a number of engineers.

I'm going to guess a bomb is cheaper. Of course a bomb has a lot of other disadvantages which is why it wasn't used.


One particularly expensive component of stuxnet is deniability. Although the commonly accepted theory for stuxnet's invention is "a state actor", specifically the United States, there's no proof of that at all. And conjecture without proof poses no threat to the US government.

If the government were to, on the otherhand, bomb Iranian nuclear facilities, one small mistake in the plan could ruin their chances of deniability, bringing down international condemnation on the US.


>and deep understanding of the operating systems referenced above

I think this understates it; it required a deeper understanding of the vulnerabilities of those operating systems than anyone else in the world, including the creators of the operating systems


Well, in the case of windows, I recall that maintaining backward compatibility with a variety of applications required a knowledge of the resource demands of each of those applications, with those applications each operating in a different domain. Similarly, creating memory allocators is something of a "black art" - it's a matter of generically good allocator but one which doesn't generate fragmented memory in "normal usage patterns" and then you have to learn what those normal usage patterns are, which involves understanding however many applications.

So the question of "sophistication" is both subtle and difficult to call.

Edit: And the production of a algorithm that's a conglomeration of ad-hoc processes might qualify as another sort of sophistication, see "the hardest program I ever wrote":

http://journal.stuffwithstuff.com/2015/09/08/the-hardest-pro... http://journal.stuffwithstuff.com/2015/09/08/the-hardest-pro...


Most of the bullet points you've listed can be summed up as "business logic." I'm sure the Stuxnet programmers worked with physicists and industrial controls specialists.

Developing software for, say, jet engines requires sophisticated knowledge of jet engines, which is probably about equally complex. But it's manageable because programmers work with engineers who are subject matter experts.


Or just like: how do you mess up a centrifuge controlled by SCADA without them knowing? just change the speed and report another speed, done.

You don't need to know classical mechanics to use a bike, or know about internal combustion engines to use a car.


Then think about the expertise to put together a self driving car... From sensors to ML...


The breadth of the expertise is: writing the worm, plus the domain knowledge of a nuclear engineer. Period. You could argue that the control software of those centrifuges is as sophisticated as the worm, since in requires knowledge in two separate domains: writing software and nuclear engineering. Same goes for any ERP software, which requires the contribution of software experts and domain experts.


I think Windows and Stuxnet are sophisticated in different ways.

Windows has to cover a huge area and a lot of "known" unknowns and be able to recover (somewhat) reliably. Stuff breaks, you get weird error messages, that driver for your Wi-Fi never really worked right, but at the end of the day you have a computer that works pretty well, and that's quite remarkable. The same is of course true of Linux and other operating systems.

Stuxnet is a hyper-specialized piece of software (malware) that cannot fail or it loses it's purpose. The authors clearly knew they had to have multiple fallbacks for every step of the process, but I find it very impressive that it reached it's end goal successfully and without being discovered. A lot of software (including malware) break because of regular software bugs, environments that differ from the expected, interference by the user, the list goes on. For Stuxnet to have avoided all of those, that is quite sophisticated.


I agree.

It's the most sophisticated piece of malware, that's for sure (at least counting the ones we know of).

But calling it the most sophisticated piece of software is too big of a stretch.

That said, other answers to this question include what we would traditionally consider as contestants (like Linux kernel), it just happens that the submitter decided to submit this specific answer. I don't know was this the top answer before it exploded here, but it sure is now.


> It's the most sophisticated piece of malware, that's for sure (at least counting the ones we know of).

Isn't Stuxnet a part of a family of similar nation state malware that would also include Flame and Duqu?


They are a family, as in, all of them were almost certainly created by the same group.

Symantec said that Duqu is "near identical" to Stuxnet. As for Flame, Kaspersky[0] initially said that it contains no resemblance to Stuxnet, and then later on discovered that they've even shared a zero day in their early versions.

From my understanding, I don't necessarily consider them as different software, more as a single software + forks by the same group for different purposes and with different zero days.

Stuxnet just happened to be the one that got to be the most popular one, for a number of reasons (most destructive, attacking the most sensitive targets, the one that got out of control and spread outside of Iran uncontrollably, first to be discovered...), so I refer to Stuxnet as the original one and Flame and Duqu as more of forks than completely different pieces of software.

Which one is more sophisticated between the three would be the same as if we tried figuring out which Linux-based OS is the most sophisticated, except that in this scenario, we only have 3 Linux distros (maybe four with Gauss) and they've all been created by the same group. There's really no point in trying to compare their sophistication.

[0] Before people bash on me for using Kaspersky as a source, Kaspersky, Iranian CERT and a university in Bucharest were the ones that initially discovered Flame, and Kaspersky's the group that published the first detailed analysis on Flame.


Out of topic but your defense for referencing Kaspersky makes me wonder why people would see a problem with it? I'm not familiar with the field and don't know who's who.


Only if you're on "Team USA". Looking on from outside, it seems to me pretty obvious that a Russian security company might provide useful insights on US malware operations that a large US security company would be less inclined or would not immediately report on.

Otherwise it's just your basic mudslinging; Both Kaspersky and US security companies are likely to do their governments favours, in particular by selectively not reporting things, both willingly and under pressure. If you're a US citizen working for a US security company and you'd stumble upon a US malware operation that appears to be doing something benign, such as preventing nuclear whatnots, you might be disinclined to report on it for fear of ruining a US malware mission--and even look past the fact that they're using such a risky, dangerous type of software to do it (being a worm/virus, remember that Stuxnet also disrupted and got into places that weren't targets).

Back when Stuxnet was active, I closely followed the story and the existence of the (airgap-hopping) virus was discovered long before people got any solid ideas about its purpose. When finally the first reports came that the special control software checked for machines running on a frequency that were only used in either some Finnish industrial plant or these Iranian refineries[0], the first reports on this did not come from a US security company.

[0] This part is a bit vague sorry. I wish I had sourced/fact-checked this part of the story better, years ago. There was so much going on.


They're a Russian company and semi-recently Trump banned their software from government agencies.

People theorize they're controlled by the Russian government but I've never come across any evidence that they're anything other than a top tier security company though.

They have done some fairly bold moves in the past though, like cleverly calling out other AV companies that were copying their detections [0] and kind of embarrassing the NSA [1] when a NSA employee took their malware/cyber weapons home to their PC running Kaspersky AV, which detected the malware and sent it back to Kaspersky server for analysis.

[0]https://www.theregister.co.uk/2010/02/10/kaspersky_malware_d...

[1] https://www.bleepingcomputer.com/news/security/nsa-employee-...


In Kaspersky's defense, they have started making their source code auditable for certain customers. Kaspersky is well aware of how they are perceived as a company, and they are aware that if anyone ever traces any of their activities back to the KGB, it's game over for them. I can't pretend I trust Kaspersky 100%, but I can see why others might.


From:

"[0]https://www.theregister.co.uk/2010/02/10/kaspersky_malware_d...

"I've received feedback from people who were just focusing on the question why other anti-virus companies would detect a clean file we uploaded. And I can only repeat as I did in the blog: This could have happened to us as well," Kalkuhl explained."

Well, he clearly says, the test was to expose the "negative effect of cheap static on-demand tests" and not that others copied from them, because this seems to be routine and they do the same.


> They're a Russian company and semi-recently Trump banned their software from government agencies.

I know it's popular to bash Trump, but it was the DHS that banned the software, not Trump:

In a binding directive, acting homeland security secretary Elaine Duke ordered that federal civilian agencies identify Kaspersky Lab software on their networks. After 90 days, unless otherwise directed, they must remove the software, on the grounds that the company has connections to the Russian government and its software poses a security risk.

Which came after the GSA removed them from the list of approved vendors:

The directive comes months after the federal General Services Administration, the agency in charge of government purchasing, removed Kaspersky from its list of approved vendors. In doing so, the GSA suggested a vulnerability exists with Kaspersky that could give the Kremlin backdoor access to the systems the company protects.

https://www.washingtonpost.com/world/national-security/us-to...


I say this without having seen the code base for either, but I'd be surprised if Stuxnet's code base was anywhere near as large or with as many moving pieces. Still, it's incredible to imagine the knowledge base that needed to go into Stuxnet to get things off the ground.

Google Search was originally written by two guys in graduate school and has been refined and rewritten many times since then. I'm sure the code base is complicated and undoubtedly some of the greatest minds in software engineering and computer science have used it. The same goes for Linux, which was written by one guy and grew from there.

On the other hand, Stuxnet isn't something that a few brilliant graduate students could have put together. To even get this thing off the ground, you need people with backgrounds in nuclear physics and/or chemistry, operating systems specialists, people with knowledge of industrial equipment, networking experts, an espionage network and competent management to pull it all together. Plus, you need to keep the whole project secret. Oh, and funding. Lot's of funding.

I'd call that sophistication in that you can't even think about starting to tackle this problem if you're just two guys in a garage.


I think of it like if everyone who read these comments on HN got together we could engineer a very good OS.

I doubt we could come close to solving the problem of "stopping Iran nuke production without killing anyone or starting a war"


Without any kind of metric for “sophisticated” it’s all subjective anyhow. I like Stuxnet as an example - it’s devious and a true hacker approach, albeit as blackhat as they come.


i think for something to be sophisticated we are looking at how complex it is. this worm does nothing new in that regard (taking advantage of 0days, hiding, covering tracks etc.) it is no more sophisticated than a regular worm. quora is a fucking joke.


Personally, I think the Stuxnet work is comparable to a Rube Goldberg contraption.

So you could ask, is a RG machine more sophisticated than say a computer - maybe not on a strictly technical level, but again without a metric, it's all about how we feel about it.

Anyways, I thought it was a great writeup that explains at least one aspect of what sophisticated software is, in a language most anyone could understand.


I think for something to be sophisticated we are looking at the metric that differentiates fine wines and cheeses from plebeian non-fine wine and cheeses.

If we can just capture that essence, we will wield the power of sophistication in our hands.


Oh, in that case it's just placebo, price, and primed expectations.


You mean how much money is charged?


Big != sophisticated. I’m not denying that there’s a ton of effort and features that go into windows, but operating systems are well known, and I’m sure most of the code powering windows is not all that sophisticated outside of some core components.

It’s sort of like comparing a skyscraper to an iPhone. Sure the skyscraper requires a lot more manual labor, but the iPhone is more sophisticated. It took ~80 years from when the Empire State Building was built to when the iPhone was built. The iPhone is more sophisticated but it’d still take more time and resources to make another Empire State Building.

Sorry if that’s a poor analogy- it’s the best I got right now.


You could debate this all day long for various values of "sophisticated." I think the author just meant some variation on "amazingly devious."


Lets not forget software used for extremely complicated and risky operations like Mars Rover or Rosetta mission, developers did some quite amazing things there with very limited hardware resources...


To me the primary difference is the software you mention performs their tasks in the open with cooperative users.

Stuxnet installed itself without cooperation, hid itself perfectly and still completed its objective flawlessly against a hostile user base.


Windows certainly has more undiscovered Windows/Driver exploits in it than Stuxnet ever had!


There's sophistication of the domain, and of the code. They are independent.

For example - it might take years of research to develop a formula for calculating something, but the final code can be very simple one-liner.


This comparison is ridiculous. Searching the internet can be imperfect, it can be lossy, and there aren't any real consequences. We are talking about an ad platform after all. If you search for kittens and you get back 345 results or 101 results or 2345 results, does it really matter? No, it has no consequence to anyone.


I agree on the sophistication part but I think you are missing out on resources used on development of this.

the responsible party(ies) did not access to resources , man power or infra at google or even at an enterprise scale.


MariaDB and PostgreSQL. Amazing software that is open source for us to dive in and play with. FoundationDB was a recent treat.


>I'd argue that Google Search is much more sophisticated than Stuxnet. Windows is much more sophisticated.

Not to be rude, but it really doesn't sound like you read the article.

Especially when you claim that Windows is more sophisticated. Stuxnet had to get past all of Windows security, and did so by using not just one or two or three never-before-known flaws, but a bunch of them.


The code base for the International Space Station is probably also VERY complex


Probably not. Complex things break a lot. You want the life-critical code to be as simple as possible, with proven correctness if possible. I bet you'll not find 1 recursion in ISS flight code, and you'll not find anything that's not having an known-upper-bound on time run.


https://www.nasa.gov/feature/facts-and-figures

It is possible that code itself is not that complex, but the interaction between all modules certainly has a high level of complexity.

"In the International Space Station’s U.S. segment alone, more than 1.5 million lines of flight software code run on 44 computers communicating via 100 data networks transferring 400,000 signals (e.g. pressure or temperature measurements, valve positions, etc.)."


Several years later, and even with the code, researchers are not able to summarize a complete list of what stuxnet definitely does.

What I see here is that the word "sophistication" is misunderstood by a lot of people.

Stuxnet took control of multiple layers of complex production environments. There are numerous "0day" kits in the code.

It's not like an effort like a search engine or most other organized software projects, because there are logistical dependencies of the worm itself in those exploits. If it was a US-israel effort (I think it almost definitely was, but who cares) then consider how much discipline and effort it takes to keep TWO govt groups of hackers coordinated enough to keep those exploits fresh, whilst simultaneously building a dependable worm.

Another thing, a lot of the actual machinery and shit isn't very well known, and this is worth mentioning because it's not like you can go spin up an emulator for this shit to test out your massively devastating two-country worm on.

Stuxnet of course made the best of this by using lots of different exploits in different situations, giving it the biggest attack surface it could, that's low hanging fruit anyways.

I think stuxnet doesn't impress people because maybe they think it's just a bunch of bugs in old shitty software, but it's so much more than that. It's bugs in software that only a few hundred or maybe a few thousand people have ever seen, much less pentested, on machinery that's rare and sometimes even unique to the location, the infrastructure of the place is based on rough intel at best, and oh by the way, your spy hackers need to coordinate with this other group on the other side of the planet.

Start brainstorming how you'd pull it off, and I think it'll become more imrpressive as you do.

Personally, I think it's the most incredible display of skill and prowess in malware thus far. The years I've spent disassembling, reversing, tracing, filtering, researching... A lifetime of hacking doesn't even knock the dust off of a project like that.


That is an utterly braindead assertion.

I wrote the quoted article about Stuxnet. And I've helped write multiple operating systems.

Your argument not an argument. It's just a random assertion with no technical knowledge of either Stuxnet or how to write an operating system.

Stuxnet specifically took advantage of Windows's lack of sophistication in order to replicate.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: