A lot of people think open source software is a much better methodology than proprietary, highly-protected source code. That's fine, there are a lot of good arguments there. However, it doesn't make sense to throw a bunch of other, barely related insults at a company when really, all you're upset about is that their code is not open source. Criticize that...that is what you're upset about (at least so far as this specific blog post is concerned)
Microsoft, SAP, VMware all have closed source software that is very prevalent in enterprise, often in entrenched positions just like Oracle. Sure they aren't exactly all peaches either but atleast they have a decent way of responding to security problems or plain bugs.
They also don't wield their license agreement as a weapon against their customers, they only use it to make sure they get paid.
Oracle thinks it is self-evident that protection of their source code is paramount (i.e. as closed source as possible), other people disagree both with their priorities and the very idea of absolutely forbidding any deep analysis of any kind outside of Oracle itself. It still seems like a debate about the degree to which the source code is "closed." For Oracle, it is absolutely closed, while many of their competitors are more lenient (i.e. slightly less "closed".)
To be clear, I think Oracle is being silly with their over-sanitized and idealistic views regarding their intellectual property. The other companies you mentioned (Microsoft et al) have much more reasonable approaches and agreements.
But that isn't what it says, at all. It even explicitly says that if you do find a security flaw this way they will still fix it.
But the main point is that you can't do anything by reverse engineering the source code that they aren't already doing, and doing better than you because they have the actual source code.
and it's not helpful because of the near 100% false positive rate.
It doesn't make sense for it to be illegal to forbid reverse engineering in a license agreement, where is that the case? If that is the legal environment anywhere, it would make more sense to just forbid closed source software. It would save a ton of time and effort and achieve the same goal.
And by the way, it has everything to do with open source. If the code was open, you wouldn't need to reverse engineer it. Every security analyst could just review the code directly and search for vulnerabilities. The whole disagreement stems from Oracle (and the author) deciding that they want protected, closed source because they view it as intellectual property, while some of their customers feel they can't depend on that software unless they verify it themselves. Well...you can't fully verify closed source software yourself.
It is really a very simple and fundamental disagreement on this one topic that creates the whole issue. It is completely valid to disagree with Oracle on this, and to tell Oracle you disagree with them. However, rather than violating their agreement, it would make more sense to decide to use an open source solution instead.
How is it so? You cannot find funny vulnerabilities without reverse engineering the binaries.
> It doesn't make sense for it to be illegal to forbid reverse engineering in a license agreement, where is that the case?
France, Switzerland, Russia and many more.
> it would make more sense to just forbid closed source software
How did you make this leap from reverse engineering to closed vs. open source?
> If the code was open, you wouldn't need to reverse engineer it.
Even with the full source available you still have to analyse (read: reverse-engineer) the binaries, especially those widely shipped.
> rather than violating their agreement
Their agreement is void in many countries where reverse engineering is explicitly allowed (when done for the reasons of security and interoperability).
The only definition of "reverse engineering software" that I use is this -- "Using tools and deep binary analysis to take a compiled binary, and convert it back to source code as close to the original as possible".
It is a very specific definition. I do not mean general "analysis" or vulnerability testing or input manipulation, etc... only attempting to discover source code.
Uhm, no, that's far too narrow. Reverse engineering is any kind of introspection into a device in question, designed for obtaining any degree of understanding of its inner functioning.
What you're talking about is called "decompilation", and it's not even among the most useful reverse engineering techniques.
edit: I just wanted to explain to you which definition I was using earlier, so that you could understand what I meant better.
Your definition is not what anyone else uses, from what I've seen. You don't have to use the same definition, of course, but be aware that unless you clarify what you mean, you're going to be creating a ton of confusion.
However, this is one of the reasons I disagree with Oracle on the matter. There are tools which actually can and do find issues at this low level (even if there are false positives), and running those tools can be part of many reasonable verification efforts. I think static analysis at the bytecode or assembly code level still counts as analyzing the source code, but I think it makes sense to do that in many scenarios.
That's the definition of decompiling not reverse engineering. The original IBM BIOS was reverse engineered by two teams: one which read the disassembled binary and wrote a written specification and a second team that took that specification and wrote code.
See the previous reply I made for links to discussion of what is usually meant by (software) "reverse engineering." But, like I said before, there probably is no "universally correct" definition, I am only describing it so that my previous comments can be understood fully.
Even further, if I buy a software, and it does not run on my system, I can turn it back into source, modify it, recompile it, and use it as much as I want.
If the original company tries to prevent me from doing this, they commit a crime that can be punished with multiple months of jail for their CEO or 10% of their profit as long as they have that practice.
So, decompilation in order to check for security vulnerabilities or to modify the function of the software for non-interoperability reasons do not appear to be covered.
disclaimer: I don't live in Europe and am not familiar with the most up-to-date software laws there. This is from the following source: https://en.wikipedia.org/wiki/Decompiler#Legality
This would be one example case. As I posted in my comment.
Integration can also just mean for future possible integration – for example, if someone writes a tool that can read a file format, I may decompile it to implement a tool to read the same file format.