shebe asks the simple question: "where does this symbol appear as text?". For C++ codebases that heavily use templates and macros, shebe will struggle. But I'm curious how it would actually perform, so I'm currently performing a search on https://gitlab.com/libeigen/eigen. Will report the results shortly.
I consulted for a large manufacturing firm building an application to track the logical design of a very complex product.
They modeled the parts as objects. No problem.
I was stunned to see the following pattern throughout the code base:
Class of the object
Instance #1 of the class
Instances 2,,n of the class
I politely asked why this pattern existed.
The answer was "it's always been that way."
I tracked down the Mechanical Engineer (PhD) who designed the logical parts model. He desk was, in fact, 100 feet away from mine.
I asked him what he intended, regarding the model. He responded "Blueprint, casting mold, and manufactured parts." - which I understood immediately, having studied engineering myself.
After telling him about the misunderstanding of his model by the software team, I asked him what he was going to do about it. He responded "Nothing."
I went back to the software team to explain the misunderstanding and the solution (i.e. blueprint => metaclass, casting mold => class, and manufactured parts => instances). The uniform response was "It is too late to change it now."
The result is a broken model that was wrong for more than a decade and may still be deployed. The cost of the associated technical debt is a function of 50+ team members having to delineate instance #1 from instances 2,,n for over a decade.
N.B. Most of the software team has a BS (or higher) in computer science.
P.S. Years later, I won't go anywhere near the manufactured product.
There's a reason Java applets got deprecated in every browser. The runtime was inherently insecure. It just doesn't work for the web.
Also, targeting the JVM forces you to accept garbage collection, class-based OO and lots of pointer chasing. It's not a good target for most languages.
Java's pretty good, but wasm is actually a game changer.
The Java runtime isn't any more inherently insecure than the JavaScript runtime, and JavaScript seems to work just fine for the web.
The key reason why applet security failed was because it gave you the entire JDK by default, and so every method in the JDK needed to have explicit security checking code in place to restrict access. The model was backwards -- full control by default with selective disabling meant that every new feature in the JDK is a new vulnerability.
Just look up "Java applet sandbox escape". There were tons of ways to do it. Here are some [0]. Then there's the coarse-grained permissions that were essentially useless to begin with.
Yes, I'm familiar with these. Many of the earliest problems were to due bugs in the verifier, and there were several different vendors with their own set of bugs. The bulk of these problems were identified and resolved over 25 years ago.
Most of the later problems are due to the fact that the API attack surface was too large, because of the backwards SecurityManager design. And because it existed, it seems there was little incentive to do something better.
Once the instrumentation API was introduced (Java 5), it made it easier to write agents which could limit access to APIs using an "allow" approach rather than the awful rules imposed by the SecurityManager. Java 9 introduced modules, further hardening the boundaries between trusted and untrusted code. It was at this point the SecurityManager should have been officially deprecated, instead of waiting four more years.
Going back to the earlier comment, the problem isn't due to the runtime being somehow inherently insecure, but instead due to the defective design of the SecurityManager. It hasn't been necessary for providing security for many years.
I'm not too sure, but the main reason MS developed it was because they just wanted Java without licensing it from Oracle, so I imagine they made a lot of similar design decisions.
I am a huge, huge fan of wasm. The first time I was able to compile a qt app to Linux, windows, Mac, and wasm targets, I was so tickled pick it was embarrassing. Felt like I was truly standing on the shoulders of giants and really appreciated the entirety of the whole “stack” if you will.
Running code in a browser isn’t novel. It’s very circular. I actually met someone the other day that thought JavaScript was a subset of Java. Same person was also fluent in php.
Wasm is really neat, I really love it. My cynical take on it is that, at the end of the day, it’ll just somehow help ad revenue to find another margin.
Fair. Running in the browser isn't novel, but JS/TS are some of the most popular languages in history and that almost certainly never would have happened without monopolizing the browser.
Expanding margins are fine by me. Anticompetitive markets are not. My hope is that wasm helps to break a couple strangleholds over platforms (cough cough iOS cough Android)
Undefined behaviour is defined with respect to the source language, not the execution engine. It means that the language specification does not assign meaning to certain source programs. Machine code (generally) doesn't have undefined behaviour, while a C program could, regardless of what it runs on.
Native code generally doesn't have undefined behaviour. C has undefined behaviour and that's a problem regardless of whether you're compiling to native or wasm.
"English as a programming language" has neither well-defined syntax nor well-defined semantics.
There should be no expectation of a "correct" translation to any programming language.
N.B. Formal languages for specifying requirements and specifications have been in existence for decades and are rarely used.
From what I've observed, people creating software are reluctant to or incapable of producing [natural language] requirements and specifications that are rigorous & precise enough to be translated into correctly working software.
In the theoretical world where a subset of English could be formalized and proven and compiled, the complexity of the language would reduce my willingness to use it. I find that the draw of AI comes from it's "simplicity," and removing that (in favor of correct programs) would be pointless - because such a compiler would surely take forever to compile "English" code, and would not be too different from current high level languages, imo.
Software Engineering is only about 60 years old - i.e. the term has existed.
At the point in the history of civil engineering, they didn't even know what a right angle was.
Civil engineers were able to provide much utility before the underlying theory was available. I do wonder about the safety of structures at the time.
> Software Engineering is only about 60 years old - i.e. the term has existed.
Perhaps as a documented term, but the practice is closer to roughly 75+ years. Still, IMHO there is a difference between those who are Software Engineers and those whom claim to be so.
> At the point in the history of civil engineering, they didn't even know what a right angle was.
I strongly disagree with this premise, as right angles were well defined since at least ancient Greece (see Pythagorean theorem[0]).
> Civil engineers were able to provide much utility before the underlying theory was available.
Eschewing the formal title of Civil Engineer and considering those whom performed the role before the title existed, I agree. I do humbly suggest that by the point in history to where Civil Engineering was officially recognized, a significant amount of the necessary mathematical and materials science was available.
First day of metalwork I managed to catch my hand on a sharp edge and got a cut that was bleeding juuuust enough to warrant a bandaid. I went to the teacher to ask for one. The look of confusion on his face is emblazoned on my memory. After a few minutes of scratching his head and thinking about the situation, he found some packing tape and covered the cut with that. And that was the last injury in that class!
reply