Opa is neat, but the two license options of AGPL or "talk to us for a price" basically ensure it will never achieve anything close to mainstream usage.
Agreed. This 'but we wanna make millions with a dual license as that 0.000001% of open source products do' attitude is definitely the largest barrier for this. And it's incredibly short sighted; just open up (L)GPL and do services + support. All big companies WILL buy support as that's what they do. AGPL is crap and will stop your growth; you have an uphill battle already, so don't do stupid things like that.
Eh, they don't need to just let everyone use it for free. I would say: put it under an open-source (LGPL, for example) license, but charge people for the downloads (including for updates). Nobody likes using software that never updates nowadays, so with low update-prices you could get reasonable amounts of revenue-per-user over time.
If I came across software that was LGPL but charged for downloads, I would very quickly google to find someone who provided updates for free. And if that free distribution channel didn't exist, I would create it.
Edit: the below isn't a very good answer to the issue; I will leave that anyway. But it boils down to that in order to divert responsibility in a big company, you have to show you did all you could and to do that all you could, you need to show you paid money to 'the head ninja'. This could be MS/Oracle/IBM: very safe. When using Open Source, you need to show as well that you did all you could when it fails (if it succeeds, the company wants to be in the news using open source to 'cut costs') and so you pay people who can support it (the original devs) large amounts of money to support you. TL;DR It's covering your ass.
Previous reply;
You haven't worked with big companies before then; they pay for support (and licenses if needed). It's how it works as it is a liability thing; if you pay for something it's better than free as free doesn't exist. And that's kind of true; for free open source you need highly skilled employees to maintain it while you can also buy support from the company who actually built it. TL;DR big companies buy your support if they use your open source.
Because big companies are marred by red tape and bureaucratic policies which require them to spend money on support rather than figuring it out themselves.
Most projects eventually encounter thorny issues. If your only recourse is to let someone smart go into a deep dive while everyone waits, a lot of bad things can happen. You might not even have anyone with the wits and drive to fix it, or you might get a terrible hack that becomes a maintenance black hole. Meanwhile, your schedule becomes a question mark, naysayers against the technology form cliques, etc.
Small teams of motivated people can make just about anything work. For everyone else, support makes project schedules more predictable and can avert certain painful scenarios.
I don't work on or speak for the Opa team, but I assume they adopted JS-like syntax for the same reason Dart did -- familiarity is important when it comes to language adoption.
Regardless of whether or not you think this is a good thing or whether programmers worry too much about superficial syntax issues, this is a practical issue you have to worry about if doing PL design for the real world.
Changing syntax for marketing reasons seems bad from my point of view, I like pythonish syntaxes, like in haml, CoffeScript etc. So changing to JS was a back step.
Opa reminds me of Curl, MIT's "Lisp with curly braces" programming language for web development. MIT spun out Curl as a company in 1998. Their business model was to charge customers per-byte served!
thanks for pointing out its AGPL.. i was going to mention it as an additional alternative in the switch from PHP to one of Java/Node.js/Python/Ruby.
but AGPL for a language is a deal breaker. i accept AGPL for an application, but not for our language (we will make sites for our clients that dont want their stuff open source, and we dont want to be back in license-hell)
on tip for opa: pick a license that is common for languages.
besides the license i think the language looks _very_ useful (more compile time checks, pretty syntax, same user created data types on both client and server).
I actually tried to run one of the featured projects on github: OPAcman.
This was my experience:
1. install OPA. Get surprised by the AGPL license because the main page mentions that it can be used to develop closed source apps.
2. git clone OPAcman.
3. run make, fail. Read error message (in color, with a little unicode "flag" symbol right where the error is!)
4. change makefile to use --parser classic, also fails
5. hack off score.opa because --parser classic database directives are irrevocably broken and I can't be arsed to figure out the new syntax, also patch opacman.opa with a stubbed out Score object.
6. Run make, get ocaml errors:
Error: This expression has type Bsl_init_.CR.t = QmlClosureRuntime.t
but an expression was expected of type
QmlFlatServerLib.record = ServerLib.ty_record
If you look at the docs you'll see it's not trying to replace CSS and HTML (thankfully). CSS and HTML are still used. There just appears to be some extra sugar around HTML templating (and it checks your HTML at compile-time).
The 'one unified language' therefore just refers to using a single language on client and server, and as it happens it's a JavaScript like language, so simply in terms of languages used it's not all that different to using JS on the client and server.
The main selling point of this appears to be strong static typing.
Edit: Also yes, glad they are rethinking the licensing.
Opa however extends the classical syntax with advanced features specific to the web.
HTML fragments can be inserted directly without quotes:
line = <div id="foo">bar</div>;
I'm not sure that this helps readability, feels like reverse php (html inside code, vs code inside html).
It's much more readable than complex combinations of " and '.
Furthermore, with the static typing, all your structures are checked at compile time: Try it, you'll see how helpful it is.
I think comparing to complex combinations of ' and "" is a bit of a straw man, as JavaScript developers use templating languages like Handlebars/Mustache so that really isn't an issue!
I see why you're doing it the way you are for the static HTML checking though!
When I looked for sample code, it automatically transfered me to the tour, which then failed to load.
Anyway, the language seems to support all the constructs of the languages it claims to replace - which could easily mean it would be even more confusing than using those languages.
Also, I think Cold Fusion is an effort to do a similar thing and has about fifteen years of history. Railo seems to be an open source clone of CFML: http://www.getrailo.org/
Christ on a cracker. Web browsers don't understand Opa. They understand HTML. They understand CSS. They understand JavaScript.
Who do you want on your team? Someone who understands Opa? Or someone who knows what browsers know, in detail?
If I know only Opa I'm limited to what the Opa developers know about these technologies. I'm limited by what they know. By mastering these technologies I'm only limited by what I can imagine.
Opcodes!? Real programmers carefully etch silicon into a shape that exploits the various physical properties of flowing electrons to perform computations. Logic gates and opcodes are just an ultra-high-level lossy abstraction above this. Kids today...
Sooo... Just as people have finally mostly learned that mixing presentation, business and data logic is mostly a Bad Idea, we get a new language that encourages just that, worse than PHP ever did?
You can separate in this language as you can in PHP and this language is a lot better than PHP ever will be. Not saying it's perfect, but these people didn't just throw a bunch of perl regexes in a basket and call it a programming language.
With Opa you use standard programming language features to abstract presentation, business and data logic. For example you can define one function that gets the data from the database, and passes it to another function that builds up the HTML.
For long time I was planning to develop a similar language, I'm happy that they did it. Probably they did some things differently than I imagined, but abstracting away the client-server communication is a good thing.
For those complaining that it's not MVC - you can build a MVC framework on top of it.
Opa runs on the client and server and compiles to JS, so it got that advantage over Curl. But it's still a kitchen sink wrapper around existing technology, from a single vendor. Those seem to never work out that well.
{paragraph
paragraph-left-indent=0.5in,
{text color = "red", font-size = 12pt,
I actually learned and coded some CURL, 10 years ago}
{text color = "green", font-size = 12pt,
Said that, I'm happy it is gone, it made my dev live much more complicated instead of easier}}
I don't see how it replaces HTML. At most it manages HTML.
Also a question: can anybody knowledgeable tell us how Opa compares with Meteor? Both of them seem to follow the approach that any piece of code can run in either the server or the client, either with the system choosing automagically or with the code specifying a 'server' or 'client' directive.
There's also a number of problems in it that look like escaping issues. I'm not sure whether it's worth noting them down individually or whether it's a higher-level problem with what they use to generate it (both html and pdf version have problems).
Why only HTML, CSS, JS, PHP? It should be replacing also the recent fads Ruby, Python, Haskell, Clojure and god knows what else. It is high time we have single, elegant solutions. Stacks are so yesterday!
It's a herculean effort. I heard about Opa about a year or two ago, amazed he's still at it. Kind of like Haxe. I doubt they will go anywhere, like Coffescript, because any language that simply abstracts another is just cruft and doomed to be obsoleted.
By that logic, Ruby and Python are just as doomed because they're ultimately just elaborate abstractions on top of C (or at least their reference implementations are). The implementation language and output target of a programming language are not material to the language itself. C targets a wide array of machine codes, but you'd be remiss in saying it's "just an abstraction on top of x86," and thus doomed to fail. Likewise, Java can target JVM bytecode or Dalvik, but it is not "just an abstraction" on top of either of those. Clojure can target a dizzying array of output languages — JVM, CLR, JavaScript, Scheme, Lua, Python…
It just boggles my mind that because a language compiles to JavaScript, it's "just an abstraction on top of JavaScript" or "just sugar" or whatever. Like if you took the same language with the same semantics but didn't target JavaScript as your object code, the language would become more legitimate. How is that in any way a useful view of the world? Is Clojure also just an abstraction on top of JavaScript because it can target JavaScript?
I would go so far to say that any programming language is an abstraction over another language and/or hardware instructions. Also any API is an abstraction over another API and/or hardware instructions.
EDIT: I realized that technically my first statement is not correct: Any Turing complete language can be language can be compiled to any other Turning complete programming language. So it not an abstraction, but equivalent.
Agreed, the law of leaky abstractions is pretty brutal. When you're writing serious systems you always have to peek under the hood -- e.g. you have to reason about data structure sizes in the JVM, or check out the code that GCC outputs, or look at the Python interpreter source code. Having a bunch of monolithic magic with its own world view sitting between you and the machine is a big barrier.
I'm skeptical of anything that tries to present a grand unified model. GWT is another example of where this strategy fails IMO.
As noted below, CoffeeScript is a different case that illustrates the point. It seems moderately successful (I haven't used it) precisely because it's a such a modest abstraction. It aims to be a 1-to-1 with JavaScript, and keeps the semantics of JavaScript, rather than imposing its own semantics which are inevitably at odds with what browsers/servers are really doing.
We've heard this all before, though. 40 years ago, Machine Code was where it was it, and people scoffed at those who were lazy enough to use an assembler. Then Assembly was where it was at, and the abstraction that C provided was leaky and poor, and compiled code could never be as good as hand-written assembly coded directly against the individual processor instruction set.
The problem with this argument is that it applies just as much in the two above examples as it does in this one, and it's clear that C has really taken off. Making good abstractions is damn hard, and the law of leaky abstractions is brutal, but the success of C, and even Java (a virtual machine on top of a compiler on top of a assembler on top of the machine!) shows that it can be done.
I'm pretty happy I can write code in C and can't imagine worrying about things like different instruction sets, registers, L1/L2 caches, etc. just for a basic "hello world". Perhaps in the future, people will think it equally crazy that we had to worry about details like what database we use, how we serialize data over the wire, manually creating ajax endpoints, etc.. Perhaps they'll let the compiler do all this work and only dive into manually fiddle with this stuff when they need to squeeze the last 10% performance out of the system, just as C programmers sometimes drop into ASM today.
It doesn't -- C is analogous to the coffeescript case; it's not at all like Opa. It's the minimal portable abstraction over digital computers. Given some C code, it's not hard to predict roughly what instructions it compiles into. It precisely illustrates the point, because there were dozens of more complicated abstractions (languages) that failed. They claimed to do more for you, but what they did conflicted with reality and failed (in that era, usually because it was hard to reason about performance). C is humble and makes few assumptions.
"Abstraction is layering ignorance on top of reality" -- Richard Gabriel
My opinion is that anything which tries to abstract network communication and make it "transparent" is going to run in to problems at scale. For some reason there is this endless desire of programmers to extend their type system across the wire. It's been tried so many times.
The goal of Opa appears to be "transparency" as discussed here -- making distributed computing look like single-machine computing. This was also the goal of GWT.
Doesn't the C compiler do all sorts of optimization magic that makes the output of the compiler sometimes hard to predict? C is also not the only language that succeeded; C++ and Java are doing pretty well, and I've used them quite alot without delving into the sausage factory of GCC/javac. Scala is doing pretty well too, with it's 20 stage compilation process!
I disagree that anything trying to abstract away network communications will fail; I doubt that every engineer at google is a full-stack expert who single-handedly sets up a massive-scale service every day by hand. More likely, they have developed a set of somewhat-reusable abstractions that they can build upon for a variety of services. Hadoop is another example where the whole network thing is abstracted away. Amazon S3/Cloudfront is yet another. Difficult, but not impossible.
Perhaps Opa is trying to be the minimal portable abstraction over a full-stack service. I know GWT tried to be the same thing and failed, but one failure doesn't mean you give up, and I'm glad they're still trying.
It's true that a C compiler does that now, but it didn't 40 years ago. Platforms that start really complex don't have a hope of lasting -- they collapse before they get to the stage of old and hairy.
I think this is an important point -- systems that last have to start really simple.
The Web was laughably simple at first. HTTP 1.0 was roughly: open a TCP connection, send a URL, get the document back, close the connection. It was WAY simpler than many contemporary hypertext solutions.
Unix was also WAY simpler than its contemporaries. Linux is old and hairy now, but that's just how things age without falling apart. They have to accomodate many different people's (sometimes broken) mental models under one roof. A system like Opa seems like it can only accomodate the mental model of its creators, and thus won't age gracefully.
C++ despite being huge has some modesty: it respected people's existing C code and didn't "cover it up". Plenty of people write C with classes still and that's actually a feature. Opa seems like it "covers up" what's underneath.
Java is kind of an exception to the adoption curve because it had huge marketing behind it like no other language did. But I think Java 1.0 was still pretty darn simple. It was a very small language. I'll grant that Java did try to cover up the OS. I think this limited its widespread application, but it's admittedly still massively popular for certain things. You couldn't do async I/O in Java for awhile, nor could you do things like make a Windows shortcut on the desktop.
Hadoop, being based on the MapReduce abstraction, does indeed pretty successfully allow the programmer to ignore the network. The fairly large restriction of being able to write 2 pure functions -- map and reduce -- is what allows this (it allows retries without affecting correctness, etc.). In a way this proves the point. You can't write arbitrary procedural/stateful code (in Opa or any other language) and distribute it over the network at scale. I'll go as far as to claim that this problem is unsolvable in a fundamental sense, like the halting problem is unsolvable.
Now maybe Opa introduces some restrictions in their model that help with distribution that I don't know about, but in general I am skeptical of the "write all your code in this one clean language with our nice model and we'll figure out the rest automatically with our hyper-optimized advanced technology". We've heard it before.
I honestly don't know why as a programmer you would want to ignore the distinction of code running on the client or the server. I'm all for sharing (some) code, as Node.js allows, but it should be obvious in any application whether a code path is running on the client or the server, and you shouldn't need a compiler to figure it out. That kind of coupling is crazy.
IMHO the CoffeScript analogy is a poor one. CoffeeScript actually abstracts very little from the JS base, being mainly about a lighter-weight, more productive, more enjoyable syntax. Opa is far more ambitious in its abstraction.
Haxe is a complete multi target compile system that just put out a service rolling server compiler. At present I do Android and C++ (both windows and linux) compiles out of it from the same code base.
I really don't see how this has anything to do with Haxe or why both would be joined at the hip for longevity.