Having now read it, I've come to the conclusion that the blog gives the wrong impression about the implications of having a custom language. Readers of the blog post come away with the idea that Wasabi was so full of "o.O" that someone was moved to write a book about that. In reality, the book is simply documentation of the language features, with callouts for weird interactions between VB-isms, ASP.NET-isms, and their language.
You should definitely read the "Brief And Highly Inaccurate History Of Wasabi" that leads the document off. It's actually very easy now to see how they ended up with Wasabi:
1. The ASP->PHP conversion was extremely low-hanging fruit (the conversion involved almost no logic).
2. Postprocessing ASP meant PHP always lagged, so they started generating ASP from the same processor.
3. Now that all their FogBugz code hits the preprocessor, it makes sense to add convenience functions to it.
4. Microsoft deprecates ASP. FogBugz needs to target ASP.NET. They can manually port, or upgrade the preprocessor to do that for them. They choose the latter option: now they have their own language.
It's step (3) where they irrevocably commit themselves to a new language. They want things like type inference and nicer loops and some of the kinds of things every Lisp programmer automatically reaches for macros to get. They have this preprocessor. So it's easy to add those things. Now they're not an ASP application anymore.
Quick rant: if this had been a Lisp project, and they'd accomplished this stuff by writing macros, we'd be talking this up as a case study for why Lisp is awesome. But instead because they started from unequivocally terrible languages and added the features with parsers and syntax trees and codegen, the whole project is heresy. Respectfully, I call bullshit.
It's a good case study in market segmentation, though. Lisp works because all of the programmers who would like to do language design as part of their day job gravitate to it. As a result, it has the most advanced language features of any language on the planet. All of the programmers who just want their programming language to be a stable, dependable tool they can use gravitate to other languages (tops among them: Java and Go), and they build some pretty cool products with them because they aren't distracted by improving the language. Wasabi's big failing is that it tried to introduce Lisp-like concepts to programmers who have to work in ASP and PHP in their daily jobs. It's kind of a no-man's land there.
There is inevitably a Law of Conservation of Coding Cost that will get you one way or another. If you won't rewrite your code from scratch (because of all the unrecognized wisdom in the old code), you'll build a compiler from scratch (and discover all the unrecognized wisdom in proven compilers), or you'll switch to some proven cross-platform solution that will cover most of your needs, but the remaining pieces needed to complete the job, when added together, will introduce you to all the unrecognized wisdom embodied in the standard, platform-specific toolchains, or some other approach that will preserve the cost in some other way.
This "never rewrite from scratch" dogma is an example of incomplete wisdom.
The cost to build and maintain Wasabi was surely higher than rewriting as X, but the risk sounds substantially less, and effectively spreads the risk part of technical debt over future years.
A bad rewrite could have killed the company. Historically, rewrites do not have a good delivery record.
A sideways sort-of-DIY ever-expanding transpiler-or-whatever kept the project alive and profitable.
Sometimes an ugly hack is the right answer if it keeps the money coming in.
Also, valuable PR.
I think the discussion could have been nipped in the bud if it was described similarly to your summation - a necessary business decision rather than good software practice.
Here they did not rewrite from scratch they developed a bridge that allowed them to keep moving forward as the world that the old software was based on disappeared. And now they are on the other side with a code-base that works and has been providing cash-flow the whole time.
but, and maybe they were also recompiled, did you keep your tests as well?
A few of the oldest FogBugz tests are in `unitTest.asp` and were written in Wasabi. They got transpiled over to C# like everything else. Some are confirming Wasabi language features (e.g. `testLambda`, `testDictionary`, `testErrorHandling`), so I could remove them.
All of Wasabi's unit tests were deleted with Wasabi.
I'm very curious, would be grateful for an answer! tx
Roslyn is an open-source implementation of the entire C# compiler, with some fantastic design decisions that allow you to use it in pieces or all together. https://github.com/dotnet/roslyn I used the C# generator portion of the platform.
Thanks for posting.
<title>An Example Wasabi Program</title>
<h1>An Example Wasabi Program</h1>
<p>Arithmetic: 1 + 1 = <%= 1 + 1 %></p>
<p>Dynamic Content: x + y = <%= IntRequest("x") + IntRequest("y") %>
Before rich client/ajax took off, it was more than adequate for most e-commerce, web publishing, and internal administrative applications. Truth be told, it still is, but it's just soooo 2001.
Public Abstract Class Animal
Abstract Function Speak() As String
Overridable Function Eat() As String
Public Class Dog Inherits Animal
Override Function Speak()
Override Function Eat()
Return Base.Eat() & " Woof!"
Without that, you'd have to rewrite, in which case, why not rewrite in an existing language?
I suspect that many of the armchair critics can't relate to being constrained by decisions made years ago at the start of a project -- decisions that one might now regret but nonetheless constrain the way the codebase can evolve.
At the point you've decide to build a terrible new internal-only language just to maintain a single bug-tracking app codebase ... perhaps you aren't been doing an accurate cost/benefit analysis?
You've also got several current and former FC'ers on this thread, all? of whom don't like Wasabi as a language, all saying that from a pure cost/benefit perspective, Wasabi paid off.
Agreed! But I would consider that an argument against using macros, not an argument for Wasabi.
Interestingly, since new hires were supposed to be pretty smart and had to pay the cost of learning Wasabi anyway, they could have just taken a page from Paul Graham and gone with Lisp. The learning curve should be the same (or easier, given the amount of material available) and they would have crazy good compilers and libraries from day one.
I think we are missing a piece of the puzzle somewhere.
"Designing Active Server Pages -
Scott Mitchell's Guide to Writing Reusable Code"
I'm not sure when I bought it, but it came out in 2000. Around that time, I'd seen some pretty nasty php, not much asp (but enough VisualBasic to start looking for a pitchfork whenever I saw it mentioned) -- and shortly after I worked as a sys.admin. at a place with a significant investment in Coldfusion (arguably the first php/asp-style language).
I've yet to write any classical ASP, and do anything more than glance at newer .net and what-not -- but I still got a lot from that book.
It, along with some of the posts I found on the Coldfusion fusebox framework/pattern did a lot for helping me keep the difference between "bad language" and "bad programmer" straight.
Just because most all php code I've seen is crap, doesn't mean all php code is crap. And it didn't have to be in the early days either.
Another fun book that's somewhat related (in my mind anyway) is:
"Program Generators with XML and Java"
by J. Craig Cleaveland (Prentice-Hall, 2001)
(Sometimes one finds great technical books on sale, too!)
I was so sad, thinking about how using a cross-platform Lisp (I.e., LispWorks) would have solved this problem for them by design. FogCreek has some pretty smart folks, it should have been within their reach to use LW and go...
The code they start with (a) works and (b) is fiddly.
To rewrite it a better language, (b) means they will very likely lose many months to problems with (a).
So, it makes more sense for them to transpile and preserve the fiddley bits than it does for them to rewrite.
That is to say, it seems like they did choose the better technology stack for the job at hand. If you're going that direction, the real problems started when Joel got a job at Microsoft.
This is where I'm curious that they didn't just start writing PHP from that point forward. It was already a cross-platform language. And, if I recall, had a lot of hype and was gaining traction fast I'm the late 90's and early 2000's.
Certainly hindsight is 20-20, but I remember a lot of folks betting big on PHP at the time.
Sounds like good decisions were made at the right times.
I disagree. First of all, Wasabi solved a real problem which doesn't exist anymore: customers had limited platform support available and Fog Creek needed to increase the surface area of their product to cover as many of the disparate platforms as possible. Today, if all else fails, people can just fire up an arbitrarily configured VM or container. There is much less pressure to, for example, make something that runs on both ASP.NET and PHP. We are now in the fortunate position to pick just one and go for it.
Second, experimenting with language design should not be reserved for theoreticians and gurus. It should be a viable option for normal CS people in normal companies. And for what it's worth, Wasabi might have become a noteworthy language outside Fog Creek. There was no way to know at the time. In hindsight, it didn't, but very few people have the luxury of designing a language which they know upfront will be huge. For example, Erlang started out being an internal tool at just one company, designed to solve a specific set of problems. Had they decided that doing their own platform was doomed to fail, the world would be poorer for it today.
One might think so, if one is charitable toward the typical developer and they're abilities. I worked at a place that had implemented their own query language and integrated it into their core product.
The query language was interpreted, and walked structured text files stored on disk. Worked great for small installations, but as the company grew, acquiring larger clients, clients data corpi grew, performance fell of a cliff.
The syntax of the language was all over the board, users would cut and paste to combine little recipes to do things, there was a never ending stream of support calls as to why this didn't work like this, and why couldn't this do this like this other part does that.
Edit; wanted to say consistent, logical syntax is were someone with experience in language theory and implementation would have made an impact. The language as was, was the result of feature accretion. I think someone with knowledge would have attempted to craft a kernel of functionality, and back it with a more robust datastore (sql).
Edit2: also want to clarify, the developers who did the implementation were actually quite talented, the project was simply not something they were equipped to deal with, and really weren't given the oppurtunity to refine or iterate on.
Edit3: I offer this as an example of one company that implemented they're own langauge, of course the purpose of this example differs from wasabi, and is not intended to condemn or denigrate Wasabi, or Fogcreek.
I guess it was a net win, but it was painful.
When a rider falls off a horse, they have to make a decision: abandon the idea of horse riding ("riding your own horse rarely makes sense"), or apply that as a lesson to your skill set and re-mount. While there is nothing wrong in deciding that, upon reflection, horse riding was not for you, I think it's harmful to go out and announce to the community that riding your own horse rarely makes sense and should be left to the pros. Because what you're left with then is a world where the only riding is done by dressage performers.
(Sorry, that analogy got a bit out of hand, and admittedly I know nothing about horses, but I hope the point is still discernable)
I'd also say that languages are hard, runtimes are hard, languages and runtimes together are really hard. A decision to go down this path should be carefully considered, and not because a developer on staff read Parr's Antlr book and wants to try it out.
On the other hand, there really wasn't anything approaching an off the shelf solution to FogCreek's business problem, and even with massively popular legacy languages there rarely is (it takes an IBM to first productize a COBOL to Java compiler). FogCreek's strategy worked well enough that customer's were writing checks that didn't bounce, and that's pretty good for a software product.
The best text editors in the world all have embedded languages. This can be used to the extent that the intended audience can master the language. Of course, use of the language can be made optional, as it is in text editors, so that initiates can use it and no one else has to.
I and many other programmers have fallen into the trap of creating special purpose embedded languages. I fell into it twice. There already exist many languages designed specifically to be embedded languages. You should think twice before creating a new one."
from How To Be A Programmer ( http://samizdat.mines.edu/howto/HowToBeAProgrammer.html#id28... ).
Nobody could know most things at the time. But we can estimate. What are the odds that small, random companies creating languages as side projects will have success? Well, looking at the languages everybody uses, I'd say: not so good.
> For example, Erlang started out being an internal tool at just one company, designed to solve a specific set of problems.
It was just one company, but it was one very large company working on something that they expected to be in operation for decades. So I think that's a bit different than a small company working on a product.
And naming the successes doesn't tell you much about the odds; you also have to count the failures. One can't justify the purchase of a lottery ticket by looking only at the winners.
We had an interesting question: "why are we using two different languages to develop client and server side functionality?"
NodeJS obviously has the better answer, but much later than Wasabi.
I also find this kind of phrasing weird:
> The people who wrote the original Wasabi compiler moved on for one reason or another. Some married partners who lived elsewhere; others went over to work on other products from Fog Creek.
It's like the author of this article goes out of their ways to avoid saying that some people left the company, period. It also wouldn't surprise me if some of these defections were caused by Wasabi itself. As a software engineer, you quickly start wondering how wise it is to spend years learning a language that will be of no use once you leave your current company (yet another reason why rolling your own language as a critical part of your product is a terrible idea).
The second half of your comment transitions from weird to mean-spirited, as you begin speculating about people you don't know and their reasons for changing jobs. I'm a little confused as to why you've been voted up so high on the page.
Compilers just aren't that magically hard and difficult. I'll cop to not having written a true compiler yet but I've written a number of interpreters, and I've written all the pieces several times (compile to AST, interpret, serialize back out, just never had the whole shebang needed at once).
If you're reading this, and you're still in a position where you can take a compilers course, take it! It's one of the most brutally pragmatic courses in the whole of computer science and it's a shame how it's withered. (Even if, like me, you'll probably write more interpreters than compilers. And nowadays you really ought to have a good reason not to pick an existing serialization off-the-shelf. But it's still useful stuff.) It's one of those things that is the difference between a wizard and a code monkey.
(If I said that too concisely for your tastes, see: http://steve-yegge.blogspot.com/2007/06/rich-programmer-food... )
The cost of an in-house programming language isn't in writing the compiler. It's training all your new team members in the language. It's documenting the language constructs, including corner cases. It's in not being able to go to Stack Overflow when you have problems. It's in every bug potentially being in either your application code, your compiler, or your runtime libraries, and needing to trace problems across this boundary. It's in integrating with 3rd-party libraries, and in not being able to use tooling developed for an existing mainstream language, and having to add another backend to every other DSL that compiles to a mainstream language.
All that said, I agree that if you're ever in a position to take a compiler course, do it. It's one of the most valuable courses I ever took, and really peels back the mystery on why programming languages are the way they are. It's just that the difference between wisdom and intelligence is in knowing when not to use that brilliant technique you know.
Which is precisely why I've never written a full compiler, even though I've written all the pieces many times.
For instance, instead of writing a parser, could you perhaps get away with just a direct JSON serialization of some AST? Do you really need to emit something, or will an interpreter do? So far I've never been so backed against the wall that I've actually needed a full compiler.
The interesting thing is that the more experience you get, the more alternatives you find to writing your own language. Could you use Ruby or Python as the front-end, much like Rails , Rake , or Bazel ? Could you build up a data-structure to express the computation, and then walk that data-structure with the Interpreter pattern?  Could you get away with a class library or framework, much like how Sawzall has been replaced by Flume  and Go libraries within Google?
In general, you want to use the tool with the least power that actually accomplishes your goals, because every increase in power is usually accompanied by an increase in complexity. There are a bunch of solutions with less power than a full programming language that can still get you most of the way there.
(If you're interested in goofing around with Starfighter, you're going to get an opportunity to get handheld through a lot of this stuff.)
I recognize that this would've been a huge leap for anyone in 2005, when 37signals was basically the only company doing small-business SaaS and the vast majority of companies insisted that with any software they buy, they actually buy it and the source code and data sit within the company firewall. Heck, when Heroku came out in 2007 I was like "Who the hell would use this, turning over all of their source code to some unnamed startup?"
But looking at how the industry's evolved, that's pretty much the only way they could've stayed relevant. Many companies don't even have physical servers anymore. That's the way FogBugz did evolve, eventually, but they were late getting there and had to back out all the existing Wasabi code and fixes they made for it to be easily deployable (which was one of their core differentiators, IIRC; they were much easier to setup than Bugzilla or other competitors).
It makes me appreciate how tough the job is for CEOs like Larry Page or Steve Jobs, who have managed to stay at the leading edge of the industry for years. Larry was pretty insane for buying a small mobile phone startup called Android in 2005, but it turned out to be worth billions eventually.
1. The primary product of many companies got too large to deploy on their own server farms, and so they started moving toward AWS etc. for scalable hosting. Once your product is in the cloud, it makes sense to deploy your supporting infrastructure & tooling there as well, because otherwise you're paying the support, hosting, & sysadmin costs for just your non-critical corporate infrastructure.
2. Bandwidth became a non-issue. In the 1990s there was a very measurable difference between 10BaseT internally vs. an ISDN line to your hosting provider. In the 2010s, there's little practical difference between gigabit Ethernet vs. 10M broadband.
3. HTTPS became ubiquitous, taking care of many security risks.
4. AJAX enabled rich UIs, which previously would've required desktop apps.
5. Employees started to blur the line between work and home, leading to demand for work services that could be used, encrypted, from a user's home network. VPNs were a huge PITA to set up. This was a big issue for much of the early 2000s; one of my employers made some clever network software to punch through corporate firewalls with a minimum of configuration.
6. Development speed increased. SaaS companies could push new versions of their product faster, react to customer feedback quicker, and generally deliver better service. Because all customer interactions go through the company's servers (where they can be logged), they have much better information about how people are using their products. Deployed services were left in the dust.
tl;dr: #1-4 made lots of businesses go "Why not?", while #5 and #6 made them go "Yessss."
It's interesting that many of the arguments about why you should not use SaaS businesses now (like privacy and security, and lack of ownership) were relatively minor reasons then. I do kinda wish (in an abstract way) that something like Sandstorm would catch on, but I think they may be early: SaaS just isn't that painful, and until we have a major shake-out where a lot of businesses get taken out because their dependencies go down, it seems unlikely that it will become so. Or the other way this could play out is that a new powerful computing platform comes out that lets you do things that aren't possible with thin clients, and you see a rush back to the client for functionality.
The monthly bills for small purchases of SaaS fits on what could be expensed on a corporate card. By the time IT gets wind, the product has already infiltrated the organization. If there's a very large up front cost, then IT is involved, you need a formal RFP process, lots of people weigh in, those opposed to the purchase can try and block it... As soon as "Put it on the corporate card" became viable, power moved back to the business units.
Granted, it may take a while to convince IT people that this is OK, but fundamentally they have every reason to prefer this over people cheating with SaaS.
I'm not sure I'm interpreting this correctly, but FogBugz On Demand was launched in 2007. Was that really late for SaaS?
They did learn from this, as you can see by the very different paths StackExchange and Trello are on.
The developer tools market changed from one with very few network effects to one with a large network effect around 2010. The drivers for these were GitHub, meetups, forums like Hacker News, and just its general growth - they made coding social. When I started programming professionally in 2000, each company basically decided on a bugtracker and version control system independently, and it didn't matter what every other company did. By 2015, most new companies just use git, they host on GitHub, and if they don't do this, they're at a strong disadvantage when recruiting & training up developers, because that's what much of the workforce uses.
Interestingly, both GitHub and Atlassian resisted taking investment for many years - GitHub was founded in 2007 and took its first investment in 2012, while Atlassian was founded in 2002 and took its first investment in 2010.
The name Wasabi doesn't have an obvious connection to the VBScript that it's based on, which seems to be the cause of people talking about writing a whole new language, etc.
Thistle -> FogBasic -> BC3k -> Wasabi
1. compilers have bugs
2. it really sucks not knowing if a bug is in your code or in your compiler
3. it sucks not having a source-level debugger
I won't categorically reject the idea, for instance I think Facebook writing their HipHop compiler was completely defensible. But you need people with compiler experience, and people who know the pain of working with crappy, undocumented, buggy toolchains to make that decision, not people who once took a compiler course.
> Fun fact: Wasabi has more unit tests than FogBugz.
> It’s fully debuggable in Visual Studio. Breakpoints, variable inspection, single stepping, the whole enchilada.
But the whole post makes it clear that working in Wasabi wasn't nearly as difficult as you think.
* Five, actually, and contributed to a sixth
Online community? None
Transferability of skillset? None, apart from knowing how compilers work. Makes for good nerd conversation, but that's it.
Writing your own toolchain is almost as bad. I've seen multiple talented people leave companies I've worked at when they were forced to build and maintain horrible tools for the in-house ecosystem. Some too-big-for-his-britches second-system-as-a-first-system ass had written them, and everybody else got stuck with it.
As the other commenter noted, this seems like epitome of bad software engineering and I'm surprised employees put up with it if they were any good.
EDIT: I learned to program in assembly, so compilers didn't seem super mysterious to me as they are for someone who learns Java first perhaps.
I think the old business adage about "In-source your core competencies, outsource everything else" applies here. If you derive a big competitive advantage from having a proprietary database or proprietary template, and it generates enough revenue to afford a dedicated team of experts to maintain it, build it. But if you have a bunch of smart & motivated developers who can build a proprietary database, but your product isn't databases or templates and your core differentiator isn't the performance or query patterns you get from building it yourself? Put them to work improving the product, and work with the infrastructure that other firms have built already.
If you have great people building the software, or at least competent ones, and you have competent users, you might succeed, maybe. But that's assuming you have a maintenance plan and a roadmap, which most software companies do not. Maintain software? YOLO! What happens when you have a bunch of morons using and maintaining the software?
In short, computer science in industry is largely practiced as shamanism by people who cannot engineer their way out of a crackerjack box.
According to this link from another response by someone who worked on Wasabi: http://www.tedunangst.com/flak/post/technical-debt-and-tacki...
Some choice quotes:
“Wasabi is 100% backwards-compatible with VBScript.”
“It’s fully debuggable in Visual Studio. Breakpoints, variable inspection, single stepping, the whole enchilada. It’s just .NET under the covers.”
So not really a new language per se, sounds more like they just home-brewed some steroids for VBScript.
Well, I can see how you might struggle there ;-)
Good natured snarks about spelling aside, part of the issue is that writing, documenting and maintaining your own language is only hard if your toolchain sucks.
If you're interested in writing a specialized language to solve a particular problem, take a look at PEG for JS, and either Racket or Common Lisp (the latter if you need native compilation).
I've recently been involved in the design and implementation of an English-like language for the expression of business domain concepts in web apps. It's a great approach if done thoughtfully and professionally.
That's probably the key, actually. The horror stories we hear are of the bad examples. And we all know that shitty tools, weak languages and bad documentation can come out of large software companies as commercial products as well.
Do you think a good compiler course would prepare the student to do a project with the scope and complexity of Wasabi? For one project, I wrote an interpreter for a little domain-specific language, then later reworked that interpreter into an on-the-fly compiler (to Lua, to avoid double interpretation). But that's a long way from writing a compiler for a general-purpose language, that can do global type inference and produce human-readable output in a target language that's fairly different from the original VBScript (if not Wasabi itself).
The trickiest bit of Wasabi is the type inference, which I admit is not "production-ready" (or "good code") because we basically invented it from scratch. If I were to do it now, I would know just enough to realize that I need to read about Hindley-Milner rather than reinvent the wheel.
Producing human-readable output is an exercise in tedium and bookkeeping, not any particular amount of skill or brilliance.
It's fascinating how easily cruelty can be popularized by using the right, nice-sounding words. Coating/mask your bile in a rhetoric popular with a community, indirectly imply some terrible things, perhaps obfuscate anything that could raise any uncomfortable, thoughtful questions, and presto! You'll have the right set-up to manufacture consensus.
The main reason is exactly as the article states, maintainability.
> As time wore on, our technical debt finally began to come due. Compilers like Wasabi and their associated runtime libraries are highly complex pieces of software. We hadn’t open-sourced it, so this meant any investment had to be done by us at the expense of our main revenue-generating products. While we were busy working on exciting new things, Wasabi stagnated. It was a huge dependency that required a full-time developer — not cheap for a company of our size. It occasionally barfed on a piece of code that was completely reasonable to humans. It was slow to compile. Visual Studio wasn’t able to easily edit or attach a debugger to FogBugz. Just documenting it was a chore
It seems like a spectacular success story.
I'd say the actual experience is somewhere in between. Sure it enabled them to support client requests to be cross platform and proved useful for a very long time, but what was the broader opportunity cost? Did supporting this proprietary infrastructure eat up resources and prevent them from exploring other ideas? Probably.
This whole story is a fabulous insight into software development for business over the long-term.
The reality though is that line-for-line, a transpiler is probably not much harder to write than a serious report generation tool. I agree with the commenter upthread, who thinks this is a result of people simply never having tried to write a compiler before.
It might be, but for what many would argue are the wrong reasons, maybe even unnecessary.
> This is an internal language designed as an incremental improvement over VB that gave them cross-platform common codebase.
The problem is that at the time, there was already several cross-platform technologies in existence, many of which were being developed in the open. Utilizing one of these technologies would have allowed FogCreek to focus on what they do best, make software. Instead, they took a proprietary uni-platform language and attempted all on their own to make it cross-platform capable - which led to years of maintainability issues.
> It lasted 10 years
They gained an early advantage of not having to throw out the codebase and start over, yet they bought themselves 10 years of technical debt which continued to pose a burden on the small company. Many would argue that biting the bullet early on and switching to an open, community-driven cross-platform language/environment would have yielded much more return on the initial investment.
> When they transitioned off of it, they did it not with a rewrite, but with mechanical translation
Yes, that is an achievement, but again, for the wrong reasons.
And I feel like when you get to the point where the best arguments you can make against something are isomorphic to the arguments you'd make against mainstream languages in language-war debates, that's a win condition.
I know you will dismiss this as "routine", but it's not...
For a small company, this is an enormous waste of time, money, and energy.
A big company like Google or Microsoft can afford to throw developers by the dozen at internal proprietary languages and not even blink -- but according to the article, FogCreek did blink every time they had to dedicate time to fixing it. It took time, money, and energy away from their core business - making software.
That's a lose condition.
FogCreek should have bit the bullet and re-wrote their application in an open, standardized cross-platform system. They would have been able to spend zero time worrying about the language, and 100% of their time worrying about their application. They could hire engineers off the street and have them produce in days-to-weeks instead of weeks-to-months. They would have saved themselves an enormous amount of time, money, and energy invested in a language that is now dead anyway.
It may have seemed like a good choice back when the decision was made, but in hindsight it appears to have been a very poor, short-sighted choice.
I think you have this backwards. A small company that writes a compiler and loses a few weeks of dev time per year survives for a decade, while spinning up various new products.
In another world, a small company rewrites its only source of revenue. 18 months later, they release the rewrite with zero new features and a chunk of new bugs and promptly die, because who's going to buy a product that spends a year and a half going backwards?
Ah, so you happen to know better than Joel how much resources they had available at the time, how long the rewrite would have taken, how much it would have affected their ability to ship new features?
Fog Creek was a much smaller company back when they wrote Wasabi. Postponing the rewrite until they had more resources to spare was probably a good decision.
Then again, given that they had the codebase already, writing their own transpiler sounds like it was the best option at the time.
FogBugz wasn't written in any of those languages though. And it isn't like Spolsky's "never rewrite from scratch" views are unknown here.
I'm curious if anybody on this thread who has written more than three or four compilers/parsers would agree with you.
Depending on the task, the only solution to some problems is to write a custom/proprietary language (whether it's closed source, of course, is up to the company).
Is a bug tracker one of those problems?
But "bug tracker" is not the problem that was being solved.
The problem was taking a big pile of legacy code and translating it to more than one platform vs rewriting the entire app from scratch in a new language that was cross platform.
It just happened that the legacy code was for a bug tracker, but it could have been for anything.
I don't think so, having done this once, to the great success of the company. They also wrote their own database. The compiler was maintained by a team long after I left the company.
Software is hard. There are more interesting, and hard, problems than pushing the value of a field from one subsystem to the other.
/me deactivates snark mode
I see the issue exactly the other way around.
If you can build a domain specific language that lets you express concepts in a clear way free of boilerplate or conceptual hackery (ORMs, for example) you will wind up with a much lighter maintenance load than the equivalent functionality built on an off the shelf framework.
Of course, there's nothing stopping you from using the two as appropriate. Simple CRUD app? Rails. Need to express a very complex domain in a readable, easily maintained form? Custom language time.
It would work, if management didn't treat programmers as a replaceable cogs in a wheel. But the day you seek to make the craft of programming a commodity that could be done anyone, you need a ecosystem whose knowledge is available to everyone. Only then would you get reasonable expertise at affordable prices to finish your projects.
The opposite is to make so programmers so special that the knowledge of specific tools is available only to them. This definitely puts programmers in a lot more special position to negotiate pay and other things at will. Because the very existence of a business depends on them.
The fact that a team building the canonical wire- form- fields- to- database- columns application (if there weren't such a thing as "blogs", bug trackers would be the "hello world" of database-backed web apps) found a reason to deploy computer science is, to me, a beacon of hope that we're not all doing something that's going to be automated away in 10 years.
Isn't that the biography of 99% of the open source projects in the big data and distributed processing world? I understand they are open now, but didn't they start as custom tools just for a single company?
It seems like the "error" that Fog Creek made was to not open source Wasabi, though even that seems more like a hindsight has 20/20 vision kind of thing, as open sourcing a project is no small feat, especially to a small software company.
Second, you wrote your comment in a text box rendered by an application notorious for being written in a custom language (in fact, it seems like arc:lisp::wasabi:.net).
Third, do you have evidence to support "nine hundred ninety-nine times out of a thousand", or is that hyperbole? Have you worked on projects where people used custom languages? Your argument would be much more interesting if we could read more about your experience with it.
By completely negating the possibility that any of those people left for any reasons not involving family, it actually seems to INCREASE the probability that Wasabi was more unpopular within FogCreek than Joel would prefer to admit.
Or could I just as easily argue, with the same total lack of grounding, that you're a secret shill for Atlassian trying to poison the well? (You aren't, of course, but you take my meaning.)
1. Original author left because his wife was going to medical school out-of-country and Fog Creek didn't allow remote work at the time.
2. Second author left because his wife was going to medical school out-of-state and Fog Creek didn't allow remote work at the time (see a pattern?). Later came back because Fog Creek offered remote work. Went on to author the blog post we're talking about.
3. Developer left to go work on Stack Exchange (me!)
4. Developer left to go make the world a better place at Khan Academy
5. 2x developer left to go work on Trello
I think that was all of us. People move on in the course of 5+ years. Turns out most of those reasons don't have to do with programming language.
FWIW, I think Wasabi was a bad decision and I'm not going to defend it. But I really don't like these massive assumptions about people's motivations for leaving.
Can I guess at why you think it was a bad decision?
(a) Too incremental to be worth it, given where the .NET ecosystem was heading
(b) FC couldn't commit the resources required to adequately support a whole language, and it's better to commit to a lower common denominator than limp with a poorly supported language
(c) If you're going to create an additional obstacle to on-ramping employees, it had better be something every project in the company takes advantage of --- like, even if you had built FogBugz in OCaml, that would be a problem since the company is not designed to take advantage of OCaml.
(d) Unless you're getting a truly transformative advantage from a custom language, it's not worth it to be out of a "Google your way out of most problems" mainstream sweet spot
(e) No matter how good the language is, using a different language makes you incompatible with toolchain, so edit/test/debug cycles are needlessly painful
I obviously have no idea if Wasabi was a good decision or not, but a workplace where people are allowed to deploy basic computer science to solve problems is (sadly) an attractive stand-out to me.
Let me start by saying that Wasabi as a strategic move was brilliant. If David disagrees there, I'm a bit surprised: FogBugz represented an awful lot of battle-tested low-bug code, and finding a way to preserve it, instead of rewriting it, made one hell of a lot of sense. I'm with you that the general thoughts in this forum that we'd have to be insane to write a compiler are misguided. Wasabi let us cleanly move from VScript and ASP 3 to .NET without doing a full rewrite, and I'd be proud to work at a place that would make the same decision in the same context with full hindsight today.
That said, I think Wasabi made two technical decisions that I disagreed with at the time and still disagree in with in retrospect. First, Wasabi was designed to be cross-platform, but targeted .NET prior to Microsoft open-sourcing everything and Mono actually being a sane server target. At the time, I thought Wasabi should've targeted the JVM, and I still think in retrospect that would've been a much better business decision. I really prefer .NET over Java in general, but I know that it caused us an unbelievable amount of pain back in the day on Unix systems, and I think we could've avoided most of that by targeting the JVM instead. Instead, a significant portion of "Wasabi" work was actually spent maintaining our own fork of Mono that was customized to run FogBugz.
Second, Wasabi worked by compiling to C# as an intermediary language. There was a actually an attempt to go straight to IL early on, but it was rejected by most of the team as being a more dangerous option, in the sense that maybe three people on staff spoke IL, whereas pretty much everyone could read C#. I also think this was a mistake: the C# code was not human-readable, made debugging more complicated (VS.NET had something similar to source maps at the time, so it wasn't impossible, but it was very indirect and quirky for reasons I can get into if people are curious), and that decision meant that Wasabi had all of the limitations both of its own compiler, and of Microsoft's C# compiler. IMHO, these limitations are a big part of why the ultimate move away from Wasabi was even necessary in the first place, since they increased both the maintenance and developer burden.
So from my own perspective, I think that Wasabi was a mistake in that, if we were going to go to C#, we should've just got the translation good enough to really go to C# and then ditch Wasabi; and if we weren't, we should've actually owned what we were doing and written a genuine direct-to-IL compiler so we'd have more control over the experience, instead of going through C#. But I still really do genuinely believe that our going to Wasabi was a brilliant strategic decision, and I think Fog Creek would have suffered immeasurably had we not done it.
A better idea would have been to hand-code the generator, though of course that would have been a lot of string manipulation as well as a little extra effort.
Roslyn solves both of those issues for us, but it didn't exist until very recently.
(In my experience, you need to build up an inter-op layer first to make working in C# somewhat sane, but it's usually not hard to identify the necessary helper modules needed. Having the .NET runtime actually is a boon here since the IL is designed for inter-language inter-op.)
Why did you find yourselves maintaining a fork of Mono (versus fixing upstream)? Was it something like forking, although being problematic, had lower impedance than doing the necessary rituals for getting your changes accepted upstream?
Well, I don't think they actually did leave because of Wasabi; the chain of reasoning I described above isn't very sound. But it's easy and obvious, and the author could have avoided it by doing a little less.
Indeed. And then the author goes on to explain the actual reason, so you don't need to guess.
But I can imagine a scenario where an employee leaves due to tech stack woes. The employee may not want to burn bridges during their exit interview by saying, "I'm leaving because this tech stack sucks: it's affecting my long-term career progression (no other companies use Wasabi); management is too slow to adapt despite our repeated kvetching; the 1 full-time guy who knows the language is overworked and doesn't have time for proper training." Instead, the employee just says, "I'm leaving for personal reasons" and everything is wrapped up nicely.
Edit: Glad to hear from other commenter that this wasn't the case at FogCreek. I have known people to leave jobs due to tech stack woes; they didn't tell management the real reasons why the left.
"The people who wrote the original Wasabi compiler moved on for one reason or another. Some married partners who lived elsewhere; others went over to work on other products from Fog Creek."
Unless I'm mistaken, this was all that was written about the reasons people left, which to me does not seem very "bizarre". Sure, he could have included "... and some left for other companies" but there's a difference between omitting the obvious and saying something like:
"Let it be known that all the ones who were truly good programmers, and by extension, real human beings! Only the best stayed at Wasabi. No one who was a good employee left for anything other than a personal reason. No one!"
As GP notes, why would a dev bother inflicting upon themselves the brain damage of learning this language which will never be useful anywhere else? It's like requiring the devs at your company to use keyboards with a unique layout or something.
It's a great example of path dependency however.
BTW: Red Hat has been here too. We converted a huge app from .Net to Java by using some translation software: http://lpeer.blogspot.co.uk/2010/04/switching-from-c-to-java...
The greater evil was between rewriting their code in PHP, Java, dealing with Mono or sticking with ASP (which was already old fashioned). Or writing your own language.
From Joel's defence post:
We could use .NET, but then I'd have to pay engineers to install Mono for all our Unix customers, and the .NET runtime isn't quite ubiquitous on Windows servers.
The greater evil was definitely writing their own language.
Obviously, hindsight is wonderful, but they had a lot of people immediately point out it was a bad decision and as they say in their blog post, they ended up having to employ a full time language developer. Installing mono doesn't look so expensive now!
So Thistle, the transpiler, was written either summer 2003 or summer 2004 (it's not entirely clear if he employed the intern that summer or the next) and Wasabi came later (so 2005 is probably correct).
> I fixed a crazy number of silly bugs in mono’s class libraries. To be fair, implementing all of .NET is a herculean task, and most of the fixes were easy enough to make. The result of these fixes meant that we had to ship a patched, custom version of mono
This was 2007. Using Mono in 2005 does not sound like it would have gone particularly well.
Joel is great, but this choice baffled me in the past and baffles me today. For the sort of software FogBugz is, they would have had a much simpler life with Java, Python, Ruby, even Perl. Despite all of Joel's insight into "making html sing", he behaved like an accountant building humongous Excel macros "because that's what we know".
Remember the age of Fogbugz. It was initially released in 2000.
MS Windows was by far the dominant operating system. Virtualization was still in its early stages, and mostly at the desktop level. Linux was still growing in the server market but not dominant as it is today.
And what is exactly is wrong with the MS ecosystem if you're targeting enterprise? There are still a lot of businesses that work exclusively with Windows servers with IT managers that don't want the headache of having Linux servers.
Enterprise software tends to be a notch or two below consumer software in the "it just works department", and my experience with deploying Java based enterprise software was pretty negative. In 2000, not a lot of people were using Ruby, Python or Perl for enterprise web apps. It was mostly ASP and JSP back then.
God, don't I half remember it. I was a junior ASP dev at the time, for my sins. Java was hot like the sun and PHP was the default choice for the young and penniless. Perl was mainstream. Python and Ruby were new and rough (they were crap for webdev on shared hosts, with zero support by ISPs, but alpha geeks were already flocking to their ecosystems, Python in particular).
I'm sure part of the reasoning was that FogBugz did not start as a product -- the product back then was CityDesk, which was even more tied in the MS world -- but still, the "server scene" back then was already unix-y, which is why they were pretty soon forced to consider Linux support. I still think it was a shortsighted approach but hey, FogCreek is still alive 15 years later, so I guess it wasn't all that bad.
"In particular, we didn't want to have to tell them to get a Java Virtual Machine up and running, because that's not easy, and they're not really a hundred percent compatible."
You'd be surprised how many applications don't do that too. There's a reason why a lot of people say "enterprise software sucks" -- it's usually because the software makers value new features over improving how things work.
I suspect that this is a function of overestimating the effort on the Java side and underestimating both the demand and the work on the non-Windows side.
They used to be an extremely Windows-centric company.
It's kind of the same tune with mobile software -- "Native feels and runs better than everything else". In the case of enterprise software GUIs, it's particularly true.
The went from writing VBScript to writing Wasabi.
Also, there's a big difference between a rewrite and rewriting but maintaining both versions.
Nowadays we have open-source runtimes, .net running on multiple platforms, and componentised tools like Roslyn. It is easy to forget that the .net tooling from 10+ years ago was much more limited.
Writing you own language is an unusual approach, so descriptions of dealing with that kind of technical debt are rare. I thought this article was valuable and interesting.
I wouldn't worry about this at all. The choice of language on project n has never once negatively affected my work on project n + 1. As a programmer, my job has a lot more to do with solving problems (in the generic sense) than it does the tools I'm using to solve them.
I also found that passage oddly worded. We get it, people don't stick around forever, you don't have to try and hide it like it's some dirty little secret. Also as developer I doubt I would have wanted anything to do with a closed-source internal-only poorly-documented language. You may learn some concepts that transfer but by and large you will have to start from scratch when you leave and you won't have skills people are looking for. Also if you do dive headfirst into Wasabi and love it and then leave you probably will be that annoying fuck at your new company that says shit like "Well in Wasabi this was easy...." or "Wasabi makes this problem non-existant because...." Shut up, no one cares. It's crazy to me to think of a company as small as Fog Creek would attempt something like this but to be fair I was born and learned to develop in a different environment than they did so maybe the tools and languages available back then really just couldn't cut it.
I work with at least 10 languages a week. There is zero chance of us hiring people with years of experience in all those. We want people who have used multiple languages, and someone who worked on compiling one language into another would completely satisfy that itch.
Just like ... Rust.
In any case, Spolsky still firmly believes that Netscape would have been better off today if they had continued work on the (completely unsalvagable) Netscape 4.x. Having extensively reviewed the history, Joel was dead wrong about this. There's no reason to assume his advice is more trustworthy than that of any other "thought leader," or that it is consistently followed within his own company.
Scroll down to the part "Fear, Uncertain, and Doubt by Joel Spolsky" from September 01 (permalink 404s)
In a parallel world where the cool developers were all attending Wasabiconf, Joel Spolsky Prime wrote a sneering blogpost assuming that DHH Prime was joking about the crazy 'scaffolding' and 'proprietary database mapping' framework he was building BaseCamp on
Hats off to FogCreek for building a successful product, but that doesn't mean that people have to admire or even accept their technical choices without any criticism. I for one consider DHH's criticism to be on mark.
I don't think Joel's conservative approach was all that bad.
He made a judgment call for --his-- company's products at the time. He could afford to, and it appeared to have worked well for his product line. And believe it or not, people wanted to work at Fog Creek.
In 2006, I wouldn't have imagined the explosion of language acceptability that we have today. We see everything from Ruby, Erlang, Scala and other languages widely used -- a great thing that you can use the best language for software development instead of being limited to the "safe enterprise" choices of Java or Microsoft .NET.
It would have been more of a mistake if Fogbugz turned out to be a load of crap, but it's pretty good. I prefer it to JIRA.
Right. I recently listened to a lot of the archive of the StackOverflow podcast. Joel & Jeff periodically discuss Rails, and Joel eventually recognizes Rails as a viable option once it improves. This blog post must have been before that.
One of the best parts is that he pretty obviously has Spolsky dead to rights here, but typically DHH is thought of as an asshole and Spolsky as the software nice guy. But it's possible to spread FUD with nice-sounding words.
1) Maintenance Nightmare
2) No-one likes programming in propriety language as it dead ends your career
3) Company leavers take vast amounts of knowledge away with them and impossible to hire-in to replace that knowledge
Just because they've been successful doesn't mean they haven't expended unnecessary effort to get there.
Climbing a mountain carrying an extra unnecessary 15 kilos is still successfully climbing a mountain - but carrying an extra 15 kilos whilst doing so is still a bad idea.
Speak to anyone at almost any company and you will get at least one story of some unbelievably bone headed strategic mistake the company is making.
All technical debt decisions should be made based on what the business hopes to get from the debt. Considerations for the alternatives, when the debt is retired, etc should all play a part. To globally say "this is a terrible idea" like so many in this thread are doing totally ignores these types of factors in favor of "It's a bad computer science" idea and thus miss the point of technical debt in the first place.
(Then they leave for the next thing, like they always do, and their coworkers are left with a pile of "??????????????")
I still wouldn't want to have to learn this language, would you?
I recently played around with Atari 2600 BASIC (which is quite possibly worse emulated than on a real system) but found that it had an incredible tracing mechanism I've never seen elsewhere. Also, you could control the speed of execution as the programs runs (imagine---running a program at full speed, then slowing down to watch exactly what happens). You won't know until you try.
The way I read this article, creating Wasabi a decade ago was not a mistake, given what they were doing and what was available at the time. Not open-sourcing Wasabi was a mistake, though.
When I learn a new, perhaps hyped-up computer language, I soon run into difficulties, no matter what the merits of the language. The difficulties are lack of tooling. eg, no debugger - or only a rudimentary one, no static analysis, no refactoring, no advanced code browser.
If the language is successful, these things come with time.
When you develop an in-house language, you'll never get the advanced tooling that makes for a great software development experience. This, for me, was why I was surprised by Joel Spolsky's announcement of an in-house language.
(Although, to be fair, these things didn't really exist of VBScript nor for PHP at the time Wasabi came to be.)
Aren't tools like RubyMotion, Appcelerator and their ilk kinda similar to Wasabi in terms of what they do?
You might even make the argument that Coffeescript is somewhat similar to Wasabi.
It's listed from a Google search, but from just clicking around the Fogbugz site I can't even find the page/pricing for on premise installation.
I wonder to what extent the generated C# code depends on C#'s dynamic typing option. I ask because the original VBScript was certainly dynamically typed. So by the end, to what extent was Wasabi statically typed, through explicit type declarations or type inference? And how much did the compiler have to rely on naming conventions such as Hungarian notation?
The type inferencer was an attempt to make it as easy to declare variables and functions as it was in VBScript. Of course, you could explicitly declare types as well.
The one place Wasabi depends on Hungarian notation as a type heuristic is when reading from an ADO RecordSet (which happens quite frequently in FogBugz). The column name determines which type the returned object will be cast into.
I've been on that end and it really sucks, especially if you don't have the manpower to keep all platforms and versions in sync feature-wise. And there even was a C++ core that was identical across three platforms and yet there still was feature disparity and neglect.
It's funny how they teamed up 2 years later to build stack overflow (and again used an MS stack)