Hacker News new | past | comments | ask | show | jobs | submit login
Reaching the Limits of Adobe Stupidity (whitequark.org)
178 points by siasia on May 6, 2012 | hide | past | favorite | 49 comments



The title is misleading. There is no indication such a limit exists.


For some perspective, the Flash Player code base is almost 20 years old and must maintain binary compatibility with ancient content (because many websites have binary .swf files without the .fla source files). Flash is likely the world's most installed software when you consider that most Windows and Mac and Android devices have Flash.


Adobe Flash, formerly Macromedia Flash, formerly a vector-based font-authoring tool rejigged into an animation system called FutureSplash.

It's no wonder the thing is so jammed full of fossilized code. Legacy platform support. Legacy plugin support. The whole thing is a museum of bad ideas and shoddy implementations done by people who no longer work for companies that don't even exist.

It was wildly successful, beating out even its brother Director slash ShockWave, which is an admirable achievement, yet did not come without a cost. Like Windows its success has saddled it with an enormous base of users and developers to keep content.


This is not as much of a burden as would first appear. The vast majority of Flash files (except those that rely exclusively on actionscript for rendering) are based on as relatively small set of primitives/instructions: place object, move object, show frame, etc. When new features were added typically a new instruction was introduced. Only very rarely was an existing instruction modified for example PlaceObject2 being extended to include optional event handlers for movie clips.

A big problem with all things Flash-related is that Adobe was rather half-hearted in its commitment to making Flash open so the documentation was never quite as polished as it could be. Another problem which probably the source of the frustration in the original post is that functionality was added rather than existing code upgraded. That probably meant that the existing code along with all the warts was left untouched rather than run the risk of making breaking changes.

The "interesting stuff like jumps past the end of function" were I think attempts at obfuscation since decompiling flash files and actipnscript is not very difficult. Another example was to append actionscript byte-codes at the end of a table used for strings and amend the length of the table to hide the code. The following actionscript byte-codes could then add jump into this hidden code which would hopefully but not successfully confuse the decompilers.


This is just a subjective feeling, but back in the 90ies, everyone seems to have been producing code with only one goal in mind: have the compiled output work well enough so it can be shipped.

There was next to no automated testing, there was some theoretical best practice, but at least in the windows world nobody cared. Heck, back in the days there weren't even multiple accounts on a machine, nor were there file system permissions for example.

Of course, those old days are long gone by now, but software surviving from back then (like Flash Player) still has this legacy.

As an aside: when it initially came out, Flash Player was called ShockWave Flash (.swf - ring a bell?). Flash was this lighter, simpler (and cheaper) version of the Shockwave authoring system.

At some point it superseded it in features though.

Some of this old Software can be rewritten, but most of it is still being depended on by independent third parties which require the mess to be bug for bug compatible.

Combine that with binary file formats, the need for backwards compatibility, the need for newer software to read older file formats and in the case of Flash, the need for newer plugins to execute old, buggy bytecode because so many compiled swf files are out there of which the source (also in binary .fla form btw.) is long lost.

It's so easy to look at Flash player today and laugh at Adobe, but remember: this piece of Software comes from a different age where nobody but the Unix guys knew better (and they knew better than to touch Windows).

It's a good thing that it's slowly dying I guess.

By the way: I'm as guilty as anybody else.

Thankfully though, the age of dial-up is over so nobody is going to use that Windows dialer I wrote back in those days (despite its bugs, it was deemed good enough to win awards even in dead-tree publications).

Where this does bite me personally though is in the Windows frontend of our product. I've written that abomination right after finishing high-school back in 2001.

Unfortunately it is still in wide use and I have to maintain it which is ever so painful.

In theory it really needs a rewrite, but by this time all new users are - of course - using the web frontend, so only people who don't care at all about technology are still using the windows clients (still 25% Windows 98 of all things)

So rewriting the client and then pushing an auto update would be incredibly painful for them unless it's pixel-by-pixel identical.

And to make this rewrite run on Windows 98. I don't even want to think about this.


I think the real changed was projects started to skip the testing step. When programmers can get new code into production in the same day they need to do their own testing.


Actually they did change the before in a not backwards compatible way at least two times (IIRC), so there should not be much pain in doing that again... At least from the updates/compatibility preserving point of view they have it sorted.


Can you point to a link describing this? As far as I remember they were still bundling the old runtime and feeding older code through the older runtimes.


Yes, I think that's correct, but I don't know the details of how was it done. The point was that they have done this before and the new design isn't that old actually.


none of this explains why the compiler doesn't do some basic optimisations, or why the incorrect code generation hasn't been fixed. as long as it generates bytecode within the legacy spec, it should be perfectly backward compatible.


One can only approach the limits of Adobe's stupidity, not actually reach it.


2 x 10^13 years: Estimated time for a self assembling system of the complexity of Adobe's software lineup to reach equilibrium, e.g. to balance out new bugs with fixed issues.

http://news.ycombinator.com/item?id=3936320


HN needs more Adobe bashing.


Being this is HN, I like to think this more a calculus joke =D


I have lost faith in Adobe because they cannot get their products working on case sensitive HFS volumes. The problem has existed for almost a decade I think. The fact that one cannot install Creative Suite on another volume makes it even worse.


Slightly more sickening is that for some of their apps it can be fixed by simply renaming directories and libraries within the App bundle. I googled around for a bit a few months ago and got PS and InDesign working on a machine with Case Sensitive formatting. This sort of laziness just kill me.

One example of the tail wagging the dog: http://forums.adobe.com/message/3311504


This may be somewhat irrelevant, but what is the motivation for having a case-sensitive HFS volume? I agree that Adobe should support it (and separate volume installation) because these are edge cases that should be accounted for, but I've never heard the reasoning for having a case-sensitive format.


I just always have done it because I want my mac to act like a normal UNIX system. I suppose it was kind of a principle thing.


This is why I do it. To make my local dev system a bit closer to my (typically linux) deployment systems.

(I remember wasting a lot of time once on a rookie mistake where I had a Perl script with "use Strict;" in it instead of "use strict;", it worked fine on Mac OS (OS9, in this case), but failed in at-the-time-inexplicable ways when deployed on unix. I suspect my habig of insisting on choosing case-sensitive filesystem stems largely from that… And yeah, doing perl/cgi webdev on Mac OS9 was a really bad idea, I discovered.)


I presume it partially is due to POSIX. Here's an errata from MS about that: http://support.microsoft.com/kb/100625

(TLDR: NTFS itself is Case sensitive even though the Win32 API isn't because NT was/is designed for POSIX compliance)


I would say the stupidity here is any modern OS still having case sensitive file systems.


iOS is case sensitive (and it bit me), way to go Apple... http://www.enavigo.com/2012/02/12/xcode-ios-simulator-is-cas...


Why am I not surprised this is about bugs in Flash?

It ain't much better on the artist side. The UI revision after the Adobe/Macromedia suit was so bad it was one of the major reasons I left the animation industry. My friends who stayed have AMAZING piles of helper scripts to patch it up and make it halfway usable.


One of your major reasons for leaving an entire industry is because of a UI revision in ONE of your software tools? Surely you are exaggerating.

What are the amazing piles of helper scripts intended to patch up an UI revision consisting of?


No, no, really. Flash was the tool used in ALL the jobs I was getting. And the UI changes from 5 to MX were completely flow-breaking; imagine that a new revision of your favorite text editor made you wait a half a second after every third keystroke, and flushed the typeahead buffer every time that happened.

And you HAVE to update to this new tool because the files saved by it are incompatible with the old versions; if you're gonna interoperate with other folks you have to somehow deal with this maddening new behavior.

I was already coming to feel that I would never have the joy of working on my own stories and creations because I didn't have the burning drive and social skills to claw my way to running a show. All that was left of animation to me was the joy of making things come to life. And this UI change made that a death of a thousand cuts instead.

The helper scripts are a lot of macros to automate common usage patterns, and a few plugins to create new palettes. I can't remember the exact details; this was a whirlwind tour of Cartoon Network's working methods when I was staying at their head Flash director's place for a funeral.


I'm surprised nobody mentioned the wonderful

http://adobegripes.tumblr.com/

yet :)


I knew Adobe was bad but I didn't realise they were that bad.


As noted in the article comments, much of the AS weirdness goes back to Macromedia - but Adobe itself has its own share of blunders.


And before that FutureWave.[1]

[1] http://en.wikipedia.org/wiki/FutureSplash_Animator


I still like ActionScript because I like javascript, but in many cases I want to declare my types so I can get type checks at compile time instead of being surprised or having to write unit tests to be sure of myself. Also refactoring large javascript or python projects is my nightmare because getting it to 'compile' means nothing, so many bugs could be lurking still.


Reminds me of this: https://twitter.com/#!/dozba/status/198081985506328576. Seriously, put the effort in to learn a unit testing framework (Qunit) and a mocking framework (SinonJS) and use them. You'll find that refactoring your large dynamic projects no longer involves blindly 'changing shit'.


Shouldn't the title be Reaching the limits of Adobe Flash Stupidity?

You couldn't say the same about the Photoshop team, IMO.


There is this old rant about the PSD file format, which I was instantly reminded of when reading the article:

http://code.google.com/p/xee/source/browse/XeePhotoshopLoade...


I know some really angry designers that disagree, but fair enough.


As a designer I can attest to this. I think Photoshop is missing some key things for a web designers use. Although CS6 is looking great.


He's critical of the compiler's optimization and the efficiency of the code, but doesn't Flash have an embedded JIT, which effectively means the byte codes are pre-compilation, and actions that might seem like optimizations would potentially make the JIT's life far more difficult?


How does

    (ternary (false) (integer 15) (integer 15))
help JIT?


Probably doesn't help it. Probably doesn't hurt it either though. I'm sure the JIT realizes the whole operation is pointless and just replaces it with what is effectively (integer 15). It did look like there were bits of instructions put in there to make it easier for the JIT to see where certain boundaries/behaviours were in the code, and maybe this is one of them. Or it could be something stupid that just makes it easier for them to manage the code in the flash compiler. Either way, even if it is a bug, it won't negatively impact the runtime if the JIT is doing anything like a sane job.

Think of the output of the flash compiler as being like the output of the C processor. Then it makes more sense.


It still hurts everybody, because it bloats your files with unnecessary opcodes and forces the JIT to sift through it and optimize it away, instead of spending its time to do useful optimizations.

It might only seem like a little turd on the sideway that you can easily step over, but quite evidently in the case of flash little things like these have added up and its performance is now -- in the metaphorical sense -- up to its neck in shit.


> It still hurts everybody, because it bloats your files with unnecessary opcodes

After compression, what do you figure the cost is? I mean, that looks like a lot of text, but as opcodes it's a few bytes at most before compression. Improving video codecs provides way larger gains than fixing things like this.

> forces the JIT to sift through it and optimize it away

That code is already going to be there, isn't going to burn up a measurable amount of CPU, and only imposes a cost once per load. There are better places to work on improving the product.

> It might only seem like a little turd on the sideway that you can easily step over, but quite evidently in the case of flash little things like these have added up and its performance is now -- in the metaphorical sense -- up to its neck in shit.

That's actually far from clear. I think another way to look at it is that Flash has so many other problems that focusing on this BS clearly isn't worth their (or anyone else's) time. Heck, for all we know some of this has been caused by fixes to other performance problems they've been working on.


Wow, is that code terrible. I wonder if there would be a market for a "Flash Compactor" to remove all that dead code and speed things up?


Sure, it's exactly the same as PHP opcode cachers market.


Now I know the reason why I felt uncomfortable about ActionScript. Flash folks were always convincing me that AS3 is perfectly fine and kickass language. Next time I'll just link this.


Given that you didn't have this information before, it's highly unlikely that this was why you felt uncomfortable. Most likely you just don't like ActionScript. It's OK — you're allowed to have personal taste.


To be fair, there's nothing wrong with the language. The article points out problems in the VM and the compiler.

There's nothing stopping someone from writing, for example, an AS3 compiler for the JVM that interacts with Java libraries in the same way JRuby does.


Well, since he is talking about how bad the specs are (and not the VM) this seems only partly true ;)


check out haxe. it is a good, modern language that compiles to a number of targets, including flash.


Remember kids, optimization doesn't matter!


Also, there are some great and funny comments about this article on reddit: http://www.reddit.com/r/programming/comments/t9qxy/reaching_...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: