Hey HN, I haven't done a lot of technical postmortem blogpost-style writing
so I'd welcome any feedback or tips on how to improve. Is it too long, too
short, too technical, not technical enough? Boring? Interesting? Is it
enlightening or does it just come off like content marketing? I literally
have no idea how good I am at this.
I don't think anyone here is going to complain about a post where you make a game work by reverse engineering Bezier splines encoded in XML by Adobe Flash.
However I wonder why the OP didn't try to keep vector graphics and use to SVG for instance? That would allow for infinite scalability. It is mentioned that "GPUs don't like vectors" but if the game doesn't change too often it should not make a lot of difference?
I did consider it. Infinite scalability isn't required during runtime; the scene doesn't ever zoom or scale in the games, because that was always so slow in Flash, so the only real benefits would be 1) supporting higher resolutions and 2) reducing file size.
For 1) I decided I'd rather just release an update with larger textures if these ones ever start to look dated. That way I get to keep the runtime code simple. Less code means fewer bugs. I don't want to spend a lot of time fielding support requests from users who hit edge cases in the rasterizer. As for not changing too often, that's true, but taking advantage of that means doing change tracking with dirty-rectangles or similar, which not only adds complexity but also feels like it would make performance less predictable.
And for 2) the game as it stands now is under 50MB so I didn't feel a pressing need to make it smaller, although a tiny executable would be cool in a satisfying, demoscene kind of way.
So, I think I need to elaborate on "GPUs don't like vectors". What the OP meant was "GPUs have literally no support for rendering anything other than pixels on triangles and getting them to efficiently draw Bezier curves and fills is an active area of research". You'd need specific hardware support for rasterizing them, and as far as I'm aware no such hardware exists.
When Adobe hacked on "HTML5 support" to Animate, they did exactly the same thing the OP did. It renders every shape in the FLA to a sprite sheet and then draws it to a canvas tag. If you have knowledge over what will be drawn ahead of time, this is the most reasonable thing you can do.
Even before HTML5 support, the AS3 Starling framework that let you "use the GPU" would pre-render all your vector assets to bitmap textures at runtime. And that was a framework built for Flex developers; if you were accustomed to building things on the timeline, you rendered on CPU, because that's where all of Flash's very particular rendering logic has to live.
Ruffle gets around this by tesselating every shape into a GPU-friendly triangle mesh. This lets us render stuff that's technically vectors on GPU. But as you can imagine, this creates its own problems:
- Flash has very specific stroke-scaling rules. If a stroke is smaller than 1px[0], it will be rounded up to 1px. This is how Flash's "hairline stroke" feature works: it actually asks for a 1/20px[1] stroke, and that gets rounded up to whatever the current scaling factor for that shape is. When your stroke is a polygon mesh, you can't vary the stroke to match Flash without retesselating, so hairline strokes on shapes that stretch don't animate correctly.
- Likewise, any stretch of a stroke that changes the aspect ratio also distorts the stroke, since its baked into the tesselated mesh. There's a minigolf game that does this to hairlines and it will basically never look right in Ruffle.
- Tesselated vectors lose their smoothness, so we have to sort of guess what scale the art is drawn at and add enough detail polygons for things to render correctly. Most of the time we get it right. However, there are some movies that do crazy things like store all the art at microscopic scale and blow it up. This provides a compression benefit, because it quantizes the points on the art with little visual difference on Flash Player. On Ruffle, however, the art becomes very low-poly.
- All the tesselation work takes a significant amount of time. There are certain movies (notably, a lot of Homestuck) that would hang the browser because of how much ridiculously complicated vector art was being processed before the movie even loads. We had to actually limit how much art could tesselate per frame, and expose that to movies as bytesLoaded, which is why Ruffle movies have preloaders even though we don't support streaming download.
There's another approach to drawing Beziers on GPU: drawing the hull of control points with a procedural texture that calculates the underlying polynomial. This is especially simple for quadratic curves (the ones with one control point), which is what all Flash timeline art[2] is.
However, strokes are more complicated. You'd think we could just take the hull of the stroke and draw that as a fill, but you can't. This is because the offset of a Bezier curve is not a Bezier curve. Drawing the stroke in a pixel shader would make sense, except you still need to define a polygon mesh around your stroke with a reasonable texture coordinate system to make the math work. And the polygonal outlines of Bezier curves can get really funky; there's no obvious way to quickly say "here's a curve, now give me the polygon that encloses a 5px stroke around it". Remember how tesselation takes so long that it would hang Ruffle?
[0] I'm not sure if this is virtual or device pixels.
[1] Flash specifies vectors in fixed-point units called twips. That's why the zoom factor on Flash movies was locked to 2000%.
[2] Flash can draw cubics - the two-control-point curves you think of when you think Bezier - but only in response to an AS3 graphics command. We haven't implemented this in Ruffle yet.
> "GPUs have literally no support for rendering anything other than pixels on triangles and getting them to efficiently draw Bezier curves and fills is an active area of research"
Well fuck me, I had no idea. I figured that 'simple 2D vectors' would be beyond piss-easy for modern GPUs. I'd never considered that it wasn't the actual math space that was accelerated, rather the fast memory mapping of everything on presumably comparatively simple geometries. You've just turned my view of the world upside down :(
Yeah, this is something that many many people assume (vectors are "easy" on GPUs) and then are amazed (like I was!) that they aren't even in the function set. My thought was that if you were "accelerating" desktops you'd really want vectors right? But no.
The interesting thing is that graphics cards in the 90s - which were "desktop accelerators" past the low end - specifically supported accelerated 2D primitives, including Bezier curves for the more powerful ones. It was all gradually dropped as cards focused on 3D games specifically.
Have you tried splitting the area covered by each shape into several 8x8 pixel rectangles, then running a compute shader over each one that executes the exact same rasterisation algorithm as Flash did? That's more or less how triangles are rasterised on the GPU anyway.
It's definitely not a simple solution, but might enable you to do runtime rasterisation with a good framerate on the GPU rather than pre-rasterising all the vector art.
Also, as far as I understood, both GNU Gnash and Lightspark were using OpenGL for rendering. So I always expected that GPU would still bring some sort of acceleration for 2D path rendering ?
Flash was my 2nd programming language (after VB6) back when it was still Macromedia. I had a lot of fun making really crappy minigames and learning the basics of coding. This article brought some good nostalgia and plenty of astonishment at the lengths you went to.
Your writing itself is great. A lot of writing linked by HN seems too verbose because people try to sound smart. Yours is succinct while engaging. Right length, just technical enough, and interesting. I really enjoyed this, thanks for writing it!
Also:
> Object-orientation is not fashionable right now in gamedev circles
Object-orientation is kind of the old-school way of doing things, these days it's all about ECS - entity component system. This is a more data-oriented approach that has the potential for much higher performance when you have thousands of objects that need updating in your game. It's similar to how OO languages like Java/C# are going out of style and more (nominally, at least) data-oriented languages such as Go or Rust are in style.
I watched this talk years ago: CppCon 2014: Mike Acton "Data-Oriented Design and C++"
I was very impressed by the ideas presented.
Java/C# can be used to write a program with "just a bunch of structs + static methods". Isn't that data-oriented enough? Ya, I know we can do 72 levels of inheritance also... for ignore that for a moment.
Yeah, "OO is going out of fashion" is a common phrase in gamedev, but what's actually meant is "deep class hierarchies defining the behaviour of game objects is going out of fashion". Most ECS code is still written in OO languages and makes heavy use of OO features.
Probably how OO teachers and mentors failed horribly (AFAIK we still don't warn new students about the pitfalls of inheritance ?), so ECS got popular as a better alternative to shitty OOP code ?
Not only that, but due to CPU caches and memory alignment issues, ECS have better performance than inheritance; also because most ECS systems ditch virtual function calls.
Whether or not an ECS will actually be faster depends entirely on the type of game and access patterns of the inheritance structure. Further, it matters how you are doing inheritance (LTO can eliminate a lot of the pitfalls even if you use virtual function calls).
This is nothing against ECS. I just find the current claims of performance dubious and potentially flat out wrong given modern compiler optimizations and changes in CPU caches (mainly that they got a LOT bigger).
I believe claims of "compilers getting faster/bigger enough" are the exact reason we have ECS now. It surely depends of the kind of game, but games with any kind of entity which isn't unique should feel the performance improvements.
Ideally, compilers should be able to transform classes into ECS layout when generating code. Shouldn't be hard to have a pass before compilation, if we use clang.
There's a gem of a hidden narrative here for more junior developers that you don't always need to start over (which 9/10 fails) but instead be resourceful within your constraints and to be mindful of what your constraints actually are, which don't have to be new and shiny.
It was a pretty insightful article, I didn't know how a flash game worked internally and how it could be reimplemented! I think it is also relevant for those migrating their projects to newer technologies, I always found this process fun(except there is a deadline) :)
(Edit: I love your games and used to play a lot when I was a kid)
Is it too long, too short, too technical, not technical enough? Boring? Interesting? Is it enlightening or does it just come off like content marketing
Long, but well-organized and not ranty. You can quit after a few sections and still learn interesting things. Technical enough with enough screenshots to keep it from being "wall of text". It is a good way to share some eye candy from your games. [E]nlightening? Yes. [C]ome off like content marketing? Yes, but this is the kind that we want to read -- basically, interesting content marketing. Don't be ashamed that you need to hustle a little bit as a small software shop.
Oh, and you forgot funny. This part made me laugh:
Flash stores its vector graphics in XML format. You might argue that XML is a poor choice for graphics data, but you weren't a product manager at Macromedia. Behold: ...
I’ll add to the chorus and say that this was a wonderful post and the type of content I hope to find when I come to HN.
I enjoyed how you explained certain technical concepts succinctly and using simple language. For example, as someone who never used Flash, you explained the 1000 ft view of how Flash works. When I’m looking at a new tool or framework, I try to star by understanding the mental model behind it. Yet, a simple explanation of it is often difficult to find.
I was also struck by how much technical work was involved here—you made it sound easy. How long did this project take you?
I liked it a lot! It doesn't feel like marketing at all, and I think it's a good balance between technical and non-technical (though I wouldn't mind more details).
Adobe couldn’t figure out how to monetize Flash, other than selling the Flash authoring tool to Flash designers, a dying breed, for a few hundred bucks or bundled in Adobe’s Creative Suite. Adobe tried selling Flash video DRM, which got undercut by Google Widevine, and selling licenses to unlock Flash opcodes for advanced 3D and C++ cross compilation, which was undercut by web technologies like WebAssembly.
Also, I think Flash was more expensive to maintain than Adobe’s other design applications because the Flash Player needed constant security patches. Adobe was usually happy to milk cash cow products, such as Director and Shockwave, transferring maintenance to remote teams in India.
Keep on writing!
I enjoyed reading what you tried and why you ended up with what you did.
It is fine to mention your game, I do not consider that content marketing at all.
It's interesting for sure, I'm left wondering if it was less work to make flash continue to work somehow instead of re-authoring the original games.
I mean you've probably learned a lot, but you could have learned to author games using a more modern tool than flash instead and that would have been useful in the future, right?
Great article! I also still use Flash by using a custom runtime that interprets the output of the “Export as texture atlas” feature. I was wondering if you had attempted to use this feature in order the rasterize the graphics?
Looks like that feature was added in Animate CC, which I don't use, so I haven't tried it.
Looking at the docs, it looks like something I would have at least tried. Might have saved me writing the rasterizer, though I'd still have wanted to atlas everything together into bigger atlases, and make binary animation files.
One downside vs the .fla parsing approach I went with is that even if you can script away the GUI clicking, you'd still have to have Flash open while the build script runs. I quite like having an exporter that is just a CLI program you can run without Flash, though it's not a massive deal.
I'd be interested to see what exactly Animate puts into the animation.json file it generates. The docs don't seem to mention it, and I couldn't find any example ones online.
Your tool looks great, by the way — have you made any games with it?
It is perfectly balanced. It can be read top to bottom in a few minutes.
If you feel like to expand a part you can write a new post and link it from the original one.
Some other posts are born with all the tiny details, are too long to be read on one session and I end up missing even the key points because that second session never happens.
I might have missed this in the article but are you sharing any of your work? I know that a lot of the work was manual but some of your tooling might be useful for anyone else who want to go this craftsman route of converting a flash game.
This was a great read. I distinctly remember playing Hapland as a kid. Armour games or Crazymonkeygames, can't remember which one it was. So this was a great nostalgia trip.
I thought it was great, and I have no interest in games or game development, but I read the whole thing top to bottom and came away thoroughly impressed. Excellent post and work!
Also, the idea of using text/assembly as an intermediate format of binary files is pure genius. I will probably use that occasionally for similar use cases.
> Uploading achievements to Steam is a pain. You can't just define a list and give it to their command-line tools; you have to laboriously click through the slow, confusing miasma of PHP sadness that is the Steam partner site and add them one by one.
> I think if you're a big important game studio you don't have to stand for that and they give you a bulk upload tool, but I'm not one of those so I looked at the HTTP calls it was making, saved my login cookie to a file and wrote my own.
No way. Fucking hell, Valve. Sort your shit out. How many thousands of hours have been wasted because of this?
> Although I developed the game mostly on my Mac, during development Apple invented this thing called “Notarization” where if you run any app on a new version of MacOS, it'll make a network request to Apple to ask if the app's developer pays Apple a yearly fee. If the developer does not pay Apple a yearly fee, MacOS will pop up a dialog strongly implying the app is a virus and refuse to start it.
No way. Fucking hell, Apple. Sort your shit out. How many thousands of dollars have been wasted because of this?
>No way. Fucking hell, Valve. Sort your shit out. How many thousands of hours have been wasted because of this?
This is to prevent abuse. Some games will have 1000s of achievements as a feature, because gamers like games with lots of achievements (and because Valve kinda encourages that with some of their store features), so forcing people to do it manually is a way to decrease that kind of activity somewhat.
Valve already capped the number of achievements to 100 because of that kind of abuse, so I'm not sure it justifies having poor UX:
> By default, games are limited to 100 achievements at first. Reach out to us via the "Support" option at the top of this page if you have questions about this.
That's a lazy excuse IMO. There's a lot you can do, like letting small developers have up to 100 achievements, with more requiring talking to partner relations.
An easy solution to this is to give point-based weights to achievements and cap the max total "points" any given game can have, allowing a developer to have many small achievements for their game if they wish. But really, none of this matters because Steam achievements are easily auto-unlocked via SAM.
<<you have to laboriously click through the slow, confusing miasma of PHP sadness that is the Steam partner site and add them one by one.>>
Read question: Can this be automated with Selenium? For a long time I was using very low level Chrome DevTools libraries. Then I switched to Selenium. It is so impressive. It actually works, without weird 1-5% failures that you waste time chasing... then adding small sleeps to make it smooth and working.
I would suggest playwright or puppeteer myself. It uses the dev communication channel for chrome(ium)/firefox and works much better, even if you're limited to JS... there's a Deno port of puppeteer if you want to avoid some of the node.js bits for (imho) a cleaner experience.
I remember creating a simple "catch the falling X" game in Flash 20 years ago. It was basic, but had everything you'd expect in a beginner project: points, multiple game states/screens, a victory condition, basic story (which was just a screen at the beginning, like NES). The thing is, I was a literal child and I also had no clue how to code -- I did everything by following tutorials and gluing together code I copy&pasted&modified. And it worked! And people played it!
They all thought it was pretty shit, but to me it was more important that some random person across the globe genuinely picked up something I wrote, played it, and then wrote up a review of it.
I showed it to my parents and they immediately started pushing me to learn programming.. and well here I am now. Thanks Flash.
It blows my mind that no one has made a tool like that in so long. It had the same quality as Excel, where any non-technical person could just look up a tutorial, right click a bit, and make something usable. Why has the pinnacle of ease of use in game creation been achieved and never replicated since?
That's how I feel about HyperCard. Some apps were so ahead of their time that when they were axed, nothing comparable replaced them.
I believe that this is due to a tragedy of the commons in the software industry. We're all so creative, with so many dreams, but spend the majority of our lives working 40 hour weeks to outcompete each other and make rent. Then when someone wins the internet lottery, they succumb to their ego and do a bunch of stuff in business to hoard even more wealth, rather than working to reform the system which keeps so many others down.
I would very much like to write a programming language, a blogging tool, a web framework, a game, a game development tool, etc etc etc. But I never will. Most of the people reading this never will either.
So I endure. I meditate, I practice stoicism, I go to the gym. But I had to let my dreams die for my own self-preservation because opportunity cost grew to consume my entire psyche. I simply can't think about all of the other things I could be doing while I have to work. And all I do is work.
There doesn't seem to be much help coming in terms of UBI or having N people split a 40 hour workweek or forming artist communes for makers. Sure, we hear the occasional stories. But it's like we're all waiting around for a billionaire to liberate us, rather than creating a scalable cooperative system to sustain all of us right now today.
What I'm saying is that until we look beyond the technical challenges and see that the problem is subjugation, we'll never get free of the lackluster tools, because we'll never have time to make better ones.
"but spend the majority of our lives working 40 hour weeks to outcompete each other and make rent"
THIS.
I am so sick of the competition in most tech jobs / companies to climb the stupid ladder. Everyone just wants more more more and our society reinforces that. I _REALLY_ miss working on teams that didn't have that amount of competition causing mild dysfunction.
Just for some perspective, the 40-hour work week is a relatively new invention that only surfaced with the recent rise of the white-collar workforce. In my parents' day, the 40-hour work week was called "bankers hours" and this was used in a highly derogatory "sure must be nice" tone of voice. Many or most of us are probably descended from farmers that worked doing hard physical labor from sun-up to sun-down six days a week at a bare minimum with the rare holiday being reserved for family gatherings and religious observances. No travel at all unless you could hire someone (or left your kids at home) to feed the pigs or whatever while you were gone.
I'll take 40 hours over that kind of life in a hearbeat, thankyouverymuch. Although I am still trying very hard to make it so that I don't have to do even that much for the rest of my life.
In general, I don't disagree that it's sad that we spend such a huge proporation of our precious, finite time on our planet working. Most work without enjoyment because that's how you put food on the table.
On the other hand, the person who cracks this nut will enjoy unprecedented fame and their name in the history books for the rest of human civilization if they figure it out. (And no, it's not going to happen no matter how many 140-char political rants there are on the bird site. Very arguably, those are in fact the opposite of progress no matter which "side" you think you are on.)
And thousands of years ago it would be feasible to live on 30 hours of work a week. I wouldn’t want to give up modernity for that, but it’s not as though tremendous amounts of labor is the original work week.
But how many hours does the person who lives in the RV down by the Aqua Park work? They don't need to pay for refuse dumping because they simply dump in the lake which now has Hepatitis fumes so it's not a good idea to jog around there any more. And they have huge piles of bikes.
Fail to compete and you'll be left behind in terms of compensation, performance review rankings, and career growth, the last of which also limits your future job choices. It also puts you nearer the top of the list for layoffs when corporate finances get tight.
But those things shouldn’t bother you if you aren’t interested in the “more more more” society. It’s easy to get a stable programming job in the US that pays $80k and has very little lay-off risk.
You can’t have it both ways complaining about being left behind in comp while not participating in the grind that allows those comps to go higher.
Lower paid jobs can be worse. That they can’t afford to pay you more might mean they are doing something noble but usually it is a wannabe growth startup being badly run and pushing down pressure on lower waged staff to compensate.
But you have a point: at 80k requirement you could easily freelance to earn that, for non technical people. Help set up shopify stores etc. and not need
to constantly be on a learning grind.
I don’t think wanna be growth startup is really typical. Typical lower, but not that low paid tech job is more like maintain software tooling that supports mechanical engineers testing, or aids insurance appraisal, or replacing ladder logic industrial automation with something that has more iot type features (and still might be partially ladder logic). Probably all those pay 80k-100k to someone with a few years experience. The downsides are dingy offices that you’ll have to show up in sometimes, and for the last one, potentially lots of travel to not too exciting places.
Participating in the grind however means tossing away millions of employee-hours on paper-pushing jobs that don't really accomplish much, either in terms of personal development or corporate benefit. This is being evidenced almost daily by wave after wave of layoffs for the relatively well-heeled tech industry.
Because for everyone to make what they want to make is, as of current year, a pie in the sky fantasy until scarcity is solved, and perhaps not even then.
There are 8 billion humans, the world doesn't need another hundred thousand programming languages or web frameworks or game development platform or even indie games or fiction or music or film, produced every year when the whim strikes someone's fancy. Even if we could, who can even consume that much content?
Being a cog in a machine at least means you're probably doing something of value to someone instead of writing another book that nobody reads.
As of now we have no better solution to coordination failure except to create the AGI that will save/doom us.
You make a good point. I've latched onto some goals (like making games) because I forget that they're a stepping stone to something bigger.
The bigger thing would have been the web framework, or the programming language. But even those aren't quite big enough.
If I'm being honest, the biggest stuff is curing disease, eliminating wealth inequality.. biblical stuff. Truly I want to be an inventor and work to solve the very hardest problems.
So it feels undignified to be fixing someone's computer when I could be helping researchers to like, cure death and stuff.
But you're right, that act of helping someone is the very essence of our humanity and shouldn't be written off as something inconsequential just because it's not glorious. There is glory to be found though, a different kind of glory in that covenant.
And I totally agree about AGI.
Edit: as far as scarcity goes, I think that economics correctly attempts to allocate resources between people with unlimited wants and needs. It just never got as far as the endgame we're in now, where the primary scarce resource is time. That's an artificial scarcity, so we might have a chance to turn things around if we optimize for that instead of material wealth.
I left the tech world to “cure disease” (working on a phage clinical trial, currently treating three patients, and they’re doing super well!)
With that said, I’m consumed with FOMO and opportunity cost every day. I’ve missed the boat on being founder-level or early designer/engineer on so many projects that have gone on to raise Series B it’s mind numbing. Also, we were paid $0 bootstrapping our stuff in the first five years, and now finally we get paid, but $60k a year.
I’m saying this as a warning to all devs and designers out there. Do NOT do this. If you can get into Google or FAANG or whatever, build wealth that way. Maybe you’ll feel like a sell-out or you never scratched your itch or done something good for the world, but then you can also remember you can take a year off in Bali or whatever.
Personal opinion: I think scarcity is mostly a solved problem, in that most of us are working on projects that only exist to make someone else more money, not to fulfill any actual needs of those 8 billion humans.
There are exceptions of course.
But most of us could stop working tomorrow, and aside from the lack of income, the scarcity of resources available to us would not change.
The real problems that causes hunger/poverty in our world has more to do with logistics and politics, not a deficit of resources.
I cannot understand why you would think so, even in a first world nation.
Who would grow your food? Who would make the tools and fertilizer that enable the farmers to grow your food? The robotics and software that enables those tools and fertilizer to be produce? Who would organize, transport, package, preserve and process the produce so that you can access it?
Apply the same questions to healthcare, education, transport, construction, entertainment, etc.
Some jobs are superfluous and arise out of coordination failure, that's true. Lawyers, administrators, salespeople and parts of finance and the government comes to mind. But "most people could stop working" is an unreasonable assertion.
There are about 2M farmers and ranchers in the US. 22M health care professionals. 4M teachers. 2M truckers, 135k rail workers, 40k miners, etc.
Even at our highest employment levels, under half of those in the US are employed. They're just not counted as unemployed because they're either children, in school, retired, or simply not seeking employment.
Yes, I do believe that most jobs are superfluous - that many more people could not have to work without a material impact to the availability of goods.
You forgot to count plumbers, and everyone in the construction industry.
Also the guy that fills the milk bottles. My brother used to work in a milk bottling plant, but he was not filing the bottles by hand. A machine filled the bottles, someone overlook the machine, my brother was making the chemistry and biological test to ensure the milk was safe to drink. And there was some additional people they call in case the machine gets broken (or to build a new machine.)
Even if every single position in the US held today were considered essential, there would still be more people in the US who are unemployed than employed.
And every job currently held today is absolutely not essential.
That doesn’t support your point though that we’re anywhere near post scarcity and can support people quitting. Many jobs that were classified as essential during the pandemic are things people don’t have oodles of joy doing (garbage pickup, power plant coal loader, etc).
You can cut out the entire “entertainment” or whatever industry you think isn’t essential, but we still need to convince the people working the essential jobs to do it while everyone else free-rides.
That farmer count is only as small as it is because of the massive society they are built on that allows that to scale. You briefly scratched at with miners, but you forgot all of the machinery and auto manufacturers, the massive energy industry that powers all of it, etc.
> Yes, I do believe that most jobs are superfluous
I think you just don’t understand the purpose of many jobs. If the accountant for the local grocery store didn’t exist, what do you think would happen?
Even if we somehow "solved scarcity" and robots could automatically produce all the goods and services everyone needs for free, if we don't get past the primitive "value must be provided through work" mentality, there would be no benefit.
Stuck with this "work virtue" mindset, society would find a way to make it so half the people were paid to dig holes and the other half paid to fill them in, just so we could go on with this fixation on the need to work for a living.
Parent has a point from my experience - most products I worked on never really caught on in terms of being reasonably widely used. I don't think I've ever worked on anything I'd call truly essential.
From what I've seen, most programmers are indeed spending their time building experiments that have the primary goal of making money for the person that pays them - if they create true value that's more of a lucky side effect. And lots of those experiments fail.
My thinking is this - less than half of the population in the US holds a traditional job. 150M or so. And a vast majority of those are not jobs related to "essential" work. They're largely part of the retail and administrative workforce.
And so, if a majority of less than half of the US population is working unnecessary jobs (i.e. only valuable to our economy)... how much of what we do is really valuable to us as a species?
Which is a roundabout way to say that scarcity is a mostly if not completely solved problem; working to survive is largely no longer necessary, just customary.
There's a big leap there though from "non-traditional" to "not necessary". A colleague of mine is working on a project involving ensuring that farmers can track their feed and illness statistics correctly to improve their yields. It's not a traditional job, but it's absolutely necessary to ensure that (for example) the dairy industry can provide enough milk. (And moreover, to reduce the number of people who need to be working on calculating those statistics by hand.)
Likewise, you're also assuming that jobs that meet needs are more essential than jobs that meet wants, but the point of post-scarcity is that not only our needs are met, but our wants are as well. Picard can make himself a cup of earl grey whenever he wants, as opposed to sustaining himself with the recommended nutri-pack. One of my most fulfilling jobs was working for a t-shirt printing company - in theory, nothing we did was at all essential, but it was clearly very important to a lot of people's lives. Some of the most fascinating orders were memorial t-shirts being printed for people's funerals. Not essential at all, but I would argue very valuable to us as a species.
> A colleague of mine is working on a project involving ensuring that farmers can track their feed and illness statistics correctly to improve their yields.
I doubt that's an example of a job/project that isn't just pure administrative burden. Even your example of a t-shirt printing company is more than just pure administrative burden - people need t-shirts, and people like their t-shirts to have art of some form. While art may be a luxury, it is also pleasant and required to some extent.
However, there are some roles that are pure administrative burden, and I tend to think the GP is correct that these roles are a lot more prevalent than we want to admit. I think the 80/20 rule could be applicable here - only 20% of the population produces 80% of the wealth (read: essential or luxury items) while the other 80% of the world are stumbling on themselves in bureaucracy to maybe make the rest of the remaining 20%.
>But it's like we're all waiting around for a billionaire to liberate us
Only in Silicon Valley, and that's probably a big part of the problem. In some of the highest paying, most creative occupations on the planet people are drooling at the feet of billionaires to save them while they slave away M-F.
Most people throughout history didn't look at the rich with anything but contempt. In Silicon Valley they're held up as geniuses on every subject. Look at how a subsection of the industry treated Elon Musk taking the axe to Twitter (politics aside).
You’re confusing “billionaires” with “a few select billionaires that built disruptive businesses from the ground up”. Nobody in Silicon Valley is “drooling at the feet” of the Walton children.
> Then when someone wins the internet lottery, they succumb to their ego and do a bunch of stuff in business to hoard even more wealth, rather than working to reform the system which keeps so many others down.
It doesn't even have to come down to greed though.
If some meme stock speculation 1000x'd, my family could live off the gains in perpetuity and I could spend all my time working in a soup kitchen, doing good.
But if I instead figured out how to turn that into even more money, I could start a new soup kitchen and help even more people, doing even more good.
But if I instead figured out how to turn that into even more money, ...
It's altruism if GP stops the recursion at some point and actually uses the money to help people and do good.
It's effective altruism if they use the money in a smart way, that reasonably maximizes the good it does for others, instead of doing something stupid or just extremely suboptimal.
“ Then when someone wins the internet lottery, they succumb to their ego and do a bunch of stuff in business to hoard even more wealth, rather than working to reform the system which keeps so many others down.”
I'd add Visual Basic to your "glue bits together" list. After working on C++ & FORTRAN professionally, Visual Basic was like a breath of fresh air. I remember thinking "how could programming be this simple". It felt like cheating.
Ofc, there's an evil undercurrent to the "any non-technical person can use it". We ended up having to support a tool that some "business analyst" had written. My eyes bled.
I remember how the last VisualBasic programmer left the first company I worked for.
On one hand, he was ignorant
in regards to everything we thought was mandatory for development: version control, code style, code structure. We, young Cpp/Python programmers, despised him.
But nothing we had (QT, 2007-2008 web tech, etc) made it possible to spit out a business UI in an hour.
To be honest, I still don't see anything comparable.
> To be honest, I still don't see anything comparable.
I think it's just around the corner for those types of applications to come back. And surprisingly, they will come hand in hand with note-taking outliners, not your run-of-the-mill no-code or low-code tools.
See for example Obsidian canvas, released this week. I see it as functionally similar to Visual Basic GUI editor, except simpler and easier to edit; and its layout is stored as JSON in a .canvas file. The open-source Logseq is adding similar visual layout templates together with data-recovery primitives.
They just need to add the core interactive data widgets (text field, text area, buttons) and expose the cards to a programming language in a end-user accesible way, and you'll have the power of a visual language, except web-integrated and with a really simple way to add data (the outliner editor).
.NET with WinForms could do that just fine in that timeframe (and today - it's still supported), and many VB programmers ended up there. Delphi was another popular contender that's still around.
One more thing that changed is a platform situation. Everyone is looking into building crossplatform tools, and MS tools do not really cover this front.
Thus the abundance of those browser-wrappers such as Electron. The html/js stack is far from trivial but it really is truly portable.
So the new visualbasic has to be different, and I don't think newer ms tools cover this niche.
What I loved about VB was how the basic UI was handled seamlessly and yet you could really dig in on optimizing user flow on top of it. It wasn’t pretty, but it was functional, and you could spend your time making sure it felt good.
I can still get there with modern tools, but the defaults / conventions in VB are simply superior to anything I’ve worked with since.
I feel like Python is starting to get a lot of the same bad rep because non-techies can learn it. It's actually a great language if you want to build stuff really fast. Not everything needs to crunch a billion operations per second and even if you want that, there are several good libraries for that.
Yep. Python has potential to support some environment like Visual Basic. But there is no such project, as far as I know. All GUI frameworks are more complex, and there is no nice drag & drop GUI editor build into an easy Python IDE. At least I don't know it. Maybe if you use PyCharm, select some framework, find the right PyCharm plugins for it, you would have sth similar, but still much more difficult to use, and also there is no prebuilt and ready to be used package for such thing.
I'm in the process of making something like VB6 (yazz.com), but which is using Javascript instead of Basic for scripting. I would say that if you look at retool, bubble and many other low code tools then you will see that they are doing the same
The problem is that the approach itself is kinda flawed. The moment you have to support high-DPI or translate the app to a different language (with text labels of different size), that whole trick of just WYSIWYG dragging and dropping widgets in a visual designer breaks down.
It isn't just the WYSIWYG element though, developing the UI elements in code is not that hard.
But imagine in Tkinter and Python, probably the least steps to get a basic UI, there is so much to manage.
In vb you double clicked a button in the designer and got a place to type code. All the stuff about event loops, and not blocking the thread was dealt with for you. In Python if you try the same thing the app freezes continuously. Then you want to do simple everyday things like a date picker, or progress bar, and find you need to implement it yourself????
Then having built your app in Tkinter and Python, you want to share it with some colleagues...so where is the 'compile button'?
It is easier to implement a machine learning model in Python than it is to make it into a standalone app with a file importer and a 'go' button!
The way translation is implemented today is horrifying.
Likewise for high-DPI. Why are they still using some linear scaling of icons instead of a super-resolution GAN and some caching?
Why can't an OS translate a single-language app on the fly using state-of-the-art translation neural nets and using visual attention to capture context? It's almost 2023 ffs, and people are still using some gettext nonsense?
The problem is that on some platforms, text doesn't quite scale linearly as you increase the point size. Techniques like subpixel antialiasing with pixel-snapping produce shapes that are more readable on low-res screens, but it also causes distortion to letter shapes, and with multiple letters, depending on the exact text and its location, it can add up to amounts significant enough to cause overflow.
It's a problem on any platform that wants more readable text on low resolution. The only platform I know of that chose precise text layout over contrast regardless of the resolution is macOS. On Windows, ClearType still behaves as I described above. On Linux, it depends FreeType settings, which you don't know in advance, so you have to assume that it will happen.
I don't really know Delphi/Lazurus well enough... but I suppose it is similar to the following... for VB there is Winforms and .NET, which as someone above pointed out is 'much more powerful', but somehow not what people want!
I spent a huge amount of time coding in VB5 and VB6 in the 1990s and early 2000s during high school and my first years of college for hobby projects. Definitely felt like the most fun language and I was far more productive than in C/C++/Java/Pascal. With some work areas that needed to be optimized could be to be pretty fast when compiled. Everything was integrated and a good debugger.
I didn't get the hate. Maybe software engineering folks don't want coding to be fun and productive.
Yep I was going to say the same thing. I write small internal tools for corporates and my team in 2020 is a fraction of the productivity of my team in the 90s. Lots of tools, the complexity is nuts these days.
it's a generalized trend. Python is the new BASIC, but Python is SO much more complicated than BASIC. Web development was easy, now it's impossibly complicated, etc etc...
Rebol, Red, tcl/tk are good and easy programming languages, but kinda niche and obscure. I have no idea why computing is getting more complicated instead of easier.
I only disagree on web development. It was never easy. The web always always a mess, and nothing really worked. Today it's a mess, but you can make it work if you know what you are doing.
But yes, looks like the web was a trendsetter that pushed things into becoming harder without any good reason. Or maybe the web was just following the trend of the toolmakers all going out business at the early 00's (thanks a lot to Microsoft).
Ironically - given the article we're commenting on - a lot of the early problems with the web were solved, and I hesitate to say it, solved well by Flash. Its insecurity is what really pushed us away from it (well, and that it was a closed sourced Adobe project)
I miss the days when JQuery was the only JS library a person needed to know to do professional web dev. I stuck with web dev through the peak of Struts, but the constant changing of frameworks with (imo) minimal benefit to user experience pushed me to things besides web dev.
A lot of large companies did a tremendous amount of back office using VB6 applications. I remember visiting Disney and seeing it all over the place in the late 90s.
There was nothing particularly easier about VB6 compared to the later VB.NET. VB (5/6) was, effectively, a scripting language designed around the COM object model. .NET originally started its life as "COM 2.0", with VB support consistently a part of the plan, and gradual feature creep turning it into what it became eventually.
> It blows my mind that no one has made a tool like that in so long. […] Why has the pinnacle of ease of use in game creation been achieved and never replicated since?
FWIW it never went away¹ and targets standards-based runtimes these days. I'm sure there are many kids every month having the same kind of epiphanies with it!
That being said, because Adobe Animate is marketed as a tool for professionals rather than kids/enthusiasts, orders of magnitude more kids are having the kind of experience you had with Minecraft and Roblox instead.
There is a small but major difference with Flash vs Minecraft and Roblox. Flash let you create something from scratch (or close to it). You aren't just modding a game, you are making a game or a cartoon. It was fantastic. Minecraft you are limited to the world of minecraft.
> There is a small but major difference with Flash vs Minecraft and Roblox. Flash let you create something from scratch (or close to it). […] It was fantastic.
Absolutely fantastic, yes! Roblox does allow you to make a game from scratch — even though the primitives are generally less primitive and the constraints much different — but my point is just that if you're wondering where the spirit of dj_mc_merlin's experiences went, two places where it can be found in spades are in the Roblox and Minecraft communities. (Source: Am a parent experiencing this vicariously via my kids.)
>Roblox does allow you to make a game from scratch
...with the caveat that once Roblox (the company) folds, all the games will be gone.
...and that since Roblox changes the code all the time, you have no idea whether your game will look the same in a year. Or even run.
So yes, while I understand what you mean by the spiritual successor.. Roblox takes everything that made Flash bad (i.e. dependence on a clunky binary that one day may not be supported), and makes it worse.
Source: worked in Roblox when a new shiny material pack was rolled out, which replaced existing materials in existing games, messing up the look of games where materials were used creatively.
> ...with the caveat that once Roblox (the company) folds, all the games will be gone.
True, and I guess Flash is a useful lesson in that sense, although no platforms I jammed with as a kid are around either. It makes me wonder if Godot apps written today will work on the web 20 years from now.
The amount of commercial games that used (and still does actually) Flash is much higher than many could believe. Jackbox is probably one that many here have played in the past few years. Some games used it only for UI, like Borderlands.
For context, most of these games use(d) a Flash derivative called Scaleform for their GUIs: https://en.wikipedia.org/wiki/Scaleform_GFx — it's discontinued now but was indeed hugely popular.
A lot of games today use an HTML-powered UI library from Coherent Labs which can still be authored using Adobe Animate (i.e. what the Flash editor turned into): https://coherent-labs.com/
It could also run in its own standalone environment you could redistribute. IIRC you could even export something like an EXE and bundle the flash player with the installer.
You might like Svija — you can easily animate SVGs using Illustrator. It's not as powerful yet as Flash, but that's our goal. Disclaimer, I made it ;-)
I must admit, this isn't the most inspiring landing page for someone like me, who could be your exact target demographic (long-time Illy user and someone who likes to tinker with new web tools). I don't really 'get it' and (this is entirely personal) the design looks quite dated.
I'll bookmark it, share it and keep an eye on it though - Good luck!
I remember when Minecraft was released a bunch of kids had their first Java experience due to the modding scene. Funnily enough I didn't do that since by then I was a C enthusiast and had an irrational hatred towards Java (oh how the turntables..). I wonder if that's still possible now that Microsoft has changed the codebase IIRC?
There are 2 editions of Minecraft. One, is the original one, now called Java edition. They definately still work on it, because despite not running well on Java > 8 for a long time, it now does. But afaik, they're still stuck on LWJGL 2.
The newer one written in C++ was first called "pocket edition" and targeted mobile. It's now called "Bedrock edition" and is also the version that has RTX support.
It'll be hard to move the modding community over to the Bedrock edition. I personally don't have any knowledge how limited modding is in the latter one.
Some of the better years of my life a little over a decade ago were spent on mofunzone playing the stick figure fights with friends. I remember bullet time fighting (matrix clone with swords, guns, slow motion) and there was also an amazing action narrative series called Ray with Southpark themed characters and one of the craziest platformers I ever played called N. I managed to snag some of these a while back and preserve them offline and can still enjoy them using https://ruffle.rs/. We lost so many good things from the flash era.
Wow, hitting a bunch of memories there. Bullet time fighting was amazing, and I had a bunch of friends who were way into N ("way of the ninja") - built a couple of the most popular levels for it. Which also was pretty cool that they were able to do that, come to think of it. I remember one of them dabbling into creating games with Flash as well, and when he got stuck, I encouraged him to reach out to the creator of N to ask how he had solved that. (I think it was collisions with curved floors.) Got a nice message back that explained to him how to do it. Good times.
I really hope all these games a preserved somewhere. There are big movements to preserve old console and PC games, but flash, and possibly to a bigger extent, mobile games seem more overlooked.
I think there's a tendency to look back at these games as trash, but it was a pretty wildly experimental time. Indie games as we know them now weren't really a thing yet.
https://scratch.mit.edu/ seems like a social network for kids making basic games and learning to code. They seem to encourage forking and learning from existing projects.
I took a similar path to you. Sometimes I feel the best thing my high school did for me was having Flash, VB6 and HyperStudio installed on the school computers.
Have you tried processing it gets close to action script and canvas. We just need to put processing on a time framed canvas. With a Figma like editing experience.
You should look into roblox. I think a lot of the kids who want to create are doing it there or in minecraft. There are really some pretty expansive games created there.
I miss AS. I used it to build a gis engine that packed 5D into what superficially looked like 2D - item renderers helped make that happen. It is aggravating to no end to watch people pile on Flash for its faults - they either don't understand what was lost or just don't want to understand the loss. Steve Jobs is in this group.
Flash really was great. Its unfortunate that they killed it. The Mini Clip and animated shorts days were awesome. Used to love finding cool animations in Deviant Art.
I was expecting "Attempt 3" to involve Bluemaxima's Flashpoint, and was surprised not to see it mentioned. Does anyone know why that wouldn't have been an option, or is this essentially what Attempt 1 would have involved?
For anyone not familiar with Flashpoint, it's a project to preserve old flash games and animations and keep them playable on modern platforms. It's open source and includes a huge library (including it looks like Hapland 1-3).
Edit: Reading a bit further down the article, it looks like they were able to make some big improvements by building their own engine like supporting wide screen and higher FPS so that sort of answers my question!
I still play games on Flashpoint. Hell, I speedrun and showcase games in Flashpoint during some regular events with my other friends playing ""real"" games. It's not the best UI, but you do have an easy search function to find content you might like.
Gotta be careful to avoid the porn games though. Flashpoint archived some.
It's like Steam for Flash, Shockwave, and Java Applet games. I highly recommend it. Though I do wonder if it's going to survive the transition away from x86...
Flash could do 3D 20 years ago, before WebGL and threeJS, and you didn't have to be a Javascript (or ActionScript) expert to build them.
Nothing really exists that beats Flash, partially because the switch to mobile killed Flash's momentum. It also led to the switch in Internet content from Flash animations/toons/games to Youtube/Patreon/Twitch videos. It was easier to make money with weekly videos than trying to get Adsense bucks by making a game every few months.
Flash was actually pretty great. The big problem with it was the "oh, that thing? it's still around?" attitude of Adobe, which led to massive security problems.
Actionscript was a pretty good language, certainly around the level of what Javascript has become. It was poor-mouthed for the same reasons Javascript was (is?), but now we use Javascript to run 90% of the Web.
I remember back in the early 00s playing with the PHP Flash module. It was a really interesting environment that had a lot of promise. But, alas, it was doomed.
Similar experience with using the showcase "AsteroidX" albeit no "unresponsive" dialog though just wouldn't load. It looks like the root of that issue is it requires 3rd party cookies to load but modern browsers block that by default and the error isn't bubbled up to the user. https://www.construct.net/en/free-online-games/asteroidx-650...
> I don't know if there is anything as good for SVG/HTML.
There's not because the DOM isn't suited for complex animations. People manage to animate SVGs (see Greensock etc.), but it's a pain in the nether regions
One college near me still teaches 101 classes through Flash. When presented as an educational tool with disclaimers that the end result is not secure, it's a perfect product to teach with. You can take someone who does not even know how to turn on a computer and by the end of the class they understand what programming is, how to draw on a computer, vector vs bitmap, etc.
Wow! Talk about a love letter at the end of 2022! The dev has identified a gap and applies time and effort to solve it through building it.
Halfway, the dev complains about the orignal devs using XML, because it is not efficient "Hey, I'm not complaining, it makes my job easier."
Well sounds like the original developers of Flash made a good decision. If it makes it easier to parse the content at a later stage, I'm willing to call it a win!
Yup, this is classic Unix style! Having data-centric interoperability among tools also helps you port to new platforms more smoothly, as opposed to throwing away your code or rewriting from scratch.
Examples:
1. Using text formats embedded in XML -- he took advantage of this when reverse engineering
2. Generating text ASM -- he said this aids debuggability, and allows using existing ASM tools that he didn't have to write
3. Generating C++ -- taking advantage of the type system, as mentioned, and a huge array of other tools (profilers and debuggers)
It's data-centric rather than code-centric. It's a interoperable and transparent architecture, not a monolithic one. (Which is not surprising because Flash itself was made for the web.)
> The vintage Flash UI is great. Buttons have edges. Icons look like things. Space is well-used. It's amazing! Using old UIs makes me feel like an archaeologist discovering some sort of forgotten Roman technology. The lost art of UI design. It's neat.
This was one my favourites parts in the article. An the real piece of the pie comes right after it: there's a screenshot with like thirty icons and they're different enough that you can tell one from the other.
For years I've had a "Google" folder on my phone which links to all the obligatory things - Drive, Maps, Calendar, Meet, etc.
Ever since they made this move to every icon being a minimal shape with the 4 Google colors, I haven't been able to quickly differentiate between app icons. I have to read the labels half the time now cause they're so non-indicative of the app's purpose.
I find it somehow intriguing how you already pre-destined this imaginary book to be great… That stood out to me for some reason. How about a great title for a shitty book?
Funnily enough, The Lost Art Of UI Design sounds like a potentially very mediocre book to me. One of those stretched-out "self-teaching" design guides that listicles will recommend you. It chooses a couple of very specific rules of thumb, it gives way too much importance to them, and somehow manages to water them down to 250 pages. Probably written either by someone who's a skilled writer but barely knows enough about the industry, or an experienced specialist in the field who unfortunately can't write for life.
I get (and agree with) this idea in general, but these buttons are still present in any Adobe apps (esp. Photoshop) and they're not going anywhere.
Hell, I don't think they're original to Flash to begin with, more like borrowed from Photoshop when Adobe bought Macromedia (this is purely my speculation; but Photoshop is much older).
It's like entire industry at one point decide to optimize UIs for the first 30 minutes of usage and not the years of users using apps once they get familiar with them.
"oooh let's not put too many buttons to scare users away", "they got 22 inch screens, let's just waste space for no good reason, it looks modern and airy"
Yes. What we see in UI design today is a consequence of game theory at work. It's to optimize business. In fairness you don't want to scare users away. I think though many people equated it all to "clean looking", without realizing removing micro-interactions is a net negative....
I think touch screens are a bigger factor. You can't have small buttons in a touch-screen UI, because they must be targets for your finger. (OTOH, swipes in a touch UI are a lot easier than mouse drag-and-drop. So the optimal touch UX probably has lots of drag-activated pie menus. And confirmation flows for potentially destructive actions, to cope with the lower accuracy of touch interactions.)
Yeah, if my memory serves a lot of UI design started to get more airy around the time smartphones became more popular. Even on desktops, which I think may just be because large buttons and text became trendy and although some may deny it, I think a lot of ui design is just following trends. It takes a good designer to not just blindly copy things because they look cool.
The problem is these "designers" bring those touch-optimised UIs to desktop. And so you end up with hamburger menus on 4k-screens, tall narrow alert dialogs on MacOS etc.
One of the early attempts to balance those two goals was an option in the settings to switch between Beginner/Intermediate/Advanced mode. Never really see that anymore.
That's the discipline of data-driven business. If 20 people use your app and despise it and are depressed by it, that's better than 19 people using it and loving it. No sentimentality allowed.
How could anyone possibly measure that an app is despised and depresses people (or vice versa)?
Usage can't be it, because I use a ton of software that I despise and get depressed by, but only because I get paid to do so, or the app is so niche there isn't a viable replacement.
> How could anyone possibly measure that an app is despised and depresses people (or vice versa)?
You'd have to actually talk to your users. But that requires genuine human investment, so it doesn't scale. It doesn't come in the form of a third-party SaaS with a pretty dashboard, generous free plan, and integration via single <script> tag.
In short: they could measure this, they just don't bother to.
UI is now optimized for PowerPoint. The #1 goal of all UI work is so the designer and/or product manager can get an "oh, that looks nice!" showing off screenshots to an interviewer or to someone higher-up who'll help them get a promotion. Trendy (you don't want to look outdated, do you?) and pretty are what matter, not usability.
Working with Flash was like working with future technologies, back in the day. You could build amazing things with it, things that was not possible using standard browser API’s. In fact, Flash led the way and was the prototype for what browsers can to today.
If you cared about what you built, with clever hacks and bitwise performance tricks then the Flash runtime could run your code efficiently. I remember developing an app which used Box2D, camera based gesture control with sound effects and background music all running simultaneously, reaching 60 FPS. In other projects we used software based 3D (à la Papervision 3D), Adobe dropped the ball and Molehill/GPU accelerated 2D/3D arrived too late. Perhaps it’s not common knowledge but we could develop true cross-platform apps, compiling for different targets (SWF, IPA for iOS and .app/.exe).
That letter from Steve Jobs destroyed it all, many talented developers left the Flash world at that time. It was a bit depressing to see all the (unjustified) mainstream hate for the Flash platform that started to appear at that time, which felt bad as there were many of us that put a lot of time, care and effort in creating amazing stuff with it. Thanks Flash!
> That letter from Steve Jobs destroyed it all, many talented developers left the Flash world at that time. It was a bit depressing to see all the (unjustified) mainstream hate for the Flash platform that started to appear at that time,
You might not have seen it but the hate for Flash was well-established by then, solely due to Adobe’s gross mismanagement of their platform. Flash was what crashed and took out your entire browser session. Flash was what made drive-by compromises so easy and common. Flash was what popped up annoying “you should manually update” prompts because they sat out the automatic update trend for nearly two decades. Flash was also what made your entire computer visibly slow down because it was so inefficient - they weren’t an option on phones due to the UI differences but performance and memory alone would have ruled them out in any case.
Flash was also painful to develop for. Yes, you could do things which browsers couldn’t do back then but it came at the cost of having to pay Adobe thousands of dollars for shoddy, unsupported tools[1] and while there were a handful of ways where AS3 was ahead of JavaScript there many areas where it was behind and unlike browsers it stayed there, and it was really common to find bugs in Adobe’s libraries which meant you had to build a complete replacement or wait years to never for them to fix it.
People made cool things despite that but I attribute almost all of the blame for Flash’s demise to Adobe. They could have cut their executive bonus pool by 10% to hire a QA team and they’d have had a much better chance at long term success. Instead, they let the platform stagnate to the point where one of the early Chrome selling points was that when Flash crashed you’d only lose that page!
1. I bought a Flash license once. I naively filed bug reports for reproducible crashes in the player and IDE (the debugger was especially unstable), as well as compatibility and correctness issues with their various libraries. Each time, I got no reply before the next major release and then I got a generic “please pay $1,500 for the upgrade and let us know if we fixed it” message.
At the time it was unfortunately both true that Apple needed idle CPUs for battery life and Adobe had no intention or probably no ability either to make that happen with Flash.
No way! This is the creator of Hapland... As a kid playing flash games on websites all day like Addicting Games, Armor Games, etc. you don't really think much about the people behind these projects. Then time goes on and they cement themselves in your mind as a myth.
The fact that I now get to read and understand articles by this legend, and work in the same industry vertical him... what a privilege!
Thanks for your work. Hugely inspirational to my career.
I still believe that Flash conceptually was a brilliant idea done terribly. It was slow, buggy, under developed functionality and a security nightmare.
But the idea of having a single file that could bundle, code, audio, graphics that was dead easy to build and could run independent of specific browsers and operating systems.
That is something magical. It is harder to build that kind of content today and near impossible to implement it on that scale today than it was 20 years ago.
Theoretically you could run a Flash file on a Linux system via Konqueror browser and it would be identical to that on Internet Explorer on Windows 98. This is something we have lost.
It is a shame that Adobe/Macromedia treated it like absolute trash. A free/open solution that had the same good properties would have been brilliant but it never happened. Instead we have the whole HTML5+ stack. I mean it is cool and has done some amazing things for the web, but it is also a massive pain to manage. One browser update and boom, something has broken - time to get digging on what happened! Now it works on Chrome but not Firefox!? Oh dear.
I don't miss Flash itself, I mean to be charitable, it was a mediocre product at best. But that vision implemented in a sensible fashion could have been wonderful. Could have allowed for the decentralized web to continue on a bit longer than it has.
> I still believe that Flash conceptually was a brilliant idea done terribly. It was slow, buggy, under developed functionality and a security nightmare.t
Relative to what though? There was nothing else with its ubiquity, ease of use and deployment, and breadth of functionality.
It pioneered and enabled so many things that we take for granted on the web today.
Personally, as someone who used it extensively, I don't feel it was particularly buggy, especially relative to the alternatives or developing in the browser. It could have performance issues, if developers did bad things or pushed it too hard, but you could say that for just about anything.
As far as security, it was the single most ubiquitous runtime on the web, and thus was a prime target for hackers. Yes, it had security issues (although every web runtime did), but its not like the rest of the web / browsers were particularly secure themselves.
fyi, I worked for Macromedia and work for Adobe now, but these are my personal opinions.
Maybe I am just remembering it differently as this was more than a decade ago. But it seemed like almost every week, heck almost every day, another zero day would be discovered. Yes, this was the peak time of zero day exploits as every system was being hammered from all sides, but the Flash name came up far more often than anything else I recall expect for maybe general Windows Xp/7 issues.
As for performance, that was a major issue personally. I recall a time when I had nested items a few layers deep so that I could have a character with movable joints and the system was buckling under this work load. Going from fully ridged models running smooth as butter on 3DS Max to the same system crumbling under some basic vector graphics left a bad taste for me. It just always felt like it was dragging its feet compared with what these machine could achieve.
I don't recall any specific bugs only that recall the feeling of them. Sorry I am so light on specifics. A minor bug was that I do recall was as you rotated objects, there must have been a small rounding issue that meant items would slowly shrink as you rotated them multiple times while going about your process. It would take hundreds of such movements before it became noticeable, it was a funny little thing but would lead to some projects becoming a bit frustrating.
I say this as someone who absolutely loved Flash. I wish it had kept on going regardless of those issues. It just felt like it needed a Windows Vista to Windows 7 moment. Where it was taken to task and team got behind it to refine the project into something much more lean that still had the same functionality. Instead it felt like it was treated as that thing that would get barely any attention until it died from atrophy. That mobile wasn't taken to task is probably the thing that ended up killing it quick.
As someone else said here, Flash felt like a vision of the future. Where computers were tools of creation made easy not just consumption.
It was a pioneer in so many things and I wish it was still a part of the web. It made interactive media distribution so dang easy! Build it in Flash, add a single line of HTML and BOOM - multi-platform audio visual experience that could scale like a champ!
Yeah video kind of sucked, early Youtube performance a good example in that as it would bring my G4 Mini to it's knees! But for everything else it was such a visionary idea that we have lost.
On a tangent.
This is something that I feel a lot of modern technology has forgotten. That vision of not just new tech but tech that is enabling to even those that have no interest on the base technology. Flash had that vision, not just of creation but the ease in which it could be spread. Hypercard had that vision. Heck BASIC had that vision. As Steve Jobs always pushed in the early 80's, the computer is a cycle for the mind. Nowadays at times it can feel like the sedation of the mind.
Everytime I see here on HN another link about thing Ruby, GCC, CLANG, some JS framework, Tensor... something - I get it, in those fields it is all very cool and useful. But it is also very much for those that work in the weeds. Great for those that do that, but it also feels so lacking of that "Wow!" factor that I think brought a lot of us into this field. I hope we rediscover that one day and soon.
About a decade ago I worked with someone who was on the early Flash team.
One of the tradeoffs of the time was that they had to work very hard to both crunch the binary size down for Flash itself, as well as Flash files, because people everywhere were extremely bandwidth-limited - when dialup was the most common method for connecting to the internet and not even 56k was necessarily ubiquitous at that. And of course, compiler technology hadn't come as far as it has nowadays.
The weight of all the decisions that made Flash able to succeed eventually worked against it, but for its time, it was an extremely good implementation that made the web as a whole more interactive.
I'm happy it's been superseded, but I think that it was the right product for its time.
I think was flash did well was its neat component based framework. It was easy to understand, easy to modify and build your own set of controls on top of it.
The runtime engine sucked. It was too slow and therefor power demanding.
But the framework itself was much better than todays android , react and others.
That performance was so heavily tied to screen resolution was funny to see. It could crunch a reasonable amount of vector information, but the rasterisation would burn through processor cycles.
There were some projects when I was running on a G3 iMac where I would drop the render screen down really small to see how it would perform on a more powerful machine. It felt like if they even managed to get decent GPU acceleration possible then they could have solved a lot of these issues.
I'm not offering an assessment of the overall sentiment, but it's been a very long time since I've had to think twice about browser interoperability, and it would be an exceedingly rare occurrence for a browser update to cause me any problem.
It is a funny thing. This was during the big push for everything Smart phones, apps and viable streaming changing tastes and the domination of social media. That combo simply knocked it off its pedestal in no time.
The technical legacy of Flash made it a poor fit for touch screens. Apps filled that void with astounding speed.
This part is just a vague opinion I am riffing with - With mass available video, a ocean of video content may have changed tastes enough that the demand for interactive content diminished somewhat. Social media also killed of the dedicated website for a lot of places that used this technology and thus the demand for this stuff.
Once Google via Chrome closed the Flash hole, I do wonder if they will ever let anyone else to try and get in on that space? Time will tell.
Flash is great and tons of devs still use flash. I also use flash in my custom game engine (the one powering The End is Nigh and the upcoming Mewgenics), a different approach to what the dev in this article did (seeing as these are new games and not ports of existing flash games), my approach is to load SWF files as the resource files for art and animation in my game, and render them as vector. Flash puts a lot of information in those files, and I parse just enough actionscript bytecode that I hook and trigger C++ functions from my actionscript parser. This lets us use flash almost exactly the way we used it for making flash games back in the mid 2000s, with all of the interesting workflow tricks and hacks that made it so nice, while being able to write all the actual gameplay code in a real language and render it with openGL so its actually fast.
I recently used Ruffle [2] to get some Flash applications [0] working in the Pro version of my web browser [1], which is specifically designed to be remotely accessible and embeddable in an iframe. To run Ruffle on pages that require it, I utilize the Chrome Remote Debugging Protocol [3], similar to how a Chrome extension content script operates. Ruffle itself relies on WebAssembly and runs smoothly. It's been exciting to see the audio and video functionality of these old games restored and being able to play them again.
They did mention it. They use it for some of their older games but it seems like Ruffle isn't yet feature-complete with Action Script 3 and thus cannot be used to run their newer games
While Ruffle's AS3 support is still lacking a lot of features, since a couple months ago it's been able to play some simple games that require it. The build used on author's site is from 2021, and I just checked that the latest build is able to play several more AS3 games hosted there.
That's great news! I've been meaning to write a cron script or something to fetch the latest Ruffle every week or so, so thanks for reminding me to do that. Thanks also for your work on Ruffle, it's really great.
I'm waiting for it to develop further, I'm very excited, especially because I still play Crystal Saga everyday, I don't know why, but it's got something that I haven't been able to find in any other games.
But sadly, I haven't been able to make Crystal Saga work using Ruffle
MMOs are pretty much the last game Ruffle or any in-browser emulator will get to support - not just because it's likely some of the most complex piece of code you can find, but also because MMOs are likely to use sockets, which AFAIK can't really be accessed in modern browsers at all.
It sounds like adding those features to ruffle would have been three order of magnitudes easier than writing a flash player from scratch
I understand contributing to OSS is a pain (I often end up with half implemented features in my own branch and never manage to merge them upstream) but he could have saved himself some trouble.
It seems to me like a very different skillset required to reimplement a runtime like the SWF player vs hacking together an alternative FLA compiler that's just good enough to work on your own games.
Ruffle still doesn't support Actionscript 3.0, I suspect the task is not that straightforward.
You mention that while using Flash you found it missing basic features, could you give some examples?
One of my long-term projects is to build a Flash clone (Flash-the-authoring-tool). So I'd love to hear from people who are still using Flash, to see what features / pain points are most important.
* You can't set keyboard shortcuts for some common actions like changing the brush mode between "paint behind", "paint inside", etc. You have to click the button every time.
* No way to import or export any standard vector graphics formats.
* No easy way to just add custom properties to objects on the stage.
* No outliner. Unless you put exactly one object on each layer, it's very hard to find and select objects on the stage that aren't large and obvious. Invisible objects? Nightmare!
* You can name frames, but there's no menu to quickly select one to jump to. In a 500+ frame timeline, you've just gotta manually scroll around to find the frame you're looking for, even if you've named it.
* You can't set an alpha component for the grid to make it semitransparent.
These are all missing in Flash CS6, the last thing that was called "Flash". I don't know if any were added in Adobe Animate because I don't like to rent software.
There's definitely room for a Flash authoring replacement. Every now again I look at all the programs I can find in the space and try them out, but nothing reaches Flash yet, in my opinion, despite its limitations. Best of luck with yours!
Have you tried writing any custom plugins/editor panels to use in the editor? The process is pretty arcane at first, but eventually you are just writing html+javascript and have access to a very powerful editor API.
When I was working with Animate a few years ago I wrote a plugin to navigate around the timeline to find tagged frames and export things.
Yes! In fact, out of those problems I listed, I always thought the timeline navigation (and maybe the outliner) would be good candidates for a .jsfl plugin, I just never got round to doing it. Good to hear it's doable!
I really enjoyed this, thanks for writing the article. Just the idea of bits of Actionscript on Frames and figuring that out would have made me quit never mind everything else. I miss Flash. I used to love the Flex Builder/SWC setup as I used to be really productive with it.
> Although I developed the game mostly on my Mac, during development Apple invented this thing called “Notarization” where if you run any app on a new version of MacOS, it'll make a network request to Apple to ask if the app's developer pays Apple a yearly fee. If the developer does not pay Apple a yearly fee, MacOS will pop up a dialog strongly implying the app is a virus and refuse to start it.
> For this reason, Windows will be the first and maybe only release platform for this game.
this seems like the wrong answer to the (arguably legitimate) concern posed. the first and maybe only release platform for this game, based on the reasoning, should have been Linux. It's not like Microsoft can't decide to arbitrarily force exe's to phone home for "security" reasons, but good luck getting the Linux kernel, or any distro, to do that.
I'm 100% convinced that Linux will be the way all software will be preserved in the future; Proton will take care of windows exe's and VM's will take care of Macs, and a deterministic OS like NixOS will let you define exactly what any piece of software needs to build and run, forever. The incentives just line up.
If Proton is accepted as the solution to Linux gaming what is the reason for a developer to build for Linux over Windows?
Funnily enough there are parallels with that to why Apple didn’t want Flash on iOS. If you could make it in Flash and ship to both why code an iOS app.
Same thing, zero reason to make a native Linux build while that works fine.
Initially, Apple actually looked at an iOS version of Flash. This was around 2009/2010 IIRC. There were special dev iPhones that ran Flash. Adobe didn’t want to optimize it because they thought the iPhone would fail. Apple thought the performance was garbage and the communication mostly ended. I got to play with one for a few days and the performance was terrible. Even simple games lagged and the touch points were a disaster.
I was heavy working in Flash around this time, I remember Apple saying the performance of Flash was too bad to consider using on iPhone and I remember around 3 months later the Mac version of Flash magically ran 4 times better when for years and years it always ran a lot slower than Windows and Adobe pretended there was nothing they could do about it.
It's a good point. Turns out that the Windows API is more stable than whatever Linux offers; I have multiple games now that have native Linux versions on Steam but whose Windows versions running under Proton are simply less buggy (and also synchronize cloud saves with my pure-Windows machine).
A native Linux build is still better performance than Proton, so with the popularity of Steam Deck it is worth the effort recompiling for Linux for higher performance games.
Also the fact that macOS doesn't actually block unnotarized software...Many programs are un-notarized and just need to give the user the 1 step instruction to bypass it
Also the fact that you only need to pay the £99 fee once; after the executable is signed, notarised and stapled, that's it. I think there might be a network request if the executable hasn't been stapled, but it's only to check that it has been notarised in the past, it doesn't check for an active Developer Subscription. At least this is my understanding of the situation, I would be happy to be corrected if any of that is wrong.
Also - you have to sign code on Windows too! If you don't then Edge will try as hard as possible to stop the user opening it, and AV scanners can get very curious about what you're doing. Maybe Steam doesn't care if your game is signed?
I didn't quite follow the logic around notarization either. Steam imposes more requirements and takes a slice of revenue! Notarization is a lot cheaper and faster than that. But I guess the idea is that Steam gives you features, whereas notarization is just a tax.
Yes, and I also never felt the macOS warning about it "strongly suggested it was a virus." That was pretty much hyperbole. And I could also argue that Apple expecting devs to pay a nominal fee to get their app officially code-signed creates a barrier to entry for bad actors who are likely to not want (or be unable to afford) to pay such a fee, especially when that simultaneously gives control to Apple to block the malicious code en-masse once it is discovered.
> In the end, I settled for a bit of a hack. My exporter reads the ActionScript from each frame and applies a bunch of regexes to attempt to turn it into C++.
Haha, I lost it at this part. Brilliant read!
How much time did it take you to build the whole thing?
Git log tells me I started in November 2019, so 3 years, but the pace of work definitely varied a lot over that time. Sometimes I was putting full days of work into it, sometimes stolen moments here and there between paying work, and some long stretches of just not working on it at all.
Many 2D illustrators, industrial designers, and graphic designers considered it one of the best 2D drawing tools ever made.
One of my roommates in the heyday of Flash was an industrial designer who was paid by an Italian business to design aftermarket wheels for cars, which he did in Flash. Eventually he figured out a Flash > Studio Max workflow he liked and finished projects in 3D that way.
That guy could draw anything in Flash and preferred it over Illustrator, Photoshop, and Fireworks.
Funnily enough that was actually the original goal of Flash - back when it was called SmartSketch. The idea was to have a vector drawing tool for pen tablets. But it sold absolutely horribly.
Jonathan Gay pivoted it to animation shortly thereafter and rebranded it as FutureSplash, and then got bought out by Macromedia.
Anyone knows of a modern browser that still supports flash?
I'm stuck having to use old browsers, which I'm not very happy about.
Ruffle doesn't work the for game that I play (Crystal Saga).
I use flash occasionally to run some older code. I've been running this setup for a few years now on my Intel Mac because it won't work with Apple's new M chips. There are a couple of things you will need. First, download Firefox 84.0 here: https://ftp.mozilla.org/pub/firefox/releases/84.0/mac/en-US/. Next, you need the Flash Player itself. Adobe, as far as I know, removed the links to download it and I don't have anywhere I would trust to download. After you have the Flash Player, install Firefox 84 and disable the auto-updates. Now install Flash and everything should work. Good luck!
No such thing as a modern browser that supports Flash. But BlueMaxima's Flashpoint is a great tool that lets you run old web games in Flash and other dead formats: https://bluemaxima.org/flashpoint/
Thanks for the suggestion! I tried it, but it doesn't support playing MMO games, which Crystal Saga is, so I'll keep on searching. Don't get me wrong, the developers do provide us with a client, but you can get better performance using other browsers, which is always better.
Flash was shuttered due to being a constant, huge security risk. So no modern browsers support it and there aren't any plugins for it other than Ruffle.
Web itself is a constant huge security risk - Playstations 3, 4 and 5 were all hacked through Webkit, for example. The reason to kill off Flash was just because Apple wanted it gone, and Adobe didn't want to make them angry.
Flash and Actionscript were to interactive and animated apps what Python is to generic coding today. It brought together ease of use and powerful functionality, accessible to mere mortals.
Flash is fantastic and I wish it got the respect it deserves from tech circles. It got an entire generation into animation and software engineering, and had really robust and approachable tooling which is yet to be replicated. I think back to the gaming experiences made in Flash and Shockwave as far back as the mid 90s, and nothing today even comes close. Back then, you could load up a webpage on a Pentium II and play a really robust game made by a teenager in their spare time, and that game will be stable and display extremely good performance. Compare that with the complexity and barriers that come with mobile game development today, or the jank and poor performance of JS based games.
TL;DR: It's been decades and there's still no suitable replacement for Flash, and the web is poorer for it.
Flash was conceptually fantastic, and today we can do everything Flash did, and more, with HTML, and WebGL. Unfortunately we don't have the authoring tool to do so. Flash IDE became Adobe Animate, but it has relatively restricted output and capabilities, compared to Flash before.
Unfortunately also Flash was full of bugs, FULL. I mean the player, the IDE, everything. And Macromedia and consequently Adobe were increasingly desperate in monetizing the player by attaching downloads to it, like Adobe Reader or anti-virus software. Or adding nonsense features like half-baked 3D gaming to it.
slowly lowers flamethrower indeed, you're right, HTML and WebGL with JavaScript or webasm can do anything flash can..
They can't do it in the cycle-count that flash could (I remember seeing 3D vector demos running in 640x480 pixels on my 133 mhz pentium), but they will do it, and probably look almost identical across two different versions of the same brand of browser..
But the real point is the authoring tools, and there simply aren't any.. We're stuck in the early 80s with a f*cking text-editor for GUI work..
Geez, there were authoring tools for making nice GUI applications in the Windows 3.11 era.. But modern web development has yet to reach that level of maturity.
This is admittedly a few years ago, and the scene has evolved _a bit_ since then, but I remember seeing really simple JS games (think tetris, or a very simple sidescroller) being heartily applauded on sites like reddit and HN. We're unironically cheering approaching parity with a toolset from almost 20 years ago! Things could be so much better.
Back when it was still Dynamic HTML and I was in middle school, I copied the page source to put those into my homepage. I even figured out how to edit levels in the Mario clone.
> today we can do everything Flash did, and more, with HTML, and WebGL.
There's one big piece missing: we don't have an animation/gaming interface that's as easy and intuitive as Flash.
Flash didn't become ubiquitous because it was powerful, but because it was easy for any 13 year old to pick up and do something nice in a couple of days without reading any documentation or doing any course. Good luck getting anything to work in WebGL without any web programming experience.
Developers love saying "But you can do everything that was in Flash directly in Javascript today!!"
I wish they could understand that what they're saying would be equivalent to switching out Excel for a Python IDE and saying "but you can do calculations here too!"
I don't deny that it had lots of issues. In addition to what you mentioned, it was an accessibility nightmare, and had tons of security problems. All that said, it's difficult to argue with results.
Adobe never cared about Flash beyond it being the webs video player.
Honestly the whole thing just began to rot from a tool perspective since their purchase.
Their management is too PDF-brained to think of investing in quality work in their tools as insane as it sounds they seem to consider their creative tools an afterthought and B2B PDF solutions their main business. I can’t find any other way to explain their actions.
> and today we can do everything Flash did, and more, with HTML, and WebGL
This is a myth, theoretical true, but practically impossible. Flash was so much more than a browser plugin. With HTML5 it is so much harder to maintain a framerate, while in Flash you almost didn't need to care about that.
Flash did skip frames when it couldn't keep up, and this is what any animation framework would do as well for HTML5, or video in video players and so on.
We don't have a well established (in people's minds that is) framework and IDE for it that did all Flash did. But actually I do believe we have everything in place in terms of APIs and capabilities, and in fact even more, because... have you seen some of the WebGL animations? PixiJS? The others?
I doubt. WebGL + the rest of the web stack is way more powerful than Flash EVER was.
Mindshare matters. Without an established IDE and people knowing about it... it's not happening. Another issue is that Flash-like content was notoriously mobile-unfriendly. So we do less of those types of UIs now. Not that we can't... but it's harder.
rogual: I'm sure you would be able to manage this yourself, but if time is a factor I'd be happy to assist in converting the final result to a WASM build playable in a browser.
Shouldn't be difficult to just plop in SDL (which is trivially portable to WASM) to handle the drawing. Found a basic example here[1] that uses STB + SDL. Should be rather straightforward.
I recently picked up a copy of Flash MX to use on an old PowerBook, highly recommend others do the same. As many have commented, today’s authoring tools are poor substitutes (someone should do a list of the superior abandonware versions of SaaS products).
This linked issue is - quote - "a non-exhuastive, basic checklist of where we are aiming". As in, a checklist for MVP level support. And indeed, we have pretty much reached the MVP level, with several simple games (and some nontrivial libraries like box2d or mochicrypt) working.
From this point on, it's hard to make a simple TODO checklist. We could analyze the entire Flash API surface and make a tracking issue for each available class, but that's a lot of bookkeeping and it doesn't really say that much about real support (as, assuming pareto principle, 80% of fancier features are used by only 20% of games).
What's the state of the art on converting Flash to HTML5? You'd need an Actionscript to Javascript transpiler, export the assets, and handle tweening / timeline keyframes? And probably handle Flash specific events? I don't recall if Flash has any flash-specific effects you'd need to recreate.
Like the sibling comment says, while there are alternative flash players, they are either unmaintained (Shumway since 2016) or not feature complete yet (Ruffle). Since performance was stated as the main goal, a bespoke solution that only supports what the games actually use is probably the best performing.
I wish Flash would just be fully open-sourced for niche users to maintain and use it AS IS. I know it supposedly has a lot of vulnerabilities but there are use cases where the pros matter more and this con doesn't matter too much.
Interesting read. And it doesn't end up with Electron, which is "Flash for the Desktop".
Notarization
Why I don't wonder? I come from Linux and the weird rules for App Bundles and Notarization on Mac are pain. And badly documented. And the tools suck! Which entitlements my app needs? How to change libraries paths? Compiling applications on Mac is easy (with Hombrew), shipping to ordinary users as App Bundles is complex and difficult. If you're not a MacOS-Developer using XCode?
It can build, sign and notarize Mac apps (and Windows apps, and Linux apps) with auto update and can do so from any platform. So you can ship to everyone from Linux if you want. It also does auto update, generates a download page that detects your OS and it includes control over low level OS-specific things like setting entitlements, custom deb dependencies, Windows manifest extensions etc.
Notarization as a process is actually OK in my view. Yes it sucks to have to pay, especially for hobbyist stuff. Still, process-wise it's pretty lightweight and it seriously beats having a virus scanner intercept and rummage through all your file IO all the time. Windows has a lot of problems with aggressive AV engines mucking up programs and macOS doesn't, because notarization is in effect an ahead of time virus scanner that moves the work off the end user laptop. Also these days the notarization protocol is open and documented.
From the article: "GPUs don't really like drawing vector graphics. They like big batches of textured triangles. So, I needed to rasterize these vectors."
The author certainly wasn't the first person to have this problem. What is available for fast display of vector graphics animations with GPU acceleration? Anything better or higher level than just doing huge numbers of calls to OpenGL/Vulcan/Metal/etc.?
It's an area of active research. One very promising project is vello [1] which is essentially a software renderer on the GPU (requires GPU compute support). Preliminary results are looking good. Better than anything else out there good.
> GPUs don't really like drawing vector graphics. They like big batches of textured triangles. So, I needed to rasterize these vectors.
I don't follow. In my mind, triangles are natively defined in vector space. I didn't think it was impossible to convert vectors to GPU-friendly triangles. Rasterization is the easy way out and will (probably) help performance, but it will fall apart as screen resolutions increase.
This is an excellent and very informative story! Special thanks for how the Beziers are encoded in XML - I needed this once, but gave up for alternative way.
I recall how I rasterized sprites in my Flash games, I just did it on-the-go right after the game was loaded. I created a Vector.<BitmapData> which stored sprites’ frames, and then drew those BitmapData to the main BitmapData by copyPixels.
It’s interesting how far did you proceed with AIR way and what bugs did you encounter? It’s been nor Adobe but Harman AIR since 2019 and recently AIR50 is out. If you look for a Steam game named Shapik it’s a good example of a Flash game on Steam which was ported via AIR. I use AIR for my mobile Action Script 3 games as well, and now thinking about going to Steam
Thanks! I had all of Hapland 1 (and possibly 2?) working in AIR before I ditched it. I can't remember an exact list of bugs, I think I was just getting graphical glitches and the normal timeline bugs you sometimes get in Flash. Sometimes the AIR tools would fail with Java tracebacks too. Performance was also not where I wanted it, and I wanted full control of the scene code for save states, so the reasons to switch were just mounting up.
Great article, thank you for sharing this experience. The part I found most interesting is the section about the aspect ratio and interpolating between the 16:9 and 16:10 views relative to the original, although I'm having some trouble fully understanding the implementation. Why is the game interpolating between them based on the games aspect ratio?
I can't know ahead of time the exact aspect ratio the player will run the game at. 16:9 and 16:10 are the most common on today's monitors, but the game might be running in a window, or fullscreen on an unusual monitor.
If I just made the game at a fixed 16:9 ratio, it'd leave empty space at the edges of any window that isn't exactly 16:9. Same if I chose 16:10 or any fixed aspect ratio.
By supporting a range of ratios between 16:9 and 16:10, I maximize the number of window sizes that the game will display nicely in without any empty space (black bars) at the sides.
The best way, I thought, to support a range of aspect ratios was to define two rectangles in the scene, having the minimum and maximum ratios supported, and interpolate the camera rectangle between them.
I do really enjoyed this kin-of-post-mortem! Please write more, I'd love to hear about your other projects!
As a side note: In my opinion the closest replacement for Flash is currently Construct 3 engine - you should definitely take a look on it! (I'm not associated with Construct 3 team (Scirra) by any means - I'm just a happy user of their product)
Real talk, I've been shipping an Adobe AIR app (Flash on the desktop) to a major healthcare provider still (just did a build yesterday).
Samsung subsidiary Harmann took over the support contract from Adobe a couple years ago, and I think they have one or two souls running support for the AIR SDK, they're doing... okay.
After years of searching for an Electron alternative (2016-2019) I broke down and wrote an Electron + Vue application. I had really really wanted Wails to work, but it simply has too many sharp corners for an enterprise desktop application that needs to run 24/7.
The corporation customer basically stonewalled and refuses to let the AIR app die. We're pulling up the ladder with the Apple Silicon cutover, and hoping they bleed out on bugs. What a mess.
I'm commenting because I too "still use Flash" in a way. It's a secret shame :X
> GPUs don't really like drawing vector graphics. They like big batches of textured triangles. So, I needed to rasterize these vectors.
I remember in the distant past of like 15 years ago when it was still common wisdom to avoid vector images on web pages, or drawing anything using fancy CSS features, because the resource cost to the client was too high and it'd hurt performance.
I think proliferation of runtime-rendered vector-icons/CSS-drawing/CSS-animation is under-appreciated when it comes to finding things to blame for why the web performs like dogshit, even on super computers. Sure it's probably mostly Javascript's fault, but I think those very likely play a non-negligible role.
I saw the headline and I was ready to talk trash... then I opened the blog post and wow. What an incredible job for a single person. Hats off on the different approaches you took. Excellent work! I loved the post and congrats on launching it on Steam!
I was a huge fan of Hapland back in the day! My friends and I were obsessed. It actually inspired me to make my own Flash games, which is how I learned programming. So nostalgic to see screenshots of that game.
Oh please don't say that. The sRGB transfer function may be somehow roughly correlated with how we perceive lightness, but it's in no way a perceptual space.
Ahh Hapland. I managed to beat 1 and 2 without a solution. Never beat 3. Didn't know how to get started. Once you get started things start to get rolling.
Really awesome to see this.
I've done some conversion work for Flash to PhaserJS for educational games. We had to just rewrite everything as we didn't even have the original source code. Luckily Phaser animations can get a lot of the "walk across and open a door" stuff done easily.
I was also excited to see Hapland as I'd forgotten the name and so was never able to Google to find the games again!
You know the IT media is full of hacks when they all studiously ignore the fact that the reason Flash was ultimately abandoned is that it would allow app developers to bypass the appstores of Apple and Google.
The two will never create anything to match Flash.
Even Microsoft abandoned Silverlight for the same reason and it is why Mozilla abandoned Shumway because they don't want to bite Google's hand.
Great timing for this article! I just used Flash CS5 this week to make my first animation. I’ll admit that I’m rough with it (made a few games with it many years ago, but that is all, stuck with Actionscript and static images). Some of the logic hasn’t stuck with me yet, namely with the timeline and movieclips, and I get frustrated. That said, I’m having a great time with it.
Screw Apple! I'm moving away from it. The system is just not suited for any system's development. Too many undocumented and unexposed crap in the OS. If one want's to do something with the AV, it is near to impossible because of the restrictions and it is not something they've invented. They are bunch of lazy people who doesn't want anyone to mess around with their OS. They will never learn or change. Better stick with Windows or Linux (It has greatly improved for AV in the recent years and it's just the fragmentation which is causing a lot of issues which might eventually get consolidated or streamlined. That's another issue by itself!! :))
And all these notorization and getting a certificate every year is just too much to continue the development for a small bunch of users who doesn't even know or care about what is existing.
Some good folks amongst the users are there, but they can be counted with one hand. All other Mac or Apple users doesn't even have a clue about what is under the hood. It just has to look good with oohs and aaahs... And that's pretty much it... Offcourse as long as it works the way it is supposed to work (and that is how the 99% of it's dum dum users use it), they'll continue to remain a trillion doller company... iOS is another story by itself...
> I like to use binary formats where possible for end-user software; I don't believe in making customers' computers chew through reams of XML.
I feel an urge to object here! This is a false dichotomy; there are efficient middle-ground solutions between XML and custom binary formats e.g. Protobuf and Cap'n proto.
Having said that, I loved the trick of using an assembler to write it.
I absolutely adore Flash and it was my first introduction to programming. I greatly miss the feeling of the web when Flash was king and there was a pot luck of animators and game makers sharing neat things they made. I myself made a game but never actually published it. Always regretted not putting it up publicly before Flash went EoL.
We killed a big chunk of the internet when browsers stopped supporting Flash. It's even worse than that, that part of the internet was not built upon instead many things had to be created from scratch without Flash. I'm not saying Flash is good, just pointing out the incredible loss.
I still use Flash too. I even have the old flash brower plugin that use /tmp for storying flash*... files. I just don't have it set to auto-play. It's always click to play.
Additionally I use a flash created "alarm clock" from the early 2000s almost every day for various tasks.
I really miss all my Flash games. Some of em were and have been ported to other plateforms for sure but most are gone. And dont mention all those weird Shockwave emulators; most of em cant run half of the games I used to play.
You can drag&drop lots of older (ActionScript 1 & 2) Flash-Binaries into https://ruffle.rs/demo/
Lots of them "just work". Also lots of SWF-Archives on archive.org to download and try.
I would invest more time into a tool that ports your game to Unity. You already have "code" attached to "sprites" with "actions" happening based on some triggers and entity states.
Glad to see someone trying to reverse engineer it somewhat. We had a lot of approaches to clone it: Shumway, Ruffle, cross-compile through Haxe and OpenFL, but we are still not complete. And open-source AIR can be just a dream.
Ah, Flash. Simulating a bouncing ball using keyframe-driven ActionScript that I only sort of understood was my “hello world” moment. Love to see it being used like this. If only I could find those old .fla files…
> Flash stores its vector graphics in XML format. You might argue that XML is a poor choice for graphics data, but you weren't a product manager at Macromedia.
It reminds me that the show BoJack Horseman was created on Adobe Flash [1]. The creator calls himself an "Old School Flash Hack" in a same vibe like the OP.
Don't know if the latest seasons they still used Flash, maybe yes, because they just exported to video? Migrate to other tool maybe was a pain and/or changed some illustration aesthetics. Not sure if a project like that can migrate properly to Adobe Animate [2], the Flash successor.
I believe that Flash was a tool with a friendly UI/UX for casual animators but also many advanced features for pros, that's why was so popular and loved.
That would be the easy part. From the post they are already stored as such, but the complexity would be to render the vector graphics in C++ at 60FPS. There is no simple solution to this, and rolling it yourself is quite the task too.
If you look at some of the modern approaches to animation in C++ (like Lottie or Rive) their C++ clients are pretty poorly supported in my opinion, mostly for anti-aliasing quality in native clients over their web equivalents.
Bank UI authors: Management wanted a pretty UI, so we made a semi-attractive chalice out of Roman dichroic glass. If it breaks, bad guys take your money.
But if you had read TFA you would have realized it talks about creating a new player that can understand swf files so those don’t have to be rewritten, it’s not about using the old, ugly, insecure Flash player from Adobe.