Hacker News new | past | comments | ask | show | jobs | submit login
I still use Flash (foon.uk)
1131 points by rogual 11 months ago | hide | past | favorite | 395 comments

Hey HN, I haven't done a lot of technical postmortem blogpost-style writing so I'd welcome any feedback or tips on how to improve. Is it too long, too short, too technical, not technical enough? Boring? Interesting? Is it enlightening or does it just come off like content marketing? I literally have no idea how good I am at this.

I don't think anyone here is going to complain about a post where you make a game work by reverse engineering Bezier splines encoded in XML by Adobe Flash.

If that's not "Hacker" News, what is?

Indeed! ;-)

However I wonder why the OP didn't try to keep vector graphics and use to SVG for instance? That would allow for infinite scalability. It is mentioned that "GPUs don't like vectors" but if the game doesn't change too often it should not make a lot of difference?

I did consider it. Infinite scalability isn't required during runtime; the scene doesn't ever zoom or scale in the games, because that was always so slow in Flash, so the only real benefits would be 1) supporting higher resolutions and 2) reducing file size.

For 1) I decided I'd rather just release an update with larger textures if these ones ever start to look dated. That way I get to keep the runtime code simple. Less code means fewer bugs. I don't want to spend a lot of time fielding support requests from users who hit edge cases in the rasterizer. As for not changing too often, that's true, but taking advantage of that means doing change tracking with dirty-rectangles or similar, which not only adds complexity but also feels like it would make performance less predictable.

And for 2) the game as it stands now is under 50MB so I didn't feel a pressing need to make it smaller, although a tiny executable would be cool in a satisfying, demoscene kind of way.

Ah, thanks. That could maybe go into the article as well? Although it's already pretty thorough as it is.

So, I think I need to elaborate on "GPUs don't like vectors". What the OP meant was "GPUs have literally no support for rendering anything other than pixels on triangles and getting them to efficiently draw Bezier curves and fills is an active area of research". You'd need specific hardware support for rasterizing them, and as far as I'm aware no such hardware exists.

When Adobe hacked on "HTML5 support" to Animate, they did exactly the same thing the OP did. It renders every shape in the FLA to a sprite sheet and then draws it to a canvas tag. If you have knowledge over what will be drawn ahead of time, this is the most reasonable thing you can do.

Even before HTML5 support, the AS3 Starling framework that let you "use the GPU" would pre-render all your vector assets to bitmap textures at runtime. And that was a framework built for Flex developers; if you were accustomed to building things on the timeline, you rendered on CPU, because that's where all of Flash's very particular rendering logic has to live.

Ruffle gets around this by tesselating every shape into a GPU-friendly triangle mesh. This lets us render stuff that's technically vectors on GPU. But as you can imagine, this creates its own problems:

- Flash has very specific stroke-scaling rules. If a stroke is smaller than 1px[0], it will be rounded up to 1px. This is how Flash's "hairline stroke" feature works: it actually asks for a 1/20px[1] stroke, and that gets rounded up to whatever the current scaling factor for that shape is. When your stroke is a polygon mesh, you can't vary the stroke to match Flash without retesselating, so hairline strokes on shapes that stretch don't animate correctly.

- Likewise, any stretch of a stroke that changes the aspect ratio also distorts the stroke, since its baked into the tesselated mesh. There's a minigolf game that does this to hairlines and it will basically never look right in Ruffle.

- Tesselated vectors lose their smoothness, so we have to sort of guess what scale the art is drawn at and add enough detail polygons for things to render correctly. Most of the time we get it right. However, there are some movies that do crazy things like store all the art at microscopic scale and blow it up. This provides a compression benefit, because it quantizes the points on the art with little visual difference on Flash Player. On Ruffle, however, the art becomes very low-poly.

- All the tesselation work takes a significant amount of time. There are certain movies (notably, a lot of Homestuck) that would hang the browser because of how much ridiculously complicated vector art was being processed before the movie even loads. We had to actually limit how much art could tesselate per frame, and expose that to movies as bytesLoaded, which is why Ruffle movies have preloaders even though we don't support streaming download.

There's another approach to drawing Beziers on GPU: drawing the hull of control points with a procedural texture that calculates the underlying polynomial. This is especially simple for quadratic curves (the ones with one control point), which is what all Flash timeline art[2] is.

However, strokes are more complicated. You'd think we could just take the hull of the stroke and draw that as a fill, but you can't. This is because the offset of a Bezier curve is not a Bezier curve. Drawing the stroke in a pixel shader would make sense, except you still need to define a polygon mesh around your stroke with a reasonable texture coordinate system to make the math work. And the polygonal outlines of Bezier curves can get really funky; there's no obvious way to quickly say "here's a curve, now give me the polygon that encloses a 5px stroke around it". Remember how tesselation takes so long that it would hang Ruffle?

[0] I'm not sure if this is virtual or device pixels.

[1] Flash specifies vectors in fixed-point units called twips. That's why the zoom factor on Flash movies was locked to 2000%.

[2] Flash can draw cubics - the two-control-point curves you think of when you think Bezier - but only in response to an AS3 graphics command. We haven't implemented this in Ruffle yet.

> "GPUs have literally no support for rendering anything other than pixels on triangles and getting them to efficiently draw Bezier curves and fills is an active area of research"

Well fuck me, I had no idea. I figured that 'simple 2D vectors' would be beyond piss-easy for modern GPUs. I'd never considered that it wasn't the actual math space that was accelerated, rather the fast memory mapping of everything on presumably comparatively simple geometries. You've just turned my view of the world upside down :(

Yeah, this is something that many many people assume (vectors are "easy" on GPUs) and then are amazed (like I was!) that they aren't even in the function set. My thought was that if you were "accelerating" desktops you'd really want vectors right? But no.

The interesting thing is that graphics cards in the 90s - which were "desktop accelerators" past the low end - specifically supported accelerated 2D primitives, including Bezier curves for the more powerful ones. It was all gradually dropped as cards focused on 3D games specifically.

> they aren't even in the function set.

they are, but on nvidia only... https://developer.nvidia.com/gpu-accelerated-path-rendering

Rendering vectors on the GPU is possible, just quite tricky.

But signed distance fields work well and easily on GPUs.

Here's a recent article and discussion about it:

Vector graphics on GPU (gasiulis.name)

235 points by mpweiher 4 months ago | hide | past | favorite | 95 comments



Have you tried splitting the area covered by each shape into several 8x8 pixel rectangles, then running a compute shader over each one that executes the exact same rasterisation algorithm as Flash did? That's more or less how triangles are rasterised on the GPU anyway.

It's definitely not a simple solution, but might enable you to do runtime rasterisation with a good framerate on the GPU rather than pre-rasterising all the vector art.

I though that there were some "GPU accelerated path rendering" already 10 years ago ? https://developer.nvidia.com/nv-path-rendering https://developer.nvidia.com/gpu-accelerated-path-rendering

(OK those links are for NVidia, but if they can do it I guess others can too ?) (also see https://www.researchgate.net/publication/262357352_GPU-accel... )

Riffing off of other recent HN posts, I'm wondering if signed distance fields might be a contender for 2d strokes.

I've seen some font rendering work that has already embraced them for high perf rasterization.

Still, may be hard to get flash equivalency.

Also, as far as I understood, both GNU Gnash and Lightspark were using OpenGL for rendering. So I always expected that GPU would still bring some sort of acceleration for 2D path rendering ?

Thanks for this! Very informative!

Flash was my 2nd programming language (after VB6) back when it was still Macromedia. I had a lot of fun making really crappy minigames and learning the basics of coding. This article brought some good nostalgia and plenty of astonishment at the lengths you went to.

Your writing itself is great. A lot of writing linked by HN seems too verbose because people try to sound smart. Yours is succinct while engaging. Right length, just technical enough, and interesting. I really enjoyed this, thanks for writing it!


> Object-orientation is not fashionable right now in gamedev circles

Can you elaborate on this?

Object-orientation is kind of the old-school way of doing things, these days it's all about ECS - entity component system. This is a more data-oriented approach that has the potential for much higher performance when you have thousands of objects that need updating in your game. It's similar to how OO languages like Java/C# are going out of style and more (nominally, at least) data-oriented languages such as Go or Rust are in style.

Is ECS similar to Data Oriented Programming?

I watched this talk years ago: CppCon 2014: Mike Acton "Data-Oriented Design and C++"

I was very impressed by the ideas presented.

Java/C# can be used to write a program with "just a bunch of structs + static methods". Isn't that data-oriented enough? Ya, I know we can do 72 levels of inheritance also... for ignore that for a moment.

Unity isn't going anywhere, and programming to interfaces is just as doable in Java and C#.

Unity's been working on an ECS option for a while now

Written in C#.

Yeah, "OO is going out of fashion" is a common phrase in gamedev, but what's actually meant is "deep class hierarchies defining the behaviour of game objects is going out of fashion". Most ECS code is still written in OO languages and makes heavy use of OO features.

Probably how OO teachers and mentors failed horribly (AFAIK we still don't warn new students about the pitfalls of inheritance ?), so ECS got popular as a better alternative to shitty OOP code ?


Not only that, but due to CPU caches and memory alignment issues, ECS have better performance than inheritance; also because most ECS systems ditch virtual function calls.

> ECS have better performance than inheritance

Can have.

Whether or not an ECS will actually be faster depends entirely on the type of game and access patterns of the inheritance structure. Further, it matters how you are doing inheritance (LTO can eliminate a lot of the pitfalls even if you use virtual function calls).

This is nothing against ECS. I just find the current claims of performance dubious and potentially flat out wrong given modern compiler optimizations and changes in CPU caches (mainly that they got a LOT bigger).

I believe claims of "compilers getting faster/bigger enough" are the exact reason we have ECS now. It surely depends of the kind of game, but games with any kind of entity which isn't unique should feel the performance improvements.

Ideally, compilers should be able to transform classes into ECS layout when generating code. Shouldn't be hard to have a pass before compilation, if we use clang.

There's a gem of a hidden narrative here for more junior developers that you don't always need to start over (which 9/10 fails) but instead be resourceful within your constraints and to be mindful of what your constraints actually are, which don't have to be new and shiny.

It was a pretty insightful article, I didn't know how a flash game worked internally and how it could be reimplemented! I think it is also relevant for those migrating their projects to newer technologies, I always found this process fun(except there is a deadline) :)

(Edit: I love your games and used to play a lot when I was a kid)

Worth noting that there is a spec for the custom XML file, it's known as FXG. S starts a cubic Bezier curve in absolute coordinates: https://web.archive.org/web/20110611002637/http://opensource...

Your writing is clear and concise. You don't indulge in run-on sentences, or hyperbole.

It's an impressive piece of writing. You're definitely good at this.

Your blog post was excellent. Thank you to share.

    Is it too long, too short, too technical, not technical enough? Boring? Interesting? Is it enlightening or does it just come off like content marketing
Long, but well-organized and not ranty. You can quit after a few sections and still learn interesting things. Technical enough with enough screenshots to keep it from being "wall of text". It is a good way to share some eye candy from your games. [E]nlightening? Yes. [C]ome off like content marketing? Yes, but this is the kind that we want to read -- basically, interesting content marketing. Don't be ashamed that you need to hustle a little bit as a small software shop.

Oh, and you forgot funny. This part made me laugh:

    Flash stores its vector graphics in XML format. You might argue that XML is a poor choice for graphics data, but you weren't a product manager at Macromedia. Behold: ...

I’ll add to the chorus and say that this was a wonderful post and the type of content I hope to find when I come to HN.

I enjoyed how you explained certain technical concepts succinctly and using simple language. For example, as someone who never used Flash, you explained the 1000 ft view of how Flash works. When I’m looking at a new tool or framework, I try to star by understanding the mental model behind it. Yet, a simple explanation of it is often difficult to find.

I was also struck by how much technical work was involved here—you made it sound easy. How long did this project take you?

> you explained the 1000 ft view of how Flash works

I second that. I find the better someone knows a subject, the harder it is for them to provide this perspective, making it all the more impressive.

I liked it a lot! It doesn't feel like marketing at all, and I think it's a good balance between technical and non-technical (though I wouldn't mind more details).

Keep writing!

As someone who worked on Macromedia’s and then Adobe’s Flash Player team, I thank you for sharing your experience preserving your Flash content!

Did you investigate Haxe or OpenFL?

Can you comment on why Adobe let Flash die? Did it just become unprofitable compared to their other projects?

Or did Apple give them a nice deal in exchange? ;)

I guess we will never know hehe

Adobe couldn’t figure out how to monetize Flash, other than selling the Flash authoring tool to Flash designers, a dying breed, for a few hundred bucks or bundled in Adobe’s Creative Suite. Adobe tried selling Flash video DRM, which got undercut by Google Widevine, and selling licenses to unlock Flash opcodes for advanced 3D and C++ cross compilation, which was undercut by web technologies like WebAssembly.

Also, I think Flash was more expensive to maintain than Adobe’s other design applications because the Flash Player needed constant security patches. Adobe was usually happy to milk cash cow products, such as Director and Shockwave, transferring maintenance to remote teams in India.

Keep on writing! I enjoyed reading what you tried and why you ended up with what you did. It is fine to mention your game, I do not consider that content marketing at all.

I think you nailed it at every layer, it was a very entertaining, nostalgic and instructive read. Thank you!

It's interesting for sure, I'm left wondering if it was less work to make flash continue to work somehow instead of re-authoring the original games.

I mean you've probably learned a lot, but you could have learned to author games using a more modern tool than flash instead and that would have been useful in the future, right?

This is pretty great writing, I'd love to read more of it.

Just to say: thank you for these games. I loved them when I was in my 20s.

Great article! I also still use Flash by using a custom runtime that interprets the output of the “Export as texture atlas” feature. I was wondering if you had attempted to use this feature in order the rasterize the graphics?



Looks like that feature was added in Animate CC, which I don't use, so I haven't tried it.

Looking at the docs, it looks like something I would have at least tried. Might have saved me writing the rasterizer, though I'd still have wanted to atlas everything together into bigger atlases, and make binary animation files.

One downside vs the .fla parsing approach I went with is that even if you can script away the GUI clicking, you'd still have to have Flash open while the build script runs. I quite like having an exporter that is just a CLI program you can run without Flash, though it's not a massive deal.

I'd be interested to see what exactly Animate puts into the animation.json file it generates. The docs don't seem to mention it, and I couldn't find any example ones online.

Your tool looks great, by the way — have you made any games with it?

It is perfectly balanced. It can be read top to bottom in a few minutes. If you feel like to expand a part you can write a new post and link it from the original one.

Some other posts are born with all the tiny details, are too long to be read on one session and I end up missing even the key points because that second session never happens.

I might have missed this in the article but are you sharing any of your work? I know that a lot of the work was manual but some of your tooling might be useful for anyone else who want to go this craftsman route of converting a flash game.

I really enjoyed it! It didn't come off as content marketing at all. Well done!!

It was perfect! Full of low-key wisdom and fun observations. Doesn’t feel like “content marketing” in the slightest.

I wish the Hapland series was coming to Mac, but I respect your decision to snub Apple.

This was a great read. I distinctly remember playing Hapland as a kid. Armour games or Crazymonkeygames, can't remember which one it was. So this was a great nostalgia trip.

I thought it was great, and I have no interest in games or game development, but I read the whole thing top to bottom and came away thoroughly impressed. Excellent post and work!

This was awesome.

Also, the idea of using text/assembly as an intermediate format of binary files is pure genius. I will probably use that occasionally for similar use cases.

Great post! No complaints at all from me. The mix of your thought process + screenshots + examples is perfect.

You are very good at this, i'll follow your site from now on hoping for more publications like that!

It depends on how one looks at it. For example, I didn't read it.

It was great, loved the concise writing style, and it was a cool story.

Thanks! It's actually great to hear you describe it as concise, I was worried it was a bit waffly to be honest.

It was long because there were a lot of steps to describe. But for each, the writing was concise, interesting and entertaining.

Very interesting, and frankly inspiring to see you get this creative.

> Uploading achievements to Steam is a pain. You can't just define a list and give it to their command-line tools; you have to laboriously click through the slow, confusing miasma of PHP sadness that is the Steam partner site and add them one by one.

> I think if you're a big important game studio you don't have to stand for that and they give you a bulk upload tool, but I'm not one of those so I looked at the HTTP calls it was making, saved my login cookie to a file and wrote my own.

No way. Fucking hell, Valve. Sort your shit out. How many thousands of hours have been wasted because of this?

> Although I developed the game mostly on my Mac, during development Apple invented this thing called “Notarization” where if you run any app on a new version of MacOS, it'll make a network request to Apple to ask if the app's developer pays Apple a yearly fee. If the developer does not pay Apple a yearly fee, MacOS will pop up a dialog strongly implying the app is a virus and refuse to start it.

No way. Fucking hell, Apple. Sort your shit out. How many thousands of dollars have been wasted because of this?

>No way. Fucking hell, Valve. Sort your shit out. How many thousands of hours have been wasted because of this?

This is to prevent abuse. Some games will have 1000s of achievements as a feature, because gamers like games with lots of achievements (and because Valve kinda encourages that with some of their store features), so forcing people to do it manually is a way to decrease that kind of activity somewhat.

Valve already capped the number of achievements to 100 because of that kind of abuse, so I'm not sure it justifies having poor UX:

> By default, games are limited to 100 achievements at first. Reach out to us via the "Support" option at the top of this page if you have questions about this.


That's a lazy excuse IMO. There's a lot you can do, like letting small developers have up to 100 achievements, with more requiring talking to partner relations.

An easy solution to this is to give point-based weights to achievements and cap the max total "points" any given game can have, allowing a developer to have many small achievements for their game if they wish. But really, none of this matters because Steam achievements are easily auto-unlocked via SAM.

“Wasted? How could you say that?!” - Apple exec, wiping their eyes with a roll of $100 notes from their extortion program.

The bit about Apple Notarization is incorrect.

You can do it with a free Apple Developer account. You only need to pay $99 if you want to distribute it through the App Store.

I don't believe you are correct I'm this. A paid account is necessary to obtain the signing cert.

Still much more of a pain than it should be.

No, you can't.

<<you have to laboriously click through the slow, confusing miasma of PHP sadness that is the Steam partner site and add them one by one.>>

Read question: Can this be automated with Selenium? For a long time I was using very low level Chrome DevTools libraries. Then I switched to Selenium. It is so impressive. It actually works, without weird 1-5% failures that you waste time chasing... then adding small sleeps to make it smooth and working.

I would suggest playwright or puppeteer myself. It uses the dev communication channel for chrome(ium)/firefox and works much better, even if you're limited to JS... there's a Deno port of puppeteer if you want to avoid some of the node.js bits for (imho) a cleaner experience.

Selenium? Gross. Use puppeteer

I remember creating a simple "catch the falling X" game in Flash 20 years ago. It was basic, but had everything you'd expect in a beginner project: points, multiple game states/screens, a victory condition, basic story (which was just a screen at the beginning, like NES). The thing is, I was a literal child and I also had no clue how to code -- I did everything by following tutorials and gluing together code I copy&pasted&modified. And it worked! And people played it!

They all thought it was pretty shit, but to me it was more important that some random person across the globe genuinely picked up something I wrote, played it, and then wrote up a review of it.

I showed it to my parents and they immediately started pushing me to learn programming.. and well here I am now. Thanks Flash.

It blows my mind that no one has made a tool like that in so long. It had the same quality as Excel, where any non-technical person could just look up a tutorial, right click a bit, and make something usable. Why has the pinnacle of ease of use in game creation been achieved and never replicated since?

That's how I feel about HyperCard. Some apps were so ahead of their time that when they were axed, nothing comparable replaced them.

I believe that this is due to a tragedy of the commons in the software industry. We're all so creative, with so many dreams, but spend the majority of our lives working 40 hour weeks to outcompete each other and make rent. Then when someone wins the internet lottery, they succumb to their ego and do a bunch of stuff in business to hoard even more wealth, rather than working to reform the system which keeps so many others down.

I would very much like to write a programming language, a blogging tool, a web framework, a game, a game development tool, etc etc etc. But I never will. Most of the people reading this never will either.

So I endure. I meditate, I practice stoicism, I go to the gym. But I had to let my dreams die for my own self-preservation because opportunity cost grew to consume my entire psyche. I simply can't think about all of the other things I could be doing while I have to work. And all I do is work.

There doesn't seem to be much help coming in terms of UBI or having N people split a 40 hour workweek or forming artist communes for makers. Sure, we hear the occasional stories. But it's like we're all waiting around for a billionaire to liberate us, rather than creating a scalable cooperative system to sustain all of us right now today.

What I'm saying is that until we look beyond the technical challenges and see that the problem is subjugation, we'll never get free of the lackluster tools, because we'll never have time to make better ones.

"but spend the majority of our lives working 40 hour weeks to outcompete each other and make rent"


I am so sick of the competition in most tech jobs / companies to climb the stupid ladder. Everyone just wants more more more and our society reinforces that. I _REALLY_ miss working on teams that didn't have that amount of competition causing mild dysfunction.

Just for some perspective, the 40-hour work week is a relatively new invention that only surfaced with the recent rise of the white-collar workforce. In my parents' day, the 40-hour work week was called "bankers hours" and this was used in a highly derogatory "sure must be nice" tone of voice. Many or most of us are probably descended from farmers that worked doing hard physical labor from sun-up to sun-down six days a week at a bare minimum with the rare holiday being reserved for family gatherings and religious observances. No travel at all unless you could hire someone (or left your kids at home) to feed the pigs or whatever while you were gone.

I'll take 40 hours over that kind of life in a hearbeat, thankyouverymuch. Although I am still trying very hard to make it so that I don't have to do even that much for the rest of my life.

In general, I don't disagree that it's sad that we spend such a huge proporation of our precious, finite time on our planet working. Most work without enjoyment because that's how you put food on the table.

On the other hand, the person who cracks this nut will enjoy unprecedented fame and their name in the history books for the rest of human civilization if they figure it out. (And no, it's not going to happen no matter how many 140-char political rants there are on the bird site. Very arguably, those are in fact the opposite of progress no matter which "side" you think you are on.)

And thousands of years ago it would be feasible to live on 30 hours of work a week. I wouldn’t want to give up modernity for that, but it’s not as though tremendous amounts of labor is the original work week.

But how many hours does the person who lives in the RV down by the Aqua Park work? They don't need to pay for refuse dumping because they simply dump in the lake which now has Hepatitis fumes so it's not a good idea to jog around there any more. And they have huge piles of bikes.

Can I ask the rough scale of the company you work for?

Do you feel compelled to participate in that competition? If not, why does it bother you?

Fail to compete and you'll be left behind in terms of compensation, performance review rankings, and career growth, the last of which also limits your future job choices. It also puts you nearer the top of the list for layoffs when corporate finances get tight.

But those things shouldn’t bother you if you aren’t interested in the “more more more” society. It’s easy to get a stable programming job in the US that pays $80k and has very little lay-off risk.

You can’t have it both ways complaining about being left behind in comp while not participating in the grind that allows those comps to go higher.

Lower paid jobs can be worse. That they can’t afford to pay you more might mean they are doing something noble but usually it is a wannabe growth startup being badly run and pushing down pressure on lower waged staff to compensate.

But you have a point: at 80k requirement you could easily freelance to earn that, for non technical people. Help set up shopify stores etc. and not need to constantly be on a learning grind.

I don’t think wanna be growth startup is really typical. Typical lower, but not that low paid tech job is more like maintain software tooling that supports mechanical engineers testing, or aids insurance appraisal, or replacing ladder logic industrial automation with something that has more iot type features (and still might be partially ladder logic). Probably all those pay 80k-100k to someone with a few years experience. The downsides are dingy offices that you’ll have to show up in sometimes, and for the last one, potentially lots of travel to not too exciting places.

Participating in the grind however means tossing away millions of employee-hours on paper-pushing jobs that don't really accomplish much, either in terms of personal development or corporate benefit. This is being evidenced almost daily by wave after wave of layoffs for the relatively well-heeled tech industry.

You must live with a lot of fear. I am sorry.

Why? I'm in a good place, thanks to awareness of risks and preparedness. Save your pity for those who fail to heed my words.

I don't, but it bothers me because the people who win it get to decide what I work on and they are too often fundamentally incompetent.

Because for everyone to make what they want to make is, as of current year, a pie in the sky fantasy until scarcity is solved, and perhaps not even then.

There are 8 billion humans, the world doesn't need another hundred thousand programming languages or web frameworks or game development platform or even indie games or fiction or music or film, produced every year when the whim strikes someone's fancy. Even if we could, who can even consume that much content?

Being a cog in a machine at least means you're probably doing something of value to someone instead of writing another book that nobody reads.

As of now we have no better solution to coordination failure except to create the AGI that will save/doom us.

You make a good point. I've latched onto some goals (like making games) because I forget that they're a stepping stone to something bigger.

The bigger thing would have been the web framework, or the programming language. But even those aren't quite big enough.

If I'm being honest, the biggest stuff is curing disease, eliminating wealth inequality.. biblical stuff. Truly I want to be an inventor and work to solve the very hardest problems.

So it feels undignified to be fixing someone's computer when I could be helping researchers to like, cure death and stuff.

But you're right, that act of helping someone is the very essence of our humanity and shouldn't be written off as something inconsequential just because it's not glorious. There is glory to be found though, a different kind of glory in that covenant.

And I totally agree about AGI.

Edit: as far as scarcity goes, I think that economics correctly attempts to allocate resources between people with unlimited wants and needs. It just never got as far as the endgame we're in now, where the primary scarce resource is time. That's an artificial scarcity, so we might have a chance to turn things around if we optimize for that instead of material wealth.


I left the tech world to “cure disease” (working on a phage clinical trial, currently treating three patients, and they’re doing super well!)

With that said, I’m consumed with FOMO and opportunity cost every day. I’ve missed the boat on being founder-level or early designer/engineer on so many projects that have gone on to raise Series B it’s mind numbing. Also, we were paid $0 bootstrapping our stuff in the first five years, and now finally we get paid, but $60k a year.

I’m saying this as a warning to all devs and designers out there. Do NOT do this. If you can get into Google or FAANG or whatever, build wealth that way. Maybe you’ll feel like a sell-out or you never scratched your itch or done something good for the world, but then you can also remember you can take a year off in Bali or whatever.

Personal opinion: I think scarcity is mostly a solved problem, in that most of us are working on projects that only exist to make someone else more money, not to fulfill any actual needs of those 8 billion humans.

There are exceptions of course.

But most of us could stop working tomorrow, and aside from the lack of income, the scarcity of resources available to us would not change.

The real problems that causes hunger/poverty in our world has more to do with logistics and politics, not a deficit of resources.

I cannot understand why you would think so, even in a first world nation.

Who would grow your food? Who would make the tools and fertilizer that enable the farmers to grow your food? The robotics and software that enables those tools and fertilizer to be produce? Who would organize, transport, package, preserve and process the produce so that you can access it?

Apply the same questions to healthcare, education, transport, construction, entertainment, etc.

Some jobs are superfluous and arise out of coordination failure, that's true. Lawyers, administrators, salespeople and parts of finance and the government comes to mind. But "most people could stop working" is an unreasonable assertion.

> Who would grow your food?

>> There are exceptions of course.

There are about 2M farmers and ranchers in the US. 22M health care professionals. 4M teachers. 2M truckers, 135k rail workers, 40k miners, etc.

Even at our highest employment levels, under half of those in the US are employed. They're just not counted as unemployed because they're either children, in school, retired, or simply not seeking employment.

Yes, I do believe that most jobs are superfluous - that many more people could not have to work without a material impact to the availability of goods.

You forgot to count plumbers, and everyone in the construction industry.

Also the guy that fills the milk bottles. My brother used to work in a milk bottling plant, but he was not filing the bottles by hand. A machine filled the bottles, someone overlook the machine, my brother was making the chemistry and biological test to ensure the milk was safe to drink. And there was some additional people they call in case the machine gets broken (or to build a new machine.)

You are forgetting to count a lot of people.

Even if every single position in the US held today were considered essential, there would still be more people in the US who are unemployed than employed.

And every job currently held today is absolutely not essential.

That doesn’t support your point though that we’re anywhere near post scarcity and can support people quitting. Many jobs that were classified as essential during the pandemic are things people don’t have oodles of joy doing (garbage pickup, power plant coal loader, etc).

You can cut out the entire “entertainment” or whatever industry you think isn’t essential, but we still need to convince the people working the essential jobs to do it while everyone else free-rides.

That farmer count is only as small as it is because of the massive society they are built on that allows that to scale. You briefly scratched at with miners, but you forgot all of the machinery and auto manufacturers, the massive energy industry that powers all of it, etc.

> Yes, I do believe that most jobs are superfluous

I think you just don’t understand the purpose of many jobs. If the accountant for the local grocery store didn’t exist, what do you think would happen?

If your society focuses primarily on money, bad things.

If your society focuses on supporting a population's needs, not much.

Even if we somehow "solved scarcity" and robots could automatically produce all the goods and services everyone needs for free, if we don't get past the primitive "value must be provided through work" mentality, there would be no benefit.

Stuck with this "work virtue" mindset, society would find a way to make it so half the people were paid to dig holes and the other half paid to fill them in, just so we could go on with this fixation on the need to work for a living.

Parent has a point from my experience - most products I worked on never really caught on in terms of being reasonably widely used. I don't think I've ever worked on anything I'd call truly essential.

From what I've seen, most programmers are indeed spending their time building experiments that have the primary goal of making money for the person that pays them - if they create true value that's more of a lucky side effect. And lots of those experiments fail.

If most of us “humans” not “hn subset” stopped working we’d be in deep trouble.

Who will maintain buildings, infrastructure, essential services, government etc.

I think we could work towards long term fewer hours and more people who don’t need to work though.

My thinking is this - less than half of the population in the US holds a traditional job. 150M or so. And a vast majority of those are not jobs related to "essential" work. They're largely part of the retail and administrative workforce.

And so, if a majority of less than half of the US population is working unnecessary jobs (i.e. only valuable to our economy)... how much of what we do is really valuable to us as a species?

Which is a roundabout way to say that scarcity is a mostly if not completely solved problem; working to survive is largely no longer necessary, just customary.

There's a big leap there though from "non-traditional" to "not necessary". A colleague of mine is working on a project involving ensuring that farmers can track their feed and illness statistics correctly to improve their yields. It's not a traditional job, but it's absolutely necessary to ensure that (for example) the dairy industry can provide enough milk. (And moreover, to reduce the number of people who need to be working on calculating those statistics by hand.)

Likewise, you're also assuming that jobs that meet needs are more essential than jobs that meet wants, but the point of post-scarcity is that not only our needs are met, but our wants are as well. Picard can make himself a cup of earl grey whenever he wants, as opposed to sustaining himself with the recommended nutri-pack. One of my most fulfilling jobs was working for a t-shirt printing company - in theory, nothing we did was at all essential, but it was clearly very important to a lot of people's lives. Some of the most fascinating orders were memorial t-shirts being printed for people's funerals. Not essential at all, but I would argue very valuable to us as a species.

> A colleague of mine is working on a project involving ensuring that farmers can track their feed and illness statistics correctly to improve their yields.

I doubt that's an example of a job/project that isn't just pure administrative burden. Even your example of a t-shirt printing company is more than just pure administrative burden - people need t-shirts, and people like their t-shirts to have art of some form. While art may be a luxury, it is also pleasant and required to some extent.

However, there are some roles that are pure administrative burden, and I tend to think the GP is correct that these roles are a lot more prevalent than we want to admit. I think the 80/20 rule could be applicable here - only 20% of the population produces 80% of the wealth (read: essential or luxury items) while the other 80% of the world are stumbling on themselves in bureaucracy to maybe make the rest of the remaining 20%.

>But it's like we're all waiting around for a billionaire to liberate us

Only in Silicon Valley, and that's probably a big part of the problem. In some of the highest paying, most creative occupations on the planet people are drooling at the feet of billionaires to save them while they slave away M-F.

Most people throughout history didn't look at the rich with anything but contempt. In Silicon Valley they're held up as geniuses on every subject. Look at how a subsection of the industry treated Elon Musk taking the axe to Twitter (politics aside).

You’re confusing “billionaires” with “a few select billionaires that built disruptive businesses from the ground up”. Nobody in Silicon Valley is “drooling at the feet” of the Walton children.

> Most people throughout history didn't look at the rich with anything but contempt.

Citation needed. I don't think wealth as the main cause of contempt.

> Then when someone wins the internet lottery, they succumb to their ego and do a bunch of stuff in business to hoard even more wealth, rather than working to reform the system which keeps so many others down.

It doesn't even have to come down to greed though.

If some meme stock speculation 1000x'd, my family could live off the gains in perpetuity and I could spend all my time working in a soup kitchen, doing good.

But if I instead figured out how to turn that into even more money, I could start a new soup kitchen and help even more people, doing even more good.

But if I instead figured out how to turn that into even more money, ...

Is this effective altruism?

It's altruism if GP stops the recursion at some point and actually uses the money to help people and do good.

It's effective altruism if they use the money in a smart way, that reasonably maximizes the good it does for others, instead of doing something stupid or just extremely suboptimal.

“ Then when someone wins the internet lottery, they succumb to their ego and do a bunch of stuff in business to hoard even more wealth, rather than working to reform the system which keeps so many others down.”

Well said

I'd add Visual Basic to your "glue bits together" list. After working on C++ & FORTRAN professionally, Visual Basic was like a breath of fresh air. I remember thinking "how could programming be this simple". It felt like cheating.

Ofc, there's an evil undercurrent to the "any non-technical person can use it". We ended up having to support a tool that some "business analyst" had written. My eyes bled.

I remember how the last VisualBasic programmer left the first company I worked for.

On one hand, he was ignorant in regards to everything we thought was mandatory for development: version control, code style, code structure. We, young Cpp/Python programmers, despised him.

But nothing we had (QT, 2007-2008 web tech, etc) made it possible to spit out a business UI in an hour.

To be honest, I still don't see anything comparable.

> To be honest, I still don't see anything comparable.

I think it's just around the corner for those types of applications to come back. And surprisingly, they will come hand in hand with note-taking outliners, not your run-of-the-mill no-code or low-code tools.

See for example Obsidian canvas, released this week. I see it as functionally similar to Visual Basic GUI editor, except simpler and easier to edit; and its layout is stored as JSON in a .canvas file. The open-source Logseq is adding similar visual layout templates together with data-recovery primitives.

They just need to add the core interactive data widgets (text field, text area, buttons) and expose the cards to a programming language in a end-user accesible way, and you'll have the power of a visual language, except web-integrated and with a really simple way to add data (the outliner editor).

[1] https://obsidian.md/canvas

.NET with WinForms could do that just fine in that timeframe (and today - it's still supported), and many VB programmers ended up there. Delphi was another popular contender that's still around.

One more thing that changed is a platform situation. Everyone is looking into building crossplatform tools, and MS tools do not really cover this front.

Thus the abundance of those browser-wrappers such as Electron. The html/js stack is far from trivial but it really is truly portable.

So the new visualbasic has to be different, and I don't think newer ms tools cover this niche.

What I loved about VB was how the basic UI was handled seamlessly and yet you could really dig in on optimizing user flow on top of it. It wasn’t pretty, but it was functional, and you could spend your time making sure it felt good.

I can still get there with modern tools, but the defaults / conventions in VB are simply superior to anything I’ve worked with since.

I feel like Python is starting to get a lot of the same bad rep because non-techies can learn it. It's actually a great language if you want to build stuff really fast. Not everything needs to crunch a billion operations per second and even if you want that, there are several good libraries for that.

Yep. Python has potential to support some environment like Visual Basic. But there is no such project, as far as I know. All GUI frameworks are more complex, and there is no nice drag & drop GUI editor build into an easy Python IDE. At least I don't know it. Maybe if you use PyCharm, select some framework, find the right PyCharm plugins for it, you would have sth similar, but still much more difficult to use, and also there is no prebuilt and ready to be used package for such thing.

I have lost count of the number of 'I miss VB6/Flash/Delphi, they were so much more productive' threads on hn!

Yet in this community of all the people who could make it happen, we can't make it happen! What are we missing? Time? Money? Co-ordination?

I'm in the process of making something like VB6 (yazz.com), but which is using Javascript instead of Basic for scripting. I would say that if you look at retool, bubble and many other low code tools then you will see that they are doing the same

The problem is that the approach itself is kinda flawed. The moment you have to support high-DPI or translate the app to a different language (with text labels of different size), that whole trick of just WYSIWYG dragging and dropping widgets in a visual designer breaks down.

It isn't just the WYSIWYG element though, developing the UI elements in code is not that hard.

But imagine in Tkinter and Python, probably the least steps to get a basic UI, there is so much to manage.

In vb you double clicked a button in the designer and got a place to type code. All the stuff about event loops, and not blocking the thread was dealt with for you. In Python if you try the same thing the app freezes continuously. Then you want to do simple everyday things like a date picker, or progress bar, and find you need to implement it yourself????

Then having built your app in Tkinter and Python, you want to share it with some colleagues...so where is the 'compile button'?

It is easier to implement a machine learning model in Python than it is to make it into a standalone app with a file importer and a 'go' button!

> translate the app

The way translation is implemented today is horrifying.

Likewise for high-DPI. Why are they still using some linear scaling of icons instead of a super-resolution GAN and some caching?

Why can't an OS translate a single-language app on the fly using state-of-the-art translation neural nets and using visual attention to capture context? It's almost 2023 ffs, and people are still using some gettext nonsense?

HiDPI isn't a problem but variable (dynamic) size is.

The problem is that on some platforms, text doesn't quite scale linearly as you increase the point size. Techniques like subpixel antialiasing with pixel-snapping produce shapes that are more readable on low-res screens, but it also causes distortion to letter shapes, and with multiple letters, depending on the exact text and its location, it can add up to amounts significant enough to cause overflow.

It's a problem on older GUI platform with HiDPI support, but new WYSIWYG GUI platform should be able to implement it correctly.

It's a problem on any platform that wants more readable text on low resolution. The only platform I know of that chose precise text layout over contrast regardless of the resolution is macOS. On Windows, ClearType still behaves as I described above. On Linux, it depends FreeType settings, which you don't know in advance, so you have to assume that it will happen.

For Delphi, there is Lazarus, which is absolutely great. However, it's somehow really niche? Maybe because Object Pascal is somewhat outdated now?

I don't really know Delphi/Lazurus well enough... but I suppose it is similar to the following... for VB there is Winforms and .NET, which as someone above pointed out is 'much more powerful', but somehow not what people want!

I spent a huge amount of time coding in VB5 and VB6 in the 1990s and early 2000s during high school and my first years of college for hobby projects. Definitely felt like the most fun language and I was far more productive than in C/C++/Java/Pascal. With some work areas that needed to be optimized could be to be pretty fast when compiled. Everything was integrated and a good debugger.

I didn't get the hate. Maybe software engineering folks don't want coding to be fun and productive.

Yep I was going to say the same thing. I write small internal tools for corporates and my team in 2020 is a fraction of the productivity of my team in the 90s. Lots of tools, the complexity is nuts these days.

Totally agree. And as a result, it has become very hard to find young developers who understand GUI usability and standards.

The redpill in me says MSFT created dotnet to kill VB6. That tool made it too easy (and cheap) to create value through automation.

it's a generalized trend. Python is the new BASIC, but Python is SO much more complicated than BASIC. Web development was easy, now it's impossibly complicated, etc etc...

Rebol, Red, tcl/tk are good and easy programming languages, but kinda niche and obscure. I have no idea why computing is getting more complicated instead of easier.

I only disagree on web development. It was never easy. The web always always a mess, and nothing really worked. Today it's a mess, but you can make it work if you know what you are doing.

But yes, looks like the web was a trendsetter that pushed things into becoming harder without any good reason. Or maybe the web was just following the trend of the toolmakers all going out business at the early 00's (thanks a lot to Microsoft).

Ironically - given the article we're commenting on - a lot of the early problems with the web were solved, and I hesitate to say it, solved well by Flash. Its insecurity is what really pushed us away from it (well, and that it was a closed sourced Adobe project)

I miss the days when JQuery was the only JS library a person needed to know to do professional web dev. I stuck with web dev through the peak of Struts, but the constant changing of frameworks with (imo) minimal benefit to user experience pushed me to things besides web dev.

A lot of large companies did a tremendous amount of back office using VB6 applications. I remember visiting Disney and seeing it all over the place in the late 90s.

To be fair, WinForms is the spiritual successor to VB6, a lot more powerful, and just as productive.

There was nothing particularly easier about VB6 compared to the later VB.NET. VB (5/6) was, effectively, a scripting language designed around the COM object model. .NET originally started its life as "COM 2.0", with VB support consistently a part of the plan, and gradual feature creep turning it into what it became eventually.

> It blows my mind that no one has made a tool like that in so long. […] Why has the pinnacle of ease of use in game creation been achieved and never replicated since?

FWIW it never went away¹ and targets standards-based runtimes these days. I'm sure there are many kids every month having the same kind of epiphanies with it!

That being said, because Adobe Animate is marketed as a tool for professionals rather than kids/enthusiasts, orders of magnitude more kids are having the kind of experience you had with Minecraft and Roblox instead.

¹ https://www.adobe.com/products/animate.html

There is a small but major difference with Flash vs Minecraft and Roblox. Flash let you create something from scratch (or close to it). You aren't just modding a game, you are making a game or a cartoon. It was fantastic. Minecraft you are limited to the world of minecraft.

> There is a small but major difference with Flash vs Minecraft and Roblox. Flash let you create something from scratch (or close to it). […] It was fantastic.

Absolutely fantastic, yes! Roblox does allow you to make a game from scratch — even though the primitives are generally less primitive and the constraints much different — but my point is just that if you're wondering where the spirit of dj_mc_merlin's experiences went, two places where it can be found in spades are in the Roblox and Minecraft communities. (Source: Am a parent experiencing this vicariously via my kids.)

>Roblox does allow you to make a game from scratch

...with the caveat that once Roblox (the company) folds, all the games will be gone.

...and that since Roblox changes the code all the time, you have no idea whether your game will look the same in a year. Or even run.

So yes, while I understand what you mean by the spiritual successor.. Roblox takes everything that made Flash bad (i.e. dependence on a clunky binary that one day may not be supported), and makes it worse.

Source: worked in Roblox when a new shiny material pack was rolled out, which replaced existing materials in existing games, messing up the look of games where materials were used creatively.

> ...with the caveat that once Roblox (the company) folds, all the games will be gone.

True, and I guess Flash is a useful lesson in that sense, although no platforms I jammed with as a kid are around either. It makes me wonder if Godot apps written today will work on the web 20 years from now.

The amount of commercial games that used (and still does actually) Flash is much higher than many could believe. Jackbox is probably one that many here have played in the past few years. Some games used it only for UI, like Borderlands.

For context, most of these games use(d) a Flash derivative called Scaleform for their GUIs: https://en.wikipedia.org/wiki/Scaleform_GFx — it's discontinued now but was indeed hugely popular.

A lot of games today use an HTML-powered UI library from Coherent Labs which can still be authored using Adobe Animate (i.e. what the Flash editor turned into): https://coherent-labs.com/

World of tanks used Flash-based tech for game UIs as well.

PS At least that was the case 9-10 years ago when I was involved.

Scaleform - it came with the big world engine

Did it? I remember it was something Adobe, suspected Adobe Air.

Anyways, the client ui team definitely used Actionscript, which I had to read from time to time as a server-side dev.

It could also run in its own standalone environment you could redistribute. IIRC you could even export something like an EXE and bundle the flash player with the installer.

"FWIW it never went away¹ and targets standards-based runtimes these days. "

I just tried Adobe Animate recently and nothing worked out of the box or like flash used to.

I basically had to tinker, until even basic animations somewhat worked.

And I even have quite some experience with EaselJS, the target framework, but that didn't helped much.

The power of flash was the ease of use - that is gone.

You might like Svija — you can easily animate SVGs using Illustrator. It's not as powerful yet as Flash, but that's our goal. Disclaimer, I made it ;-)

You need to work on your self-promotion a bit more - both here ( :) ) and on your website:


I must admit, this isn't the most inspiring landing page for someone like me, who could be your exact target demographic (long-time Illy user and someone who likes to tinker with new web tools). I don't really 'get it' and (this is entirely personal) the design looks quite dated.

I'll bookmark it, share it and keep an eye on it though - Good luck!

Gorgeous home page. What does Svija do?

I remember when Minecraft was released a bunch of kids had their first Java experience due to the modding scene. Funnily enough I didn't do that since by then I was a C enthusiast and had an irrational hatred towards Java (oh how the turntables..). I wonder if that's still possible now that Microsoft has changed the codebase IIRC?

There are 2 editions of Minecraft. One, is the original one, now called Java edition. They definately still work on it, because despite not running well on Java > 8 for a long time, it now does. But afaik, they're still stuck on LWJGL 2.

The newer one written in C++ was first called "pocket edition" and targeted mobile. It's now called "Bedrock edition" and is also the version that has RTX support.

It'll be hard to move the modding community over to the Bedrock edition. I personally don't have any knowledge how limited modding is in the latter one.

Definitely still possible, various modding frameworks like Forge or Spigot for Java Minecraft still thrive.

It's still possible for the java edition.

Some of the better years of my life a little over a decade ago were spent on mofunzone playing the stick figure fights with friends. I remember bullet time fighting (matrix clone with swords, guns, slow motion) and there was also an amazing action narrative series called Ray with Southpark themed characters and one of the craziest platformers I ever played called N. I managed to snag some of these a while back and preserve them offline and can still enjoy them using https://ruffle.rs/. We lost so many good things from the flash era.

N is one of my favorite platformers of all time. Happily, it's still around and doesn't require any Flash hacks! https://www.thewayoftheninja.org/n.html

Do you know if there's been any progress on preserving the older Macromedia games? I remember those starting to get flaky even in the Flash heyday.

Not to my knowledge, I only got introduced to flash ~2007, I've mostly been going by stuff I can recall and hunting for them on wayback machine.

Wow, hitting a bunch of memories there. Bullet time fighting was amazing, and I had a bunch of friends who were way into N ("way of the ninja") - built a couple of the most popular levels for it. Which also was pretty cool that they were able to do that, come to think of it. I remember one of them dabbling into creating games with Flash as well, and when he got stuck, I encouraged him to reach out to the creator of N to ask how he had solved that. (I think it was collisions with curved floors.) Got a nice message back that explained to him how to do it. Good times.

I really hope all these games a preserved somewhere. There are big movements to preserve old console and PC games, but flash, and possibly to a bigger extent, mobile games seem more overlooked.

I think there's a tendency to look back at these games as trash, but it was a pretty wildly experimental time. Indie games as we know them now weren't really a thing yet.

Flashpoint has a nice UI and I managed to find many old favorites.


My time spent creating "stick death" videos in Flash as a kid is probably one of the main reasons I'm a software engineer today.

FYI N has some console and PC updated versions. One being N++ https://store.steampowered.com/app/230270/N_NPLUSPLUS/

https://scratch.mit.edu/ seems like a social network for kids making basic games and learning to code. They seem to encourage forking and learning from existing projects.

This brings so many memories. One could use decompilers to see how other games were made. It was fantastic for learning.

I am surprised Adobe did not manage to get Flash / Adobe Animate adapted to html5/js without people jumping ship.

Wick editor is the closest modern thing to traditional flash ie. Flash 8.

I've made a fully animated pixel platformer tech demo type thing in it, it's somewhat capable.

I took a similar path to you. Sometimes I feel the best thing my high school did for me was having Flash, VB6 and HyperStudio installed on the school computers.

Have you tried processing it gets close to action script and canvas. We just need to put processing on a time framed canvas. With a Figma like editing experience.


You should look into roblox. I think a lot of the kids who want to create are doing it there or in minecraft. There are really some pretty expansive games created there.

Had an awesome experience learning programming on AS also! Simple projects/games but great language to learn. Can involve so many key concepts.

I miss AS. I used it to build a gis engine that packed 5D into what superficially looked like 2D - item renderers helped make that happen. It is aggravating to no end to watch people pile on Flash for its faults - they either don't understand what was lost or just don't want to understand the loss. Steve Jobs is in this group.

Flash really was great. Its unfortunate that they killed it. The Mini Clip and animated shorts days were awesome. Used to love finding cool animations in Deviant Art.

I made a game / synthesizer combo in AS3 for college. Good times. I wouldn’t even know where or how to begin doing that today in JS

Try Roblox, I guess.

And to be honest Unity is really easy at the tutorial level. Shipping a real game is the hard part.

The scratch set of tools still enables kids do a lot of this stuff.

I was expecting "Attempt 3" to involve Bluemaxima's Flashpoint, and was surprised not to see it mentioned. Does anyone know why that wouldn't have been an option, or is this essentially what Attempt 1 would have involved?

For anyone not familiar with Flashpoint, it's a project to preserve old flash games and animations and keep them playable on modern platforms. It's open source and includes a huge library (including it looks like Hapland 1-3).


Edit: Reading a bit further down the article, it looks like they were able to make some big improvements by building their own engine like supporting wide screen and higher FPS so that sort of answers my question!

I still play games on Flashpoint. Hell, I speedrun and showcase games in Flashpoint during some regular events with my other friends playing ""real"" games. It's not the best UI, but you do have an easy search function to find content you might like.

Gotta be careful to avoid the porn games though. Flashpoint archived some.

flashpoint is not a flash emulator, its a preservation effort that has archived tons of flash content from the web

Ruffle is the flash emulator https://ruffle.rs/

It's like Steam for Flash, Shockwave, and Java Applet games. I highly recommend it. Though I do wonder if it's going to survive the transition away from x86...

I still haven't seen anything beat Flash with its sweet 1-2 combo of vector drawing, animation and programming tool

Or maybe it's just me reminiscing

Flash could do 3D 20 years ago, before WebGL and threeJS, and you didn't have to be a Javascript (or ActionScript) expert to build them.

Nothing really exists that beats Flash, partially because the switch to mobile killed Flash's momentum. It also led to the switch in Internet content from Flash animations/toons/games to Youtube/Patreon/Twitch videos. It was easier to make money with weekly videos than trying to get Adsense bucks by making a game every few months.

Flash was actually pretty great. The big problem with it was the "oh, that thing? it's still around?" attitude of Adobe, which led to massive security problems.

Actionscript was a pretty good language, certainly around the level of what Javascript has become. It was poor-mouthed for the same reasons Javascript was (is?), but now we use Javascript to run 90% of the Web.

I remember back in the early 00s playing with the PHP Flash module. It was a really interesting environment that had a lot of promise. But, alas, it was doomed.

Action script 3 was actually pretty great. It was essentially a strongly typed JavaScript long before Typescript came out.

I was always surprised that it didn't get more traction. Though Typescript is leaps and bounds beyond where AS3 ever was

It might not yet do everything, but we're having a go at a modern web-based animation tool with Construct Animate, currently in open beta: https://www.construct.net/en/blogs/construct-official-blog-1...

It can export videos, GIFs, image sequences, can use modern JavaScript coding, and has a surprisingly strong block-based visual alternative to coding.

I'm sorry this isn't super actionable, but in case it's a helpful data point, I tried to check this out and wasn't able to. To repro:

1. Clicked on your link

2. Moused over "Construct 3" in the header, clicked "Showcase."

3. Clicked on a game in the showcase at random ("Bunnicula in Rescuing Harold").

4. Page loaded with a graphic & a play button.

5. I tried clicking play & nothing happened.

6. Shortly after, Brave (Chrome) showed its "Page unresponsive" dialog.

Similar experience with using the showcase "AsteroidX" albeit no "unresponsive" dialog though just wouldn't load. It looks like the root of that issue is it requires 3rd party cookies to load but modern browsers block that by default and the error isn't bubbled up to the user. https://www.construct.net/en/free-online-games/asteroidx-650...

If only web games were this smooth during my childhood :). Great stuff!

The Flash Builder was really good for complex vector manipulation. I don't know if there is anything as good for SVG/HTML.

> I don't know if there is anything as good for SVG/HTML.

There's not because the DOM isn't suited for complex animations. People manage to animate SVGs (see Greensock etc.), but it's a pain in the nether regions

One college near me still teaches 101 classes through Flash. When presented as an educational tool with disclaimers that the end result is not secure, it's a perfect product to teach with. You can take someone who does not even know how to turn on a computer and by the end of the class they understand what programming is, how to draw on a computer, vector vs bitmap, etc.

Scratch has some nice features: it has a sound editor, vector drawing, and primitive animation... but I agree, the Flash UX was the best.


I still use it to introduce students to coding.. ActionScript is amazing for this

Wow! Talk about a love letter at the end of 2022! The dev has identified a gap and applies time and effort to solve it through building it.

Halfway, the dev complains about the orignal devs using XML, because it is not efficient "Hey, I'm not complaining, it makes my job easier."

Well sounds like the original developers of Flash made a good decision. If it makes it easier to parse the content at a later stage, I'm willing to call it a win!

Yup, this is classic Unix style! Having data-centric interoperability among tools also helps you port to new platforms more smoothly, as opposed to throwing away your code or rewriting from scratch.


1. Using text formats embedded in XML -- he took advantage of this when reverse engineering

2. Generating text ASM -- he said this aids debuggability, and allows using existing ASM tools that he didn't have to write

3. Generating C++ -- taking advantage of the type system, as mentioned, and a huge array of other tools (profilers and debuggers)

It's data-centric rather than code-centric. It's a interoperable and transparent architecture, not a monolithic one. (Which is not surprising because Flash itself was made for the web.)

Related: The Internet Was Designed with a Narrow Waist https://www.oilshell.org/blog/2022/02/diagrams.html

A Sketch of the Biggest Idea in Software Architecture https://www.oilshell.org/blog/2022/03/backlog-arch.html

> The vintage Flash UI is great. Buttons have edges. Icons look like things. Space is well-used. It's amazing! Using old UIs makes me feel like an archaeologist discovering some sort of forgotten Roman technology. The lost art of UI design. It's neat.

This was one my favourites parts in the article. An the real piece of the pie comes right after it: there's a screenshot with like thirty icons and they're different enough that you can tell one from the other.

I am still clicking on Gmail on accident because I think it's Maps. The icons are horrible.

For years I've had a "Google" folder on my phone which links to all the obligatory things - Drive, Maps, Calendar, Meet, etc.

Ever since they made this move to every icon being a minimal shape with the 4 Google colors, I haven't been able to quickly differentiate between app icons. I have to read the labels half the time now cause they're so non-indicative of the app's purpose.

Classic HTML gmail https://support.google.com/mail/answer/15049?hl=en (final link on that page)

> The lost art of UI design.

This sounds like a great title for a really good book.

I find it somehow intriguing how you already pre-destined this imaginary book to be great… That stood out to me for some reason. How about a great title for a shitty book?

Funnily enough, The Lost Art Of UI Design sounds like a potentially very mediocre book to me. One of those stretched-out "self-teaching" design guides that listicles will recommend you. It chooses a couple of very specific rules of thumb, it gives way too much importance to them, and somehow manages to water them down to 250 pages. Probably written either by someone who's a skilled writer but barely knows enough about the industry, or an experienced specialist in the field who unfortunately can't write for life.

Ah… soon the publishers will just have chatgpt write it.

I get (and agree with) this idea in general, but these buttons are still present in any Adobe apps (esp. Photoshop) and they're not going anywhere.

Hell, I don't think they're original to Flash to begin with, more like borrowed from Photoshop when Adobe bought Macromedia (this is purely my speculation; but Photoshop is much older).

It's like entire industry at one point decide to optimize UIs for the first 30 minutes of usage and not the years of users using apps once they get familiar with them.

"oooh let's not put too many buttons to scare users away", "they got 22 inch screens, let's just waste space for no good reason, it looks modern and airy"

Yes. What we see in UI design today is a consequence of game theory at work. It's to optimize business. In fairness you don't want to scare users away. I think though many people equated it all to "clean looking", without realizing removing micro-interactions is a net negative....

I think touch screens are a bigger factor. You can't have small buttons in a touch-screen UI, because they must be targets for your finger. (OTOH, swipes in a touch UI are a lot easier than mouse drag-and-drop. So the optimal touch UX probably has lots of drag-activated pie menus. And confirmation flows for potentially destructive actions, to cope with the lower accuracy of touch interactions.)

Yeah, if my memory serves a lot of UI design started to get more airy around the time smartphones became more popular. Even on desktops, which I think may just be because large buttons and text became trendy and although some may deny it, I think a lot of ui design is just following trends. It takes a good designer to not just blindly copy things because they look cool.

Large buttons and text were trendy already in the 20th century. There's only so small you can go on an 800x600 14' screen.

The crucial difference between then and now is the amount of whitespace.

The problem is these "designers" bring those touch-optimised UIs to desktop. And so you end up with hamburger menus on 4k-screens, tall narrow alert dialogs on MacOS etc.

Yes, and that is one of the reasons why your touchscreen UI must be radically different from your mouse and keyboard UI.

One of the early attempts to balance those two goals was an option in the settings to switch between Beginner/Intermediate/Advanced mode. Never really see that anymore.

That's the discipline of data-driven business. If 20 people use your app and despise it and are depressed by it, that's better than 19 people using it and loving it. No sentimentality allowed.

How could anyone possibly measure that an app is despised and depresses people (or vice versa)?

Usage can't be it, because I use a ton of software that I despise and get depressed by, but only because I get paid to do so, or the app is so niche there isn't a viable replacement.

> How could anyone possibly measure that an app is despised and depresses people (or vice versa)?

You'd have to actually talk to your users. But that requires genuine human investment, so it doesn't scale. It doesn't come in the form of a third-party SaaS with a pretty dashboard, generous free plan, and integration via single <script> tag.

In short: they could measure this, they just don't bother to.

They don’t. They just measure that the more “modern” design gets 5% more users, so clearly it must be better.

UI is now optimized for PowerPoint. The #1 goal of all UI work is so the designer and/or product manager can get an "oh, that looks nice!" showing off screenshots to an interviewer or to someone higher-up who'll help them get a promotion. Trendy (you don't want to look outdated, do you?) and pretty are what matter, not usability.

I seriously think that's what's going on.

Spot on. And I would add "let's make every GUI either web based or mobile based just to further annoy PC users".

Working with Flash was like working with future technologies, back in the day. You could build amazing things with it, things that was not possible using standard browser API’s. In fact, Flash led the way and was the prototype for what browsers can to today.

If you cared about what you built, with clever hacks and bitwise performance tricks then the Flash runtime could run your code efficiently. I remember developing an app which used Box2D, camera based gesture control with sound effects and background music all running simultaneously, reaching 60 FPS. In other projects we used software based 3D (à la Papervision 3D), Adobe dropped the ball and Molehill/GPU accelerated 2D/3D arrived too late. Perhaps it’s not common knowledge but we could develop true cross-platform apps, compiling for different targets (SWF, IPA for iOS and .app/.exe).

AS3 was a good language, and definitely reminds me of TS. Here’s a piece of code from 15 years ago: https://github.com/PureMVC/puremvc-as3-standard-framework/bl...

That letter from Steve Jobs destroyed it all, many talented developers left the Flash world at that time. It was a bit depressing to see all the (unjustified) mainstream hate for the Flash platform that started to appear at that time, which felt bad as there were many of us that put a lot of time, care and effort in creating amazing stuff with it. Thanks Flash!

> That letter from Steve Jobs destroyed it all, many talented developers left the Flash world at that time. It was a bit depressing to see all the (unjustified) mainstream hate for the Flash platform that started to appear at that time,

You might not have seen it but the hate for Flash was well-established by then, solely due to Adobe’s gross mismanagement of their platform. Flash was what crashed and took out your entire browser session. Flash was what made drive-by compromises so easy and common. Flash was what popped up annoying “you should manually update” prompts because they sat out the automatic update trend for nearly two decades. Flash was also what made your entire computer visibly slow down because it was so inefficient - they weren’t an option on phones due to the UI differences but performance and memory alone would have ruled them out in any case.

Flash was also painful to develop for. Yes, you could do things which browsers couldn’t do back then but it came at the cost of having to pay Adobe thousands of dollars for shoddy, unsupported tools[1] and while there were a handful of ways where AS3 was ahead of JavaScript there many areas where it was behind and unlike browsers it stayed there, and it was really common to find bugs in Adobe’s libraries which meant you had to build a complete replacement or wait years to never for them to fix it.

People made cool things despite that but I attribute almost all of the blame for Flash’s demise to Adobe. They could have cut their executive bonus pool by 10% to hire a QA team and they’d have had a much better chance at long term success. Instead, they let the platform stagnate to the point where one of the early Chrome selling points was that when Flash crashed you’d only lose that page!

1. I bought a Flash license once. I naively filed bug reports for reproducible crashes in the player and IDE (the debugger was especially unstable), as well as compatibility and correctness issues with their various libraries. Each time, I got no reply before the next major release and then I got a generic “please pay $1,500 for the upgrade and let us know if we fixed it” message.

At the time it was unfortunately both true that Apple needed idle CPUs for battery life and Adobe had no intention or probably no ability either to make that happen with Flash.

No way! This is the creator of Hapland... As a kid playing flash games on websites all day like Addicting Games, Armor Games, etc. you don't really think much about the people behind these projects. Then time goes on and they cement themselves in your mind as a myth.

The fact that I now get to read and understand articles by this legend, and work in the same industry vertical him... what a privilege!

Thanks for your work. Hugely inspirational to my career.

Hah I'm surprised not more people have played the hapland games. I was more wowed by foon putting out an article

I still believe that Flash conceptually was a brilliant idea done terribly. It was slow, buggy, under developed functionality and a security nightmare.

But the idea of having a single file that could bundle, code, audio, graphics that was dead easy to build and could run independent of specific browsers and operating systems.

That is something magical. It is harder to build that kind of content today and near impossible to implement it on that scale today than it was 20 years ago.

Theoretically you could run a Flash file on a Linux system via Konqueror browser and it would be identical to that on Internet Explorer on Windows 98. This is something we have lost.

It is a shame that Adobe/Macromedia treated it like absolute trash. A free/open solution that had the same good properties would have been brilliant but it never happened. Instead we have the whole HTML5+ stack. I mean it is cool and has done some amazing things for the web, but it is also a massive pain to manage. One browser update and boom, something has broken - time to get digging on what happened! Now it works on Chrome but not Firefox!? Oh dear.

I don't miss Flash itself, I mean to be charitable, it was a mediocre product at best. But that vision implemented in a sensible fashion could have been wonderful. Could have allowed for the decentralized web to continue on a bit longer than it has.

> I still believe that Flash conceptually was a brilliant idea done terribly. It was slow, buggy, under developed functionality and a security nightmare.t

Relative to what though? There was nothing else with its ubiquity, ease of use and deployment, and breadth of functionality.

It pioneered and enabled so many things that we take for granted on the web today.

Personally, as someone who used it extensively, I don't feel it was particularly buggy, especially relative to the alternatives or developing in the browser. It could have performance issues, if developers did bad things or pushed it too hard, but you could say that for just about anything.

As far as security, it was the single most ubiquitous runtime on the web, and thus was a prime target for hackers. Yes, it had security issues (although every web runtime did), but its not like the rest of the web / browsers were particularly secure themselves.

fyi, I worked for Macromedia and work for Adobe now, but these are my personal opinions.

It was very slow and buggy on Android, at least.

I see you work for the Dark side. j/k;)

Maybe I am just remembering it differently as this was more than a decade ago. But it seemed like almost every week, heck almost every day, another zero day would be discovered. Yes, this was the peak time of zero day exploits as every system was being hammered from all sides, but the Flash name came up far more often than anything else I recall expect for maybe general Windows Xp/7 issues.

As for performance, that was a major issue personally. I recall a time when I had nested items a few layers deep so that I could have a character with movable joints and the system was buckling under this work load. Going from fully ridged models running smooth as butter on 3DS Max to the same system crumbling under some basic vector graphics left a bad taste for me. It just always felt like it was dragging its feet compared with what these machine could achieve.

I don't recall any specific bugs only that recall the feeling of them. Sorry I am so light on specifics. A minor bug was that I do recall was as you rotated objects, there must have been a small rounding issue that meant items would slowly shrink as you rotated them multiple times while going about your process. It would take hundreds of such movements before it became noticeable, it was a funny little thing but would lead to some projects becoming a bit frustrating.

I say this as someone who absolutely loved Flash. I wish it had kept on going regardless of those issues. It just felt like it needed a Windows Vista to Windows 7 moment. Where it was taken to task and team got behind it to refine the project into something much more lean that still had the same functionality. Instead it felt like it was treated as that thing that would get barely any attention until it died from atrophy. That mobile wasn't taken to task is probably the thing that ended up killing it quick.

As someone else said here, Flash felt like a vision of the future. Where computers were tools of creation made easy not just consumption.

It was a pioneer in so many things and I wish it was still a part of the web. It made interactive media distribution so dang easy! Build it in Flash, add a single line of HTML and BOOM - multi-platform audio visual experience that could scale like a champ!

Yeah video kind of sucked, early Youtube performance a good example in that as it would bring my G4 Mini to it's knees! But for everything else it was such a visionary idea that we have lost.

On a tangent.

This is something that I feel a lot of modern technology has forgotten. That vision of not just new tech but tech that is enabling to even those that have no interest on the base technology. Flash had that vision, not just of creation but the ease in which it could be spread. Hypercard had that vision. Heck BASIC had that vision. As Steve Jobs always pushed in the early 80's, the computer is a cycle for the mind. Nowadays at times it can feel like the sedation of the mind.

Everytime I see here on HN another link about thing Ruby, GCC, CLANG, some JS framework, Tensor... something - I get it, in those fields it is all very cool and useful. But it is also very much for those that work in the weeds. Great for those that do that, but it also feels so lacking of that "Wow!" factor that I think brought a lot of us into this field. I hope we rediscover that one day and soon.

About a decade ago I worked with someone who was on the early Flash team.

One of the tradeoffs of the time was that they had to work very hard to both crunch the binary size down for Flash itself, as well as Flash files, because people everywhere were extremely bandwidth-limited - when dialup was the most common method for connecting to the internet and not even 56k was necessarily ubiquitous at that. And of course, compiler technology hadn't come as far as it has nowadays.

The weight of all the decisions that made Flash able to succeed eventually worked against it, but for its time, it was an extremely good implementation that made the web as a whole more interactive.

I'm happy it's been superseded, but I think that it was the right product for its time.

I think was flash did well was its neat component based framework. It was easy to understand, easy to modify and build your own set of controls on top of it. The runtime engine sucked. It was too slow and therefor power demanding. But the framework itself was much better than todays android , react and others.

That performance was so heavily tied to screen resolution was funny to see. It could crunch a reasonable amount of vector information, but the rasterisation would burn through processor cycles.

There were some projects when I was running on a G3 iMac where I would drop the render screen down really small to see how it would perform on a more powerful machine. It felt like if they even managed to get decent GPU acceleration possible then they could have solved a lot of these issues.

I'm not offering an assessment of the overall sentiment, but it's been a very long time since I've had to think twice about browser interoperability, and it would be an exceedingly rare occurrence for a browser update to cause me any problem.

That is fair. Credit to the Chrome and Firefox teams that this is becoming less an issue over the last few years.

As opposed to TCL, which was a terrible idea done brilliantly!

Flash really was that easy for so many people. Why isn’t there a true replacement?

It is a funny thing. This was during the big push for everything Smart phones, apps and viable streaming changing tastes and the domination of social media. That combo simply knocked it off its pedestal in no time.

The technical legacy of Flash made it a poor fit for touch screens. Apps filled that void with astounding speed.

This part is just a vague opinion I am riffing with - With mass available video, a ocean of video content may have changed tastes enough that the demand for interactive content diminished somewhat. Social media also killed of the dedicated website for a lot of places that used this technology and thus the demand for this stuff.

Once Google via Chrome closed the Flash hole, I do wonder if they will ever let anyone else to try and get in on that space? Time will tell.

I mean what’s missing here is a 2D game maker / interactive app maker that anyone can use and trivially deploy to the web, no?

Same question. Why you bury -kill- a thing without a true replacement?

Its true.

Unity kind of took its place in the game space.

Web pages - nothing. They're all using some boring non visual JS library.

I feel like WebAssembly fits that description, in both the good and bad ways.

Flash is great and tons of devs still use flash. I also use flash in my custom game engine (the one powering The End is Nigh and the upcoming Mewgenics), a different approach to what the dev in this article did (seeing as these are new games and not ports of existing flash games), my approach is to load SWF files as the resource files for art and animation in my game, and render them as vector. Flash puts a lot of information in those files, and I parse just enough actionscript bytecode that I hook and trigger C++ functions from my actionscript parser. This lets us use flash almost exactly the way we used it for making flash games back in the mid 2000s, with all of the interesting workflow tricks and hacks that made it so nice, while being able to write all the actual gameplay code in a real language and render it with openGL so its actually fast.

would love to see a blog post about this! With Flash now discontinued, it's interesting to see how it's being kept alive.

Surprising there is no mention of the wasm/rust-implemented flash player https://ruffle.rs/ in the article.

I recently used Ruffle [2] to get some Flash applications [0] working in the Pro version of my web browser [1], which is specifically designed to be remotely accessible and embeddable in an iframe. To run Ruffle on pages that require it, I utilize the Chrome Remote Debugging Protocol [3], similar to how a Chrome extension content script operates. Ruffle itself relies on WebAssembly and runs smoothly. It's been exciting to see the audio and video functionality of these old games restored and being able to play them again.


[0]: https://github.com/ruffle-rs/ruffle/wiki/Test-SWFs

[1]: https://github.com/crisdosyago/BrowserBox#bb-pro-vs-regular-...

[2]: https://github.com/ruffle-rs/ruffle

[3]: https://chromedevtools.github.io/devtools-protocol/tot/

"oldest ones, including the original Haplands, are AS2 Flash, which runs pretty well thanks to Ruffle."

They did mention it. They use it for some of their older games but it seems like Ruffle isn't yet feature-complete with Action Script 3 and thus cannot be used to run their newer games

While Ruffle's AS3 support is still lacking a lot of features, since a couple months ago it's been able to play some simple games that require it. The build used on author's site is from 2021, and I just checked that the latest build is able to play several more AS3 games hosted there.

(note: I'm a Ruffle dev)

That's great news! I've been meaning to write a cron script or something to fetch the latest Ruffle every week or so, so thanks for reminding me to do that. Thanks also for your work on Ruffle, it's really great.

I'm waiting for it to develop further, I'm very excited, especially because I still play Crystal Saga everyday, I don't know why, but it's got something that I haven't been able to find in any other games.

But sadly, I haven't been able to make Crystal Saga work using Ruffle

MMOs are pretty much the last game Ruffle or any in-browser emulator will get to support - not just because it's likely some of the most complex piece of code you can find, but also because MMOs are likely to use sockets, which AFAIK can't really be accessed in modern browsers at all.

It sounds like adding those features to ruffle would have been three order of magnitudes easier than writing a flash player from scratch

I understand contributing to OSS is a pain (I often end up with half implemented features in my own branch and never manage to merge them upstream) but he could have saved himself some trouble.

Ruffle plays compiled Flash swf files

The tool that the author made reads Flash fla files and exports data from that.

Swf files contain binary data. The ActionScript code has been compiled into bytecode. The vector data is probably also represented in binary format.

Fla files, as pointed out in the OP blog post, contain XML data.

It sounds like OP is doing something quite different from what Ruffle does.

Ruffle tries to be a player for swf files.

OP wanted to export data from Fla XML to other formats, so that he could build executable games.

It seems to me like a very different skillset required to reimplement a runtime like the SWF player vs hacking together an alternative FLA compiler that's just good enough to work on your own games.

Ruffle still doesn't support Actionscript 3.0, I suspect the task is not that straightforward.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact