Hacker News new | past | comments | ask | show | jobs | submit login
[dupe] Emulating a PlayStation 1 (PSX) Entirely with C# and .NET (hanselman.com)
60 points by LyalinDotCom 28 days ago | hide | past | web | favorite | 35 comments

> We often hear Full Stack in the context of a complete distributed web application, but for many the stack goes down to the metal. This emulator literally boots up from the real BIOS of a Playstation and emulates the MIPS R3000A, a BUS to connect components, the CPU, the CD-ROM, and display.

As I say probably too often, the modern usage of "full stack" basically equates to "iceberg tip".

"Full stack developer" is short for "Full stack web developer". Outside of web development it's rarely used.

For other areas, we usually talk about "Systems programming" or "Application programming". I suppose the full stack there would be to do both?

It's just that outside web programming, very few developers/programmers I know would refer to themselves as being able to program the "full stack", presumably meaning from the ISA level up to database programming, GUI and network programming etc etc. although there are plenty of people who can do all these.

Even the omission of the word "web" as you have pointed out, seems to have the smell of arrogance, as if web programming is all there is worth working on these days. But the web programming world, IMHO, is full of this kind of CV inflating jargon, lingo and buzzwords, whether it's "full stack", "Agile", "devops" etc.

Obviously, there is an element of old man "get off my lawn" that I am invoking. I am almost certainly projecting here, but it still grinds me the wrong way.

And I hate the term there. I’ve seen too many “full stack” Microsoft developers who couldn’t troubleshoot the simplest database or IIS issues. Let alone figure out why their web app wasn’t communicating with a back end dependency.

Even if they don’t have the authorization to fix it. They should at least be able to tell the ops folks what they need.

>As I say probably too often, the modern usage of "full stack" basically equates to "iceberg tip".

I'd never heard it prior to five or ten years ago.

It's always been a pretty ambiguous term IMO. I think it may have been coined to make some genuine distinction but just how broad was it intended to mean? The question is moot because it was quickly spread to mean very different things. You need a lot of context to make a good guess what it refers to.

I've only ever heard the term relating to web or mobile applications programming, and more often than not, where the application is cloud hosted. See the following for an example:


Important bits almost always left out: provisioning of hypervisors, network switching/trunking/routing, security, embedded on the backend, etc.

Some copies of my CV say "full stack developer from the transistors upwards", which is partly a joke and partly a reference to actually having written everything from web applications to IC design.

(Should I link my professional life to my HN account in my bio? Is that the "done thing"?)

You can do - but then I found myself being discussed and doxed on a forum once, from a very uninteresting or controversial comment I'd made. So up to you...

My background in Electrical Engineering has really helped me as a developer. Too often we ignore the computer in computer science. If you are unfamiliar with how you get from silicon to transistors, to logic circuits, to CPUs, then it's worth reading up on.

I've had the same experience. Understanding how the thing you are programming actually works can really help, at least in certain domains.

Of course, that's silly. Good C# can run at near-native speed given all the work happening in the runtime/JITter, etc.

The keyword being "near". It's like all the "Java is fast" articles out there --- in theory, you can write microbenchmarks that show your language is fast, but in practice, everyone knows what it's really like.

Of course if you have a powerful enough CPU and enough RAM you could emulate any system at a decent speed with any language[1], but I'm willing to bet that if you ran this C# emulator on the best hardware from ~2000 (which is roughly when the first PSX emulators appeared), and compared it to the (native) ones that were around at that time, you'll see a huge difference in performance. I remember a 1GHz Pentium III with 256MB of RAM would've been sufficient for those.

[1] The same "fast enough" justification being used to praise writing applications in JavaScript and bundling them with an entire web browser: https://en.wikipedia.org/wiki/Wirth%27s_law

So long as you write fast C#, C# is pretty fast. But you need to make C# that effectively compiles (JIT's) to the same assembly as a C or C++ program would. It's simply staying with a subset of C# that looks a lot like C.

It's really hard to do because you can't easily see when you pay for an abstraction (is this foreach slower than a for loop? does it allocate an enumerator? can it unroll? will this call allocate a heap array for its params? Will this call be devirtualized?)

The "Burst compiler" in Unity takes it a step further: it restricts the language to those things that are actually always fast. https://docs.unity3d.com/Packages/com.unity.burst@0.2/manual...

I really wish the C# team could do something similar: add a fast AOT subset to C#.

I would have thought the problem with using C# for a game wouldn’t be so much the execution speed (which should be pretty close to native) than the freezes required for garbage collection.

Yes, that's one of the problems. So you want to have zero garbage collection. That means you want zero garbage creation. That in turn means not allocating on the heap in the game loop, and that's rather easy to avoid doing explicitly (just don't use "new" for class types, for example!, use pooling and structs). But C# makes it tricky because a lot of the BCL types allocate under the hood. Some times even the compiler inserts allocatoins from syntax sugar (parameter arrays for varargs). These are pitfalls that the Burst subset tries to solve.

Unity is C# and there are many games which are fast. If you don't want GC to slow you down - just don't add garbage.

You just don't create garbage. Object pools and stack variables.

We'll see. Unity promised the world with IL2CPP as well.

You'd be surprised. Many people underestimate Java/C# performance. Also, for Java in games, there aren't that many points of reference, because the only popular Java game I am aware of is Minecraft.

However, C# has better potential for performance than Java right now, because it has value types and supports pointers in unsafe mode, which are very useful for high performance code. Java is catching up in this area, with Project Valhalla and Project Panama it should get there at some point too.

Years ago I saw a essay where someone took some Java code, translated to C#. Then performed a bunch of optimizations using value types and stack based structs. And it was about 10 times faster than the Java version. Because it wasn't hitting the heap anymore.

There is a Minecraft clone running on Mono. I am yet to test it though.


I think your core point is somewhat valuable, but does it really matter if any modern software could run on such outdated software? Why would I care if my software couldn’t run on a 20 year old machine? The new Raspberry Pi 4s have better hardware specs than the machine you gave as an example.

Also, sometimes “fast enough” is just that - fast enough. It is not necessarily true in every case, but sometimes it is.

Yes it does. People are much more likely to run your software on a battery powered device these days. This can have a massive impact on battery life.

In my experience, for nontrivial (but hot) code Java’s performance is within half a order of magnitude of native C++. Sure, calling it “near-native” is a bit of a stretch outside of certain specific cases, but in the general sense Java is not slow and “Java is fast” is becoming more and more true with time. FWIW, your comparison with obsolete hardware isn’t really relevant: the language and interpreter at the time was much poorer than it is today, and there’s no way we’d be able to run today’s C# on those.

> ... but in practice, everyone knows what it's really like.

Well at least we know how a NIC driver written in C# performs. Was pleasantly surprised it was very performant.

Not that you would want to replace C/Rust for that task, but anyways it is on-par with Go for this particular task: https://github.com/ixy-languages/ixy-languages

You could run this emulator against other "native emulators" (I understand your meaning but surely you see the humor there) and compare and contrast the CPU/memory usage today. I expect you'll be surprised - basing this on my experience developing rendering engines in C# and C++, though I cannot speak for Java. In any case you don't need a time machine to test your hypothesis.

> 1GHz Pentium III with 256MB of RAM would've been sufficient for those

I remember running games on Bleem (yes, I actually bought Bleem) on significantly less. I know it ran on my family's Pentium 166Mhz with 64MB or RAM, but it probably wasn't full speed. I am fairly confident it ran on full speed on my 2-300Mhz P2 laptop though.

The main difference is that those old emulators JITed MIPS machine code to x86. This project uses an interpreter.

Have you tried it or do you follow the proof by assumption paradigm?

Hoping SH up-streamed a PR for the shift to .Net Core ... Looks like it's using Winforms... Would be interesting to see the .Net core port, and then shift from winforms to eto forms or something else cross platform.

Wow thanks for the pointer to Eto.Forms - cross platform Windows Forms including Linux, OSX, and iOS (but not yet Android) is pretty awesome to hear about

Yeah, there's actually quite a few different cross platform projects that support basic to more complex forms. As a fan of Material Design and "The UI formerly known as Metro" I wish more of them would adopt one or the other as a baseline though.

I just am most familiar with Eto off the top of my head... mostly because of PabloDraw.

Maybe this should be a link to the actual project rather than someone else's article telling us that this project exists (even though I really do enjoy reading Handelman's blog)?

It was posted last week already: https://news.ycombinator.com/item?id=20917742

Okay, then this post really feels like a dupe then since it isn't a link to the original authors' work.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact