Hacker News new | comments | show | ask | jobs | submit login
Xenia - An Xbox 360 emulator (github.com)
159 points by devbug 1375 days ago | hide | past | web | 73 comments | favorite



Author here. Yeah, the readme is meant to scare off people. Even with it I still get people on IRC or email asking me if they can play game X or where they can get a copy. The note about no downloads is for all those who download 360 emulators from shady YouTube links and such - there's a surprising number of fakes packed with malware and viruses floating around.

Status of the project is that it's coming along well, after a long period of inactivity. Many games get to title screens though don't draw much. I'm really hoping AMD gets its shit together and releases Mantle soon, as that will make emulating the GPU related things significantly easier.


How many C&D have you gotten from Microsoft's well funded lawyers yet?

Hopefully you are not in US but they are pretty embedded in every country at this point.


Totally off-topic, but I think that's a pretty cool name. I've been looking to marine life for name inspiration myself.


I was just looking at the readme and it says R9 cards and up are required for mantle. Mantle only requires a GCN card and HD 77xx-79xx work fine.


A little bit OOT, why do you use .jp for the domain?


All the other TLDs (xenia.com, etc) are squatted/taken. And .jp is cool :)


Interestingly, Xenia is a brand of Daihatsu car. At least in Indonesia. http://www.daihatsu.co.id/products/highlight/xenia


Maybe upload some screenshots, or videos?


That's a great project and I'm going to dive into the source pretty soon! However I'm not sure I really like the tone used in the README.md... :/


As others have said, the word "emulator" draws a distinct crowd who is otherwise (almost) completely non-technical. Having frequented a lot of developer forums for both, I'd say it's analogous to many of the sorts of threads you get about Android development on XDA.


The amount of traffic this is going to get is probably a bit nuts, so it's pretty well deserved imo.

If your feelings are hurt by someone not wanting to get hammered with idiotic requests/e-mails about a very clear alpha 'project', you should maybe reconsider your ISP payments.


Agreed, A good note to not trust anything claiming to be the finished/working version of this is in its place. But I can't but wonder how those people get there anyway. Linkbait? First many people get lured in then hope some are going to help, be rude to the rest and tell them to get off. mhh I don't know...


It sounds more like a Linus approach of being productive rather than caring about being PC. Worrying about joe schmoes feelings that will just blindly expect a perfect product is the last thing I would want for a hobby. He seems to know his stuff, so I think talent will attract talent.


Makes sense... can you imagine how many 12 year olds are gonna find this as a result of googling "Xbox 360 emulator" and send him poorly spelled emails about how to install it on their parents' iPad?


That's rather neat, even though it can't run the games.

I can understand the tone in the README.md because a lot of 'gamers', even the technically oriented ones, expect a lot from the word 'emulator'.


yea, a lot of people would be submitting issues that they can't run a game without those first disclaimers


Sadly, they probably still do even with the disclaimer. Just my experience with dealing with similar audience to emulators (mobile apps) and making a disclaimer the top line of your summary.


The "I'm doing this and I don't give a damn about you" attitude I think has at least some merit. If the lead programmer is good, its better if he/she can say no. If there is a correlation between being an asshole and being successful I haven't seen it.


Well put.


I agree. I can't stand the tone. OSDev.org does a good job of managing peoples expectations without insulting people. Its possible to be clear and concise without resorting to using abusive language.


The developer made some interesting posts about how the emulator will work on his blog a couple years back:

http://www.noxa.org/blog/


Great link, thanks.


One thing I've wondered... And this is somewhat unrelated, is why the Xbox One doesn't run 360 games. One of the core features of it's OS is the fact that everything is running in a modified Hyper-V environment. So why couldn't MS have emulated the 360 on the hypervisor?! Will be cool to see this project progress and hopefully running on an Xbox One dev kit.


Unlike most of the replies made to your question, I don't think the big problem for Microsoft was emulating PowerPC on an x86 platform to make emulation happen. The real problem with emulating both the PS3 and Xbox 360 is that they were the first consoles with an incredibly complex GPU. The GameCube and PS2 already have chips that are incredibly hard to emulate, and you need a high end PC to do it. The Xbox 360 GPU is much more complex than what's in the GameCube - it's two generations ahead of it with two incredibly complex generations of GPU functions you have to emulate on other hardware. The Xbox 360 GPU can be seen as the very first chip in the generation of GPGPUs to which today's GPUs still belong. It really was a beast of a machine in 2005.

Different GPUs work very differently internally, and the 360 GPU probably is vastly different internally than what's in the Xbox One, despite both being made by the same vendor. The incredible amount of specific GPU hardware optimization, the incredible EDRAM bandwidth and other things that need to be emulated for emulation to work is incredible. That even ignores the performance characteristics of the Xbox 360 CPU - it outperforms even the highest end Haswells today in very specific scenarios (and Microsoft has no hope to emulate this on the weak Xbone CPU). Props to the OP for starting such a daunting project.

If you look at Dolphin's system requirements, the Xbox One does not quite have the power to even emulate a GameCube, a console like the Xbox 360 with an IBM PowerPC CPU and an ATI/AMD GPU but from 2001 instead of 2005. It could never hope to emulate an Xbox 360. It makes me wonder what kind of regular PC hardware is needed eventually to run something like Halo 4.


> The real problem with emulating both the PS3 and Xbox 360 is that they were the first consoles with an incredibly complex GPU

i kind of agree and disagree...

i agree with the idea that the GPU is a big problem, but i dont think the implementation of features on hardware like EDRAM, memexport, half-float textures etc. are especially problematic vs. the general case of the unified memory architecture...

most of those unique features like the EDRAM and memexport were relevant to hardware specific optimisations at the time - modern GPUs can produce equivalent functionality by ignoring the performance characteristics and relying on their horsepower. EDRAM is a very good example of something that you just don't need an equivalent for - you can emulate all that functionality with regular VRAM just fine.

memexport and similar functionality relying on the unified memory architecture is a little more tricky. the real problem i'd imagine would come from the the resulting 'tricks' used for performance and flexibility when feeding the GPU from the CPU side - e.g. being able to memcpy into a vertex buffer or blob of shader parameters instead of going through the DX like interface, which was a genuine and useful optimisation. although iirc MS put some limits on this by failing your cert if you did anything outside of their approved list of workarounds for DirectX performance issues... something which is theoretically very easy to check for in many cases

i can't really see how to workaround that with creating a quite complicated and expensive layer around the memory emulation. for the other features (including memexport) workarounds are possible - if unperformant. memexport is only 'easy' because it is an optimisation provided to do something you could already do but in a much faster way... (and something which is now 'standard' since SM4)

also, as an aside, i think its worth remembering that whilst the PPC CPUs had lots of advantages for fast execution the memory read/write performance was abysmal compared to contemporaneous Intel PC CPUs and many features like branch prediction (always true iirc) and out-of-order execution were also quite far behind the Intel PC counterparts of the time. Today that gap is even larger - although it is certainly true that other things have become slower, these were never the bottleneck in my experience... it was almost always memory and the poor size and performance of the cache.


The 360 ran a PowerPC instruction set while the Original Xbox and Xbox One run x86 architecture. The game would have to be rewritten to run on Xbox One.

I'd assume this makes it difficult.


You can convert instruction sets on the fly, the technology has been around for a while. One big commercial failure that was partly relying on this was Transmeta and their Code Morphing Software: http://en.wikipedia.org/wiki/Code_Morphing_Software

Here is a more general article on the topic:

http://en.wikipedia.org/wiki/Binary_translation


That's the whole point of a VM right? OS X could run PowerPC apps with Rosetta when they went Intel. I just assumed that we could do the same since the Xbox One has a beastly processor compared to the aged 360 one.


PPC OS X apps didn't need any low-level system emulation; they could only talk to hardware via the kernel, and they could only talk to the kernel via system libraries. So at kernel boundaries you could simply translate the PPC system call to x86.

Whereas basically all consoles expose low-level system stuff to the game, such as memory-mapped IO, low-level GPU commands, TLBs, DMA, etc. All of this can't be emulated without a large amount of overhead.

At least 6th and especially 7th generation consoles have such complex timings that emulating the clocks of different parts isn't generally needed. And it's possible that Xbox 360 and PS3 require games to go through their kernel for a lot of hardware access; I don't know.


Rosetta was based on QuickTransit technology: http://en.wikipedia.org/wiki/QuickTransit


Most VMs aren't full architecture emulation. Instead they use a trap to prevent some instructions from executing, but the rest are passed through as native instructions. Breaking out of this virtualized environment could potentially pose a security risk, and it is possible to detect that you are in a virtualized host because timing wasn't as exact as for these intercepted calls as they are on actual hardware.

Intel and AMD added additional instructions that reduced the chance that information running on one VM couldn't be leaked to another VM. This is the foundation of Hypervisors, which are low level systems designed specifically for giving a managed interface for the host OS, but importantly these are still virtualized VMs meaning the computer architecture is still the same as the hardware. (a little hand-wavey but correct enough for the sake of discussion)

Transmeta designed cores that weren't strictly x86 for example, but the technology is more like RISC vs. CISC. By transforming the CISC instructions into equivalent RISC instructions on the fly, the underlying processor is RISC. This is already true of (almost?) every modern CPU, they execute micro code. Transmeta was one of the first to do this. I'm not sure, but Transmeta may have performed instruction reordering in their pipeline at the micro code level whereas others did this at the opcode level. I'm not aware of any instance where they used this to simultaneously provide multiple architectures on the same silicon, although at a glance it seems plausible. It would have been very expensive to build multiple ISAs into the same core, especially if the demand for such technology is nonexistent. By scraping the transistors that would have been used to support multiple ISAs, you can use that space for better pipelines, SIMD, multiple cores, or simply increase the yield, conserve power, and/or make the processor more efficient.

Any of those options would be better, so I don't believe any of these mythical multi-ISA processors exist.

The bottom line is that for the Xbox One to support Xbox 360 code, they would have to emulate everything and there simply aren't enough CPU cycles to make that happen.

Since I'm on a roll, the biggest disappointment was that the Xbox 360 didn't emulate the Play Station. Now obviously the Xbox 360 is made by Microsoft and the PS is made by Sony, but the idea isn't so extreme. A company called Conecttix [1] created a PS emulator for the Mac. The Mac was using the same ISA as the PS, so the emulator only had to emulate the BIOS and peripherals. Sony took them to court and lost. The pivotal piece was that Microsoft bought Connectix and a part of that company lives on in the Virtual PC virtualization software now made by Microsoft. Sony apparently bought the PS emulator and killed it, but imagine if that had gone to Microsoft instead? The Xbox 360 uses the same ISA, so in theory it could have also run a 360 version of VGS. Gamers who didn't have a PS2 might have been able to play their PS games on Microsoft hardware. Microsoft would have gotten hardware sales and Sony would have received money for game licenses.

For this generation, Microsoft would have done well for itself by acquiring OnLive or building out its own server side gaming system as Sony has done by purchasing Gaiki. This would have given the Xbox One the ability to play Xbox 360 games over a remote desktop type of link. I think if the public backlash against the online offerings wasn't so boisterous, we may have seen a service like that at launch instead of the watered down version they scrambled to produce.

The key future proofing component of Xbox One is the ability to run parts of the game in the cloud. This is why the slower core of the Xbox One shouldn't be seen as limiting. Games can be written to push complex calculations to a server farm while the local core handles more pedestrian chores. Extending that idea further, we may see Xbox 360 emulation yet. The Xbox One is posed to win the battle this generation if these long term strategies are given time to mature and be fully realized. The PS4 has some short term appeal, but the gap between Microsoft and Sony isn't as wide as the gap between those two companies and Nintendo.

[1] http://en.m.wikipedia.org/wiki/Connectix_Virtual_Game_Statio...


Nope on Connectix VGS not emulating the CPU - PSX and PS2 both used MIPS CPUs, vs. PowerPC in Mac. It very much was the exact same thing as PCSX or derivates.

The only platform a PSX emulator might have not done full dynarec/interpretation would have been the PSP, and that's unlikely to actually exist for various reasons.


you mean like the POPS emulator that the PSP used to run PlayStation 1 games (to correct the common mistake, the PSX was a completely different Japanese console, which only had the PlayStation as one of its parts).


PSX was a codename for the PlayStation. They decided to reuse the name for their failed entertainment center, but the name predates it by almost a decade.


I was unaware of that. I thought they were both PowerPC designs.


Right now nothing is using the cloud for game processing and the only big game that is pushing features other than save game backup is Titanfall which is getting dedicated servers for each game. Interesting discussion here about what Microsoft can do with cloud rendering

http://arstechnica.com/civis/viewtopic.php?f=22&t=1208703


> Most VMs aren't full architecture emulation.

But many are. E.g. QEMU can emulate PPC on x86.


That is the whole point of an emulator, which is a type of VM. Rosetta was an emulator, however products like VMware are not -- they rely on processor features to execute real code in a sandbox.


Binary recompilation is hard, but it's been done (the 360 did this for the original Xbox, which was x86).

Things that kill you are graphics and sound; particularly texture formats (which you don't have the CPU horsepower to convert) and sound (the 360 has a ton of voices in hardware, and this is difficult to emulate in software).

Personally, I don't think it's impossible. But it'd take the right people a couple of years to make it actually work.


As I understand it, the Xbox One has a 360 API translation layer. There may be some nonportable assembly code around--there almost certainly is--but (speculative) most use of assembly I've seen in the real world tends to be developed alongside a "slow" C/C++ path which may not be so slow on the Xbox One.

It wouldn't be as purely simple as a recompilation, but it's a conceivable amount of work. Probably be more work than is worth it for many titles, but I'm surprised at least the Big Games don't have support.


its not very hard actually - you can map the PPC instruction set to the x86 one with a bit of framework around it. more trivially you can write a C program to perform the same functionality as the original CPU. what this means is that you don't have to re-write the game at all - you run the original compiled code, just not on hardware.

I did this many years ago when I learned about the existence of the x86 instruction set as a stepping stone towards understanding/making interpreters, virtual machines and compilers. I recommend anyone do it as a learning exercise.

This stuff is incredibly simple at its core but there is a common misconception because it is 'low level' that is is some how hard or complicated...

Once i knew it was just simple instructions, registers and a few flags coupled with a memory model it was obvious how to achieve... you write C functions for the various flavours of ADD, SUB, LEA, MOV, FSTP, ADDPS etc. by iterating through the stream of bytes and interpreting them in the same way as the CPU (this is always described in the CPU manual in my experience) you can call the right ones in the right sequence. you use some appropriate blob of memory for your registers, flags and other CPU state and some big array of bytes for your emulated memory...

this is what an emulator is at the simplest level, an interpreter for CPU instructions. (of course making the implementation of instructions might necessitate that you do more - e.g. emulating memory, BIOS or more...)


You'll see a lot of technical answers, all of which are valid; but the overriding reason is that if you're playing your old 360 games, you're not buying new Xbone games. Microsoft takes a loss on every console sold; if you're not regularly purchasing new games at retail prices, they're not interested in serving you.


I don't think Microsoft and Sony think that way. Look at the extraordinary lengths Microsoft and Sony went to in the previous generation to provide back compat -- Microsoft wrote an amazing Xbox emulator, Sony build a whole expensive mini version of the PS2 into early models of the PS3.

And this generation it seems that Sony is trying to provide PS3 emulation using their cloud gaming system.

Some of the benefits of back compat to platform holders:

1) Encourages customers to upgrade sooner and to choose the compatible platform.

2) Increases the roster of games during the first few years of the new console's life.

3) Extends sales of the previous generation hardware and software. (Software because it can be played on both old and new hardware, and hardware because there's more new software available.)


The PS* cloud gaming system I would imagine has very little to do with the PS4 and everything to do with the PS Vita, televisions, phones, tablets. All of which are desperately in need of something to differentiate them from their competitors.


>And this generation it seems that Sony is trying to provide PS3 emulation using their cloud gaming system.

In other words, it's a way to get you to pay for games you already own but can't play on your new console


Xbox 360 used a PowerPC instruction set. It would seem unlikely that emulation could be efficient enough to have it work reliably good and fast.


It isn't necessary to purely emulate every instruction, you can cache translated code segments like a JIT compiler does. It has been done before for fast x86 emulation on a VLIW chipset: http://en.wikipedia.org/wiki/Code_Morphing_Software

More details: http://en.wikipedia.org/wiki/Binary_translation#Dynamic_bina...


It's not just about translating instructions. Console games are extremely optimized and fine tuned for the processor they run on. A block of 40 PowerPC instructions may take 40 clock cycles to run on a PowerPC Processor, but 120 clock cycles to run on an x86 processor once they're blindly emulated. Programmers (and compilers) will change the type of instructions they're using to best fit the situation and the instruction set they will be using.

Then there's the problem where PowerPC is big endian and x86 is little endian so add potentially additional processing for network and file system code as well (models, textures, sounds, etc), in addition to any magic numbers[0] that may be used in the codebase.

While it is possible to emulate, the performance would be abysmal. Just take a look at GameBoy emulators for the PC. They use massive amounts of CPU due to the necessary overhead to emulate the processor, graphics, sound, etc. Trying to play a game like Call of Duty or Grand Theft Auto emulated from PPC to x86 would just be sluggish at best.

[0]http://en.wikipedia.org/wiki/Magic_number_(programming)


I'm surprised the 360 used big-endian mode (almost all PPC implementations are bi-endian, after all!), given the prior Windows NT PPC release was one of the few to use little-endian for the OS (note that one could change endianness on a per-thread basis! — anyone know if anyone did this on the 360?), and using big-endian makes it inconsistent with everything else Windows.


Don't know how it is with the modern console games but at least in the old days the developers where relying on all kinds of tricks that made really accurate emulators difficult to make. See for example the story about "perfect snes emulator" [1]

[1] http://arstechnica.com/gaming/2011/08/accuracy-takes-power-o...


In reality you could emulate it. The problem is the xbone doesn't have a powerful enough x86 processor to emulate the powerpc 360. It was around 2010 that mainstream computers matured enough to easily emulate a ps2, for instance.


> It was around 2010 that mainstream computers matured enough to easily emulate a ps2, for instance.

The PS2 is an incredibly weird architecture, though. Lots of different processors requiring really strict synchronization in weird ways. It's not that the raw horsepower wasn't there for emulation (it has been there since ~2004), but that it's just strange enough that emulating it is brutally difficult.


Yeah, we emulate the Gamecube (which was more powerful) much more efficiently.


Not as weird as the PS3.


It would require emulation. And while it's fairly easy to emulate apps compiled for powerpc to run on x86, it's much harder to do so with games while maintaining performance.


I'm amused by how many video game emulator devs insist that they're not for playing video games. (MAME is another big one.)


And why is that difficult to believe? My previous emulation project (https://code.google.com/p/pspplayer/) was the same kind of exercise: do something no one else had done and learn a lot doing it. As a hobbyist the act of doing something is far more fulfilling than the output.


It's not difficult to believe, it's just funny. Devoting an enormous amount of time and energy to understanding and reproducing a device whose only purpose is to play video games while insisting that you're not interested in video games is a pretty funny thing to do.

I mean, I obsess over odd stuff too, I'd guess most of us do, but it's important to keep a sense of humor about it.


Oh, for sure. I thought you were implying that I only was doing it to play/steal/etc games (as many people assume). I've got a pile of awesome Japanese import games I picked up the last time I was there that I'd love to play, but the region locking on the real consoles prevents me. If I had an emulator, that wouldn't be a problem. Not the only reason I'm doing this, of course (I could just buy a Japanese console), but it's a reason nonetheless :)


Might such a comment be used against you? Just curious.


It's fair use. If a book is not available for sale in the US I can travel to the UK and buy a copy and bring it back with me and read it. It may not be what the publisher wanted, but it's a legally purchased good used in a legit way.


In at least the current legal framework, this declaration of intent is [supposed] to help avoid lawsuits.

In theory, the emulation of a system, in and of itself, is not illegal. The copying or unlicensed use thereof, is.


Non-piracy emulation projects are the only way to explain NES emulators written in Visual Basic and JavaScript.


Never been curious enough about how something works to try to replicate it?


Can't exactly download/build this on my current machine. What does it actually do so far?


Step 1.) read the readme.


No currently supported features, but I assume it actually does...something. Or serves some early purpose towards the development of a 'complete' emulator.


I'm really curious about which new (Windows 8-only) APIs are potentially useful.


There are a bunch related to D3D, and I'm currently using RtlAddGrowableFunctionTable in the JIT to get proper stack walking (previously this was really painful). There are also some kernel primitives I'd like to take advantage of once things are working to do some cool memory tricks and hopefully speed things up.


Time to start the cease and desist countdown.


While there are other ways to screw up, like promoting illegal ROMs, emulation itself is legal thanks to Sony vs. Connectix.


I wish Connectix had stuck around and continued making Emuators. Virtual Game Station was awesome.


It was awesome, but EPSXE and others surpassed it years ago :).




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: