Hacker News new | past | comments | ask | show | jobs | submit login
That WoW server blade (augustl.com)
104 points by facorreia on Aug 8, 2014 | hide | past | favorite | 64 comments



Wow, the memories and the pain. Its weird because I spent enough time in Azeroth that I think of it as a 'place' and as a place with fond memories that I cannot go back to as Blizzard has moved on, its oddly painful for me. It was the only game where the social aspects were as fun, or sometimes more fun, than the game play.

That said, I worked at NetApp when WoW first released and their architecture at the time was these blade servers talking to oracle database instances that were using EMC SAN boxes for storage. We tried to convince them they would have less down time if they converted to Oracle over NFS as Oracle had done in their big data center in Texas.

As an engineer it started me thinking about the whole 'world as a database transaction' sort of model of things. How that got built, where did latency matter, where did it not? What could you un-do and what had to be at-most-once. And then the scale of that with respect to localizing transactions when actors (characters) were within scoping distance of database changes. Quite the interesting challenge.

Oh and the 'weird custom board' on that blade is a compact flash to IDE adapter. The blades booted from compact flash.

EDIT: Hmm, not CF adapter, going back and looking again I think that was the NVRAM card which allowed them to recover transaactions after a server crash.


>'oracle database instances that were using EMC SAN boxes for storage.'

I can only imagine what the bill for that setup looked like. Did they jump all in with a Symmetrix?

>'It was the only game where the social aspects were as fun, or sometimes more fun, than the game play.'

Personally, the social aspects are what eventually drove me to quit MMOs for good (so far) not long into the life of WoW.

It seems to me that a lot of people look to MMOs for escape and a situation where the game world is simultaneously one person's preferred life and another's play time is a recipe for trouble.


>Personally, the social aspects are what eventually drove me to quit MMOs for good (so far) not long into the life of WoW.

Likewise, though for a different reason. I never engage enough, even in a guild or similar, I guess because I usually end up playing MMOs without RL friends jumping in, and I find the limited interactions sufficiently boring I eventually leave the game, as the gameplay isn't enough in any MMO to keep me.


Once you start 10 man raiding with skype, the social aspect really kicks in. It becomes much more like a LAN party even if it is with people you don't know very well.


Moved on how? I haven't really played except for the original beta, so I haven't been following WoW news much.


Regarding the social aspects, they've made the game so accessible to casual players that they've eliminated the need to form strong social bonds for 99% of the content that people play.

You can level pretty quickly just by soloing, so you don't need to find a buddy to help. With the Dungeon and Raid Finders, you don't even need to join a guild or have a set of regular friends for tackling harder content.

With the battleground queues, you don't need friends to participate in PVP.

Unless you want to attempt the harder versions of the same Raid content or try to compete more rigorously at PVP, all of the classic reasons to meet and keep friends in WoW are gone.


Very interesting. There's "raging" debate over in some Destiny threads (which shares some MMO traits) regarding the lack of matchmaking for certain raids (the designers require you to form up your own band of friends, because it's intended to be a difficult challenge that a random lot could probably not take on) and whether or not things should be accessible to all or tailored for a more dedicated few. I saw several comparisons to how WoW had watered the experience down, and now I know more. Thanks!


Well when I use that phrase there are two aspects to it, to my taste they over simplified the game mechanics, even in things that were silly (like cooking recipes) in order to be more accessible. Second in the Cataclysm expansion they blew up Azeroth to create more content and deal with some of the aging older content. (For example it was fun to go in and solo Molten Core with my Paladin, I remembered doing it with 39 other people once a week trying to get Tier 1 raid gear.) To facilitate that leveling happened much faster they condensed the starting zones, destroying old content and leaving some stuff that no longer made sense in the new world. But the killer for me was the pandas and the even more watered down mechanics. I played for 10 years, making it the most money I had ever spent on any video game ever. (over $1500). The economist in me finds it interesting that I would probably still be a subscriber if I had a way to play on the pre-panda world, preferably WotLK but Cata would be acceptable. No doubt though there are 5 more people that do subscribe for the same reasons I don't. So Blizzard makes it up in the end. It has made me wonder what it would take to build a replacement, and it has been interesting looking at the alternatives that have popped out since.


There's lots of debate around this topic - but some of the opinions are that the new expansions have made leveling far too easy. And that some items that you had to grind for days/weeks/months to get (like the Green Proto-Drake) are now available for enough gold.

The advantage is that the game is a lot more approachable for the novice (free to play up to level 20), but OTOH, it moved a lot of the challenge to raiding and end-game, which a fair number of people have no interest in doing.


They've remodeled Azeroth (the world has changed). Part of the reason was to allow flying mounts, but they also had historical transformations.


There's nothing special about this blade, it's a standard "off the shelf" HP blade that anyone can purchase.


I've worked extensively with HP equipment. I've just sent August an email with a ton of extra detail about this blade.

The 'weird custom card' mentioned in ChuckMcM's comment is actually a SmartArray 6i raid controller. The black slot is to put a small stick of ram into to act as cache.

Happy to answer any and all questions about the server :)

Out of interest, there are two CPUs in this server, and they're both an AMD Opteron 275 (2.2 GHz, dual core). The 512MB Hynix ram sticks would have come from the factory, while the 2GB micron sticks would have been an upgrade.


Thank you for reaching out and helping to complete the story of this box.

Maybe someone from Blizz can even tell them what the server's role was...


Posting the email I sent:

I saw your blog post about your WoW blade, via Hacker News, and couldn't help but want to contribute.

I have worked as a sysadmin and hardware specialist for the past 5-6 years, with a lot of experience with HP equipment - I was wondering if you'd like me to help provide better descriptions of any of the parts of your blade? Extra detail like the fact that Hynix ram came in the machine from the factory, but the micron ram was added as an upgrade. Your machine probably only ever had those 6 sticks in it, as the next generation of servers used newer DDR2 and wasn't compatible (hence no reason to remove ram from this blade). That blade is what is called a 'half-height blade and it would have sat vertically in a Blade Enclosure, ie, the HPBL25p text would be horizontal. Blade enclosures usually hold up to 16 half-height blades, or 8 full height blades, but yours is a p-class blade, so it would have been in an enclosure for half-height blades only (8 blades max). The network ports and all other connections are on the blade enclosure. If you'd like a full spec sheet on your blade and the enclosure, you can look at this PDF from HP: http://h10010.www1.hp.com/wwpc/images/ap/BL25p_v7.pdf

Regarding your concerns about the CPUs and heatsink paste, etc: The CPUs will actually be attached to the heatsinks and you should simply be able to lift them out as a unit. This is done so that if a CPU were to fail in the blade you can replace the CPU in just a few minutes. The detail of which CPU you have is more than likely attached to the underside of the heatsink as well, next to the processor, but you can actually tell which one you have from the model number (that sticker that says 392439-B21). It's an HP ProLiant BL25p 275 2.2GHz-1MB Dual Core 2P 2GB Blade Server. This means your CPUs are AMD Opteron 275s, and you have two of them. The '5' in BL25p also indicates an AMD CPU. Intel servers end in a zero.

I can also tell you what most of the bits of hardware on the motherboard do too, if you like.. I'm happy to annotate photos. The large green card towards the rear of the server with the little silver heatsink and the empty black slot is the hard drive controller. In this server the model would be a SmartArray 6i. The slot is for a small stick of RAM (128MB for this model) that the controller would have used for cache - it probably never had any installed though.

As general information: the little 'add in cards' are called daughter-boards, and are actually completely normal in servers, especially of this size, partially due to space constraints. The main reason for daughter boards though is so you can quickly and easily replace a failed component. Servers are generally designed to be easily and quickly serviceable. I've personally replaced a server motherboard in under 10 minutes (from power off to power back on again). It's generally all tool-less and extremely modular.

The clear magnetised lid on these server is definitely not standard - Blizzard must have added this when they decided to memorialise the server. The standard lid would be metal and held on with a quick-release lever mechanism.. I really like the magnetised approach though :)

I can keep going on, but yeah, let me know if you'd like any information or more insight into the server. Glad to see that it's in the hands of someone who obviously cares about the equipment though :) it's nice that they made them into a collectible rather than just selling them to a used equipment vendor.


I would really like to know the software architecture of a WoW realm. (Beyond the basics of... one blade for each continent).

OS, tools, programming languages, how did the different parts of the software (such as, again, continents) communicate between themselves... For example, I was told once dungeons and raids were/are scripted in Lua.


>Beyond the basics of... one blade for each continent).

On launch there was 2 continents, but 4 server blades per server.

Scripted events take place in Lua. But most raids until I believe Cataclysm weren't scripted so much as it was just mob abilities + cool downs.

Note: Not a wow server architect just played way to much WoW


These things I know. WoW server reimplementations are pretty monolithic and that causes a lot of problems on heavy loads. I know of a private server that's working on having some "load balancing" to avoid that.

What I mean by this is that currently the only way to have lots of players playing at once in the same realm is to buy very expensive hardware. That plus the amount of DoS attacks you receive if you are popular means that having a private WoW server online is anything but cheap.


To get an idea how they grouped zones a bit, this picture was taken showing the expansion zones for Burning Crusade, apparently you could move between them with a few tricks.

https://i.imgur.com/768UM.jpg

I was always more impressed with Asheron's Call management of areas, as they initially had one zone friend world, it would load land blocks as players accessed them.


Whoa. I'm pretty sure that screenshot was taken with around 1 fps. :)


3.7 FPS (top left).

I've done a lot of reverse engineering of World of Warcraft as I worked on its internals for years. If this is an interesting subject I might write about it. Anything specific people would want to know?


Mahouse, your comment is dead for some reason.

I've mostly learned about the internals of the WoW client itself and how it interacts with the server. I suppose I will eventually write about it when inspiration strikes me, but I'll be happy to answer any more specific questions you have. It's a very broad subject.


It's dead because a moderator removed it, it was too personal I guess. Any mail address I can use to reach you. Thanks.


Of course. On my profile.


>I would really like to know the software architecture of a WoW realm.

>OS, tools, programming languages, how did the different parts of the software communicate between themselves...

I don't know the details of the WoW software architecture but I know there were a lot of weird side-effect bugs in the game.

My favorite weird bug was the one where you couldn't craft a stack of items (like bandages for example) when you were wearing 5 pieces of your class specific items set. So rather than crafting a stack of 20 bandages, you'd have to craft them one bandage at a time if you were wearing 5 pieces of your class set. If you took off one piece of gear you could craft stacks.

I spent a bit of time wondering what type of design would cause that sort of bug....


More than likely it would sit on top of an Enterprise Service Bus (ESB) architecture, using XML to pass information between the servers. The general term for this is 'message passing'. A software example is IBM's Websphere MQ, or the free RabbitMQ.

There's a lot of detail to go into in this field, suffice to say that you can easily make a career out of knowing how to design efficient ESBs.

Obligatory Wikipedia link for more info: http://en.m.wikipedia.org/wiki/Enterprise_service_bus


This seems inconsistent with the person above who said they worked for NetApp and on the WoW realms in particular.

They said it was more SAN + a centralised Oracle DB. Which is a pretty common approach. The Oracle DB keeps the different blades in sync but also allows trivial data transportation.

XML seems unlikely unless they were storing XML within Oracle but that seems odd/insane for their use case. It might make sense if they were sending data geographically but as far as I know WoW realms are in one location only.


These are actually two separate things. The data would definitely be stored in a database on a SAN, and a NetApp SAN plus oracle DB would be just fine for that. This represents the database/storage layer of the operations.

The part I'm talking about is the behind-the-scenes communications layer for the application itself - how you pass information between servers in real time with low latency.

Think of things as a stack. You have the top layer which is the application. It runs on an individual server as a process, or set of processes. You then have middleware, which is software that handles communication between applications, whether it's on the same box, or between servers. Middleware is also responsible for handling communication with the database (oracle in this case possibly). This is the database or information storage layer.

Ie: application layer --> middleware --> database.

This middleware is the portion I'm postulating could be using XML, based on my general enterprise experience. It's not stored anywhere, it's simply a transport mechanism for data between applications when you don't need to store it. More than likely it would contain information such as 'this user is entering the battlegrounds, please create a slot for them with these details', or 'this player just caught the boat to another continent, I'm handing them from me to you and here is all their information'.

XML is great for this kind of 'live' data transportation, although I probably prefer json these days..


I don't understand, why wouldn't you just use the centralised database you already have as opposed to creating this XML communications middle-layer that seems to be redundant with a centralised data store?

What are your savings doing it that way? It just seems to cost more of everything (resources, man-power, etc).


There's a few benefits, the main one being decoupling of the application and the back end. This way, rather than your application developers needing to know how to talk to the database, they simply use an API to talk to the middleware, which never changes even if you replace the back end database software (moving from Oracle to DB2 for instance). Think of it like a translation service, in this regard.

It also helps reduce overhead and latency. Imagine if you had to constantly write the actions of users to a database in real time for 50000 users, including all the metadata, just so your servers could communicate amongst themselves. The latency involved would be huge, even with today's vastly-better equipment. A fast middleware layer greatly helps to cut down on the data that needs to be written to the back end system, while also improving latency, for what's really not a lot of extra complexity. Two birds with one stone, so to speak.

This approach also helps you scale - you can treat everything as a node, with the middleware data as distinct messages. Messages are created and consumed from queues, and so you can have multiple application servers and multiple database servers, with applications publishing messages onto queues for processing, with the database servers consuming these messages if they have the resources to do so. Adding more capacity to the system then becomes as simple as creating a new database server and telling the middleware it exists - the extra resources are automatically used.

Most enterprises work this way.. The basic outcome is that you can start treating the components as a service, with easy separation of duty.

There's more detail here: http://en.m.wikipedia.org/wiki/Middleware


It's important to note that I wouldn't recommend an ESB architecture for small scale.. You really only want to be looking at this sort of thing when your server count is 20+ and you have separate teams handling your application, databases and middleware.


I've observed some messaging behavior in the game -- selling at a vendor can take variable amounts of time, implying that the buy/sell transactions are via a service and not local to the gameplay server.

I expect they use a hybrid approach - some monolithic, some message-bus.


Oh man, my first year of college I was in a freshman seminar and idly bid on one of these with a fairly low offer (~$200). Twenty minutes later I was the proud owner of a WoW server.

My lack of foresight has led to it being stuck, still in the original shipping box, in my parent's basement.


I envy you greatly. When these were put on sale a freak snowstorm happened to hit the northeast and me and everyone else around had no power for close to a week. I was after one of the more populated servers (Kil'jaeden) and my phones battery just didn't last long enough to get that final bid in. I still regret not just putting in a higher bid earlier that day.


I just found my winning bid e-mail: I got Quel'Thalas for $212.50 on 2011-10-24. I remember that freak Halloween snow storm.. somehow it managed to avoid hitting me the capital region.

I'll probably sell the server at some point but I have no idea what a fair price for it is. Perhaps I'll sell it using a double-blind auction.


You are lucky, I didn't bid because I thought it would go way higher than theses prices. I still regret it.



Out of curiosity, do you remember how long my blog was down? Linode reports 100% uptime and no monitoring show traces of congestion.


I still wonder about how many players those servers could hold, and what about other MMO games networking architecture ? For examples I know that in guild wars, once you get out of town, you don't see other players. In wow they also merged many zones so they are "cross realm".

How does EVE online work by the way ?

I'm fascinated by this stuff. I wish there was an unique realm MMO game like wow.

I guess EVEO is unique level.

I'd love to see a game using more p2p architectures to enable small parties to offload servers.


Patrick Wyatt (worked at blizzard on WC, starcraft, battle.net, then on guild wars) blogged a bit about the design of some of these games: http://www.codeofhonor.com/blog/scaling-guild-wars-for-massi...

Worth a read


Less tech-oriented and more design-oriented, Alexander Brazie has blogged a lot about various design decisions he and his team made at the time in WoW.

https://alexanderbrazie.blogspot.co.uk


>'I'm fascinated by this stuff.'

Same, I'd be most interested in seeing how WoW was initially built to mitigate the lag that plagued MMOs at the time. Handling that well was key in my decision to play the game at all since it was critical for PvP play.

>'How does EVE online work by the way?'

Here's a pair of older articles from my bookmarks about engineering of EVE.

1: http://penny-arcade.com/report/article/planning-for-war-how-...

2: http://massively.joystiq.com/2008/09/28/eve-evolved-eve-onli...


IIRC the demand at launch STILL overwhelmed their setup. This is a system that has evolved over the last ten years.

>'I'm fascinated by this stuff.' Yeah, it's a world apart from your standard web site / services model.


Unless it's changed since I last played EVE, the EVE servers all reset every night so EVE is down for a reasonable time each day. In WoW the equivalent happens only once per week.

The WoW 'cross realm' zones are also the "instanced" zones so while they are cross realm they actually hold much less data and people than the single-realm parts. Even the 'cross realm' areas are still restricted to a "battle group" of realms of 5-6(?) realms.


Daily downtime is 11.00 to 11.30 UTC, usually less than this (10-15 minutes most days).


I share your level of enthusiasm with this. I've done a bit of work on a project with ZeniMax concerning the virtual environment for ESO's servers. Let's just say that the magnitude and scale of their server rooms is enormous, and they build virtual environments on top of enormous scale to increase thresholds.


EVE, I believe, doles solar systems out to different servers (and shuffles them around depending on load). This results in slowdowns for enormous battles - they're too big for one server to process in real-time, so ticks get further apart.


The effect described in the pdf linked at https://wiki.eveonline.com/en/wiki/Grid_Manipulation suggests that eve does something more complicated than that.


EVE Online:

-server code still in python (single threaded)

-one cpu thread per solar system

-during big battles with >2000 people server slows down time(tickerate) and makes game unplayable


oh so yeah I'm not missing anything...


What's the definition of a blade? I thought it was a diskless server but this one used to have a hard drive.


>'What's the definition of a blade?'

I'm not sure there's any hard and fast definition.

Generally, a blade is going to share power at least, likely network uplinks as well. Diskless configurations are common, with some sort of shared storage being used, but local disk is finding its way back into blades via SSD.


Making diskless servers work is too complex for most admins, so blades are generally sold with disks. Blades are generally cable-less and share power, cooling, and networking within the chassis.


To me a blade is a server that you have to have a blade chassis in order to use it.


4 blades to run an entire realm? That's more impressive than I thought.


That was probably the case back in vanilla (pre-expansion) WoW. With the introduction of cross-realm battlegrounds in 1.12 - a month before the first expansion - followed by the activation of arenas in the first expansion, they most likely considerably expanded their server capacity to accommodate for the increased load. I remember seeing news articles back in the day just a few months before Burning Crusade's launch that Blizzard had bought a massive amount of servers, but the only thing I can find now is this blog post: http://www.rahulsood.com/2006/02/blizzard-going-crazy-for-op...

However, it's important to note that around this time Blizzard was heavily improving their Battle.NET service and launched a download service where you could register your CD keys from their previous games such as Starcraft and Warcraft III and download a fully working ISO that did not require the physical CD to run. The server purchases may have gone towards that instead.

Would love it if Blizzard wasn't so secretive about everything, especially their tech. CCP, makers of Eve Online, have been very forthcoming in talking about their entire stack in their developer blogs and videos - from infrastructure and hardware, to how they profile their code and the software design choices they make.


The market for private WoW servers in the past has been a major concern for Blizzard and I'd imagine has been a large motivator in their decision to closely guard their infrastructure design.


No, they just sold 4 blades per realm for a charity auction.

A few months later the used equipment market was flooded with the same blades, so I imagine they offloaded quite a bit of hardware.


I wonder how many blades per realm they use now with the new continents and whatnot.


[deleted]


That's exactly what he says on the web page. :)


Does anyone know when exactly they changed their specs?


[deleted]


"If your account is less than a year old, please don't submit comments saying that HN is turning into Reddit. (It's a common semi-noob illusion.)"

https://news.ycombinator.com/newsguidelines.html


The parent comment was deleted, but I saw this on Reddit[1] before it popped in here.

[1] http://www.reddit.com/r/wow/comments/2cyu9d/closeups_of_wow_...


You don't think it's possible that the same content can be interesting to audiences of two different sites? Most of the stuff at lobste.rs is reposts from HN, and stuff from HN gets posted to Reddit as well. I see stuff from HN on Ars Technica and CNet occasionally, and vice versa.

It's not a competition. It's the Internet. We're all the same.


Not everyone subscribes to /r/hackernews. I had not see this post before




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: