Hacker News new | past | comments | ask | show | jobs | submit login
Why aren't young programmers interested in mainframes? (stackexchange.com)
50 points by siamore on Jan 15, 2013 | hide | past | web | favorite | 54 comments



I remember being offered a local position that required me to maintain the company mainframe (with RPG2). It was quite interesting because they were doing a lot of things by hand that could simply be achieved with a little scripting (in this case, Python and Perl). Things like daily sales numbers were still added up in some obscure Excel spreadsheet (this is the biggest restaurant franchise operator in the Caribbean). The potential to innovate inside the company was huge. But no, their mainframe guy made it clear that the job was to continue on with his legacy of writing obscure job-securing (as he put it) code. I declined all the offers they made, and they are still looking for someone to fill in the position (and will be for a long time). It was sad to experience such situation. A new chance to inject some productivity inducing innovation being shut down by mediocrity. Still, it was fun talking to an old RPG programmer. Even C sounded like an exotic language to him. :)


I'm 28 and I work in mainframes. Or rather, I write compilers for mainframes and I have to say that I disagree with the top voted post.

Mainframes have an amazing CPU architecture that is different enough from x86 to really open your mind. Ever move 2GB of memory with 1 instruction? (Well 2 really if I'm being pedantic.) It's got what's easily the industries most complicated branch predictor, and runs at well over 5Ghz. If you're interested in that, you should check out the slides from Hot Chips. Oh right, and it's got hardware transactional memory.

Now I'll readily admit that using z/OS through a 3270 terminal is quite painful, and using USS through SSH isn't much better, but thankfully mainframes now run RHEL. That's right, big iron runs linux. Yes, you'll likely have to recompile your favourite software, but you can run vim, emacs, bash, zsh, and whatever else you want, and you don't even have to deal with EBCDIC.

As for the comment that there's no development on mainframes, well that's just plain wrong. There's a ton of development, to the point where nearly every single financial transaction you make in a day goes through a Z mainframe, and nearly all major retailers use one.

You just can't beat the security and stability. Can you think of another platform where you can hot swap EVERY part? Reportedly, the Z in SystemZ stands for Zero downtime, and that includes OS upgrades and hardware upgrades.

No, you're not going to write PHP, Ruby, or node.js on a mainframe, you're going to be using C/C++, Java, or COBOL (surprisingly good language in some cases). But you're also not falling behind technologically.

This might come off as preachy, but I really do enjoy working on zLinux, it's a really nice environment, and I think that young people might actually enjoy the experience of working on something that's not x86.


I am interested in mainframes. They have some pretty awesome hardware, great reliability parameters, and some seriously interesting modes of operation. You know how we like to throw data at a hadoop cluster and do a lot of big batch map-reduce jobs? A lot of those end up being very similar to the types of batch jobs mainframes are great at. Distributed transactions/shards/etc for our databases? Again, stuff mainframes have been doing for decades.

The problem as I see it is access. Sure, like the SO poster mentioned, Hercules exists, but you can't really run modern mainframe OSes on it (legally and easily that is). Further, you can't really get the same experience, because you don't have the hardware offload or partitioning available to you for real.

Heck every time I think I should look into the beasts, I can't even find decent technical documentation describing things.

Compare most of the stacks people on this site use - there are complaints if the documentation isn't formatted right, let alone if it is incomplete or wrong...

I regularly see things people are talking about and/or doing here, and think "man I think mainframes solved that years ago, it sure would be nice to play around with doing this on a Z series". Of course since I can't just do that, I say whatever and play with it on a few linux instances and then can use it myself - with actual knowledge and some experience.

This is why I think no one is interested in mainframes: we can't play. We can't say, how can I...? We can't cut our chops by building a million and one little tools that grow up into really helpful software for everyone. We can't just whip out our phones and read about it when we are whiling away our time in line at the coffee shop or futz with things on an airplane.

It's that when compared to the availability of software and information of the bazaar, the high priesthood of mainframes with their mysteries and closed access can't provide us with what we need to just build.


I'm not sure what you mean by "decent technical documentation describing things". IBM is very open about publishing their documentation. For instance "Introduction to the New Mainframe: z/OS Basics" is available here: http://www.redbooks.ibm.com/Redbooks.nsf/RedbookAbstracts/sg...

I'm not saying that this makes it easy to get access to, which is still a real problem.


This is a good point - I should clarify: what I mean is that there aren't a lot of tutorials, step-by-steps, etc by people using the systems... the type of thing I think of as "screwing around guides". The IBM documentation always feels more like deep manual. I'm pretty sure that even if I had access, I couldn't be up and running on a mainframe in an hour, like I could with some web stack and a simple app tutorial, there don't seem to be good blogs on "how I managed to do $HOT_TOPIC_OF_THE_WEEK on my mainframe". I find these really important for understanding various other technologies... not as an end to my understanding, but as an entry point. Conversely, If I finally get some time to look at Clojure seriously, I know how to find resources in a simple way.

I could very well be wrong about this though. Maybe they exist and I just don't know the ecosystem terminology, but that itself is a problem, I can't even figure out how to get into the ecosystem if it exists.


Is there an equivalent to

http://www.debian-administration.org/

for MVS, z/OS, pretty much any mainframe system (other than S390 linux, which is kinda cheating)


Do I work on a mainframe? What is a mainframe?

I work on system i, old name was AS/400. Yes we use RPG, occasional COBOL is around, and CL is the wrapper. Sine RPG has no standards committee it adapts to the times. The majority of our code is free form, looking more like Pascal/C than anything else. We run web services and the like all in combinations of C, RPG, JAVA, embedded SQL, and such. I even have a PhP based wiki out there and some other oddities. Its nice that one program can easily incorporate modules from multiple languages. Don't even get me into the ease of maintaining large databases, adding tables, indexes, or views. It is so simple as to not require an army of people to support our large systems.

So why aren't people getting into these systems? The easiest explanation is that its not "sexy" enough. I love sitting in meetings hearing all about how certain groups do X, Y, and Z, and how the platform I work on does not, even though we usually do and with far more reliability and less work.

Its all fun in the end, I enjoy coding as much on my Mac as the i. I work with people in daily who interface to us through ODBC, web, and more. I do find that most people don't think we can do something; including so called experts on my platform; simply because they don't even bother to do a google search. My favorite reply to most inquiries is, we might not be doing that today, but by the end of the day I will tell you how we can.

Face it, its boring to many people. They see green screens and freak. Its only part of our work, most of the development is fully web faced or similar provided it suits the business needs. Warehouse and much of retail doesn't need more than fast efficient access to information and input.

Look at it this way, go see what major medical systems, distributors, and even what the casinos use. You might be surprised.


Occasional i5 programmer here (is it just System i now? Dammit IBM...)

  > The majority of our code is free form
This is not at all our case. We had a contractor once who decided to try his hand at discreetly converting files from fixed format to free, introducing mountains of subtle bugs in the process. Nowadays we're only allowed to use free form if we're making a brand new source file (which is rare).

  > I enjoy coding as much on my Mac as the i.
Can I ask what tools you use to code on the mainframe?

  > They see green screens and freak.
I can confirm this. Our management is so embarrassed of our green screens (which are for internal company use only, customers never even see them!) that we're actually embarking on a massive project to bolt a PHP-based web-facing front-end onto our system, and literally the only rationale is so that we can use CSS to make our screens look more "modern".


My girlfriend works as a retail cashier and she was lamenting her company's switch from an AS/400-based checkout system to a more modern GUI. We're only 20-somethings and she's far from a computer expert, but she recognizes that taking a bit of time to learn keyboard shortcuts and terminal commands ends up being much quicker than using a mouse or even using the tab key to get through forms.

Her company's reason for the switch was making it easier to train new cashiers. I'm expecting they'll see a net decrease in overall cashier speed though.


There's nothing stopping the devs/company from putting in keyboard shortcuts as well. However, most people, when building GUI apps, overlook this step.


I would love to worry about keyboard shortcuts at my job. The problem isn't that I don't know how to do it, it's that they're invisible, so they're the first thing to fall off the to-do list.


I absolutely expect the same response with our transition. We're not going to have any choice but to recreate the IBM 3270[1] keybindings as exactly as we can in a web browser. And guess which browser appears to be the only one that allows web pages to capture the function keys? It's Safari! I suppose it's fortunate that most of the company is still on Windows XP...

[1] http://en.wikipedia.org/wiki/IBM_3270


Would you start off a business with a mainframe?

I understand a lot of big orgs use them - likely because when those orgs started there were no other compelling options. There are more options today, generally more cost-efficient, partially because it's easier and cheaper to find people with expereice with modern hardware and software.

Given that I don't see much trend for small businesses to use them, and many small businesses grow in to the larger businesses, it seems like it's going to be a continual marginalization of "mainframes" as part of business. They'll never go away entirely, but it'll be harder to find people to work on them. I don't think it's an issue of 'sexy' as much as 'demand'.

If 'sexy' was a huge component in job skills, I couldn't explain so many Java developers. :)


I do the odd bit of mainframe security stuff now and again. You can argue the toss all you like about hercules, but the fact is that it's not up to scratch for running a proper stack. Saying you can run Z/OS on it with CICS and RACF and DB2 and all the rest to learn is fine, but you try finding half of this stuff.

The reason young programmers (and not so young programmers) aren't interested in mainframes is pretty simple, it's largely alien to modern computing users. Mainframes are rarely taught in CS degrees, most people go their entire careers without going near a mainframe and when they do see them, the fundamental concepts (batch rather than interactive, segregating security from the main OS etc.) they recoil in horror.

It's also a highly specialist field. Mainframes are typically designed for repetitive tasks with extreme reliability, in the same way that GPUs are designed for specialist programming cases. It's unsurprising then that there isn't exactly a rush to fill the niche.


HPC seems a lot more interesting than mainframes. One of my first UNIX accounts was on a CDC Cyber 205 (shortly before it got decommissioned, I'm not that old) and an ETA 10P supercomputer. The hardware solutions Cray and others came up with to things like cooling (inert liquid fluorocarbon?), wiring looms, etc. were amazing. All but the largest mainframes were kind of boring by comparison.

Modern virtualization covers most of the interesting parts of mainframes. There was a lot of weirdness with asymmetric multiprocessing (channel processors for storage, etc.), but a lot of that is fundamentally uninteresting since it is dealing with special cases in an inelegant way. One of the more interesting parts of mainframes were Tandem and other redundant/fault tolerant systems, with multiple processors executing the same instructions in parallel to compare output. I don't know what the state of the art is for that -- obviously in the normal open systems Internet world you could just do this at the application level with multiple machines, but chip/instruction level redundancy might get used more in embedded systems. Conceptually really interesting.

The other really interesting hardware areas, to me, are FPGAs/network ASICs, and HSMs. Trusted Computing, specifically enhancements to EFI, Intel TXT, etc. are all really interesting too.


So this is a problem I've been dealing with. Been working out of school for almost two years at a big financial firm, working on Mainframes. I would say I'm relatively lucky in terms of mainframe horror stories you hear, in terms of new software being developed, but yes, ultimately for every "unit" of new work I do, I would reckon there's two units of maintenance.

The one main benefit I've seen working on my team, however, is the proximity to some of our business users and knowledge gained about the business that we conduct. I value that more than some of my colleagues working on an internal toolbar, even though they may be using a more modern technology.

I feel the difficulty that's quickly arising is knowing when to get out before the pigeonhole widens.


For those that might be, have a look at http://www.hercules-390.org/


I found the link to hercules to be the most interesting part of the article. Though the likelihood of me installing the VM and exploring it approaches 0.0, the anthropology of its culture is fascinates me.



I played with an IBM mainframe when I was in university (mostly for the free t-shirt), and the reason I don't really want to work on mainframes for a living is because too much of what I was doing revolved around the machine, as opposed to problems.

For example, one problem revolved around creating a file, which required me to specify how many lines and how many columns it was before I could open it.

I suppose this is why I prefer higher level languages e.g. Python, as I can spend more time thinking about the problem at hand than, say, the details of HTTP.


I too did it for the shirt.

I come across mainframes all the time at work, but I'm an IT auditor and I deal with legacy systems on a daily basis.


I worked at a mainframe facility about 20 years ago in the financial services industry specifically a service provider for brokerages. For multiple generations since about 1975 when the Altair came out, everyone both in and out of the field is certain they'll all be perma-unemployed in about 5 years, yet here they are and there's even a shortage.

Its a close analogy to nuclear engineers.

The next thing is its MOSTLY a single company ecosystem. The private companies are all "How long did you work for IBM? Never? Then why are you even applying here?". Also if you think buzzword bingo and crazy inflated requirements are an issue outside mainframes, its much worse on big iron.

Finally it tends to be very poorly understood. 20 years ago it was all full screen editors and IDEs and automated testing and such but even today all you'll hear about is how 2013 mainframe programmers still have to punch physical card decks. I'm sure there are places that haven't upgraded their mainframe dev tools since MVS/360 in '68, just like you'll find PC places that require devs and/or journalists to use "notepad.exe" as their primary editor.

This closely ties into weird ideas about "power" and a demand that "the mainframe" is one individual thing from 1960 which has never advanced since then in any manner, therefore a modern mainframe must be hilariously underpowered compared to a modern iphone. Its actually pretty funny to watch if you have a window into each world.


Too expensive to keep in your bedroom.

(Mark Crispin /did/ have a Dec 2020 in his spare bedroom. And I was once jokingly offered NBS-10, but I was in college and renting a room at the time).

I visited the Living Computer Museum here in Seattle recently. It's got some great minis in a raised-floor machine room. I wanted to open up cabinets and get my hands dirty again. (Booting up 4.2 bsd on our 11/780 used to walk the disk drive about an inch /thataway/, and it had to be repositioned periodically. You don't get visceral stuff like that now).


For me it was mainly the life-sucking corporate culture that is referred to. I was very interested in mainframes as a young programmer (still quite recently) and I even did a few projects in school centered around Linux for s/390 and Hercules. When it came time to find a job somewhere away from school I couldn't get close to a mainframe without losing interest because of the (sorry to reuse the same phrase) life-sucking corporate culture I would have to penetrate.


Because once you've programmed a computer interactively with good tools there really is no going back to anything else.

Mainframe development suffers from crufty tools, a batch-oriented workflow with VERY slow turn-around times, an OS that was positively stupid (mainframes were expected to run mission-critical applications in as little as 32 KiB of memory; not a whole lot of room for user-friendly OS smarts) -- all relics from the dark ages of computing. Kids these days are spoiled by interactive consoles (graphical consoles even!), free-form programming languages like C and Python, fancy full-screen editors and free compilers. They don't have time for mainframe shit.

Yes, yes, I know. Modern mainframes can be time-shared and z/Architecture systems run Linux now. But Linux on a System z is not that much different from Linux on your PC or a SPARC-based server; there's nothing inherently "mainframey" about it except the hardware it happens to run on. When people speak of mainframe development they usually mean MVS, JCL, COBOL, and all those horrors.


The fact is that mainframe (emulated at least) COBOL still makes most of the world go around. There are billions of loc in play and maintained. It's out there, cutting your paycheck, processing your loan, or billing your premium.

There are some who claim we're approaching a talent shortage due to the sector's inherent underpaying and uncoolness, while others who poo-poo such shortage on the grounds there are plenty of offshore shops trained and lined up to take on the business for less than the old, local talent. The only thing I could contribute there is there are some sectors, banks for example, who cannot outsource because of privacy/financial regs, so they are stuck with internal hires.


The true talent problem I see is maybe 20% solved by offshore talent. I say that because with most of these monster systems, it seems the talent comes in understanding the system yes, but more importantly the business that the system is designed for (around...)

There are offshore people on mainframes in the banking industry, but the problem faced is more in explaining why or how to do something business-wise. Once some of the older ranks , and thus their experience/knowledge, retire there will be less people who understand why the system does what it does, with hoards of offshore who just pump code leading to a crippling process to which I imagine will hinder any non-trivial change.


From the people I know working in banking, outsourcing is in fact quite common. They just have very tight access controls and truckloads of legal contracts, and some work has to be done on-site.


The bankers I know work on giant clusters of much more modern things. I can't imagine the banks aren't moving to something more like that for processing nightlies.


So, most of the world is banks?


The accepted answer relies too much on hindsight and experience, something that young programmers definitely don't have available to base decisions upon.

The real answer is: no young programmer today grew up using anything other than a PC or Mac, and no CS graduate today was instructed on a mainframe in college (AFAICT that trend ceased in the late '90s).

I'm a very occasional mainframe programmer myself (though I get to use RPG4 rather than RPG2), and I distinctly remember the inner incredulity I experienced during my job interview that anyone (other than banks) was actually still using these things!


The school I went to (a state university in Michigan) actually taught mainframe programming with RPG and COBOL. It wasn't a standard major program and it wasn't required, but there must be enough of a demand to keep some classes open for a minor focus during a CS degree.


I never considered it because I can't think of a problem I need a mainframe for. If there is something really CPU intensive I would use the cloud (map-reduce XYZ) for it.

Have you ever come over a discussion here on HN and someone said "Let's use a mainframe for that"?

I know there are a few disciplines is science where it is important, but then I would have an interest in meteorology etc. to even consider it. So all the rest I associate with legacy stuff, where I heard horror stories from people. Never "Wow that AS/400 is so cool for X".


I remember using an AS/400 mainframe as a kid (I have fond memories of essentially IM'ing back and forth with my father when I was 4 in his office).

I used one helping him with some database migration/upgrade at the same office when I was in my teens. So I've touched mainframes at least, but now all of that is in the fog of time and memory.

Now, I'm a Ruby developer and I honestly don't know what I'd use a mainframe for. I don't know when I'd deploy one, what an appropriate task is for them, etc.

I get the sense they are good for when you have "big data" stuff to deal with, but it feels more like a 1980's definition of "big data" (MILLIONS of rows, omg!) than datasets in the billions/trillions of rows (or huge datasets like a human genome sequence).

I assume on a technical level, being turing complete machines, that I can do 'anything' with them, but realistically what can I do? Can I run a web server? Do data analysis using Hadoop across a huge dataset? Can I easily setup background tasks and queues? Do they do 64 bit instructions? What are the limitations? Can I run Lisp on them? What is the toolset like? Do they support Unicode? What is their database, and what is it capable of?

And what's the best place to learn about this stuff? I can go to Barnes and Noble and get a book on C, Ruby, Java or PHP... but finding good modern documentation on mainframes? I dunno where to look, or what would be considered 'too old' for documentation. Anything about Ruby from 2002 is ancient, but perhaps that's the new hip stuff when it comes to COBOL?


"it feels more like ... than datasets in the billions/trillions of rows"

LOL that's a pretty good description of what the financial services company I was working at 20 years ago. Right next to my WAN cabinets was a fully automated tape robot holding something like 4 exabytes possible. 60 gigs on each new 3590 tape cartridge times 100 on a shelf times 10 shelves per "bookcase" times 2 cases per track segment times 30 or so segments long (oh yes I'm well aware that's like 100 yards long and that was a huge PITA WRT my WAN cabinets) that's 3.6 exabytes if stuffed full. Thats alot today. But this story is from 20 years ago.

I was not involved in the DB side but supposedly they had a continuous stream of IRS/SEC related inquiries about individual stock transactions from 7 years ago or whatever.

Supposedly the recent tapes store 4 TB per cartridge, so run the math again and ...


I can give you one good reason: Money

I was a developer at a bank in my youth. I lived in a very rural area and it was one of the only nearby places willing to hire a kid with a half complete CS degree. I dabbled in web development while there and officially made the switch four years later. I make more than five times as much now as I did back then.

Before you say that it's because I was a junior developer, I knew many senior devs there were only making a little more than I was.

For the record, that job was the low point of my career. The developers were terrified of technology. Everything was in COBOL (with occasional bits of assembly), and all data was stored in flat files. I remember one developer proudly proclaiming that she didn't "surf the inter-links." I blew their minds by writing a simple regex once. None of them had even heard of regexes.

A fair number (not a majority, I hope) of companies using mainframes are still on COBOL. The language isn't OO (there's an OO variant, but no one uses it). It doesn't even support functions! Code re-use means copy/paste, and you can forget about metaprogramming and higher order functions.

While it's true that COBOL is turing complete and anything that can be written in C can be done in COBOL, I'd point out that the same is true of Brainfuck and LOLCode. You wouldn't write a non-trivial app in either of those languages, nor would expertise in either of them alone make you a good programmer.

I'm sure there are some places using mainframes who are doing cool stuff and aren't terrified of new things, but I still wouldn't want to work there. These days I'm surrounded by really smart people who love learning new things. The fact that I'm also paid an obscene amount of money to do something I love is a plus.


A couple of years ago they actually started a one year COBOL/Mainframe education program here in Stockholm. Mostly backed by banks and insurance companies and from what I have heard many of the people that have applied and graduated have been quite young (for example I read an interview with someone straight out of high school who was about to graduate).


The school I graduated had a required COBOL class in CS/MIS due to pressure from a big insurance company donor to the school.


Everything that was good about mainframes does still exist, and is popular, we just call it "the cloud" now.


Fully agree with this statement. Mainframe had 3270 (or vt) dumb terminals and cloud has tablets and smartphones, but this is mainly the same architecture, with more colors and animations.

It is interesting to see that the applications themselves move very slowy, despite addition of colors/animations. Corporate apps are still mainly forms and databases systems. The technologies change, but they still have same architecture than in 60s/70s. Nothing new since mainframe, lisp and smalltalk.


I think the top-voted answer here is totally off.

In c2008, during college, I worked at IBM on mainframes for internships and co-ops. I worked on the Omegamon products on z/OS. Later I worked at Fidelity, where I was minimally involved with mainframe programming.

First, mainframes are awesome from an architectural perspective. As one example: when you run a program, you must (I'm simplifying a bit) specify exactly which files it is allowed to access. Moreover, when you create those files, they aren't allowed to grow infinitely - OS will enforce a maximum rate at which a file can grow, and a maximum file size.

Or, mainframes have been doing virtual machines for decades. In fact, the only sane way to run a mainframe is to slice it up and run VMs on it. "zVM" is written in assembly and runs z/OS VMs. Nowadays it also runs RHEL VMs. I hear they're putting Intel CPUs in mainframes to run Windows VMs too.

There's more, but I just want to illustrate that the granularity from a security perspective is fantastic. There are tons of technical and features that, in my view, would be very interesting to the computer crowd.

So why aren't young people interested?

First, nobody knows what mainframes are, except that they are dinosaurs and old and slow and whatever. Sun used to give universities pallets of computers running Unix - so guess what university students learned! IBM is trying to bring awareness with their Master the Mainframe competition (which is fun!) or sponsoring Senior Project courses throughout the country.

The OP is right that mainframes aren't available to learn on. And they're so different from the concepts we are familiar with that it is difficult to jump in. "Learn z/OS The Hard Way?" "Codecademy: OMVS Edition"

Do realize that every time you swipe your credit card, you are very likely hitting a mainframe in a bank somewhere. Talk about throughput.

The OP is correct in that if you're not working for IBM writing their mainframe products, you're probably at an awful soul-sucking bank writing COBOL. And really, the COBOL itself is probably not as bad as the fact that nobody knows it and these banks are notoriously bad at documentation or mentoring young people to bring them up to speed. Honestly working at a bank sucks. But banks user mainframes, and banks == bad, and mainframes == slow (somehow), so they are a no-go.

If you do work for IBM on their mainframe teams, those guys are honestly the smartest engineers I have ever worked with. They're all, well, old, but they know C and assembly (lots of mainframe software is written in those languages) inside and out. They understand networks. How to increase throughput. If your webapp crashes, nobody cares. If your bank's mainframe has a segfault you are fucked.

But then you are working in a very specific part of a very large company. Many young people work at IBM and are content there! They recruit heavily from my alma mater. But perhaps the types of young folks who are on HN, stackoverflow, etc are not really interested in working at IBM, much less learning about an esoteric technology. It will only keep you employable in a narrow (and soul-sucking) part of the industry.


My experience as a young person who has worked on COBOL and has worked for a very large "soul-sucking" bank is that they fired or forced out all of the older developers who knew how the system worked years ago. No one who's left really understands anything, and most code has been running un-maintained (luckily the older developers were pretty great) for YEARS. Because of that, there's no one capable of mentoring anyone else.

My experience as a web developer, though, has led me to believe that while mainframes do some really neat things, they just aren't necessary anymore. I'm certainly more than a little biased, but I don't see any reason why they shouldn't just transition to cloud or virtual web servers.


> The OP is right that mainframes aren't available to learn on.

We have a mainframe in our university (donated by IBM) which is used exclusively for teaching. As far as i know, this isn't the only one here in Germany. Allthough the material isn't as omnipresent like tutorials for beautifull HTML pages, one can sure find some. It's probably that most of the material is hard to find and not really sexy.

In case you are interested (only in German, though): http://jedi.informatik.uni-leipzig.de/de/access.html


> Do realize that every time you swipe your credit card, you are very likely hitting a mainframe in a bank somewhere. Talk about throughput.

Why do "mainframe" computers stick around? (I use quotes because I'm not sure what qualifies a machine as a "mainframe"). I'm curious, and not trying to start an argument.

Is it that more popular operating systems and languages we see on HN/SO aren't up to the task of operating at the scale/responsiveness required by financial institutions? Technical lock-in? If-it-aint-broke-don't-fix-it?

I'm mildly surprised that these systems haven't been replaced by something more modern (or at least, modern sounding), but I am blissfully unaware of the world here and am likely missing important data that causes these types of institutions to use mainframe hardware/software.


It's partly historical: these applications already exist on mainframes, and it would be very expensive to change, so they don't.

But mainframes have a lot of advantages because they are integrated systems designed for exactly the sort of high-throughput tasks that businesses use them for. Their (single-threaded) processing power is good but not outstanding; where they really shine is parallel I/O performance which lets you execute large numbers of operations per second, or distribute workloads across large numbers of processors with extremely high bandwidth inter-processor communication.

They're also designed to be highly reliable, redundant and modular. You can do things like upgrade and replace processors, RAM, and disk drives without interrupting applications. And speaking of processors and RAM, modern mainframes can scale to >100 processors and terabytes of RAM in one system.

Yes, you can build systems with similar specs and capabilities with clusters of commodity hardware, but then you have the complexity of managing a distributed system. With a mainframe, you have a single beefy machine. You'll also have a support contract with the manufacturer to call when things break, so they can be responsible for doing any complicated maintenance.

On the downside, of course, they're expensive in the "Q: how much does it cost? A: how much do you have?" way that enterprise pricing works.


I work for one of the major banks and all the reasons you listed are true. From talking to coworkers who work with the mainframes, the mainframes still provide fantastic performance that Java and other higher level languages haven't matched on the stack (yet). Also, the applications are so large and so critical to our customers that the process of transitioning to a new stack would be a nightmare for almost zero short-term gain.

It should be noted that the industry is trying to make a push for porting the COBOL applications to Java to help resolve issues regarding an aging workforce, hiring, and documentation.


I am not a mainframe guy, but AFAIK mainframes are very special beasts. Especially they guarentee very high aviability and have special hardware features to support this guarentees. So you do not use them in the first place if you are not more scared of downtime, than of hardware costs. And with this comes a specific very conservative style of software development. If it ain't broken, don't fix it. So a lot of the mainframe code is very old and will only be replaced, if the company is replaced by a competitor. And therefor compatible hardware will also stick around until the end of business operations.


As well as the other reasons given here, the cost of conversion is much higher than you might otherwise think. Especially for banks, you have to prove that the new software meets with compliance regulations, not only for the financial regulators but also for the auditors (banking auditing is especially tough).

Compliance does not mean just "meets all the unit tests", but also meets specifications like guaranteed uptime, stringent physical security (that alone could rule out many cloud services), tight audit trails, recoverable failures, incorruptible data, and so on.

It opens up a whole can of worms that will dwarf the development cost of the software conversion alone. It's easy to see why many banks stick with their current working system.


As a younger person, I would definitely be interested in mainframes. The problem is not just experience, but I have no idea where to look for those jobs.


When you swipe your credit card it tends to end up batch processed not in real time but yeah that does eventually go through a mainframe. Other than that I agree gooby. :)


I've never seen a mainframe, I don't really even know what one is or why anyone would use one. I'd think that's the main reason, but if you want to go past that, there's pretty much nothing appealing about working on legacy code in ancient languages with people 40 years older than me. I can't imagine COBOL experience looks particularly good on a resume nowadays when you want to leave the mainframe world.


Agree that I would not love working with 40 years older COBOL programmers. However, I for sure would love to work with 40 years older lispers or smalltalkers.

I am wondering if in 40 years from now, youngsters would love (or not) to work with [ruby|java|js|nodejs] programmers.


Because not many people think microoptimisations are cool.


I remember working on OS/390 and doing COBOL stuff.

I understand how "important" money transactions are. Every time I make a transaction with my VISA or a wire transfer a mainframe is implicated in the process. We get it. It's important.

But what has changed in the last 20 years? Credit cards payments? Not really.

What has changed is that I know have ubiquitous Internet access, things like Google Maps or Open Maps, things like blogs, forums/discussions websites like Hacker News, eBay, Amazon, price comparators, plane fligths finders, StackOverflow and Quora and all the others helping to disseminate knowledge, Youtube and Vimeo, online university lessons, on demand computer instances and storage and so many other things.

So, sure, I could have decided to work on back-end payment processing. We get it. It's important.

But young programmers much prefer to either work for --or try to build-- companies shaping the future.

And it very much looks like the future ain't build on mainframe computers...




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: