Hacker News new | past | comments | ask | show | jobs | submit login
Interviewing my mother, a mainframe COBOL programmer (medium.com/svenskunganka)
534 points by Svenskunganka on July 14, 2016 | hide | past | favorite | 273 comments



Nice article.

Reminded me my own experience of creating bank software system for a State Bank of one of republics that was in the middle of separating from former USSR. "Perestroika" they named that process.

No mainframes, where we would get those at that time? Only i386 as a server and PC XTs as terminals. Connected by ropes essentially, but NetBIOS was working over them. Plus 1200 bod or so modems to connect with other banks in the republic.

But we (team of two developers) did it in 9 months.

Excellent Zortech C++ by Walter Bright (creator of D language).

Editor - MultiEdit the Great. What we would do without it - I don't know.

Database - dbVista (later known as dbRaima).

Everything was working on PTDOS (Russian MSDOS alike thing)

Yet multitasking C library providing cooperative multitasking (same execution model Node.js is using).

And everything in 640k of RAM that "should be enough for anybody".

We were young and nobody hadn't even a chance to tell us that this is "too serious", "impossible" and the like.

A lot of manual input by operators, so usability requirements were surprisingly high: http://www.terrainformatica.com/2016/02/ui-usability-solutio...

:)


Nice. In that era we had an even cheaper solution for a multi-user accounting system, the VNA by Advanced Micro Research / the "UnTerminal":

https://books.google.com/books?id=AlnQ5OJS6XgC&pg=PT167&lpg=...

So basically no networking, just multiple video cards and keyboard interfaces in a single 386 PC (it used RS-422 to move the video signals to the screens). In SCO Xenix this worked extremely well. No networking needed.


musical verification - very nice! a completely unexplored area of UI design. very powerful can be.


Oh, yeah.

Girls were especially happy when we've added an option to choose octave :)

I still hear sometimes in my night dreams sound of bank's operation hall orchestra :) Twenty workstations or so at closing of operational day ... It was actually not that bad.

The only problem was with 10 digits and 12-tone chromatic scale. Added '.' and '-' to that.


that is very interesting what you are saying. acoustic perception has almost unlimited bandwidth compared to visual. you don't even have to look, just listen for the noise in the background. if girls learned to distinguish 10 tones by the end of their first day, imagine what can be done with more sophisticated approach. this acoustic monitoring can be complimentary to standard mode of operation, a sort of UDP protocol - if you missed a note, fuck it, sky won't fall, but because of incredible capacity of our brain to distinguish the patterns in background noise, this can be very useful in detecting unusual situations that otherwise would go unnoticed.


That area of UI/UX is very interesting indeed.

Common misconception is that UI is only about visual side of it. User Interaction may include all other senses/informational channels.

Another common misconception is to think that UI principles are the same on any device that has CPU. Far from it.

As an example: I've participated in initial designs of EverNote[.com] application. From the very beginning it was very clear for us that desktop and mobile versions are very different entities. On mobiles you essentially have no other way to input your data other than using camera or voice. Size of your finger's touch area on phone screen is comparable with area of average paragraph of text. WYSIWYG text editing in such circumstances is barely feasible. So only buttons of 1 inch size, that's the only reliable options in GUI.

GUI and UI in general are very interesting things indeed.


Why not two octaves in the Pentatonic scale?

https://en.wikipedia.org/wiki/Pentatonic_scale


I would _love_ to read a writeup of that project.


Hard to recall actually, many waters had flown away since that time.

Just one fact: the system survived monetary reform in USSR of 1991 when in three days the whole country suddenly changed all banknotes ( https://en.wikipedia.org/wiki/Monetary_reform_in_the_Soviet_... ). History of USSR is just one big social experiment of things like this.

Number of transactions in these three days was astonishing as you can imagine. We've spent all three days in the bank worrying mostly about hardware. 12 keyboards and 3 workstations from twenty-something were lost in the battle :)

But the most crazy thing that I saw was on the next day after the reform. It was January 26th, and it was snowy all three days. Plaza in front of bank's building was covered by snow and banknotes on top of it, surreal. People who were late or exceeded exchange quota were just throwing money away.

But our system did handle it surprisingly well.

My pardon for hijacking the COBOL topic. C++ rulez forever!


Thank you for the details :)


Same!


> No mainframes, where we would get those at that time?

Госбанк started using mainframes in 1972: http://www.computer-museum.ru/histsoft/bank1.htm


Gosbank USSR had died with the USSR itself. And those IBM clones they were using ... Where did you get them at that time? And software? No one wanted even to think to start something new with them.

UPDATE: In fact we were extremely lucky to haven't even seen this mainframe beauty upfront. Pure number of day transactions were enough for us to estimate payload and architecture.

It was just 3 or so years later when I was invited to see that glassy mainframe's kingdom at Central Bank of USSR.


My grandmother who is in her 80s was one of the US military's first female programmers... and she was a widow... and she was German expatriate (as you can imagine the difficulty of that post WWII). They programmed in punch cards and drank a lot of coffee.

If it sounds like I am bragging.. I am.. I am very proud of her. What I'm not proud of is not spending enough time with her to extract more details. I did ask her a similar question about being a German widow working with predominately males and she basically had the same answer as the articles mother... "no problems what so ever".

This article is a great reminder that I need to call my grandmother more often :)


My grandfather was instrumental in developing several key products and technologies at IBM in the 60s and 70s. I'd been meaning to call him, and he suddenly died last Saturday. You never know when your opportunity to talk to your grandmother will disappear.


Please do. My grandmother (my last living grandparent for over 20 years) passed away in January, and I am indeed happy that I took the opportunity to learn things from her about her life over the years.


Was her name Grace Hopper?


No it was not. I wish I could provide an official link of her work but I can't seem to google anything. She isn't famous by any means and is exceedingly humble (hence the difficulty on finding details indirectly).

Interestingly though she absolutely loves all things sci-fi. She records almost everything the SyFy channel produces and loves Babylon 5, Dr. Who, etc...

She is exceedingly good at board games. Even ones she has never played before she often wins. I remember brining Chinese checkers once to her with my wife. She had never played before but she absolutely trashed my wife and me in multiple games.

She grew up in Germany and I have often wondered if that had any affect on her non-traditional-gender hobbies and likes. Or maybe it is just her personality (as I love board games and SciFi as well).


She wasn't a German expatriate, as far as I can tell!


"We used to have personal desks, but now we have this “pick whatever spot is available” open area. I dislike it a lot."

This fits in line with what nearly everybody I know thinks. Does anyone _actually_ like open-plan offices without dedicated desks? How long until that trend passes?


When we were moved to an open office plan they were quite open about it - the one and only reason for doing so is they can squeeze more people into a smaller footprint. All the garbage about "facilitating communication" is just that, garbage.


There are two issues here: open-plan offices and hot-desking “pick whatever spot is available”.

While I understand the argument for separate offices, or at least cubicles, I have never worked in such an enivornment. But I am truly shocked that these programmers are being asked to hot-desk.

It's not like journalists or cops or people who only pop into the office unpredictably. On the other hand, programmers do have terrible personal notions how how their monitor(s), keyboards etc should be layed out.


> On the other hand, programmers do have terrible personal notions how how their monitor(s), keyboards etc should be layed out.

Not just programmers, trust me. At our company, most people do not care, but a few of them will notice - and get cranky - if you move their keyboard by just a few millimeters.


A friend of mine used to work for a bank which changed premises, and introduced hot-desking. They only had 80% seats and planned for all staff to work at least one day from home. "I'm a netadmin. I can't work from home to do my job properly" eventually swayed them into giving him a permanent desk.

But hot-desking was still a complete failure, even excluding the personalisation issues. He said that people would arrive at the floor asking 'Is Bob here today?', to which could only be answered 'I don't know, he could be anywhere on the floor. I'd have no idea of where to look for him'.

Some people like open-plan, and some people hate it. But I've never met anyone who liked hot-desking.


Not to mention hot-desking sounds like a fantastic way to spread illness. I would never do it. The guy who is out sick with diphtheria today could have been using the workstation you are currently occupying yesterday. How would you know?

I rather doubt the cleaning crew is taking the time to sterilize keyboard/mice. I know, I'm a bit of a germaphobe.


When I did hot-desking everybody had their own keyboard and mouse that they either brought home with them or locked in a locker every night.


Don't forget about open-back seating, so everybody can see what you're working on.


I don't particularly mind that people see my monitors. If you work in a decent place, no one would care if you read some news once in a while. If someone spends the day idling, then it's noticed anyway.

Once in a while I get a sensitive or confidential email, but that usually happens when the computer is connected to a big screen in a conference room full of people anyway...

I DO mind not finding anywhere to sit, not knowing where people are when I want to talk to them, not having two or three huge monitors to work on, and not having my Model-M to type on...


Personally I do mind, not because of fear someone catches me procrastinating on Hacker News (most people in my company wouldn't even know what it is) - I simply can't focus with other people close behind me. It's a weird kind of stress.


Wait, are you saying there are still companies attempting to hotdesk, "not finding anywhere to sit?"


Yes? A rather large telecom company for example.

I guess there are some stools and tables available somewhere, and technically you can sit on the floor I suppose, but people that are late more or less have to go home again.

They are kind of in a downwards spiral anyway, so maybe it will sort it self out.


I actually like this, since it stops me from procrastinating, as does working in a team.


or rather, what you are browsing.


When people start refusing to. Two last times I looked for a job, I told recruiters that I won't be working in open space right away. Regardless of whether they actually had it or not, and whether they actually wanted to hire me - every time a recruiter heard that, he remembered it.


I have no issues with it, as long as nobody takes my usual seat.

Which is to say, I don't like hotseating.


The people who spend all day chatting in an attempt to avoid doing actual work seem to like them.


I'm fine with open office as long as you still have a dedicated desk. Otherwise it's annoying and difficult to find people.


I get less productive when I'm always at the exact same desk in the exact same room every day, so I like systems that force me to move around.


When the sun rises in the West, Hell freeze over and HR goes back to be Personnel.


> Personnel

Wow, that's a much better name for it!


There's a weird effect where people become disgruntled with the current terminology because its offensive/demeaning and come up with new words to take their place.

Thus Personnel became Human Resources.

But often the new words are, with the passage of time, clearly just as, or more offensive than their predecessors. I mean, "Human Resources"?!! WTF! Are people really just "resources"? To me, Personnel sounds way friendlier.

Another one is around disability. We would never describe someone today as retarded. Yet really, is disabled any better? I'd rather be retarded (as in, slowed down a little) than disabled (completely stopped).

[close to my heart: we write HR software :)]


Such process is called Euphemism treadmill [1].

From Wikipedia article on Intellectual disability [2] there is a list of historical medical terms. Nowadays used in offensive manner are: Cretin, Idiot, Imbecile, Moron, Retard. Same will probably happen (or already happened in some cases) with Disabled.

Probably same process takes place for many corporate terms. I observed this in my time at Samsung when internal organisation for CSR [3] changed name from POSITIVE_ADJECTIVE Workplace to New Management Committee.

I am through most of my career a C++ programmer and have personal interest in C/Unix programming. For one project I was working on JavaScript code base with Web developers. I've seen different terminology used for same things. When you study a bit of history or historical code bases then you will also see that terminology changed. I could say that web developers are reinventing a wheel and terminology with it, but every generation seems to do this again and again. It probably is related to the same mechanism. It must be often easier to name things than to learn their previously assigned names. Mostly because you don't know what to look for.

[1] https://en.wikipedia.org/wiki/Euphemism_treadmill

[2] https://en.wikipedia.org/wiki/Intellectual_disability#Termin...

[3] Corporate Social Responsibility


It's no longer Human Resources at my current company. It's now Human Capital Management. They've also had the IT dept move us from logging into the network with traditional usernames (i.e., "bob.smith"), to employee numbers (77332).

I'm waiting for them to drop the "Human." I'd like to be referred to as a generic "Carbon-based labor unit" or some such.


Carbon Organic Generic Worker or COG for short.


7 of 12, Floor 3, Unimatrix 01.


As someone who is disabled and when I was younger was called retarded, I want to be called disabled. For one, people have a particular way of saying retarded that's very insulting. I was okay with the traditional term at the time, handicapped. That describes things in a far less insulting fashion.

People tended to use retarded as referring to one's mental abilities and assumed that if you have a physical disability, you have a mental one. That was the biggest problem with retarded.


> People tended to use retarded as referring to one's mental abilities

Isn't that what it means? Applying it to something physical just seems wrong.


People often assumed that if you had something physical, you had something mental as well.


Personnel want replaced because it was offensive to employees they deal with, it was replaced because people in the field wanted to more recognition within organizations, and being seen as managing important assets was seen as a way to gain that recognition.


Why was it offensive?


I'm saying it wasn't offensive, the issue was unrelated to that.


Small typo in your original post: I think you said want instead of wasn't. That may have led to some confusion.


I agree about HR! What an annoying term. But, unlike "the r-word", which has lots of history of being used as a direct insult and slur meant to be hurtful, "disability" has quite some time of history as use for political organizing and as a political identity. So while I would actually describe myself as having some physical impairments (such as not being able to walk much) I also would say I am disabled, as a social and political identity. No matter if I mean something different by it than people might think as they use the word to describe me, because it is a point of struggle. Hope that makes sense. It is super interesting how and why people shift the meanings of words!


I wouldn't mind being called a resource if it was short for "one who is resourceful" but we ll know that's not what it means.

Incidentally HR workers refer to themselves as "professionals".


We don't call people 'disabled' any more, we call them 'people with disabilities'. It sounds trite, but it puts the focus back on defining them as people, rather than by their disability. A bit like the difference between 'a black programmer' and 'a programmer who is black'.

The 'retarded' thing fell out of favour because it was heavily used as a pejorative, which doesn't help people who actually are retarded. When I was in neurology over a decade ago, the term had changed to 'developmentally delayed', as in, hadn't hit the mental milestones that a kid of that age should have hit. I loathed that term, because while most parents knew (or were told) to interpret that as 'old-fashioned retarded', to some parents it sounds like the kid is just delayed, and given enough time, will catch up. Some kids do, and at better rates than in ye olde days, but plenty of them don't.


I find it interesting that you chose to analogize "disability" with "black"...


I recently watched a video on social media that was about raising awareness of people with disabilities and encouraging people to use their votes this election cycle to make a difference. It was very moving until the end where they had "#CripTheVote"[0], and I was like "WHAT?" Isn't "crip" short for "cripple"? And isn't "cripple" considered insulting these days? Is this some sort of "take back the word" movement?

[0]https://twitter.com/hashtag/cripthevote


It's a reclaim of the word.


I thought the idea behind "Human Resources" was that the people who work in HR are supposed to be a resource for humans, not the other way around.


They had to go with "HR" because "Geheimstaatspolizei" was too hard to spell.


Er, no, it's not it's because that group manages a particular set of resources for the company, to wit, the human ones.


Oh, it never occurred to me that I'm supposed to be offended by "personnel", nor by "HR".


Interesting article! I've sometimes wondered if I should learn more COBOL. I'm 26, one of my gigs was to reverse-engineer loads of COBOL code whose functionality was being migrated to something modern.

There was a few of these very long scripts, written in the 70's. At the top was "Here be Dragons".

It felt cool after I figured out what interest calculations it was performing, I ended up leaving a comment "Dragon Slayer was Here" at the end of the script.


I don't know man, you can't kill COBOL dragons with a Ruby sword.


you should learn the domain knowledge there while the experts is still around, then migrate it out to ruby or .NET or whatever


I think most if not all Mainframes run Java, so I think saner would be to learn so I can port to Java.

There are still younger guys in our banks who COBOL, but the domain expertise sits with the much older guys. It's lately common for the older guys who go on pension to come back as "independent consultants". They earn a lot of course.


eh, modern mainframe even runs Linux, so not really a problem there technically

Just find one of the experts that are willing to answer whatever question you have on the legacy COBOL code


It's a solid skill, and one where a lot of the talent in the industry is going to retire soon. Mainframe skills in general, and COBOL specifically, are still very entrenched in financial services (namely insurance). I don't know a lot of people under 35 who don't shit themselves writing JCL.


How does one go about learning all the ancillary skills? I mean there's COBOL, sure, but then there's the system architecture of a mainframe, something for which there is precious little documentation and learning material on.

Some doddling around with an (almost certainly illicitly acquired) copy of IBM Z/OS on an emulator almost proved to me that no free resources exist.


That's a really good question that I haven't had to think about for a while - I've had some kind of mainframe access my entire career, although I've actively tried to avoid doing that kind of work too frequently. It's just not something I've ever enjoyed.

I'll ask around with my mainframe vets and get back to you. There has to be something out there.


I would be very grateful - a new (old?) way of thinking about computing is very interesting to me.


The COBOL programmer shortage is solvable if they simply pay people enough money. COBOL programmers do not make SV salaries, far from it. Yet the code they write generally is handling more critical tasks than most of the stuff SV startups are producing (bank balances, payroll, insurance claims, etc.)

Though I found it to be stifling, I would work with mainframes again if the price was right.


The problem isn't money as such, there are certainly very very good salaries to be made if you're an experienced COBOL developer (I don't know where you picked up the opposite idea from).

The problem is training new developers: you won't get experience developing COBOL (not just the language, but the toolchain, the idioms of the time etc) without developing COBOL, and nobody is going to let you touch the system until you're experienced.

Unlike more recent languages, there aren't a huge range of small/simpler/less important (but still representative) codebases for novices to cut their teeth on.

I think it would take some sort of apprenticeship style setup to train new COBOL devs, but that's really, really hard to get right, and will be a huge drain on the few experienced devs you do have.


It's well known that Financial companies are generally among the worst places for programmers to work. Learning COBOL just sets you up for a lot of that.


Finance is a very very broad industry. Everything from Xero to your local credit union/building society to bulge bracket investment banks fall into 'finance'. I work for a bulge bracket IB, and all of the developers enjoy their work because it's what you would expect working at a technology company; maintaining an in house document store, implementing complex mathematical models and implementing distributed graph based pricing systems for real time data are all things I've had the opportunity to do here, and everyone is payed at market rate or higher for their experience.

It's a great industry to be in if you work in the right area, just like any industry.


Well known by whom? Sure it's Enterprise-y and you'll be in a cubicle. You'll also have a very secure position, reasonable working hours and demands, and work at a place with the financial resources to compensate you at market rate (or above).

There are a lot of places where it is a lot worse to be a programmer than a bank.


I guess it depends.

The pay is ok, the domain is interesting, and the work load is manageable.

When I look at some friends doing game development, they seem to work a lot more for a lot less. They have a cooler office though...

But I think it's a good idea to try to avoid COBOL.


One of my earliest jobs was working with COBOL (I am one of the few programmers coming up in the early 2000's that had COBOL in not one, but TWO core classes required at my college). The pay was rather good for my Midwestern location, 80k+ starting base pay was well over double the average pay for college graduates.

I thought it was actually fairly interesting work. COBOL is a very neat language. Like anything though at age 22, it's not likely you'll stick around for any job, much less one where you're generally just doing maintenance and source control / corner case bug bashing. I moved on but I'll always remember the job.


Where did you attend college? I also took two COBOL courses at Miami University (Oxford) taught by a professor who preferred to be called "El Supremo". Then, in my senior year TA'd them again...thus sitting through a stunning 4 semester's worth of COBOL during the early aughts.


Didn't go there, but same state. Northern location, private school. Given Ohio, should narrow it down to about 750 different locations. :)


$80k in the early 2000s, in the Midwest, as a new grad, seems astoundingly high. That's $102-112k in today's dollars (2004/2000->2016).


It was high for sure. But others received similar offers. I graduated in 2004, so you have to adjust for that time period where tech was exploding and the economy was recovering post 9/11. Jobs were everywhere.


Oddly enough, COBOL programmer shortage is solvable if COBOL died. Banks and financial institutions are literally running COBOL code from 70s. Hundreds of millions of LOC of COBOL built from era of antiquity. The faster we move on, the better.


I'd much rather have my bank run on 45-year old COBOL code than 45-day old Go or Rust.


We are still using buildings and bridges that are more than a century old. Why should we stop using programs that work?


One reason would be that these programs crash, require updates to keep up with industry/process changes and/or will eventually be obsolete in terms of performance or process adequation.

But the knowledge required to fix or update these programs is almost completely owned by the programmers that developed them, who are in the process of retiring or are unwilling to share knowledge for (justified) fear of losing their job.

This creates an unjustifiable dependency for the company, who will have a very hard time finding senior developers to replace the original developers or even recruit <modern language> developers to build a new and more modern system. What happens to the company if the original programmers retire, get sick or leave?

Every project that I know of that focused on replacing mainframes running Cobol quickly reached nightmare-like complexity during the analysis phase - especially because the original developers were uncooperative and the scarce documentation was obsolete - and was replanned to contemplate at least twice the time (and sometimes twice the work/FTE), which will also increase the cost of replacing the technology by a few (tens of) millions...


> Every project that I know of that focused on replacing mainframes running Cobol quickly reached nightmare-like complexity during the analysis phase - especially because the original developers were uncooperative and the scarce documentation was obsolete

That has been my experience as well. It's also the case that business processes get designed around the COBOL code, not just vice versa. Separating out what are actual business requirements from "what we've always done on the mainframe" can be fiendishly difficult, and not just the original developers but the customers can be extremely uncooperative. It takes management coming in and saying "these are the new business processes, they are compliant with the industry standards, and our new software is going to implement them" to get things to actually happen. How that works for a changeover in something like a bank is beyond my experience, though.


There is nothing wrong with using old infrastructure. However, there is something wrong with treating it as a scacred black box.

A company using ancient COBOL code should have people who understand it, but also the ability to test it, and to talk to it from new code.

Once you get to that point, replacing the COBOL is not so hard -- once you have a reason to do it.


It is much harder to modify old software than it is to modify an old building. I've seen projects to modify some small aspect of our system that took 10+ developers six months to implement, but could have been done in an afternoon by a single dev with an IDE, had the project been written in a statically typed language.


COBOL is statically typed. In fact, one of the good features about COBOL is that all fields (variables) have exactly specified lengths, including numbers. You don't just say "I want an integer", you have to say exactly how many digits are in the integer, assuming you use PIC fields (decimal) instead of COMP (binary). Non-numeric fields have exact sizes too, like X(20). It is impossible to have a buffer overflow in COBOL.


A given COBOL module is statically typed, but a typical (in my experience) COBOL application is made up of many modules and there is no type checking between them.


The equivalent would probably be rather (literally) byzantine sewage and water systems from the middle ages in very old cities like Istanbul ;)

They're everywhere, they do.. something.., and nobody can replace them just all at once. Even if you replace some of them, the remaining part still means your system gets swamped by the old mess.


It is generally true. Less 'fancy' work, you get less money, irrespective of how critical your code/work is. It is an irrational element that is not necessarily driven by supply-demand dynamics.


It will take some time before they realize it though. And I find it hard that any company would pay someone who has no experience in a language boatloads of money, even if its an excellent programmer in another language.

I'm pretty sure the problem is worse than just offering money. This is why many large corporations will pay a lot more for consultants and the like rather than risk it with independent contractors, despite the contractors being much cheaper.


> And I find it hard that any company would pay someone who has no experience in a language boatloads of money, even if its an excellent programmer in another language.

Top tech employers will routinely hire you (with generous compensation packages) to program primarily in a language you're not experienced in.

Big non-tech companies are simply not set up to realize that developers are assets, not peons.


The developers in a tech-company are working at a profit-centre, generating income.

In a non-tech company, they are working in a cost-centre.

Ironically, a lot of non-tech companies have not yet realized that the services they provide are actually digital, and their profit centers are actually just a sales organisation.


It's possible to have a shortage even if companies are offering the maximum salaries they can afford.


The only users of cobol left are large corporations - they can definitely pay more to programmers. Good software engineer can pick up Cobol skills in a couple of month, six month top. We are not talking about neuorsurgent.


My impression from the article was that the barrier to entry for new hires was less the language and more the domain-specific knowledge of the industry and the institution.

Salient quote from the article: "the time before a new employee can stand on their own feet is 2–3 years"


Exactly. It is not like COBOL is that hard to learn, what is hard to learn is how these old systems work and what all the code is doing.


> We are not talking about neuorsurgent.

Most neurosurgeons just follow textbook prescriptions. I don't think this is a good analogy.


> maximum salaries they can afford.

> bank

Uh, you would refuse to do mainframe programming for $1m/year? Seriously?


I meant the maximum amount they judge to be financially prudent. Roughly speaking, an employer will only pay an employee up to the value of that employee's work.


Based on your reply, I'm sure I don't need to explain how markets work?


I don't think so, but I also don't understand the complaint you seem to have.


"Value of the employee's work" is determined by the market. So either you can get someone cheaper or you can't. Either you can run your billion dollar banking operation without them or you can't.


When I said "value of the employee's work," I meant the financial upside to the firm for hiring a specific employee.


There is always a shortage of people willing to cut the grass with scissors for $0.05 an hour. There is always a shortage of people (for any price) capable of running a large corporation successfully. The former you don't need, so you go without. The latter you do need, so you pay what they cost.

If there is a shortage worth complaining about, it implies there is a critical need (a large financial upside -- these articles make it sound like it's an existential requirement for the bank) for hiring an employee that knows Cobol. My point is that if the compensation scaled with that critical need, plenty of people would be picking up Cobol right now, just like plenty of people are honing their business management skills.


You know you have job security when you fuck up so badly the government steps in and you still don't get fired. Most of us can only dream of ever achieving that kind of job security.

Very nice article. Captivating read. I was interested about what kind of 'previous computer experience' your mom had. That era of computing is completely alien to me.


Also comes down to Sweden having a "how do we fix this and make sure it never happens again" culture rather than a US-style blame culture. You can get fired in Sweden if you're grossly negligent or disloyal, but not for a simple mistake that anyone could have made.


This is a very good attitude. In general, the impression I get of Nordic countries is of a more egalitarian social contract with people looking out for each other rather than the culture of personal progress at any cost that we see elsewhere.


I suspect the climate has something to do with it, over hundreds if not thousands of years.


If you have one developer that can make a mistake that bad that the government steps in, you have a process problem, not a developer problem.


My father retired from his job as a COBOL programmer after 40 years. He only had that one job all this time. He lasted 14 years under communism (pre 1989, Eastern Europe) and 26 years after. That's as stable as they can be.

As a kid, I remember him coming home with IBM style punch cards. He was putting the cards into something that looks like a money counting machine. It was pretty fast and noisy, I remember being fascinated by it. His computer room was filled with a huge, 3-rack, communist-block designed computer, with 256KB of RAM and 60MB of HDD. The HDD was as large as a washing machine and took 2 minutes to spin up from cold start. The actual data discs were interchangeable (so the disk itself was spinning in regular air, not in void).


> He was putting the cards into something that looks like a money counting machine.

That is/was the punch card reader, with a metallic hair-comb like contraption (each prong of which was presumably connected to some sort of opto-coupler or a similar sensory component) for sensing the holes in the cards as they would pass under it at high speeds. As a kid, I used to have a lot of fun watching that in operation at my dad's office.

He used to write code (COBOL and Fortran) by hand on those graph paper-like coding sheets, then hand the sheets over to a 'specialist' card punching machine operator (I remember they used to have special courses for that too) who would punch the cards and so it went... :-)


> Most of us can only dream of ever achieving that kind of job security.

Work on something that nobody else wants to do for 50% of what you could make doing something more fashionable.


>I can only imagine the fat paycheck a 20-year old mainframe programmer would get though, because your age in this case would be invaluable.

I have a sneaking suspicion that this is not the case. Companies that want top talent and are willing to pay for it are not shy about that fact. Seeing a legit job posting for a position that pays double your current salary is a solid incentive to apply.

I'm guessing that the subject of this interview probably makes a less-than-competitive salary. Programmer pay has handily out-paced inflation for decades now. She probably started at around $30,000/yr in 1991, if she managed to get a 3% raise every year, that would only bring her to like $63,000/yr.

Naturally, I'd love to be wrong about this. This woman has a demanding and important job and deserves a generous salary for it.


Also, I guess that really when they say COBOL they mean 'our decades old spaghetti mess of COBOL', not COBOL itself. I doubt anyone would struggle to 'learn' COBOL, but it would take years to figure out exactly how these systems work.

Therefore there's not much chance you can go to another bank and do the same stuff, surely? Therefore less pay.


Exactly. The complexity here is not in learning Cobol, it's understanding a complex, mission-critical legacy system with no documentation, no support and implemented using little to no methodology or best practices.


A $63,000 salary does not seem too bad for being in Sweden. Sure, it's not as ridiculously high as some salaries are, but the general public would consider it a high compensation.


Yeah you're probably right, that was just me asking the question to myself more or less.

I've been raised to not ask what people make, that it's considered rude - so I have no clue what she makes.


> I've been raised to not ask what people make, that it's considered rude - so I have no clue what she makes.

Sure, but within the same family? While I was a kid this was vital to get some idea of the family budget. :P

Of course, it's simple enough to look up at Skatteverket...


You're not wrong. COBOL programmers aren't paid well AT ALL.


I made a decent salary as a COBOL developer (more like maintainer since nothing new was written). I was a first-year graduate making $80k+ in a Midwestern town. Senior salaries are probably hard to come by, I imagine, though...


I've got a uncle whose a COBOL solutions architect at a large company and he makes less than 125K. It's a decent living but he doesnt really have the ability to make a move like many of his .net based colleagues.


> It's a decent living

It's over double the average family income in the USA.


So?

Developers straight out of college can sometimes make triple the average household income.


The point is to compare yourself to the entirety of society, and not just 'keep up with the joneses' in your own special bubble.


Sure. I feel quite lucky that my particular skillset is richly rewarded.

That doesn't mean there aren't underpaid software engineers. Even when their lowball salary is higher than the average household income.


"Decent living" doesn't mean that you're under or overpaid for your skill set, position and responsibilities. It usually means based on your geographic area as a whole (or objectively whether or not you can sustain yourself).

The median household income in the US is a "decent living" in almost the entire country, even some urban centers. Saying that 2x+ is a "decent living" is completely tone deaf.


Depends on where you live. In SV, sure, but they have the highest cost of living in the world.

In Dallas, a fresh out of college CS grad can probably expect a salary in the lower 40s. And the cost of living is cheap enough you can afford a house on that.


Really? That seems weirdly low, even with a lower COL taken into a account. In Southeast Michigan a new grad would expect to start somewhere around 60, and that's enough to buy a decent to huge house depending on where you end up living.


That's a sad salary. Especially when moving a few hours away to San Antonio will yield a 20k+/yr salary increase. Even if housing in Dallas were free, it's still better to move from a financial perspective.


Where in DFW can you purchase a nice home on 40k?


I'm talking about renting, not purchasing.


Well over double (halfway to triple).


For every one of your uncle, there are 10 bodyshop guys billing out at $40-60/hr.


> I have a sneaking suspicion that this is not the case.

I would agree. Between the fact that new companies aren't generally standing up mainframes, and that existing companies are finally getting off them (I know one bank locally that's migrated critical parts of its payments processing to commodity hardware & languages, and the most critical government payments processing system) I suspect it will be a shrinking market. I certainly wouldn't bet on an uninterrupted career out of it, put it that way.


She probably would have gotten a big pay bump -- if temporary -- around 1999...


I'd love to have heard some actual quotes from your mother; how did she get into it, what are the best/worst/strangest parts of her day, how does she think the future will look given the lack of young COBOLists, etc.


You are right, I apologize, I'll ask her about how her days looks like and her best/worst/strangest parts. I'll add that directly to the post and reply to you when it's done!

EDIT: I've added some Q&A at the bottom, will add more as more questions come in.


The q and a bit was my best part, it sounded more life direct quotes.


Thanks, that's lovely! I agree with stuaxo, it is my favorite part.


Hm, my mom programmed on a mainframe with the pre-Fortran "Michigan Analog Decoder", is that interview-worthy?


Very much so! I'd love to read it...


I found lots of analog companies Googling for it but no references to Michelin being in the business. So, yeah, I'd be interested in a link about it and/or the story.


K, turns out it was actually Michigan Algorithm Decoder:

https://en.wikipedia.org/wiki/MAD_(programming_language)

So, not quite as old-school as the analog days :-p


I was trying to imagine how an analogue component could be anything but a sensor for a mainframe in a factory or something. A programming language called MAD makes so much more sense. :)


Just googled it myself and found the same -- following up with her about the name.


If it makes an interesting read, then yes :)


It's not that COBOL is particularly good or bad. It's that COBOL-era database technology is painfully-obsolete. Flat files, ISAM, and IMS are all pre-SQL, and old shops are likely to still be running some of that stuff. DB2 isn't too bad; it speaks SQL and runs on various platforms including Linux.

With the older systems, the database update logic and the business logic are in the same programs. This is not good.


Flat files, ISAM, and IMS

Those were all good ideas at the time.

Infact IMS was such a good idea the hipsters just reinvented about 1% of its functionality and called it MongoDB!


I was explaining NoSQL to an older coworker of mine who bemusedly humored me for 20 minutes or so before explaining she'd started her career off with Pick in the 1970s...


Right. Much NoSQL is a re-invention of the CODASYL DBMS concept from 1969.[1] Everything is explicit links.

[1] https://en.wikipedia.org/wiki/CODASYL



Yes


Funny cuz that's what Ive been saying about IDMS for years:

https://en.m.wikipedia.org/wiki/IDMS


> With the older systems, the database update logic and the business logic are in the same programs. This is not good.

Can you elaborate? Sometimes it seems that business logic must be implemented in SQL, for example, or else you will get a massive performance hit.


The article has an explanation of this point, I thought it was plenty to understand what was going on.


Why would you want to be forced to rewrite business logic just because you're using a different database provider?


The COBOL I worked with was all SQL with DB2. Admittedly we were the most "modernized" part of the company.


It was interesting, because they interview gave the impression they were quite behind the curve on mainframe technologies - our mainframers work predominantly with DB2 as a back end and have mostly migrated off the more primitive stores quite a long time ago.


The banks have had zero incentive on spending money on their systems. From a bankers perspective, it's a cost centre, that you need to have since it's cheaper than people doing manual stuff.

It's just until recently that someone realized that since all services are delivered digitally, the banks would have been outcompeted by Google, Microsoft, Apple or just about any competent and large enough tech company, was it not for the heavy regulations, and a conservative financial world.


> The banks have had zero incentive on spending money on their systems.

Yeah, that's just not true.


As far as I see it, they have to spend shitloads of money sometimes, but there is no incentive to spend money on improving and adding features to the core systems such as the ledgers, as it is very hard to get a return on the money that way, partially because it will not really improve their market share.

The money is spent on systems on top of the core systems (web etc) basically trying to stay on par with the competition. If they spend on the core systems, it's when they try to consolidate and save costs.


DB2 is bad... If you're used to using MSSQL, MySQL, Postgres, or SQLite, the tooling and limitations of DB2 are pretty galling.


> DB2 isn't too bad

DB2 is what Oracle wishes it could be when it grows up.


At the end of the 1980s, I was in a systems administration class for a now gone minicomputer line. There were two women in it who seemed quite mature, but must have been younger than I am now. One told a story about her introduction to computing that has stuck with me. She passed an aptitude test to become a programmer: by now I have forgotten whether she worked for a computer manufacturer or a customer the size of a utility company. Whichever it was, the company had a machine room with a glass window opening onto a busy street in Manhattan. In those days, programs mostly resided on punch cards, and data mostly resided on tapes. One would bring one's cards and I suppose tapes to the operators. They would hang the tapes, run the cards through the reader, and the tapes would commence to spin back and forth. If all went well, the program ended with the tapes spinning on back to the reel. In case of an ABEND, everything stopped, and every New Yorker on the sidewalk could say that the program didn't work. It struck me as being far more public a test than most of us my age and younger have ever had to deal with.


The statements about batch processing are interesting. For example, when you transfer money to someone at a different bank, one way is for the banks to have correspondent accounts with each other, so the transfer is just within accounts at one bank. When one of the correspondent accounts is out of money, they have to do a real inter-bank transfer, probably via the fed or some larger bank.

Bottom line is that there is parked money in these accounts. The banks should have every incentive in the world to minimize this parked money (either to get access to their own money or to avoid paying interest on someone else's). So there should be pressure to make the systems as real-time as possible, not batched.


Banks already do some of that locally and internationally.

In South Africa we have BankServ, which is a local interchange and clearing house. Transactions are still in batches, but the settlement between banks happens nett between the local banks.

Internationally, MasterCard, VISA, Amex etc. manage the clearing. Banks then have to pay within 2 days via Swift to either other banks or their designated international settling bank.

Going back to your suggestion, it could work because the banks could just pay each other when they run out of funds. The challenge becomes fraud and other legal issues, which often mandate settling houses.

The real-time problem is a legacy hardware problem as well as the settling issue. On your last point, money is normally parked at the Central Banks, mostly as a legal requirement. From my experience, banks in healthy financial economies like mine don't keep a lot more than the minimum cash reserve with the Central Bank.


> Transactions are still in batches, but the settlement between banks happens nett between the local banks.

Same thing in Sweden, I believe. Cleared through Bankgirot.

https://en.wikipedia.org/wiki/Bankgirot


Would it make much difference? There will be a transaction in the other direction soon enough.


Yes. All current "direct transfer" solutions, more or less requires that you keep a rather large sum of money in a common pool (at the 3rd party) to cover for the event that you run out of money and have to close the bank, to guarantee that all (or at least most of) the money is actually available when the debts are settled, typically a few times a day.

I don't think real time settlements involving the central banks will happen any time soon, but maybe more frequent batches.


First off, great post.

It's very nostalgic for me because the author is fascinated by the technology of the day. COBOL, hierarchical databases, batch processing were still a norm in IT when I got into the field. My first programming job was around 1992-93 on a Vax using SQR and PL/SQL for an accounting department. We also had to be able to read COBOL. By contrast, my friend got into the field 5 years ahead of me and was hired into an insurance company. His IT department was just switching from assembly and IMS on an IBM mainframe to the "newfangled" COBOL going against DB2.

I left IBM after a year, stayed in my first programming job for a year, and then started a 5-year stint with startups. When I hit the startup scene I realized how a new norm was developing. Small servers, C/C++, relational databases, and OLTP were the dominant paradigm in smaller organizations.

The author did a nice job describing this period in the history of computing.


If you're interested in knowing how banking works between banks, Planet Money had a podcast on it few years ago. Apparently bankers met at a parking lot in New York and traded bags of paper checks back in the days.

http://www.npr.org/sections/money/2013/10/04/229224964/episo...


Andrew Turley has given a nice COBOL talk; the title alone - "What We Can Learn from COBOL" - is worth the price of admission:

http://confreaks.tv/videos/goruco2014-what-we-can-learn-from...


Really interesting! Thank you both.

I would love to know what a 'typical day' and the general working environment are like. (A vague question, I know, but sort-of on purpose.)


My mother indirectly used COBOL in the microcomputer era: she used Real World Accounting on 68000 based Radio Shack Model-16's under something called Ryan McFarland COBOL Operating System (RM/COS). Eventually this became available on Xenix and SCO UNIX on PCs.

My one and so far only COBOL program was an ASCII-graphics game of checkers for the TRS-80 model 16.


That's interesting because (a) it's first I've heard of COS and (b) I cant find jack on it. A bunch of ComputerWorld articles and a patent reference it with a description it's a custom OS for COBOL apps. A web site says it's for an abstract, C machine but nothing to confirm. BitSavers doesnt have anything that stands out to me. Checked jnder Tandy as adds indicated it was licensed to them.

I'd be interested in links to guides on this product or scanned pages that show whether OS itself was COBOL. I'll put it on Wikipedia's Timeline of Operating Systems list if I get the info. I got quite a few added there so far.

Edit: Some paydirt showing ghe company along with at least compilers absorbed by MicroFocus. Tech might still live on somehow.

https://en.m.wikipedia.org/wiki/Digitek

Looking with Digitek, MicroFocus, and Liant still didnt get me jack. This is some phantom of a product.


Search for "rm/cos 68000"

I remember a little about it:

You could set up "partitions", which mean reserve fixed amounts of RAM for different applications.

It ran on the (8 MB I think) external hard drive for the TRS-80 model 16.

An advantage compared with Xenix is that it was smaller (used less of the hard drive).

Also it had just a single directory, but had a form of hierarchy in that you could use any number of '.'s as separators: like usr.bin.ls. The total name could be long. I think OS-360 worked something like this.

I think the last line of the screen was always the command line, and the other lines were for command (or cobol menu screen) output and worked like a pager.

You could definitely hook up serial terminals.

I found this:

https://books.google.com/books?id=05wAGZQlo9QC&pg=PA694&lpg=...

http://www.1000bit.it/ad/bro/altos/Altos-68000-Systems.pdf

It's the same as "COS990", but for 68K.


Thanks! That last link is helpful! It's actually quite an interesting and capable OS. Has file & software protection mechanisms. I recall UNIX having about no reliability at the time. Also, coding, compiling, and directly running without linking the apps is closer to a LISP machine than regular OS of the time.

Might give it at least a brief Wikipedia article with these links.


> first I've heard of COS and (b) I cant find jack on it.

If you can lay your hands on some issues of BYTE and PC Magazine (Ziff Davis) dated between 1985 and 1990, you would find plenty on RM COBOL.


That's another angle of the confusion. Most of the search results are old magazines with basically the same advertisements. They advertise both RM/COBOL and RM/COS. They're different products. There's a bit of RM/COBOL information given it basically worked its way into MicroFocus. It's RM/COS that they never describe past "an operating system specifically for COBOL."

Note: It's going to be irritating if I find it's another DOS variant after all this work haha.


I think you are overgeneralizing IMS performance. "IMS is extremely old and very slow. Searching for data can take hours." Especially when you later point out that they have 10TB of DB2 data and probably more than that in IMS. Not all IMS databases are slow. Not all MySQL / Postgres, etc. databases are fast.


Definitely not! I'm mostly quoting what she said, but IBM has a lot of nice tech, but you have to understand that these are very old versions of IMS and the volume of data they have does not make the case better. Hierarchical databases are not always slow, but once you want to find stuff at the very bottom of the hierarchy, it'll start take some time.


Hierarchical and network DBMSs can scream, but you have to have a competent DBA curate the so-called "physical" relationships (indexes, record and index pointers, child record locality, block sizes, buffers, what-have-you) that support the logical views defined by the application schema. In either IMS or IDMS (where I have some experience) this can be a lot of work, done right. Measure, tweak, repeat. Those older DBMSs can be finicky, but properly monitored are great performers.

Conversely, a lazy DBA can cost you a lot of money. And lazy DBAs can be hard to identify - their domain is rather arcane after all.


Um, what about indexes?


These older databases often allow the use of indexes as an option. Of course ISAM (Indexed Sequential Access Method) is indexed. But also it is often possible to add indexes to the networked database internal structures. An alternative is to use hash tables with overflow. In any case you must usually periodically scan through the database reorganizing the indices and/or location of records, which can take up a lot of time and usually requires taking the database offline. Here's a good read(first is PDF):

https://www.google.com/url?q=http://advsys.net/ghs/publicati...

or the same paper at Google Books:

https://books.google.com/books?id=YqM3pqFVt8IC&pg=PA8&lpg=PA...


Not compeltely related to article but I've often heard that banks argue that COBOL is more reliable and maintainable then any "modern" language.

I've always wondered why this is the case. Is it because people are way more careful when entering code? The article points out that their IDE is connected to the Mainframe directly which makes me believe an error is much more severe. I'm more confident that the compile and unit tests can catch my mistake than I'm in my own coding skills (which is why I am not always afraid of changing tightly coupled systems). And even if worse comes to worse, i'd still be confident that some system will detect that change and roll back my commit before all goes south.

With COBOLT (in such old environments), it seems that things must not break at any point so developer are a lot more careful with what they do, which makes the code do what it is designed for very well. The problem with such an approach is of course upgradability (I would not want to touch undocumented code that has not been updated for 20 years because some government decided to replace an old "protocol".

Or maybe, COBOLT is considered more reliable (by Banks, etc) because managers just see the fact "this system has been running for 15 years" while ignoring the problems that come with it.

But I'd like to hear your opinion about it.


I have some COBOL on IBM experience; I did a short stint about 15 years ago for a large grocery store chain. There is no way that COBOL is more reliable and maintainable -- it's effectively equivalent to programming in old-school line-number BASIC. The place I worked at was extremely strict on reviews, process, and testing. Every line was reviewed extensively by a group of people and if you did a SQL query you always accessed the fields in alphabetical order. It was that kind of strictness. Some of this was because the consequences of a mistake are pretty severe (Hawaii doesn't get a shipment of groceries) most of it was because the technology was so antiquated. What I can do now in a week would take 4 months to do that in that environment.

I was one of the few new hires with actual education and experience in software development. The majority of new hires were people in their mid-thirties who were looking to change careers from trades, etc. The company was specifically looking for smart people that they could train extensively in house. Almost nothing you learn outside of that environment is applicable inside and vice-versa. The best I can describe it as a complete parallel evolution of technology. The technology is similar and yet also completely alien.

Connecting to the mainframe didn't mean you were editing the code live. It's exactly like shelling into a remote Linux machine to compile your code. You didn't work locally because there's nothing that is local.


When I worked with COBOL I mostly worked in a similar hosted setting, using only a dumb 3270 terminal to write my code. However it's not the case universally, it is/was also possible to develop COBOL code on a PC that simulated the mainframe runtime (MicroFocus COBOL, if it's still called that, is what we used).


My father's one(ish)-man business was a PC-based COBOL application that began with the launch of the IBM-PC until he retired in the earlymid 2000s. I worked on it for a few years in the 90s (SPF-PC!).


COBOL development had (has?) a completely different development and deployment experience. Sure, you keyed your code directly into the mainframe, but your code and your session were isolated - you, the developer, had limited privileges. Only once your code took the prescribed input and produce the prescribed output was it allowed near "production" data.

That said, the language itself was (is?) so limited in scope and ability that stability was nearly guaranteed. You READ and PRINT using tape or disk and printers, but the programmer didn't worry about what devices were available - some system admin pointed these devices at a job, and READ came from that tape and PRINT went to that printer. The setup and management of the jobs that ran your code was a very manual process. No pointers, no memory management ... nothing low-level at all. Just high-level abstraction.

Then there's hardware reliability. As an example: you can walk inside the mainframe, pull a CPU, and the thing keeps working. Replace that CPU, and the system makes use of it. None of the system's users ever know that it happened.


That job setup uses a language called JCL. And while COBOL reads more or less like english statements, to the point that anyone who has ever programmed procedurally could follow it, JCL is quite inscrutable. One old joke was that nobody ever writes JCL from scratch, they just take what worked last time and change the dataset names.

http://www.jargon.net/jargon/jargonfile/j/JCL.html


> Then there's hardware reliability. As an example: you can walk inside the mainframe, pull a CPU, and the thing keeps working. Replace that CPU, and the system makes use of it. None of the system's users ever know that it happened.

From what I understand, until Tandem came along and started to seriously eat away at the market in the late 1970s and forced all the other manufacturers to follow, mainframes were not known for reliability.


Im still looking for data myself on reliability over time. I recall one OS for IBM mainframes bragging about its reliability because it ran six months or so without a reboot. I was think, "Wow... some good numbers there..."


> Not compeltely related to article but I've often heard that banks argue that COBOL is more reliable and maintainable then any "modern" language.

What I've usually heard is that code that already exists and has been demonstrated in many years of use to work correctly in COBOL is more reliable and less costly to maintain in COBOL than the alternative of replacing working components with something in a modern language when new business requirements required a change.

Having been involved in a big-bang replacement of a large, working COBOL system with a from-scratch .NET solution, I'm not unsympathetic to this argument.

I've never heard the argument (though perhaps its been made somewhere) that COBOL is, for a clean slate project, more reliable or maintainable.


It's just IBM marketing :-)

To the extent it's true, it's because the people that have built these systems are experienced and have a lot of business knowledge, and know how to reuse and tweak their systems while not trying to build each new requirement with the latest interesting framework they downloaded 5 minutes ago.

But it's slow to code in, especially if you go outside COBOLs comfort zone.


This is probably in respect to a few things. 1)There's no need to constantly update your version of the language and framework unlike with say Ruby and Ruby on Rails. 2) COBOL is battletested so most of the failure modes are well-known unlike with adapting the latest and greatest 3)I doubt that mainframes suffer from as many distributed computing problems


I also imagine they are fault-tolerant in ways we no longer consider today.


I am sad to see the mainframe platform go away. There are still some things that modern platforms could learn from, my biggest favorite is probably the heavy system instrumentation that comes with it (system traces, SMF and RMF). And WLM is also neat.

I am afraid it became a victim of its own success. IBM and other SW companies got used to so huge margins so they stopped innovating enough, in order to protect the revenue.


I remember a professor at Stanford for an early CS class (I'm pretty sure it was Jerry Cain) mentioning COBOL programming and drawing a graph with salary vs time for a given language that manages to be adopted widely. First when the language is rare there's a great ascent up, then as more people gain knowledge the rates drop to some minimum as the knowledge supply broadens. Then as the language shifts into a legacy context the graph ramps up again. If I remember right he claimed the ramp exceeded the initial peak in the earliest days of the language. I forget what the context was but I think it started out with a question about what languages paid the most. I think I recall him quoting a $300/hr figure for COBOL because it's so old school and the knowledgeable population so small. Then come to think of it I think he said the rates essentially drop to zero when the language is all but abandoned and transitions to modern equivalents have befallen.


It's funny what most of us consider "a lot of money". An uber driver yesterday mentioned overhearing two organ replacement surgeons talking at his gym in Palo Alto. One said, "man, the pay sucks. It's only $13,000,000 a year :-(". And the other one responded "sheesh, come work for us and we could at least get you up to $19,000,000 a year." Kind of puts things in perspective...


Yah, the free market is an amazing thing where free individuals freely agree upon rates even as astounding as those figures. The even cooler thing is that there even exist institutions that can fit that into their payroll! Maybe cooler still is that economies exist that even hold institutions such as those.

I remember going to this Bay.net talk with Juval Lowy who beat down on everyone calling devs code monkeys who got paid peanuts. Haha, ~100K range is definitely peanuts compared to a seven figure salary. What a world.


Almost nothing about healthcare in the United States resembles a functioning market, much less one worth marveling at because of all the freely available choices being made.


IANAD, but that sounds impossibly high. Sure they weren't joking or discussing how much their malpractice insurance policies cover?


FMH I wonder what the Surgeons I have met (I am on a transplant list in the UK) would say to that.


It's a funny anecdote, but in practice both supply and demand matter.

JavaScript is both extremely popular and rather well-paid. Why? Almost everyone needs it.


We have the full suite of IBM systems at work, z, i, and p, plus a multitude of other VM systems and windows servers. They have been talking about retiring the z-series for the last twenty years yet have yet to do so. The reason is probably similar to that of the banks, not that there isn't sufficient talent to do so but just because it works, works well, and damn if every silly project a portion off the z, i, or p, doesn't go wrong in some form or another.

the new thing is wrappers, extending the life of the applications and their data but allowing modern interfaces via the web and hand held devices. if anything its the best example of code reuse there is.

on another note, I don't know if we have anyone under fifty who knows COBOL well enough to do new work but a few can do maintenance just fine


Very interesting article, thank you very much!

I spent about 9 months in total in a mainframe team during my training, but that team mainly took care of hooking up the mainframes to the corporate network, so I did not do any coding except for small Perl script. (I came close to having to code in REXX, but I dodged that bullet. I played around with REXX at home and did not like it very much.)

But it was very interesting to see all the examples of parallel evolution in the mainframe world, at least where the terminology was concerned. A reboot was an IPL (also used as a verb), the OS kernel was a nucleus (a term I really liked), they also had a term for network interfaces I forgot about.

A strange but interesting world, it's a shame those beasts are so expensive.


Was expecting a Question and A...


Yeah, sorry about that! I'll add some direct Q&A at the bottom of the post!


Yeah, please do! Here's some questions I would ask:

1. How/when did you decide you wanted to be a programmer?

2. What were the challenges you've faced as a female programmer that started in the 90's?

3. You've been working for the same entity and possibly the same code base for more than 20 years. Does it ever gets old?

4. How scary is it to write code for a bank?

I'm certain I would have 100's more questions to ask, but I'll be satisfied with at least these :-)


There! I've added those to the post at the bottom, with one more quite funny story that I didn't knew about myself. Thank you all for asking questions, there are many questions that I couldn't come up with myself that I'm very happy to get answers from!


About the IMS to DB2 migration, intuitively I would try to write a translation layer between the two databases. Do you know if this was already attempted or deemed undoable (or silly) from the beginning ?


DB2 is a relational database and IMS is not.

The programs that use IMS probably do lots of things that are just not done in relational databases for good reason, so if you just translate it you will either get errors in relational constraint or you will configure DB2 to stop enforcing relations, losing most of the advantages DB2 has in the first place.


My last job was at a COBOL shop that used Ordat ti2sql (an emulation layer) to run its HP mainframe code on top of Solaris and DB2. I'm sure similar products exist for other mainframe architectures.

http://www.ordat.com/en/solutions/tisql/


Interesting personal perspective, thanks for posting!

It sounds like other than the very slow database, the biggest impediment in this space is the lack of new talent familiar with the environment.


Talent is only half of the problem. Where I work they did relatively little hiring for a long time. Many of the people I work with have been working here for 30+ years. Now we have a massive, antiquated code base that is poorly documented and the system knowledge is evaporating due to retirements faster than we can transfer it to new hires.


Every so often I do some research to see if the lack of talent is going to be reflected in salaries but COBOL jobs always seem to pay less than web development


It's not just web development, it pays less than almost any development.

I would actually not mind learning COBOL and writing boring bank software, but why bother if I can make more money with C++ and Python, while working on more interesting stuff?

If I'm going to take a pay cut to use an exotic language, I'm going to find a job writing Lisp.


That was my experience too with the whole mainframe ecosystem. There are a few hotshot consultants that work for Sirius/mainline/etc that make the bucks, but the guys working for the banks/airlines/etc mostly seem to be underpaid relative to the shrinking knowledge base/talent pool and how critical their jobs are.

Makes you wonder what would happen if the entire mainframe tech organization at one of these banks demanded to be paid on par with the traders working on the investment side of the bank.


> Makes you wonder what would happen if the entire mainframe tech organization at one of these banks demanded to be paid on par with the traders working on the investment side of the bank.

It's for organizations like this where I really see unions making sense.


Lack of new talent is indeed the biggest problem they are facing, but they've acknowledged that they've gotta upgrade these old systems before it's too late - instead of hoping for new talent to show up. Nordea in question has bought a new system that they want to gradually migrate to, but all their mainframe programmers, - including my mother - are all saying it's not going to work. The system is too large.


I've worked on a project that migrated an old cobol system and it went fine although everyone said it would not.

But it's not easy and not cheap.

The problem for the banks is that they have so many systems, typically thousands, and they are all interconnected.

Enter the 'enterprise architects' that draw power points with boxes and buses on top of all the old shit and you have a receipt for a very expensive soup.


Absolutely. In many cases, universities can't afford their own IBM mainframe environment to allow students to learn how to use it. Many folks, such as myself, have to have on-the-job training in order to learn COBOL and how to use the IBM mainframe itself.


OpenCOBOL is available to learn the language. I do not know it, just checked macports and yes it is there.

Description: OpenCOBOL is an open-source COBOL compiler. Homepage: http://www.opencobol.org/

Library Dependencies: gmp, libtool, db44, ncurses, libgnugetopt, libiconv, gettext, mpfr Platforms: darwin License: GPL-2+ Maintainers: egall@gwmail.gwu.edu, openmaintainer@macports.org


Is that environment something that could be virtualized, were it not for intellectual property reasons?

Almost seems to be more of a vendor lock-in issue than an a mere 'old and scarce platform' issue, albeit that's significant also.


There's already an emulator called Hercules, but you need a license from IBM to run the operating system all of this runs on top of.


Maybe telling IBM about your interest would lead to them licence it free to emulators. They wouldn't like their platform to die because nobody knows how to use it.


you'd think that wouldn't you , instead they've sued them in the past http://arstechnica.com/information-technology/2010/04/ibm-br...


That's because they've been a bit naughty though, not because of mere corporate evil. Quoting the article you linked:

-----8<-----

A well-designed System Z emulator that allows users to migrate their own mainframe applications to commodity hardware would obviously pose a serious threat to IBM's mainframe business, but IBM's software licensing terms have historically prevented such a threat from materializing.

...

In many ways, the project arguably benefits IBM by encouraging interest in the mainframe platform. [...] What brought about IBM's change in perspective was an unexpected effort by the TurboHercules company to commercialize the project in some unusual ways.

TurboHercules came up with a bizarre method to circumvent the licensing restrictions and monetize the emulator. IBM allows customers to transfer the operating system license to another machine in the event that their mainframe suffers an outage. Depending on how you choose to interpret that part of the license, it could make it legally permissible to use IBM's mainframe operating system with Hercules in some cases.

Exploiting that loophole in the license, TurboHercules promotes the Hercules emulator as a "disaster recovery" solution that allows mainframe users to continue running their mainframe software on regular PC hardware when their mainframe is inoperable or experiencing technical problems. This has apparently opened up a market for commercial Hercules support with a modest number of potential customers, such as government entities that are required to have redundant failover systems for emergencies, but can't afford to buy a whole additional mainframe.

-----8<-----


Sure naughty or not naughty though, having IBM be actively hostile to the only cheap way to get access to a mainframe like environment is disasterous for their skill pool.

So if they don't like herculeas' efforts they should've brought out their own, or come to some kind of arrangement with them.


Interesting tidbit: I'm currently replacing a product that IBM refuses to licenses any longer. The company I work for would have gladly gone on paying $2MM a year in licensing fees, but IBM is literally refusing to take their money. They offer no replacement either.


Even working for a company that had mainframes, getting access to do any kind of learning was very difficult.

They rarely have dedicated training systems given the cost of running them, and trying to find information from IBM was an excercise in frustration.

Personally I think that is where IBM made a huge mistake. Lack of low cost training systems has starved their ecosystem of new coders.


There have been versions of COBOL for PCs since the pre-Windows DOS era. Not free, but neither was anything else in those days. Unix wasn't free either, neither was most unix software.


From a technology perspective, the mainframe software described in this interview is clearly an evolutionary dead-end. It was written in an era when every bank had its own programmers and wrote its own proprietary software.

Because of the widespread use of these homegrown systems in the banking industry, however, it appears that management is not yet ready for a total break with the past. The investors that own these companies need to demand change and indicate they are willing to pay the cost.


Where is the interview? I want the interview!


Is it possible to get into COBOL still? I don't know anything about it, most people seem to say it is horrible to use. Would it be career suicide? Will it be totally gone in X years?

Given the chance, I'd probably jump onto the opportunity to work with it.

For reference I do iOS development now and I have ZERO experience with COBOL.


I love this article. My own stepmother just retired from a 30+ year old career as a COBOL programmer. One of my first gigs was actually doing a bit of COBOL myself - it's an interesting language in that the database access is built into the language itself, rather than being a library or add-on.


There are tons of (young) Mainframe programmers and experts from India. Banks are facing no such problems. And no, they don't necessarily earn fat paychecks.

And I think you have broken a security rule if your screen shot of ISFP is from her bank. This whole article seems risky for her job.


That's not her ISFP, she got much nicer colors! That's an image I found of an ISFP.


Awesome story, thanks for sharing! It's important to take a step back and appreciate history.


I'm not sure it's fair to call it history -- it is because of history (what isn't?), but what is described is the way things are right now, eh?


Japan's Mizuho bank is notorious for struggling with the system update. Big mergers bring big office politics.

They were in news in 2002, 2011, and also right now, where they are struggling to finally merge the 3 systems of their pre-merge banks together.


  I can only imagine the fat paycheck a 20-year old 
  mainframe programmer would get though, because your age 
  in this case would be invaluable.
As long as you're on the topic of compensation, how much does she make right now?


Not OP; but working in an investment bank in Western Europe as we speak. The mainframe guys next to me are making around 1-2k EUR a day as contractors.


Thank you for writing this! This was a very fascinating read - very insightful.


You can interface Cobol with modern technologies. See, for example, this web framework:

http://www.coboloncogs.org/


This tech stack is basically the same as what most insurance companies in Australia use! IBM mainframes, IMS, DB2, Endeavour etc. there's a lot of this type of work still out there.


Is there really a market for my COBOL knowledge still? It was one of my favorite languages and honestly it is what got me into SQL and other data analysis languages.


Thank you for sharing this. Off topic, but I thought the article was very well written!


I studied Fortran, COBOL and RPG II later doing RPG II programming on IBM S/36 computers for 4 years until I went to the PC.


>There’s nothing wrong with the language itself, the problem is that barely anyone knows it — at least not in the context of mainframe programming.

What data did you use to determine that barely anyone knows mainframe COBOL? There are a significant amount of companies that still use it because it just works and they're in no need to upgrade to current technologies.


I don't have any data for that point, but I get the sense that this shortage of labor is caused more by aging workforce than obscurity. My dad is an engineering manager for a large credit card processor that uses a very similar mainframe setup, and they're constantly putting enormous pressure on their offshore contracting firms to provide young COBOL programmers because they can barely find them here. It's a job that pays people enough that they can retire well if they're good with money, and they're retiring constantly. At 57, he's one of the youngest on the team.


I was involved in this for a couple of years in the mid-90s working for a COBOL vendor helping people migrate from mainframes and minicomputers to what was known as the “open systems” (mostly Unix, with Windows NT rising) world, saving a somewhat jaw-dropping amount of cash in annual licensing and support costs.

Even back then, one of the major complaints was that companies refused to pay for training, hoping they'd find someone who already knew everything they needed. I heard multiple stories about people who told their employer that they were planning to retire, left on schedule after not finding anyone qualified to train, and returned later as a consultant at a significantly higher rate.


Many younger programmers wouldn't want to get into mainframe Cobol programming because of the almost compulsory requirement to provide after-hours support duties in most of these jobs. Many managers find it easier to let programmers get woken up at night to fix the problems instead of allowing them to prevent the problems from occuring beforehand during the daytime. Years of getting woken up at 3 in the morning to fix those types of production problems tends to make many people look for other lines of work. The people who do stay employed doing after-hours support are often the ones who deliberately put the problems into the code during the daytime, generating those money-making callouts. US businesses with some India-based programmers will utilize them to fix those overnight problems because of the time zone difference.


The phrase "their offshore contracting firms" there should be enough to make any sensible 20 year old run screaming. Sure, spend a bunch of time learning a dead language, to get paid peanuts and be replaced by an offshore worker within a year, no problem!


unfortunately even assuming the outsourcer can find people who really have COBOL experience there's a world of difference between knowing COBOL and being able to maintain the 30+ years of hacks that many large companies will have built up in these systems.

Unfortunately the alternative (re-write the mainframe/COBOL systems in a more modern platform) is a risky and costly process....


> being able to maintain the 30+ years of hacks that many large companies will have built up in these systems.

It's not even the hacks. It's that "how the business works" was automated into COBOL 30 or 40 years ago, everyone in the "business" who knew how and why things happened retired or got laid off, and the COBOL programmers are the only ones who remember the business rules.


In college a few years ago I had to work with a system written in FORTRAN 77 in the early 90s that has been hacked new features and bug fixes on and off for almost 20 years, one single file with more than 20k lines of code, it worked all right, but understanding it was troublesome.

I lost the count of how many times I thought about rewriting the entire beast in C or even a more recent iteration of FORTRAN.

In the end, it wasn't something that I would use for much longer and I just mapped what all functions that i needed did on a spreadsheet and added a couple of hacks for the next unfortunate person to handle.

Maintaining old codebases full of hacks and without good documentation that can fail without big consequences is boring, but code that handles the kind of data that banks have is something straight from hell.


What I believe the parent is referring to is that there is an acknowledged shortage of COBOL programmers. See http://blog.stafflink.ca/recruiting-tips/cobol-and-the-mainf... or just google COBOL programmers shortage. There are still plenty of COBOL programmers (and jobs) but most are 45+ with many retired or retiring shortly. Why not train replacements? Schools tend not to train students in procedural programming or mainframe programming so there's a fairly hefty ramp-up period (2-3 YEARS with the possibility that the person may leave at any time during or after training). This COBOL shortage tends to be a large reason why many companies are replacing COBOL systems with systems written in Java or .NET.

That said, mainframes and COBOL tend to be used heavily at utilities, railroads, and government entities. One of my coworkers (one of two former COBOL programmers on staff) just identified that Eversource in CT uses COBOL because of their inability to handle a certain type of id.


2-3 years? COBOL is not that difficult.

My first job out of school, at a big consulting firm, and they were teaching new college graduates COBOL in a 6-week bootcamp. And most of them had taken few or no programming courses in school.


yes language are NOT difficult, it's the same concept everywhere (conditions / loop / boolean logic / etc)

BUT the hard part was, is and will always be domain knowledge, with special cases here & there, obscure logic, miscommunication between expert & developer and others


Few people know it but it was written with the idea of very human-like descriptions of what's to be done. I'd wager anyone with a basic introduction to programming could quickly become productive in the language. Although you wouldn't guess it if you went to my college in the early 80s.

For reasons only known to my computer science dean COBOL was a third year course after after BASIC, Pascal and FORTRAN in 1985 or so when I was working through the curriculum for a CS degree. Fourth year was BAL (IBM mainframe assembly) followed by a language of your own design.


The classic COBOL example was:

ADD GIN TO VERMOUTH GIVING MARTINI

Since today that would be...

martini = gin + vermouth

.. perhaps we are now 25% or so more efficient.


That it works and that "barely anyone knows it" are not contradictory.

Fortran (particularly in the context of embedded systems developed in the 70s and 80s) is another "barely known" language that's quite widely used (particularly in historical and maintenance contexts, if not in widespread current development outside certain engineering and scientific domains).


I've seen mainframe users in the UK (e.g. banks) trying to hire new COBOL programmers because (as the article says) all the original ones are retiring.

Unfortunately young people might not find the world of mainframes and COBOL the most enticing career choice :)

I think it will be a huge problem over the coming years as more of the original authors of banking system retire.


I can understand why the big banks are reluctant to re-write the big systems now that they have painted themselves so far into the corner, but what I don't understand is why they seemed so disinterested in the massive advances that were made in the 1970s that would have made COBOL and IMS long obsolete already then.


Reinventing the wheel is not the same as advancing.


I knew someone who just retired from being a mainframe programmer (not just COBOL). She reckons she could have kept working for another 20 years if she wanted.


nice to see such interview, especially it's from parent


COBOL is one of the "big 3" compiled languages developed in the late 1950s; the other two FORTRAN and LISP. All three are still in use.

COBOLs fortee was you instructed the computer in English-like sentences and paragraphs.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: