Hacker News new | comments | show | ask | jobs | submit login
Banks should let COBOL die (thenextweb.com)
79 points by artsandsci 234 days ago | hide | past | web | favorite | 185 comments



The idea that large corporations are simply going to move on from COBOL is out of touch with reality. It really can't be overstated how deeply old COBOL programs are embedded into these corporations. I worked for one that had been using them since the language itself was created, and while they all could see the writing on the wall, the money to do the change simply wasn't there. Not only would they have to do a code change, they would also have to do a hardware change because all of this is running on mainframes. Not only does this cost a lot of money, but it would also take an enormous amount of time. What mid-level manager is going to stick their neck out and push for this? What exec is going to ignore the guaranteed transactions that a mainframe and a decades-old COBOL program can offer simply to secure a solid future for which they may not even be around for?

But as far as training people to handle COBOL, well, that's simple. Look at vocational schools of years past. Why can't you train a random person just how to do COBOL? They don't need to do object-oriented programming or UX theory. They just need to maintain and occasionally update some ancient program that's been rock-solid for longer than they've been alive. It doesn't take a genius, it just takes very specific training. And there are a lot of people that would jump at a chance for some high-paying, steady job that they just need to train for a year or so at night school.


"The idea that large corporations are simply going to move on from COBOL is out of touch with reality"

Thank you. So much internet arm chair analysis falls into this category.


The company I work for has piles of Visual Basic 6 code that will be here for as long as their product is viable (Prepaid credit cards services). The growth is simply not their to perform a rewrite. It's sad but is also a reality.


I clearly remember VB6 being an excellent tool. Quality of code varied widely between developers.


VB6 was incredibly easy to work with, and honestly, the value it provided by being able to link a quickly-made, solid UI with easy, solid code was highly underrated.


Same with Delphi which had a better underlying language than BASIC, fast compiles, superb version support and a stunning VCL (visual component library), it took VS studio years to catch up with a lot of that.


That also provides a vector for experienced COBOL devs to make _serious_ money as instructors. Banks would fly out trainees anywhere on the planet for any fee if it meant critical systems have the necessary ops guys.

It's also a great proposition for a grad student. They don't have to unlearn much, they're in a perfect position to absorb a new language, and the term 'set for life' can be thrown around liberally.

Wouldn't be surprised if this was already a thing, probably in the form of COBOL-specific Consultation services.



"But as far as training people to handle COBOL, well, that's simple. Look at vocational schools of years past. Why can't you train a random person just how to do COBOL? "

You can. Our local, community college ties its classes to what industry in the area wants. It has classes on COBOL, RPG, and so on. That's because there's quite a bit of it out here in big companies.


Learning cobol (or any language) isn't the hard part - its understanding a massive legacy codebase and business practices that's the hard part.


Yep.


I don't understand why there wouldn't be at least 2 or 3 students, in any graduation class, interested in learning some language (and everything around it, since COBOL isn't just about the language but understanding business and computation in another paradigms).

Do all students just care about learning the hype and creating the next flapping bird game?


I was interested in COBOL while doing my MS in CS (coming from non-CS undergrad) but the problem was tooling and projects. I gave up after trying to track down z/OS, install hercules and get all that tooling set up. There is more to learning development on these old applications than just COBOL though, you need JCL and other applications/tools that go along with it. I ended up giving up because it just became too much of a hassle.

Creating a new flapping bird game is exciting for students because it can give them an opportunity to create something from nothing. You aren't going to get that opportunity doing COBOL development. You will most likely spend the rest of your career doing maintenance work rather than any greenfield development. There is nothing wrong with that but not everybody likes the fact they can't create something new.


A sometime co-worker, probably in his late 20s, emailed me from the Data Science bootcamp he was attending to ask a question about COBOL. He is not interested himself, but somebody else in the class has heard that it pays well. I gave him what I thought was the email for our shop's last (now retired) COBOL guru, and recognized GNUCobol as a place to start.


When I first started getting into development I took a look at it because I heard of the great money you could get if you were an expert in it. From what I could tell at the time, 5-6 years ago, it was both a lot harder work to even learn anything more modern just due to the lack of a community that had put all of their knowledge online and a risky proposition. The money looked good, but you severely limited the number of places you worked and if the industry ever decided to get off its ass and change over in even low double digit percentages you were likely to be out of work and with skills that didnt really transfer


A project I lead at least ten years ago was all about moving a company away from a COBOL based system- huge piles of money were an option and the timeline was in the 3-5 year range, and still everyone (myself included) thought there were better options all around.

The driving factor was a suite of new services and features they wanted to implement, we came up with a creative way to run parts of the system as proxies to the COBOL applications backing them and to decorate the requests and responses with data from the newer systems.

We were done much faster and cheaper, for a project of its scope there were surprisingly few hiccups or post launch rework projects, and we got the end result they could've spent 10x the money to achieve- and all with way lower risk.

I'm assuming we were in a fortunate position that the general flow of data could be handled from the various source streams, but I wonder why this isn't at least an option for moving forward from some of these ancient systems. It seems like a good first step, at least.


They already do this. At my old tech school there was almost a rotation of COBOL/RPG instructors back and forth between the schools/businesses. Along with those businesses paying for some of their own to go through the classes.

Other companies have worked together to essentially make boot camps before SV even thought to do so.


Realistically, if, say, one of these big corps/banks with deeply-integrated COBOL systems spent a few years revamping their systems to something more modern and usable today, how much would it cost?

Running mainframes isn't cheap. COBOL programmers aren't cheap. Not being able to improve these systems / innovate with them can cost the business a lot of time/money. And if we're talking COBOL, we're talking at least 25 year old corps.

How long's the ROI on moving over? How expensive does the maintenance of antiquated systems get? It's easy to say that they're rock-solid and never need to be touched, but is that really true? For example, lot of banks today have a lot of absurd UX, often because of absurd internals and I know a lot of that usually is "the system is ancient and nobody really touches it anymore, that's just what it is now".


> Realistically, if, say, one of these big corps/banks with deeply-integrated COBOL systems spent a few years revamping their systems to something more modern and usable today, how much would it cost?

What language would you suggest? Remember that decimal reliability and predictability is important. Plus, let's pick a language likely to have a very long service life because we need to have our investment rewarded.

Frankly, the iSeries machine we have is pretty cheap to run. We bought it 10 years ago for about $15k and have had a maintenance contract that doesn't cost that much (about the same as our Red Hat service contract). We probably will buy another one since they just released a sub $10k one that will more than meet our needs. Never lost any data, easy enough backup that the biz office does it themselves[1], and IBM is pretty quick about any fixes. Much cheaper than any Linux, BSD, or Windows system we have ever had. Plus, I don't mess with it like the others. It just works.

1) we've tested it multiple times, after all, an unrestored backup is not a true backup


I don't know enough about these systems to suggest a language, but I imagine within the "a few years of development" is budgeted some time to figure out what tool to use. Surely you're not telling me that mainframes running COBOL is the best tool for the job. What languages do younger banks use?


"I don't know enough about these systems to suggest a language", so how can you say "Surely you're not telling me that mainframes running COBOL is the best tool for the job."?


Are you telling me I'm not allowed to ask questions?


You weren't asking a question, you were making a statement about how foolish it was that banks are running COBOL. Question would be about the why and what value do people see in the situation that stays their hands at going to a new technology. Why did banks spend the money and time (which was not insignificant) to modify code for year 2000 instead of making the switch then at a natural change point? It would be the most logical point, yet they didn't do it. What does the mainframe and even the mini-mainframes bring that Docker does not? What made COBOL so hard to replace given later things like PL/1 which were directly supposed to replace it? Where do all the RPG programs fit into this?

Maybe someone should be thinking about a COBOL replacement language instead of the plethora of C / C++ replacement languages.

I imagine within the "a few years of development" is budgeted some time to figure out what tool to use

It will be a lot more than a few. Things absolutely cannot go wrong or even change in production. One slip-up can result in real money being lost. This whole "go fast and break things" doesn't apply to bank accounts.


These systems can be at the very core of the company's operations, so you're talking not just a revamp of a bunch of COBOL programs and mainframe servers, but a significant portion of the company's workflow would potentially need to be refactored or replaced. The mainframes themselves sit in a box, but many other systems rely on their data and their guaranteed processing. And all of those connections and data structures are custom. Even if you were to figure out some way to completely emulate the system (and that itself would probably take several years), you would then need to integrate your emulated system into other real-time systems. A bank just can't shut down for a day. And if they screw up a few minutes worth of transactions, then things get very bad. So your testing and integration will be slow and methodical and expensive. And even then, you're still emulating a crappy, old system. If you wanted to actually replace the entire system, how would you do it? These things aren't modular. They were custom-built with no intention of ever being parted-out. You would have to completely re-architect the company's infrastructure and workflow.

Or you could not do any of that. You could just accept your situation. Mainframes are essentially a subscription service. For a yearly cost, IBM has offered you guaranteed performance and guaranteed hardware (they fix it immediately if there is a problem). So, all you have to do is pay your subscription and pay some people who like to have a steady, unexciting job and your business will continue to run like clockwork.


I understand what you're saying, but does this logic hold over decades?

I'm obviously not saying banks should just switch today cause it's easy and all, but if the ROI on switching is 10, even 20 years, it still makes sense to do if you're a bank.


It's not 10 years, it's probably closer to 100. Reasons being a) the ridiculously large cost of rewriting and testing the new system on a real-time system that can't be taken down, and b) the ridiculously low cost of maintaining the current systems.


How do newer banks do it? It certainly doesn't take 100 years to bootstrap a bank. I also mathematically can't get behind the concept of reproducing a known state taking longer than the time it's taken to get to it.

I can't find the post now, but some folks from the US govt who frequent HN talked at length about the ongoing replacement of mainframe-type services with newer tech. What I gathered from that is that the cost isn't ridiculously low at all when you take into account the impact it has on the organization's ability to get things done.


I had the fortune to work on COBOL software for brief period of time at my last job, which was at a financial clearing company. This group of 50+ COBOL applications that was the lifeblood of the company and a bad run could cost the company millions. The software was started in the early 90s, so it was easily 20+ years old when I was working on it. There was very little documentation regarding the process and the people who worked on the system were gone. I bring this up because when you have software that old you typically aren't going to have the original people who wrote it at the company nor will you have the people who gathered all of the requirements. All of the knowledge that went into that product was virtually gone.

You are at a known state but you now have to re-create the steps to get to that known state as I'm sure a lot of the domain knowledge around the product is not there anymore. Not only that, you now most likely have other sources consuming the output of the original COBOL application so you have to ensure that the other applications are able to consume the new data and that those don't break either.


> I also mathematically can't get behind the concept of reproducing a known state taking longer than the time it's taken to get to it.

What you have to remember is that when the COBOL code was written, it replaced hundreds, maybe thousands of people doing manual data entry and manipulation, maybe even pen-on-paper. That gives you a fantastic ROI. After that's been done, replacing one computer system with a newer one is completely different, a spectacular case of diminishing returns.


While I'd agree that its easier to start a new bank, thats not the case here. What you are doing is trying to recreate something that currently exists, and if it goes like almost every job I've had then there is no budging on any of features, they want stuff duplicated down to the same bugs, and none of the features are documented. You end up spending half your time just reverse engineering what currently exists before you can even begin work


I'm not sure about most banks, but some use other industry-specific software, there's one being developed in Uruguay for small banks and financial companies called Bantotal, which is written in Genexus, a 4GL language or metalanguage that generates whichever language you want (Java, Ruby, .NET, even Cobol and RPG)

http://www.bantotal.com/?lang=en

https://en.wikipedia.org/wiki/GeneXus


most moves from mainframe to *nix, windows or others are a port of existing code to a new OS. NOT a complete rewrite.


You're absolutely right. But to start a decades-long project that may or may not work and has no near-term ROI is a hard thing to rally for. The cost of the project would eclipse what is being paid to the mainframe provider and to company personnel. And these companies have a difficult time bringing in top talent, so they don't really have many tech-savvy people to preside over it.

I've seen it attempted first-hand and it was a dismal failure. And it was their third or fourth attempt. And it wasn't even a full replacement! Just a simple data lake to dump things into. Google or Facebook or Amazon could do it, but few companies are like them.


> Realistically, if, say, one of these big corps/banks with deeply-integrated COBOL systems spent a few years revamping their systems to something more modern and usable today, how much would it cost?

Success isn't guaranteed, maybe as low as 50%. When you say 'few years' it intimates you don't appreciate the scale of some these systems. In the end, retiring legacy systems isn't a technical problem, it's a product owner, product management, and business problem. The business owners have to make concerted, multi year effort to document their process as implemented, define the replacement, and manage it's (re)implementation. This never happens.


> The business owners have to make concerted, multi year effort to document their process as implemented, define the replacement, and manage it's (re)implementation. This never happens.

I wouldn't say it never happens, but it certainly takes a couple tries and a lot of time and money.

My employer is close to being off our mainframe(s), but it will probably take at least another 10 years to get rid of the last little dependencies. We've already spent about 15 years (with hundreds of developers) getting to the point where all the major online and batch processes have been reengineered and have very small mainframe dependencies. The mainframe development teams are very small and mostly focused on decommissioning work.

But yeah, it's a massive undertaking. In our case, the business was motivated to pay for it because mainframes just couldn't scale for our main use cases.


Any chances we could read more about this somewhere? Sounds very interesting.


> Any chances we could read more about this somewhere? Sounds very interesting.

Very, very unlikely. The company doesn't even have technical writers anymore for internal projects, let alone technical historians. And with the time it's taken and normal staff turnover, I doubt that there's anyone who really could write about the whole effort based on their own experiences.

Basically, it consisted of building of building a new central datastore for our content (that was pretty mature by 2005, when I started). Then hundreds of smaller projects to migrate this or that process to read from or update the new datastore or keep it in-sync with the old one. There was also a lot of opportunistic piggybacking of new feature work onto migration projects or migration work into new feature work. However, I only have the view of a regular developer.


When I say "few years" I mean 10, 15 years? 20? 30? In the context of a bank, which may be around hundreds of years, that's still not a lot. But maybe I'm still way off, hence my question: How long would it take?

I see a lot of people here telling me I don't know what I'm talking about without actually giving an answer as to how long it would take and how much it would cost, which was my question in the first place and I'm being downvoted for asking it. Seriously, WTF HN?


Well, those questions depend on the system in question, and for any particular system, it's probably impossible to say what the cost of replacing would be, because no one understands everything the system does, and how it interacts with external systems. This is what others are alluding too, your question is unanswerable.

What you can do is estimate the cost of implementation in today dollars. For example suppose Bank System X has been under development for 40 year with an averge of 10 developers per year. That's 400 person years. Suppose you find cheap labor at $100k per year, the sunk cost in personnel alone is $40M in todays dollars. So 40 people maybe could do it 10 years.

The above analysis is not satisfying.


> Running mainframes isn't cheap.

Price for performance for certain workloads, as I understand, it can be.

> Not being able to improve these systems / innovate with them can cost the business a lot of time/money.

The banks use other technology for peripheral systems that interact with the core functions all the time (look at the drivers behind AMQP, sometime.) The core functions, OTOH, are about impelmenting requirements that haven't changed in centuries; until there is a desire to implement a fundamental new way of doing things (like blockchain, which financial institutions are exploring and probably aren't using COBOL on mainframes for their experiments) theres nothing to innovate on.


Most banks looking to move off mainframe/COBOL/CICS/etc don't look to rewrite everything themselves. They look to buy packaged software which they will then customise/enhance. I think most of those packaged software offerings are written in Java. (Disclaimer: My employer sells packaged banking software written in Java.)

In fact, if you look at the existing COBOL banking applications, many of them aren't pure inhouse development, but packaged applications like CSC Hogan. Hogan runs on top of z/OS and CICS and is written in COBOL. Hogan shops will have a lot of their own COBOL they add to the Hogan system to implement customer-specific requirements. So for many banks moving off mainframe/COBOL, it is actually a migration from one packaged app to another, but of course they need to rewrite from scratch all their customisations.

In my personal experience, most banks doing this don't try to do it as a big bang. They identify subsets of functionality where they feel the current mainframe system is most limited – e.g. mortgages, mobile banking, whatever – and start by replacing that function only with the new non-mainframe software, and have it integrate with the pre-existing mainframe app for other functions. Then, the idea is, progressively they will migrate more and more functionality to the new system, and eventually the mainframe/COBOL systems can be retired. A piecemeal approach often involves less risk, costs less (in the short term) and is easier to justify to the business since they can see tangible results much easier.


It cost one large bank in Australia $1 Billion over 5 years to convert their system. That's 2-3% of their annual profit over 5 years to fix the problem, so we can assume the costs of running the current system are probably some small multiple of that. And they would still need to pay to run the new system, and most likely both systems at the same time towards the end of the transition.


> the money to do the change simply wasn't there.

This is a good excuse for a few years. But after decades?

And talking about banks?

---

The whole software industry is frankly pathetic. We are talking about actors with the HIGHEST profits of mankind, and yet "we don't have money for this" star to look pathetic.

Google, Apple, MS, IBM, Oracle, Banks, Defense industry, etc have not excuse to not invest in basic infrastructure.

This is the same thing with country that neglect self-investment in fundamentals. If you agree is smart to invest in road, electric grid, etc, then why we think otherwise when we talk about OUR industry.?


It cost Commonwealth Bank of Australia a ~$1 Billion to convert their old system from COBOL (http://www.reuters.com/article/us-usa-banks-cobol-idUSKBN17C...). That took them 5 years and they're about a third of the size of most of the larger US banks. So you're asking for 20-30 banks to devote 5% of their operating costs over the next 5 years just to fix the problem in the U.S. Or they can put the problem off for another few years and see what happens.

This isn't a "basic infrastructure" problem, it's a problem where the people writing the software didn't plan for obsolescence and many of the "best practices" we take for granted weren't even in place when this was being written. And it is about tracking real money, so if the system isn't exactly right, they could end up being tied up in court which makes it even tougher.


The whole article clearly show that is justified.

The risk is too high to let things as is today.

---

BTW, probably some downvoters think I'm advocating to move this to some hipster language like JS or something like that.

I work mostly for enterprise-alike customers, have work for government projects, etc.

So I'm aware of the work here. Also, have helped to move decent-sized code bases, several times, from OLD->NEW.

> ~$1 Billion to convert their old system

If things in AU is alike my country, this cost is inflated... For some project Oracle bid our customers 10x more than ours at the time.

But this is beside the point, anyway.

---

So, how this could be done?

====

The first problem is the need to have a solid hardware+OS, where linux is not there.

Solution? Stay on mainframes. Is basic.

The second problem is OLD cobol + Zero understanding in why, how, where, etc is the full system. This is the big one.

This is alike find a OLD C codebase, made like 3 years ago. Or even better, a OLD B codebase (or any kind of precursor to C, if it exist at the time...).

And pretend the solution is to STAY WITH THE OLD CODEBASE. For DECADES.

IS THIS MADDNESS!

NOT, IS IT!!!!

-- So, you build a better COBOL. Like CoffeScript->Js. You star moving the code from OLD to NEW. In the process, you document, and build testing. You end with a better cobol (or a transpiler) with a proper toolchain.

---- This can be look as the progress with another bad language, JS. You start with something terrible, with a huge installed base, yet, you star moving torwards something else.

In the process, you can do something alike ASM.js but for cobol, and whola, you can transpile other languages.


BTW, I can't imagine that IBM or somebody else have not figure a technological solution to this.

The main trouble is that customers on the enterprise side of the world have no respect for developers, and fired them after years or just not care.

Then, when in trouble, ask for external contractors that do not have the required knowledge (and maybe, not even the skills).

For this to work, it need to engage the on-site developers + maybe contractor or hire some additional hands for the mechanical work. Do it well, and this is fairly cheap for the benefit.

But this require to "do it well". And that is the "hard" part, because is the human factor the big issue here.


Talk is cheap and rarely includes a full accounting of real-world constraints.

You can tell what a person's _real_ priorities are by what they do. Likewise, you can tell what a society's real priorities are by what they do.


>Talk is cheap and rarely includes a full accounting of real-world constraints.

Of course. But after decades of the same problem, is not time to admit that is time to change?


Everybody always complains about the weather but nobody ever does anything about it.

In other words, even though everybody complains about something, and even though it's theoretically fixable, the cost+benefit isn't always there. That doesn't stop people from complaining or offering conceptual solutions.

If nobody pursues a particular solution, that's strong evidence about the real implementation and problem costs. Not conclusively, of course. But as a passive observer it's usually the best evidence available.

Really, the answer to the COBOL "problem" is obvious and it's what basically anybody familiar with it suggests: a piecemeal and gradual shift according to cost, benefit, and opportunity.


But something has got to give down the road, right? I feel sorry for the future guy who will be forced by circumstances to do the change.


Object oriented COBOL is actually a thing now :)


It has been for a while. Probably it was last year when we threw out a bunch of Microfocus Object COBOL documentation acquired about 1987. Let me add that I know essentially nothing about COBOL.


I get tired of seeing these same sensationalist falsehoods propagated without challenge. So I shall challenge.

>> We did a piece the other day about how learning the ancient programming language COBOL could make you bank.

>> Which can only maintained by small group of veterans, that grows thinner every day.

>>There are almost no new COBOL programmers available so as retirees start passing away, then so does the maintenance for software written in the ancient programming language.

This is false because there is in fact tons of COBOL talent around (in the US). The waves of offshoring in the '00s left many thousands of unemployed COBOL developers. I personally know of 100's in the local market. A prominent financial services and BPO firm off shored mainframe development, and laid off many 100's here. Most got out of software development all together. This skills shortage simply doesn't exist.


I kind of suspect that when companies complain about lack of talent, what they really mean is a lack of people with the required skill set (God forbid they should pay for training their own employees!) willing to do senior-level work at entry-level wages.


Yeap, it's true.

Me, COBOL in LinkedIn... Inflated offers via LinkedIn, Zero.


Same for Perl


>> The waves of offshoring in the '00s left many thousands of unemployed COBOL developers

If you were a developer affected by stuff in the '00s, you're probably nearing the end of your career (unless you'd just started) The problem here is the lack of a NEW talent pool.


Why do you think this would be true? For example, I went through a mega corps cobol bootcamp in 2003 with 30 other people. They ran two courses a year up until 2008 or so. It's not as if all cobol training stopped in the 60's.


Age 30 in 2000 is 47 today.

Is 47 really "near the end of your career"?


Nearer to the end than the beginning, perhaps.

I always get a chuckle when some hopeful naif writes like they're going to retire at 50 and be done. Most everyone I know who's tried that "gets bored" and goes back to work or starts consulting after a half year trying to find their feet. Those are also the ones who when asked what happened tell you they've got plenty of good years left and are now planning on retiring at 70.

47 is still building steam in my experience.


Our company has high turnover on the bottom. A steady percentage of new hires are bored, retired people. :)


I guess if one lives in SV, it might be.


> Is 47 really "near the end of your career"?

As a non-lead, non-management code slinger? It's usually well past it.


As someone who has worked in a few investment banks, Perl is a SIGNIFICANTLY larger problem than COBOL.

While verbose, COBOL enforced structure and discipline, and it is very possible for code to be maintained by someone other than the author.

On the other hand in the 90's and early 2000's many critical pieces of software were written in Perl. While it is quite suited as a scripting or glue language, I've seen elaborate integrated systems written in cryptic, unstructured and undisciplined Perl, often with proprietary extensions.

And no one learns Perl anymore.

There is no way to maintain these projects, and in at least two projects I've seen, what should have been a straightforward change lead to a large scale retire-and-replace project.


Sometimes the "no one learns Perl anymore" just means that no one is willing to pay a good money for some change that the company wants.


Yep. I know Perl (5 not 6) really well and programmed in it for about 10 years. No one I've seen actually wants to hire for it. I'm fine with that because it wasn't something I loved much like the Transact-SQL programming (love Objective-C and looks like Swift will be more work not love).


Thats seems true. I have a Perl jobs group on Facebook, and the salaries are a fair bit lower than for Django (which I currently do).


I dread working on Perl code I wrote even just a few years ago. If I've context-switched away from the code for more than just a few months it often looks incomprehensible. It really is a write-only language.


PERL is on of my biggest headaches. Using it in conjunction with an AS/400 to manage an inventory system for tractor parts still gives me uncomfortable dreams to this day. I had to learn it from my stepmother who was working at the same company and was the one handling most of the database (I just needed my own version for better location tracking in the warehouse.)

I would've much preferred to have done that in RPG.


Hell yeah.

Every time I've written Perl with another programmer, I learned something new.


I hope this is the case, as I quite liked doing Perl before I switched to Python. (Saying that I can imagine that there is some horrendous Perl out in the wild).


> Döderlein says that banks have three options when it comes to deciding how to deal with this emerging crisis.

Option 4, which is what banks are actually doing to the best of my knowledge, is to train new hires on how to write COBOL.


Glad to see someone else pointing this out.

The problem with these systems is that they are very old, and thus do not benefit from many of the more modern developments in the field nor do many quality developers learn the language.

The benefit with these systems, though, is that they are very old. With that age comes completeness. They're battle hardened, thoroughly tested, and a known quantity within the institutions that leverage them.

History is littered with companies that attempted to replace the core of their business with one big project written with newer technology, only to fail catastrophically.

During my first gig with a bank, I was appalled to see the ancient technology in play at the heart of the bank's systems, and the retired developers coming in part time on schedules they set for outrageous hourly rates to do maintenance tasks on that system. Over time, though, I came to realize that this was the most reliable and cost effective solution the bank had available.

The third option listed is what I see happening, but at a much slower pace than the article suggests:

> Basically, Döderlein suggests making light-weight add-ons in more current programming languages that only rely on COBOL for the core feature of the old systems. However, the key thing is how the connection to the old system is made. > Gradually banks will be able to address each and every product need that they have with new platforms that will replace the overly complicated COBOL add-ons. This compartmentalizes the banks’ COBOL-problem and makes it cheaper to fix, as it won’t have to be done all at once.

It's a glacial pace, but it's being attempted. The heart of the old systems, though, were still untouched last I had visibility into those inner workings.


You're correct on all points in my experience. My previous contracting gig involved writing Windows Server based code for a UK mortgage lender that interfaced between the internet based Hometrack valuation service, and a Fujitsu ICL clone mainframe running the core Cobol system for the lender. The key point about that system is that it is the core ledger for all the mortgage accounts, which add up to many billions. It's the golden record of all the mortgage accounts and payments. Replacing it with a new buggy system based on the latest tech could kill the business.


> The benefit with these systems, though, is that they are very old.

See, https://en.m.wikipedia.org/wiki/Lindy_effect


True, but as you admitted, everything is "a known quantity", i.e. more or less frozen.

Which is cool when everything works and nothing has to change. When something breaks or you need to change something... you wish you had things such as clean code, unit tests or even TDD/BDD, code coverage, continuous integration.

Banks are basically sitting on systems they don't fully control. They're kind of an interesting experiment: how long can you control the dragon, i.e. continue operating via patchwork and outside contractors which are in their 60s? :)


Small businesses are doing this as well (where I learned COBOL). Having experienced it, I do not feel there's really anything wrong with COBOL itself. The language is verbose and relatively straightforward, however it requires a different approach from what most people are used to.


Is it true that just for knowing the language you are getting compensated much better? How easy is the job market?


It might not always be the case. Hiring young people and teaching them Cobol (or whatever) without paying large amounts upfront might be also a viable business practice, ie. think about it as an entry level job. All depends on requirements, clients, etc.


This is how I learned. The team I was on consisted of several entry-level people who were taught COBOL. After a couple years or so most of us have moved on to better compensated positions at other companies.


The issue with rewriting to newer technologies is relearning - the hard way - all the requirements encoded over 40-50 years. Decision-makers are rightly afraid of the $100M mistake. As some posters here said already, organizations have to slowly phase in newer technologies and phase out COBOL ones.

The track record of Big Bang rewrites is not a good one. A few years ago it was either ACM's CACM or IEEE's Computer magazine that profiled several high-profile failures. The financial and time cost overruns were spectacular.

Banking and Insurance are not the only industries using COBOL. If you're in higher ed and your institution uses Banner, you've got COBOL. During every upgrade, our Banner team has to license the latest and greatest COBOL compiler set from (I believe) Micro Focus.

I have to wonder if a split in technology exists between commercial and investment banking.


People who don't know COBOL, IBM Mainframe or the industry write these type of articles.

You want to loose the hardware - there's a solution for that. You want to create modern web services over - there's a solution for that. You want to code in a modern IDE - there's a solution for that.

If the systems that use IBM Mainframe were using Amazon ECS or Azure... the same people would write articles about the lack of reliability of core servers like banking and airlines.


PR piece for Auka. A classic example of one of pg's submarines[1].

Auka CEO Döderlein, mentioned a mere 5 times, says there are only 3 options, ignore the problem, big bang rewrite, or bolt on new pieces.

Don't panic! Just call Auka. They have your new piece.

There's no mention of the option we would all pick, a gradual migration, of course.

[1]http://paulgraham.com/submarine.html


Agreed.

And besides that, banks should do what they want to do. That COBOL software they are running is hardly ever a problem, the problems are usually in the new stuff which has been far less battle tested and is connected to a hostile network.


Legacy COBOL systems (hell, a lot of legacy code in general) is a perfect example of the "if it's not broken, don't fix it" mantra. There's an incredible amount of work involved in rewriting legacy code, and when you're talking about doing so in a constantly updating, critical context like financial transactions, it kind of highlights the need to take things slow and carefully. My experience with old legacy systems (20+ year old C in particular) is limited, but it was enough to recognize that just ripping it all out is an awfully ignorant approach compared to a gradual migration of what needs to be replaced.

It's also an incredibly expensive one, especially since it's a recurring situation. No matter how clean and clear you think your code is, or how thoroughly you've commented everything, in a decade or two, someone is probably going to be groaning about it being legacy code.


It's a pity that pg drifts off in the end assuming that the medium blog would be more serious just because of its existing. It's just that the main text content market hasn't hit the internet yet at the time of the writing. Or maybe it was part of his PR campaign to bring HN to life, who knows.


COBOL has been "dying" since the late 70s. Since that time I have seen several languages and platforms come and go. Where I work our core application driving the business is written in COBOL. Estimates are between $500 million to $1 billion dollars to replace it due to the number of systems interacting with it and all the business processes affected by it. For that kind of money the business wants a technology they can hang their hat on for twenty to thirty years. So what are you going to choose? What languages and platforms in widespread use today are going to be around in twenty to thirty years? Here's what I think it will be: C, C++, Java and COBOL Hmmm. COBOL is in that list. So if I think COBOL will still be around in twenty to thirty years and the system is working fine now then what reasons do I provide to the business for spending $500 million to $1 billion for the upgrade? Yeah - I can't think of any either that warrants that kind of expenditure. So COBOL remains.


It's not only banks, I took an internship at an insurance company a couple of years ago that had their entire mainframe written in COBOL.

Not only do they have a hard time finding people to replace their retiring veteran developers, but for smaller companies (like this one) that can't afford to pay ridiculous salaries for a top notch COBOL dev, they have to settle for mediocre aging developers that can write COBOL and are on the job market.

These devs are getting paid good money to work on critical systems and aren't skilled enough to properly maintain them. It would be so much cheaper for these companies to pay better devs to do more recent tech. But it's hard for them to get out of that loop. Makes me rethink where I have my money.


I wonder what people will say in twenty years about our NodeJS, AWS, microservices and JavaScript systems.


> I wonder what people will say in twenty years about our NodeJS, AWS, microservices and JavaScript systems.

The same we're saying today about Object Pascal, Business Objects, Cold Fusion or DCOM:

nothing at all.


Here's a $500k-$1mil government contract for custom ColdFusion programming:

https://www.fbo.gov/index?s=opportunity&mode=form&id=65b571a...

Get your bid in!


> The proposed contract listed here is a 100 percent set aside for Women Owned Small Business concerns. The Government will only accept offers from this (sic) small business concerns. All other firms are deemed ineligible to submit offers.

Well what about the other couple-dozen genders that seem to have surfaced in recent years? =)


fun times, coldfusion vs netobjects


Hopefully in twenty years that kind of complexity has been abstracted out of existence.

Microservices may still exist as an organizational concept but I'd be stunned if we're still thinking about things like containers in 20 years.


Every generation of developers thinks their stuff will be replaced at some point but then they realize it's still used twenty years later. Happended with Y2K, Linux timestamps and a lot of other things.


Nothing critical (banks, military, factories) runs on NodeJS/JavaScript/AWS so probably nothing. Do you think a lot about some old BASIC Accounting Software?

There are probably only two languages which might face similar problem as COBOL: C and Java. For now they are both actively maintained and used for new projects - so we are fine for at least next 50 years.


My (limited) experience under a government contractor sadly contradicts this: the primary requirement for many projects is to fit inside of a server rack and "just work" when hooked up to power and the network. None of my work was in NodeJS, but I'm positive that JavaScript will be floating around the backends and frontends of our government's services for a long time.


Question is: what bad would happen if you pull out this project from rack?


In most cases, you'd probably make the employees at whatever agency or department very unproductive for the next few days (or however long it takes to find the server and plug it back in). The kind of projects I'm talking about are usually internal and critical for getting the right data to the right people within an agency.


So its is not critical from country/world perspective...


At what point does something become critical, in your book?

The kinds of services I'm describing are critical to the normal functioning of government agencies. They're running now, and they'll probably be running for the next 25 years (at which point another contractor will upgrade them to something new and shiny).


When problems/lack of it influence life(life != work) of many people.

- If power grid goes down you are facing riots.

- If systems on your new shiny Boeing go down you are facing deaths of dozens of people.

- If your military systems goes down you are facing russia ;)

If employees of government agency works less efficiently(is it even possible?) for some period of time (few days, months?) - nothing wrong happens. That's why it is possible to change those systems every 25 years. They will be less productive after software change, they will face new bugs in transition period. A lot (but not all!) of work those agencies do exist only because of some stupid laws (example: in my country you have to have special permit to cut down a tree on your land - I know someone who had to start building their house year later because they couldn't get permit to get rid of one tree, another one: you cannot inherit some types of parcel if you are not a farmer - you have to either become farmer[of course virtually - but it takes few months] or change parcel's type [it's not always possible and it's also few months of fighting with bureaucrats ]).


Depends on the government agency I'd say. For example: unemployement or social security checks going out, VA Hospital systems, etc.


NodeJS and Javascript yes, but critical things absolutely run on AWS.


Running on AWS and doing it correctly so that one of the many regional outages does not cause you an issue is really hard and is a software problem. On the other hand using a system of tested proven code that runs on a system that is redundant by design at the lowest level of component (a mainframe cluster) where the software does not have to have knowledge of the redundancy is a bit simpler.

I really think people miss this very important point. Redundancy is transparent for 99.9999% of the software you run on the old big iron systems. For something like a modern application on AWS you have to know, understand and code for the infrastructure.


We're talking about two different issues here. First is the architecture/software of a mainframe system, with all of its redundancy and scaling. Second is the issue of hosting and maintaining your own hardware. On the first issue I'll admit I don't know enough about mainframes, perhaps they provide redundancy transparently to the programmer, but I'm a bit skeptical. Perhaps you simply think the redundancy is transparent because you understand the mainframe infrastructure deeply and internalized it? The second issue I'll push back on.

If you just want to host your servers on EC2, the SLA is 99.95% uptime. So that's about 4 hours of downtime due to outage a year. Put your servers in a multi-AZ autoscaling group and you're pretty solid. Bonus points for having autoscaling groups in more than one region. If you don't want to use any of the other services AWS provides, you can simply use them as a basic hosting service and get pretty amazing uptime. I've never used an old mainframe system, can you get greater uptime than that? Including hardware failures, power outages, network outages?


There was one point in my life when I was working for a large mainframe company ;) A bank wanted to move a system to a new site. The uptime on that system was 11 years. Uptime on a mainframe system is pretty much considered 100%

4 hours downtime a year is just way to much for some systems.

Just have a google for "mainframe uptime".

Yes the software is abstracted from the HW redundancy. You can pull CPUs with running code in systems and things keep working. Really - walk up and pull the CPU out. No impact to running code.

Heck you can run your private cloud on one:

https://www.fool.com/investing/general/2015/01/24/heres-why-...


Of course it depends on your definition of "critical". There was AWS outage ~2 months ago and banking systems around the world worked pretty well, same with military, big factories didn't stop because of this outage, planes were still flying etc.

It was inconvenient for some people (quite frankly I didn't even notice anything - I've read about that in my RSS reader few days later ;)) but nobody died(excluding heart attacks), countries didn't collapse, wars didn't start because of this.


The outage was for S3, and only in the us-east-1 region. Any critical system should have multi-region fail over. Yeah, factories and planes and stuff are critical and wouldn't use AWS, mostly because those are embedded systems that shouldn't even rely on an Internet connection. But as for banking, Capital One is a huge AWS customer for example.

DISCLAIMER: probably should have mentioned this earlier, I'm an AWS employee :)


We're one of the largest JS users in the world, and I'm pretty sure we'd fall into the category of "critical" (in terms of impact, not end-of-the-world).


C, Java, C# and/or VB.NET.


Python seems to be sticking around.

I suspect that in 50 years, Haskell will still be the up-and coming thing.


I don't think Python has entered the kind of domains where they tend to keep the same version of the system around for 50 years: military, finance/banking/insurance/accounting, healthcare, etc.

I mean, it is present there, but most of the time it's part of some glue or automation, it's not part of the core systems.

Those are usually Cobol, C/C++, Java and .NET. Or some proprietary language in case they were crazy enough.


In wholesale finance Python has been deployed for core systems over the last 10 years. For example JP Morgan's Athena platform goes well beyond glue or automation and handles pricing, intraday risk, EOD risk, etrading and STP. Hundreds of millions of dollars have been invested in the build out, and it will be around for decades.


Python, like COBOL and Excel is a good language for people who program as part of their work but are not professional programmers: data analysts, scientists, some finance types, etc.

Compare that to C where buffer overflows are common, Rust where the type system is a high art, Java where thread-based concurrency "just works" so you can get in trouble using it, etc.


Microservices are here to stay, especially in large engineering orgs. NodeJS and Javascript, on the other hand, I'm not so sure about... Many of the engineers I know view this technology as a "cancer" when compared to Java, Go, etc. Their reasoning is that Javascript doesn't offer the safety or maintainability that other languages do.


Microservices are just the new bandwagon name for Sun RPC and DCE.

Every couple of years people decide to redo distributed computing stack.


It's like a technology time bomb. If businesses are struggling to cope with a limited subset of basic programming languages like Cobol and assembler, they are going to have a nightmare dealing with obscure frameworks.

I'm actually surprised that graduates can still enter the web industry without receiving specialist training - that's how graduates are dealt with if they go down the Cobol route and work for one of the big IT multinationals who maintain these sorts of systems.


This old argument again? It seems like every 25 years somebody brings this up....


Every 25 years? COBOL's only been around for about 60. You make it sound as if it were something we've had happen far more frequently.


I quite like the fact that programs written decades ago are still in use. In the web development world there is so much churn. Anything you write now will have been rewritten a year later. It is quite disheartening.

I like this quote from https://teuxdeux.com/purpose

> On that note, why the heck does every app need to change all the time? Can’t something on the Web be more or less finished?

However I have no idea what COBOL is like to work with so I am probably being unrealistic.


One of the design goals for COBOL was that non programmers should be able to understand it with a little training. That would allow non technical managers to have a bit more control over their projects than they would have with Fortran, Algol or PL/1.

I don't know how well that worked out, but other than Hypercard all other languages I am familiar with (and they are many) only make sense to those who can program in them. So though Java is often considered to be the modern COBOL, at least in this regard it isn't.


There's a misconception that COBOL is dead, its not, there are developers and new compilers being built for it. [1]

IBM isn't sitting around waiting for COBOL to die out, its working out hooks between COBOL and Java and introducing new languages like NodeJS [2]

The problem isn't money, its risk. No CIO is willing to risk moving to a new system, and that's the problem that needs to be attacked.

[1] https://www.ibm.com/developerworks/community/groups/communit...

[2] https://developer.ibm.com/zsystems/documentation/java/worksh...


I know very little of COBOL but I have often wondered if bank COBOL software could be transpiled (a quick googling shows there are transpilers). That is COBOL -> Some_Other_language and perhaps Some_Other_language -> COBOL.

Then in theory you could write unit tests in some language to test the real COBOL and the transpiled COBOL.


This company has been doing it for quite a while with an amazing toolkit for language work:

http://www.semanticdesigns.com

http://semanticdesigns.com/Products/Services/LegacyMigration...

They also have a proprietary, parallel language I'd like to see go FOSS:

http://semanticdesigns.com/Products/Parlanse/index.html?Home...


The problem is, when you have 10M lines of complicated COBOL, transpiling it to JavaScript will make things completely unmaintainable.


So it sounds like COBOL really isn't the problem.

Its that is 10M lines of complicated probably poorly written (in terms of hard to understand and little comments and no test coverage) code.

Otherwise companies successfully transpile far bigger code bases than 10M lines...

IMO the first thing they should be doing is writing automated tests against the existing COBOL. A transpiler might help this process.


You are absolutely right, that's the nuance that's often missed in these discussions. The COBOL applications in question grew organically over decades to encompass business processes for entire industries. There was never a single design document or even goal.

The other thing is how these systems were architected. They were mostly all green screen apps that build data files that were then fed into extensive (complex, large) nightly batch processes. For example today when you do a transfer between bank accounts, you may expect the transfer to be immediate. most often it's shown in a pending status until after the next nightly process. This is why.


Edit: further, the problem with transpiling is that these systems are still under active development. For example, banking systems have to be updated annually for changes in regulations and tax laws. If you translated all you COBOL to Java (which results in truly inscrutable code (cobol has unusual control flow constructs, even besides go to)). Then as an organization you're face with either modifying the Java code (this is very, very, very bad), or regenerating from COBOL periodically, which kind of defeats the purpose.


Are there any good resources on the architecture of these systems? It sounds very different from the application and web programming that I know and it would be interesting to learn about it.


I always assumed it was so the banks could skim off the interest generated for another night.


In my experience these systems are not poorly written - the process around these systems is huge, with reviews and reviews, and lots of testing


I work for a large brokerage company that has a deeply integrated system written entirely in COBOL on the IBM mainframe. That sucker is going no where, and for good reason: it's incredibly efficient for what it does. As a batch processing system, the mainframe simply can't be beat. We deal with millions of dollars worth of transactions every day that have to be correctly accounted for and reconciled. In order to get the performance required to do all of that, we stick to the mainframe.

However, I do see the appeal for building off of the COBOL foundation using more modern languages. Writing a UI in COBOL on CICS is an absolute nightmare to deal with and maintain (not to mention my firm's egregious install and deployment process that is nothing but red tape and TPS reports). I'm working on a project right now where we're migrating some of our reconciliation screens to the web, where the UI communicates with a Java/Spring Boot server that itself interacts directly to the database using DB2 Stored Procedures.

In this space, I see a need for moving away from COBOL. But for the batch processing? Not a chance.


Say I wanted to learn COBOL, has anyone got any resources they'd recommend?


Programming in COBOL is best learned by signing up as a trainee employee in a company that has been using it for a long long time and that employs a bunch of old programmers who will be more than happy to teach you.

It's a quirky language, (omit a single '.' somewhere and you're in a for a world of hurt) has its neat parts and in general will come across as limiting once you have been exposed to other, more expressive languages. But that's probably also the reason it is still around, think of it as JAVA but then conceived in the gray past. The original intent was to make a programming language that would allow managers to program. That didn't quite work out.


Hey, apologies for randomly hijacking a comment for this question, but over the past years you've been popping up all over HN's comment section with (it seems) a rather eclectic and broad mix of experience, knowledge, and (possibly) talent. I'm almost starting to suspect you're multiple persons (but not really).

Have you ever written about how this came to be? I find myself to be similarly broad in interest/experience, but I'd judge myself much more superficial/limited in this regard because of the breadth of it all. I'm curious how you manage it all.


Hey There. Not really the place for this but I've had an 'interesting' life for want of a better description, I've written lots about it on my blog at http://jacquesmattheij.com/

If you want to take it off HN feel free: jacques@mattheij.com


You've never heard of Jacques Mattheij, inventor of the live webcam?


Nope, perhaps too young for that?

That said, the guy doesn't even have a Wikipedia page! So how should I know?


By being old, obviously.


MicroFocus have a Visual COBOL [1] that works in Visual Studio or Eclipse and targets the .net clr and the jvm. Trial downloads available. Or there is GNU COBOL [2] with docs and faqs [3].

I'm not a COBOL programmer, though, and not planning to be.

[1] https://www.microfocus.com/products/visual-cobol/

[2] https://en.wikipedia.org/wiki/GnuCOBOL

[3] https://open-cobol.sourceforge.io/


Depends on the country.

Are in the USA?

Edit: Clarification, you can't download a Mainframe VM so you need access to a Mainframe for training.


Programming in COBOL does not require a mainframe.


You're right.

I tend to think about what you need to learn to actually work in a IBM Mainframe environment which is greater than just learning the language.


I think you could use Hercules.

http://www.hercules-390.org


I wonder what banks would do if it wasn't for those genius suggestions in the article... /s


The decisions to keep these systems around are not about technology -- they are about risk. Every IT lead in every bank knows the options for updating the systems. But they also have risk management concepts attached to their decisions, and because we are talking about banks, there is no such thing as small consequences. Banks are extremely risk-averse, so if the system is working and changes bring risk, you don't do the change. Figure that by the time there really is nobody left to maintain it, somebody else will have taken on the risk, and new products will be available. So there is less risk in waiting for those products than in taking on new development efforts yourself.


I am probably the odd man out here, but I took two semesters of COBOL and really enjoyed it. I have been trying to figure out how to get into COBOL programming, but the job postings are few and far between and seem to want more experience than I have. I honestly think I would be really happy programming in COBOL day in and day out.

As an aside, I think part of the reason I like COBOL is because I am an above average to very good academic writer. I enjoy writing papers and find it hard to meet page minimums because brevity and clarity is important to me. These characteristics (good, clean, to the point writing) seem to translate to COBOL more than any other language that I have used.


Has anyone ever been part of any kind of migration from Cobol to anything else ? What happens to underlying hardware - is it even possible to migrate from mainframe to the cloud without a complete reengineering ?


I worked on a project to convert a city's billing and payment system for their water department from COBOL to JEE.

It was actually pretty straight forward. The key to the project was being to easily convert the data in the mainframe to standard RDBMS tables, Oracle.

After that it was just a matter of "mapping" the green screens of the system to web pages, i.e. displaying the same data and providing the same actions.

The only time I really needed to grok any COBOL was when there were discrepancies between the old green screens and the new web pages.

After a period of user testing it was basically do the final data dump to Oracle, turn on the new system and turn off the old one. However, there were many other programs running on the mainframe so the hardware didn't go anywhere.


Nobody uses mainframes anymore (except banks). They have emulators however, that emulate a mainframe on windows.


Not true. I recently worked at a major auto parts retailer and all part pricing and availability data was stored in mainframes.

Currently I work for a major insurance company which has multiple data centers with many, many mainframes... Think ~60K square feet worth of mainframes per data center.

Mainframes are pervasive and aren't going anywhere anytime soon.


Banks have no issues hiring ex-offshore contractors who have been doing the work for years. These guys can maneuver themselves through mainframe technologies and show their proficiency with quite ease.


Banks also have no issues retraining their IT staff to learn COBOL and ascend from the Mostly For Fun Open Systems department to the Really Important Mainframe Systems department.


Compared to COBOL, that IBM PC is a whippersnapper. An IBM 370 would have been a better match.

https://en.wikipedia.org/wiki/File:IBM_System_370-145_und_Ba...

My dad programmed in COBOL for the Army, and he retired in 1975. That's how old COBOL is. Knowing my dad, he probably still has his manuals, though finding them might prove an archaelogical exercise


It's kind of heartening to see programming languages like COBOL, Fortran, and Ada still are chugging along helping folks solve their problems. I think instead of trying to replace the language how about we design modern hardware and software to work around it? Isn't that where virtualization can help (I'm assuming too much probably with this question considering how many programs of that era are hardware specific)?


There is another important problem: the hardware and software these production COBOL applications runs on (mainframes) is a burden for vendors like IBM. Unless these mainframe systems are replaced by commodity hardware and software, banks will always have to fear EOLs or high prices to keep development viable for vendors.


It's not just banks, a good number of industries still run legacy COBOL systems. Timeshare for one.


I think they actually do. But very slowly. Last time I checked the amount of known COBOL solutions was already surprisingly small. It's mostly people who don't know anything about COBOL and COBOL fans who still believe that COBOL is that important.


Maybe so, but that would be an immensely huge undertaking costing a lot of $$$$. Good luck getting them to do that.


I don't know about anyone else, but one reason I wouldn't get into COBOL is I wouldn't want to be that guy who thought he could learn an archaic language but lost the banks millions because of programmer error. I suspect a lot of people simply don't want the kind of responsibility that working with bank mainframes entails, even if they know COBOL.


Rewrite it in Rust/Javascript


You're going to have a hard time selling that combination.


Is there a proper decimal library for either language?


Meh, who cares about a few cents ;)


I still remember tracing down an odd bug involving taxes and contract calculations that resulted from a programmer I thankfully couldn't identify using a single precision float to do calculations that involved unit of measure conversions that then had calculations involving contract terms and weights. The original unit stored was a decimal with 30 total digits and 10 decimal digits. When I finally found the part of the code where decimals suddenly because floats, I was inclined to do violence given it was 3AM and I was chasing a couple of hundred thousand pennies in various places. It was deep in the middle of a stored procedure so the standard excuses don't apply.

I swear there are not enough classes about numbers.


> I swear there are not enough classes about numbers.

Yes. This is one thing that irks me about these fads in programming, they all solve problems that have long ago been solved and then re-introduce all the bugs that were already discovered, fixed and forgotten about ages ago. Old software is a very nice repository of information about edge cases and real world complexity.

A $0.75 accounting error turned out to be named "Markus Hess". Precision matters. If it works good. If it works but you don't know why it works: bad. If it works, but has a small problem and you don't know why: bad. If you run a $1M budget and you're off by $0.75: bad. And so on. You really don't want to store anything to do with money in floats.


I remember a 'Survey of Programming Languages' class in college. Perhaps a archeology of code class and things already done. You are right, I wonder how much of Computer Science knowledge is locked away never to be seen. Perhaps a tax credit for companies that donate their old source code to the Library of Congress.


Lol. Genius. We see the converts and owners here take exception.


try writing a rust implementation of linux...


wouldn't it be great to have something like COBOL-grpc and rewrite everything in rust/scala/java/c# but not all at once, more like the simple things first?

I think that would probably be a good business.


If I were a bank I'd start a new bank, in Not COBOL, and put all my new customers on that. Then after 10 years offer to move people still at the old to the new bank, to get new offers.


You forget the old customers have credit contracts that last 50 years which have business rules, implemented in mainframe software, which no one understands or knows.

You can't migrate that way. An Insurance company can do that (contracts usually last 1 year) but Banks can't.


Right, I didn't put in "and the few remaining accounts that can't be moved can stay on the old system"

It was more a suggestion of how to move to a new system at all.

What happens if the bank closes before those 50 years?


How do you message to your existing customers that you're holding their life savings on fragile 1970s tech?

Also, you'll still have to write wrappers for the old tech (iPhone apps which access the COBOL core) in order to maintain feature parity with the other banks.


>How do you message to your existing customers that you're holding their life savings on fragile 1970s tech?

You mean "battle tested for 40+ years, totally solid 1970s tech"?

If anything, it's any crap they'd write now that would be MORE fragile.


You're thinking like an engineer.

You're right, of course, but the public is trained to trust the latest and greatest.

If 'battle tested' was the most salient selling point we'd be on Series 60 or QNX phones.

'We're putting new customers on the latest technology and keeping existing ones on our old systems' is not going to be an easy thing to message to the general public.


If 'battle tested' was the most salient selling point we'd be on Series 60 or QNX phones.

Actually, there are more Series 40 and Symbian phones sold today than iPhones - 350 million to 200 million (India / rest of Asia, Africa and Latin America are the main buyers). Many people here still miss the old Nokias. Of course, Android is the new Series 60.

http://gadgets.ndtv.com/mobiles/news/nokia-sold-35-million-f...

http://tech.firstpost.com/news-analysis/india-smartphone-shi...


The code is from the 1970s, but a lot of the hardware has been replaced over the years. IBM has been pretty aggressive about maintaining compatibility options because they're fully aware of just how much legacy code their customers need to use. And when it hasn't been, maintenance has usually been pretty well-handled. Those machines were usually built to last. There's even a company that was still using an IBM 402 accounting machine as of 2012.[0] And there's plenty of military hardware using 70s/80s computer tech.

But I don't see this ever really coming to consumer attention. They're never directly--or visibly--interacting with old mainframes. From their perspective, they're using the newest tech. The mainframe in the background doesn't exist to them. If they even think about it, it's just magic. The closest you might have gotten to the public talking about legacy systems was probably the healthcare.gov debacle, when a number of news articles mentioned integration issues as being an problem. Those were big issues in themselves, but they weren't the main problem behind the site's development nor were they ever a major discussion point.

0. http://www.pcworld.com/article/249951/computers/if-it-aint-b...


Here in the U.K. at least, banks using legacy systems is usually mentioned when a bank has an outage / ATMs failing.

The canonical example when it comes to UK banking is the RBS outage, which meant that customers could not make payments for three weeks. RBS was hit with a £56m fine for that.

http://www.telegraph.co.uk/finance/newsbysector/epic/rbs/112...


And this right here is the reason that banks can't just sit on their ticking timebombs.

Another commenter says there's no incentive for managers or execs to stick their necks out, but £56m is quite an incentive.


Not really, because the risk of such a fine is ridiculously low. From reading up on the incident, it seems like it was more a matter of a software update rather than specifically the use of legacy code that caused the incident. UBS was fined because the outages were unacceptable to consumers and regulators and because "the potential for problems had been raised in an internal audit two years before the incident took place in 2012"[0] and nothing was done.

One could argue that legacy code makes that more likely because of unseen factors that predate the people actively working on it. It's hard to miss a landmine, after all, if you don't have any clue that it's there. But while a rewrite might eliminate that specific risk, it does so by inviting a lot of new ones to replace it. In some ways, the risks are probably greater as you need to figure out possibly decades of business logic and then re-implement everything. From a business decision perspective, the existence of legacy code on its own is an insufficient argument on its own. After all, the problems inherent in legacy code will eventually surface yet again in another 10-15 years anyhow. The better solution in most cases would be to identify specific problems and determine what can be done to hedge against them.

0. https://www.theguardian.com/business/2014/nov/20/royal-bank-...


if you think COBOL is bad you haven't seen RPG yet.


If you read the article carefully you'll notice that most banks DO NOT maintain their own software! They licence the software, and usually the job of running the system, from a software house. The article is based on an intervjue of the CEO of Auka. Auka is a software house for a peripheral piece of banking software. (read, he's selling software licenses). Try to look at this from the banks point of view.

My first job was in one of these software houses for banking and finance(B&F) who made both core system and peripheral software.

A bunch of us programmers tried to make the same sell to one of the bankers we had on staff. (You need bankers to do the job of products manager in this business). He looked at us like we're morons, and asked a rather pointed question: "OK, say we rewrite the whole software stack in C++ (this was 1998). What NEW banking products can we make in C++ we CAN'T make in cobol?" Answers is of course; None, zip, nada. Last time you changed bank, did you ask who's banking system is being used and in what language it was written in ? I'm guessing not.

So banks go to the software houses to license a core banking system, and they care pretty much about only two things: 1) Is it solid (i.e bug free. Few things evaporate trust in a bank faster than there being questions about the banks ability to do basic math right). 2) price

There have been a few cases when a new core banking system have been written from scratch, and most have been failures due to being too buggy when launched. Word get out among the banks and the system is dead. All will wait until it's proven, no one will be the first one out of the gate. The bug free requirement is a show stopper from the getgo.

On to price. The development cost of such a system is in the neighborhood of 250 - 500 million USD. The only way to get that down is to strip down the features to a much simple system. The other software houses are just going to sell an existing cobol system, with disabled features to match yours, and underbid you.

Bankers, unlike most CS majors, actually understand how to calculate cost of investments! Say 500M USD development cost, a future value calculation will yield that you'll need a return-on-investment (ROI) of 20-50M US EVERY YEAR FROM HERE TO INFINITY to make the investment pay off. (That's PROFITS, not revenue)

Core banking systems are so feature stable, and have been so thoroughly debugged over DECADES, that there is very little maintenance to be done. I'd expect that each of these systems have a staff of 10-20 programmers maintaining them (and they spend a lot of time doing other things). How big a staff of maintainer will you new banking system need after launch? Even if it was zero and you assume you pay each maintenance programmer 250k USD, a staff of 20 still only cost 5M USD.

The article have a section "what can the banks do" left out a fourth option. Instead of using CS majors to code cobol, you use B&F majors, and give them a 1-2 year education on doing cobol programming. The CS majors need 1-2 years of on the job experience to reach a B&F Bachelor level understanding of what a core banking system need to do anyway. Cost is the same.

One last point, say you do write a new system today your choice of language will be either Java or C#. As database there is no viable alternative to SQL. In 30 years no fresh CS major will touch Java or C# with a ten foot pole, and you'll have to do another rewrite to the tune of 100-200M USD (assuming that programming will require less hands).

Much cheaper to keep on educating non CS majors to do cobol.


And we should all switch to Metric, give up fast foods, and call our mothers once a week... but look, it's never that simple because all of these things take effort.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: