Reminded me my own experience of creating bank software system for a State Bank of one of republics that was in the middle of separating from former USSR. "Perestroika" they named that process.
No mainframes, where we would get those at that time? Only i386 as a server and PC XTs as terminals. Connected by ropes essentially, but NetBIOS was working over them. Plus 1200 bod or so modems to connect with other banks in the republic.
But we (team of two developers) did it in 9 months.
Excellent Zortech C++ by Walter Bright (creator of D language).
Editor - MultiEdit the Great. What we would do without it - I don't know.
Database - dbVista (later known as dbRaima).
Everything was working on PTDOS (Russian MSDOS alike thing)
Yet multitasking C library providing cooperative multitasking (same execution model Node.js is using).
And everything in 640k of RAM that "should be enough for anybody".
We were young and nobody hadn't even a chance to tell us that this is "too serious", "impossible" and the like.
A lot of manual input by operators, so usability requirements were surprisingly high: http://www.terrainformatica.com/2016/02/ui-usability-solutio...
So basically no networking, just multiple video cards and keyboard interfaces in a single 386 PC (it used RS-422 to move the video signals to the screens). In SCO Xenix this worked extremely well. No networking needed.
Girls were especially happy when we've added an option to choose octave :)
I still hear sometimes in my night dreams sound of bank's operation hall orchestra :) Twenty workstations or so at closing of operational day ... It was actually not that bad.
The only problem was with 10 digits and 12-tone chromatic scale. Added '.' and '-' to that.
Common misconception is that UI is only about visual side of it. User Interaction may include all other senses/informational channels.
Another common misconception is to think that UI principles are the same on any device that has CPU. Far from it.
As an example: I've participated in initial designs of EverNote[.com] application. From the very beginning it was very clear for us that desktop and mobile versions are very different entities. On mobiles you essentially have no other way to input your data other than using camera or voice.
Size of your finger's touch area on phone screen is comparable with area of average paragraph of text. WYSIWYG text editing in such circumstances is barely feasible. So only buttons of 1 inch size, that's the only reliable options in GUI.
GUI and UI in general are very interesting things indeed.
Just one fact: the system survived monetary reform in USSR of 1991 when in three days the whole country suddenly changed all banknotes ( https://en.wikipedia.org/wiki/Monetary_reform_in_the_Soviet_... ). History of USSR is just one big social experiment of things like this.
Number of transactions in these three days was astonishing as you can imagine. We've spent all three days in the bank worrying mostly about hardware. 12 keyboards and 3 workstations from twenty-something were lost in the battle :)
But the most crazy thing that I saw was on the next day after the reform. It was January 26th, and it was snowy all three days. Plaza in front of bank's building was covered by snow and banknotes on top of it, surreal. People who were late or exceeded exchange quota were just throwing money away.
But our system did handle it surprisingly well.
My pardon for hijacking the COBOL topic. C++ rulez forever!
Госбанк started using mainframes in 1972: http://www.computer-museum.ru/histsoft/bank1.htm
UPDATE: In fact we were extremely lucky to haven't even seen this mainframe beauty upfront. Pure number of day transactions were enough for us to estimate payload and architecture.
It was just 3 or so years later when I was invited to see that glassy mainframe's kingdom at Central Bank of USSR.
If it sounds like I am bragging.. I am.. I am very proud of her. What I'm not proud of is not spending enough time with her to extract more details. I did ask her a similar question about being a German widow working with predominately males and she basically had the same answer as the articles mother... "no problems what so ever".
This article is a great reminder that I need to call my grandmother more often :)
Interestingly though she absolutely loves all things sci-fi. She records almost everything the SyFy channel produces and loves Babylon 5, Dr. Who, etc...
She is exceedingly good at board games. Even ones she has never played before she often wins. I remember brining Chinese checkers once to her with my wife. She had never played before but she absolutely trashed my wife and me in multiple games.
She grew up in Germany and I have often wondered if that had any affect on her non-traditional-gender hobbies and likes. Or maybe it is just her personality (as I love board games and SciFi as well).
This fits in line with what nearly everybody I know thinks. Does anyone _actually_ like open-plan offices without dedicated desks? How long until that trend passes?
While I understand the argument for separate offices, or at least cubicles, I have never worked in such an enivornment. But I am truly shocked that these programmers are being asked to hot-desk.
It's not like journalists or cops or people who only pop into the office unpredictably. On the other hand, programmers do have terrible personal notions how how their monitor(s), keyboards etc should be layed out.
But hot-desking was still a complete failure, even excluding the personalisation issues. He said that people would arrive at the floor asking 'Is Bob here today?', to which could only be answered 'I don't know, he could be anywhere on the floor. I'd have no idea of where to look for him'.
Some people like open-plan, and some people hate it. But I've never met anyone who liked hot-desking.
I rather doubt the cleaning crew is taking the time to sterilize keyboard/mice. I know, I'm a bit of a germaphobe.
Not just programmers, trust me. At our company, most people do not care, but a few of them will notice - and get cranky - if you move their keyboard by just a few millimeters.
Once in a while I get a sensitive or confidential email, but that usually happens when the computer is connected to a big screen in a conference room full of people anyway...
I DO mind not finding anywhere to sit, not knowing where people are when I want to talk to them, not having two or three huge monitors to work on, and not having my Model-M to type on...
I guess there are some stools and tables available somewhere, and technically you can sit on the floor I suppose, but people that are late more or less have to go home again.
They are kind of in a downwards spiral anyway, so maybe it will sort it self out.
Wow, that's a much better name for it!
Thus Personnel became Human Resources.
But often the new words are, with the passage of time, clearly just as, or more offensive than their predecessors. I mean, "Human Resources"?!! WTF! Are people really just "resources"? To me, Personnel sounds way friendlier.
Another one is around disability. We would never describe someone today as retarded. Yet really, is disabled any better? I'd rather be retarded (as in, slowed down a little) than disabled (completely stopped).
[close to my heart: we write HR software :)]
From Wikipedia article on Intellectual disability  there is a list of historical medical terms. Nowadays used in offensive manner are: Cretin, Idiot, Imbecile, Moron, Retard. Same will probably happen (or already happened in some cases) with Disabled.
Probably same process takes place for many corporate terms. I observed this in my time at Samsung when internal organisation for CSR  changed name from POSITIVE_ADJECTIVE Workplace to New Management Committee.
 Corporate Social Responsibility
I'm waiting for them to drop the "Human." I'd like to be referred to as a generic "Carbon-based labor unit" or some such.
People tended to use retarded as referring to one's mental abilities and assumed that if you have a physical disability, you have a mental one. That was the biggest problem with retarded.
Isn't that what it means? Applying it to something physical just seems wrong.
Incidentally HR workers refer to themselves as "professionals".
The 'retarded' thing fell out of favour because it was heavily used as a pejorative, which doesn't help people who actually are retarded. When I was in neurology over a decade ago, the term had changed to 'developmentally delayed', as in, hadn't hit the mental milestones that a kid of that age should have hit. I loathed that term, because while most parents knew (or were told) to interpret that as 'old-fashioned retarded', to some parents it sounds like the kid is just delayed, and given enough time, will catch up. Some kids do, and at better rates than in ye olde days, but plenty of them don't.
Which is to say, I don't like hotseating.
There was a few of these very long scripts, written in the 70's. At the top was "Here be Dragons".
It felt cool after I figured out what interest calculations it was performing, I ended up leaving a comment "Dragon Slayer was Here" at the end of the script.
There are still younger guys in our banks who COBOL, but the domain expertise sits with the much older guys. It's lately common for the older guys who go on pension to come back as "independent consultants". They earn a lot of course.
Just find one of the experts that are willing to answer whatever question you have on the legacy COBOL code
Some doddling around with an (almost certainly illicitly acquired) copy of IBM Z/OS on an emulator almost proved to me that no free resources exist.
I'll ask around with my mainframe vets and get back to you. There has to be something out there.
Though I found it to be stifling, I would work with mainframes again if the price was right.
The problem is training new developers: you won't get experience developing COBOL (not just the language, but the toolchain, the idioms of the time etc) without developing COBOL, and nobody is going to let you touch the system until you're experienced.
Unlike more recent languages, there aren't a huge range of small/simpler/less important (but still representative) codebases for novices to cut their teeth on.
I think it would take some sort of apprenticeship style setup to train new COBOL devs, but that's really, really hard to get right, and will be a huge drain on the few experienced devs you do have.
It's a great industry to be in if you work in the right area, just like any industry.
There are a lot of places where it is a lot worse to be a programmer than a bank.
The pay is ok, the domain is interesting, and the work load is manageable.
When I look at some friends doing game development, they seem to work a lot more for a lot less. They have a cooler office though...
But I think it's a good idea to try to avoid COBOL.
I thought it was actually fairly interesting work. COBOL is a very neat language. Like anything though at age 22, it's not likely you'll stick around for any job, much less one where you're generally just doing maintenance and source control / corner case bug bashing. I moved on but I'll always remember the job.
But the knowledge required to fix or update these programs is almost completely owned by the programmers that developed them, who are in the process of retiring or are unwilling to share knowledge for (justified) fear of losing their job.
This creates an unjustifiable dependency for the company, who will have a very hard time finding senior developers to replace the original developers or even recruit <modern language> developers to build a new and more modern system. What happens to the company if the original programmers retire, get sick or leave?
Every project that I know of that focused on replacing mainframes running Cobol quickly reached nightmare-like complexity during the analysis phase - especially because the original developers were uncooperative and the scarce documentation was obsolete - and was replanned to contemplate at least twice the time (and sometimes twice the work/FTE), which will also increase the cost of replacing the technology by a few (tens of) millions...
That has been my experience as well. It's also the case that business processes get designed around the COBOL code, not just vice versa. Separating out what are actual business requirements from "what we've always done on the mainframe" can be fiendishly difficult, and not just the original developers but the customers can be extremely uncooperative. It takes management coming in and saying "these are the new business processes, they are compliant with the industry standards, and our new software is going to implement them" to get things to actually happen. How that works for a changeover in something like a bank is beyond my experience, though.
A company using ancient COBOL code should have people who understand it, but also the ability to test it, and to talk to it from new code.
Once you get to that point, replacing the COBOL is not so hard -- once you have a reason to do it.
They're everywhere, they do.. something.., and nobody can replace them just all at once. Even if you replace some of them, the remaining part still means your system gets swamped by the old mess.
I'm pretty sure the problem is worse than just offering money. This is why many large corporations will pay a lot more for consultants and the like rather than risk it with independent contractors, despite the contractors being much cheaper.
Top tech employers will routinely hire you (with generous compensation packages) to program primarily in a language you're not experienced in.
Big non-tech companies are simply not set up to realize that developers are assets, not peons.
In a non-tech company, they are working in a cost-centre.
Ironically, a lot of non-tech companies have not yet realized that the services they provide are actually digital, and their profit centers are actually just a sales organisation.
Salient quote from the article: "the time before a new employee can stand on their own feet is 2–3 years"
Most neurosurgeons just follow textbook prescriptions. I don't think this is a good analogy.
Uh, you would refuse to do mainframe programming for $1m/year? Seriously?
If there is a shortage worth complaining about, it implies there is a critical need (a large financial upside -- these articles make it sound like it's an existential requirement for the bank) for hiring an employee that knows Cobol. My point is that if the compensation scaled with that critical need, plenty of people would be picking up Cobol right now, just like plenty of people are honing their business management skills.
Very nice article. Captivating read. I was interested about what kind of 'previous computer experience' your mom had. That era of computing is completely alien to me.
As a kid, I remember him coming home with IBM style punch cards. He was putting the cards into something that looks like a money counting machine. It was pretty fast and noisy, I remember being fascinated by it. His computer room was filled with a huge, 3-rack, communist-block designed computer, with 256KB of RAM and 60MB of HDD. The HDD was as large as a washing machine and took 2 minutes to spin up from cold start. The actual data discs were interchangeable (so the disk itself was spinning in regular air, not in void).
That is/was the punch card reader, with a metallic hair-comb like contraption (each prong of which was presumably connected to some sort of opto-coupler or a similar sensory component) for sensing the holes in the cards as they would pass under it at high speeds. As a kid, I used to have a lot of fun watching that in operation at my dad's office.
He used to write code (COBOL and Fortran) by hand on those graph paper-like coding sheets, then hand the sheets over to a 'specialist' card punching machine operator (I remember they used to have special courses for that too) who would punch the cards and so it went... :-)
Work on something that nobody else wants to do for 50% of what you could make doing something more fashionable.
I have a sneaking suspicion that this is not the case. Companies that want top talent and are willing to pay for it are not shy about that fact. Seeing a legit job posting for a position that pays double your current salary is a solid incentive to apply.
I'm guessing that the subject of this interview probably makes a less-than-competitive salary. Programmer pay has handily out-paced inflation for decades now. She probably started at around $30,000/yr in 1991, if she managed to get a 3% raise every year, that would only bring her to like $63,000/yr.
Naturally, I'd love to be wrong about this. This woman has a demanding and important job and deserves a generous salary for it.
Therefore there's not much chance you can go to another bank and do the same stuff, surely? Therefore less pay.
I've been raised to not ask what people make, that it's considered rude - so I have no clue what she makes.
Sure, but within the same family? While I was a kid this was vital to get some idea of the family budget. :P
Of course, it's simple enough to look up at Skatteverket...
It's over double the average family income in the USA.
Developers straight out of college can sometimes make triple the average household income.
That doesn't mean there aren't underpaid software engineers. Even when their lowball salary is higher than the average household income.
The median household income in the US is a "decent living" in almost the entire country, even some urban centers. Saying that 2x+ is a "decent living" is completely tone deaf.
In Dallas, a fresh out of college CS grad can probably expect a salary in the lower 40s. And the cost of living is cheap enough you can afford a house on that.
I would agree. Between the fact that new companies aren't generally standing up mainframes, and that existing companies are finally getting off them (I know one bank locally that's migrated critical parts of its payments processing to commodity hardware & languages, and the most critical government payments processing system) I suspect it will be a shrinking market. I certainly wouldn't bet on an uninterrupted career out of it, put it that way.
EDIT: I've added some Q&A at the bottom, will add more as more questions come in.
So, not quite as old-school as the analog days :-p
With the older systems, the database update logic and the business logic are in the same programs. This is not good.
Those were all good ideas at the time.
Infact IMS was such a good idea the hipsters just reinvented about 1% of its functionality and called it MongoDB!
Can you elaborate? Sometimes it seems that business logic must be implemented in SQL, for example, or else you will get a massive performance hit.
It's just until recently that someone realized that since all services are delivered digitally, the banks would have been outcompeted by Google, Microsoft, Apple or just about any competent and large enough tech company, was it not for the heavy regulations, and a conservative financial world.
Yeah, that's just not true.
The money is spent on systems on top of the core systems (web etc) basically trying to stay on par with the competition. If they spend on the core systems, it's when they try to consolidate and save costs.
DB2 is what Oracle wishes it could be when it grows up.
Bottom line is that there is parked money in these accounts. The banks should have every incentive in the world to minimize this parked money (either to get access to their own money or to avoid paying interest on someone else's). So there should be pressure to make the systems as real-time as possible, not batched.
In South Africa we have BankServ, which is a local interchange and clearing house. Transactions are still in batches, but the settlement between banks happens nett between the local banks.
Internationally, MasterCard, VISA, Amex etc. manage the clearing. Banks then have to pay within 2 days via Swift to either other banks or their designated international settling bank.
Going back to your suggestion, it could work because the banks could just pay each other when they run out of funds. The challenge becomes fraud and other legal issues, which often mandate settling houses.
The real-time problem is a legacy hardware problem as well as the settling issue.
On your last point, money is normally parked at the Central Banks, mostly as a legal requirement. From my experience, banks in healthy financial economies like mine don't keep a lot more than the minimum cash reserve with the Central Bank.
Same thing in Sweden, I believe. Cleared through Bankgirot.
I don't think real time settlements involving the central banks will happen any time soon, but maybe more frequent batches.
It's very nostalgic for me because the author is fascinated by the technology of the day. COBOL, hierarchical databases, batch processing were still a norm in IT when I got into the field. My first programming job was around 1992-93 on a Vax using SQR and PL/SQL for an accounting department. We also had to be able to read COBOL. By contrast, my friend got into the field 5 years ahead of me and was hired into an insurance company. His IT department was just switching from assembly and IMS on an IBM mainframe to the "newfangled" COBOL going against DB2.
I left IBM after a year, stayed in my first programming job for a year, and then started a 5-year stint with startups. When I hit the startup scene I realized how a new norm was developing. Small servers, C/C++, relational databases, and OLTP were the dominant paradigm in smaller organizations.
The author did a nice job describing this period in the history of computing.
I would love to know what a 'typical day' and the general working environment are like. (A vague question, I know, but sort-of on purpose.)
What data did you use to determine that barely anyone knows mainframe COBOL? There are a significant amount of companies that still use it because it just works and they're in no need to upgrade to current technologies.
Even back then, one of the major complaints was that companies refused to pay for training, hoping they'd find someone who already knew everything they needed. I heard multiple stories about people who told their employer that they were planning to retire, left on schedule after not finding anyone qualified to train, and returned later as a consultant at a significantly higher rate.
Unfortunately the alternative (re-write the mainframe/COBOL systems in a more modern platform) is a risky and costly process....
It's not even the hacks. It's that "how the business works" was automated into COBOL 30 or 40 years ago, everyone in the "business" who knew how and why things happened retired or got laid off, and the COBOL programmers are the only ones who remember the business rules.
I lost the count of how many times I thought about rewriting the entire beast in C or even a more recent iteration of FORTRAN.
In the end, it wasn't something that I would use for much longer and I just mapped what all functions that i needed did on a spreadsheet and added a couple of hacks for the next unfortunate person to handle.
Maintaining old codebases full of hacks and without good documentation that can fail without big consequences is boring, but code that handles the kind of data that banks have is something straight from hell.
That said, mainframes and COBOL tend to be used heavily at utilities, railroads, and government entities. One of my coworkers (one of two former COBOL programmers on staff) just identified that Eversource in CT uses COBOL because of their inability to handle a certain type of id.
My first job out of school, at a big consulting firm, and they were teaching new college graduates COBOL in a 6-week bootcamp. And most of them had taken few or no programming courses in school.
BUT the hard part was, is and will always be domain knowledge, with special cases here & there, obscure logic, miscommunication between expert & developer and others
For reasons only known to my computer science dean COBOL was a third year course after after BASIC, Pascal and FORTRAN in 1985 or so when I was working through the curriculum for a CS degree. Fourth year was BAL (IBM mainframe assembly) followed by a language of your own design.
ADD GIN TO VERMOUTH GIVING MARTINI
Since today that would be...
martini = gin + vermouth
.. perhaps we are now 25% or so more efficient.
Fortran (particularly in the context of embedded systems developed in the 70s and 80s) is another "barely known" language that's quite widely used (particularly in historical and maintenance contexts, if not in widespread current development outside certain engineering and scientific domains).
Unfortunately young people might not find the world of mainframes and COBOL the most enticing career choice :)
I think it will be a huge problem over the coming years as more of the original authors of banking system retire.
My one and so far only COBOL program was an ASCII-graphics game of checkers for the TRS-80 model 16.
I'd be interested in links to guides on this product or scanned pages that show whether OS itself was COBOL. I'll put it on Wikipedia's Timeline of Operating Systems list if I get the info. I got quite a few added there so far.
Edit: Some paydirt showing ghe company along with at least compilers absorbed by MicroFocus. Tech might still live on somehow.
Looking with Digitek, MicroFocus, and Liant still didnt get me jack. This is some phantom of a product.
I remember a little about it:
You could set up "partitions", which mean reserve fixed amounts of RAM for different applications.
It ran on the (8 MB I think) external hard drive for the TRS-80 model 16.
An advantage compared with Xenix is that it was smaller (used less of the hard drive).
Also it had just a single directory, but had a form of hierarchy in that you could use any number of '.'s as separators: like usr.bin.ls. The total name could be long. I think OS-360 worked something like this.
I think the last line of the screen was always the command line, and the other lines were for command (or cobol menu screen) output and worked like a pager.
You could definitely hook up serial terminals.
I found this:
It's the same as "COS990", but for 68K.
Might give it at least a brief Wikipedia article with these links.
If you can lay your hands on some issues of BYTE and PC Magazine (Ziff Davis) dated between 1985 and 1990, you would find plenty on RM COBOL.
Note: It's going to be irritating if I find it's another DOS variant after all this work haha.
Conversely, a lazy DBA can cost you a lot of money. And lazy DBAs can be hard to identify - their domain is rather arcane after all.
or the same paper at Google Books:
I've always wondered why this is the case. Is it because people are way more careful when entering code? The article points out that their IDE is connected to the Mainframe directly which makes me believe an error is much more severe. I'm more confident that the compile and unit tests can catch my mistake than I'm in my own coding skills (which is why I am not always afraid of changing tightly coupled systems). And even if worse comes to worse, i'd still be confident that some system will detect that change and roll back my commit before all goes south.
With COBOLT (in such old environments), it seems that things must not break at any point so developer are a lot more careful with what they do, which makes the code do what it is designed for very well. The problem with such an approach is of course upgradability (I would not want to touch undocumented code that has not been updated for 20 years because some government decided to replace an old "protocol".
Or maybe, COBOLT is considered more reliable (by Banks, etc) because managers just see the fact "this system has been running for 15 years" while ignoring the problems that come with it.
But I'd like to hear your opinion about it.
I was one of the few new hires with actual education and experience in software development. The majority of new hires were people in their mid-thirties who were looking to change careers from trades, etc. The company was specifically looking for smart people that they could train extensively in house. Almost nothing you learn outside of that environment is applicable inside and vice-versa. The best I can describe it as a complete parallel evolution of technology. The technology is similar and yet also completely alien.
Connecting to the mainframe didn't mean you were editing the code live. It's exactly like shelling into a remote Linux machine to compile your code. You didn't work locally because there's nothing that is local.
That said, the language itself was (is?) so limited in scope and ability that stability was nearly guaranteed. You READ and PRINT using tape or disk and printers, but the programmer didn't worry about what devices were available - some system admin pointed these devices at a job, and READ came from that tape and PRINT went to that printer. The setup and management of the jobs that ran your code was a very manual process. No pointers, no memory management ... nothing low-level at all. Just high-level abstraction.
Then there's hardware reliability. As an example: you can walk inside the mainframe, pull a CPU, and the thing keeps working. Replace that CPU, and the system makes use of it. None of the system's users ever know that it happened.
From what I understand, until Tandem came along and started to seriously eat away at the market in the late 1970s and forced all the other manufacturers to follow, mainframes were not known for reliability.
What I've usually heard is that code that already exists and has been demonstrated in many years of use to work correctly in COBOL is more reliable and less costly to maintain in COBOL than the alternative of replacing working components with something in a modern language when new business requirements required a change.
Having been involved in a big-bang replacement of a large, working COBOL system with a from-scratch .NET solution, I'm not unsympathetic to this argument.
I've never heard the argument (though perhaps its been made somewhere) that COBOL is, for a clean slate project, more reliable or maintainable.
To the extent it's true, it's because the people that have built these systems are experienced and have a lot of business knowledge, and know how to reuse and tweak their systems while not trying to build each new requirement with the latest interesting framework they downloaded 5 minutes ago.
But it's slow to code in, especially if you go outside COBOLs comfort zone.
I am afraid it became a victim of its own success. IBM and other SW companies got used to so huge margins so they stopped innovating enough, in order to protect the revenue.
I remember going to this Bay.net talk with Juval Lowy who beat down on everyone calling devs code monkeys who got paid peanuts. Haha, ~100K range is definitely peanuts compared to a seven figure salary. What a world.
the new thing is wrappers, extending the life of the applications and their data but allowing modern interfaces via the web and hand held devices. if anything its the best example of code reuse there is.
on another note, I don't know if we have anyone under fifty who knows COBOL well enough to do new work but a few can do maintenance just fine
I spent about 9 months in total in a mainframe team during my training, but that team mainly took care of hooking up the mainframes to the corporate network, so I did not do any coding except for small Perl script. (I came close to having to code in REXX, but I dodged that bullet. I played around with REXX at home and did not like it very much.)
But it was very interesting to see all the examples of parallel evolution in the mainframe world, at least where the terminology was concerned. A reboot was an IPL (also used as a verb), the OS kernel was a nucleus (a term I really liked), they also had a term for network interfaces I forgot about.
A strange but interesting world, it's a shame those beasts are so expensive.
1. How/when did you decide you wanted to be a programmer?
2. What were the challenges you've faced as a female programmer that started in the 90's?
3. You've been working for the same entity and possibly the same code base for more than 20 years. Does it ever gets old?
4. How scary is it to write code for a bank?
I'm certain I would have 100's more questions to ask, but I'll be satisfied with at least these :-)
The programs that use IMS probably do lots of things that are just not done in relational databases for good reason, so if you just translate it you will either get errors in relational constraint or you will configure DB2 to stop enforcing relations, losing most of the advantages DB2 has in the first place.
Because of the widespread use of these homegrown systems in the banking industry, however, it appears that management is not yet ready for a total break with the past. The investors that own these companies need to demand change and indicate they are willing to pay the cost.
It sounds like other than the very slow database, the biggest impediment in this space is the lack of new talent familiar with the environment.
I would actually not mind learning COBOL and writing boring bank software, but why bother if I can make more money with C++ and Python, while working on more interesting stuff?
If I'm going to take a pay cut to use an exotic language, I'm going to find a job writing Lisp.
Makes you wonder what would happen if the entire mainframe tech organization at one of these banks demanded to be paid on par with the traders working on the investment side of the bank.
It's for organizations like this where I really see unions making sense.
But it's not easy and not cheap.
The problem for the banks is that they have so many systems, typically thousands, and they are all interconnected.
Enter the 'enterprise architects' that draw power points with boxes and buses on top of all the old shit and you have a receipt for a very expensive soup.
Description: OpenCOBOL is an open-source COBOL compiler.
Library Dependencies: gmp, libtool, db44, ncurses, libgnugetopt, libiconv, gettext, mpfr
Maintainers: firstname.lastname@example.org, email@example.com
Almost seems to be more of a vendor lock-in issue than an a mere 'old and scarce platform' issue, albeit that's significant also.
A well-designed System Z emulator that allows users to migrate their own mainframe applications to commodity hardware would obviously pose a serious threat to IBM's mainframe business, but IBM's software licensing terms have historically prevented such a threat from materializing.
In many ways, the project arguably benefits IBM by encouraging interest in the mainframe platform. [...] What brought about IBM's change in perspective was an unexpected effort by the TurboHercules company to commercialize the project in some unusual ways.
TurboHercules came up with a bizarre method to circumvent the licensing restrictions and monetize the emulator. IBM allows customers to transfer the operating system license to another machine in the event that their mainframe suffers an outage. Depending on how you choose to interpret that part of the license, it could make it legally permissible to use IBM's mainframe operating system with Hercules in some cases.
Exploiting that loophole in the license, TurboHercules promotes the Hercules emulator as a "disaster recovery" solution that allows mainframe users to continue running their mainframe software on regular PC hardware when their mainframe is inoperable or experiencing technical problems. This has apparently opened up a market for commercial Hercules support with a modest number of potential customers, such as government entities that are required to have redundant failover systems for emergencies, but can't afford to buy a whole additional mainframe.
So if they don't like herculeas' efforts they should've brought out their own, or come to some kind of arrangement with them.
They rarely have dedicated training systems given the cost of running them, and trying to find information from IBM was an excercise in frustration.
Personally I think that is where IBM made a huge mistake. Lack of low cost training systems has starved their ecosystem of new coders.