Hacker News new | past | comments | ask | show | jobs | submit login

Not compeltely related to article but I've often heard that banks argue that COBOL is more reliable and maintainable then any "modern" language.

I've always wondered why this is the case. Is it because people are way more careful when entering code? The article points out that their IDE is connected to the Mainframe directly which makes me believe an error is much more severe. I'm more confident that the compile and unit tests can catch my mistake than I'm in my own coding skills (which is why I am not always afraid of changing tightly coupled systems). And even if worse comes to worse, i'd still be confident that some system will detect that change and roll back my commit before all goes south.

With COBOLT (in such old environments), it seems that things must not break at any point so developer are a lot more careful with what they do, which makes the code do what it is designed for very well. The problem with such an approach is of course upgradability (I would not want to touch undocumented code that has not been updated for 20 years because some government decided to replace an old "protocol".

Or maybe, COBOLT is considered more reliable (by Banks, etc) because managers just see the fact "this system has been running for 15 years" while ignoring the problems that come with it.

But I'd like to hear your opinion about it.




I have some COBOL on IBM experience; I did a short stint about 15 years ago for a large grocery store chain. There is no way that COBOL is more reliable and maintainable -- it's effectively equivalent to programming in old-school line-number BASIC. The place I worked at was extremely strict on reviews, process, and testing. Every line was reviewed extensively by a group of people and if you did a SQL query you always accessed the fields in alphabetical order. It was that kind of strictness. Some of this was because the consequences of a mistake are pretty severe (Hawaii doesn't get a shipment of groceries) most of it was because the technology was so antiquated. What I can do now in a week would take 4 months to do that in that environment.

I was one of the few new hires with actual education and experience in software development. The majority of new hires were people in their mid-thirties who were looking to change careers from trades, etc. The company was specifically looking for smart people that they could train extensively in house. Almost nothing you learn outside of that environment is applicable inside and vice-versa. The best I can describe it as a complete parallel evolution of technology. The technology is similar and yet also completely alien.

Connecting to the mainframe didn't mean you were editing the code live. It's exactly like shelling into a remote Linux machine to compile your code. You didn't work locally because there's nothing that is local.


When I worked with COBOL I mostly worked in a similar hosted setting, using only a dumb 3270 terminal to write my code. However it's not the case universally, it is/was also possible to develop COBOL code on a PC that simulated the mainframe runtime (MicroFocus COBOL, if it's still called that, is what we used).


My father's one(ish)-man business was a PC-based COBOL application that began with the launch of the IBM-PC until he retired in the earlymid 2000s. I worked on it for a few years in the 90s (SPF-PC!).


COBOL development had (has?) a completely different development and deployment experience. Sure, you keyed your code directly into the mainframe, but your code and your session were isolated - you, the developer, had limited privileges. Only once your code took the prescribed input and produce the prescribed output was it allowed near "production" data.

That said, the language itself was (is?) so limited in scope and ability that stability was nearly guaranteed. You READ and PRINT using tape or disk and printers, but the programmer didn't worry about what devices were available - some system admin pointed these devices at a job, and READ came from that tape and PRINT went to that printer. The setup and management of the jobs that ran your code was a very manual process. No pointers, no memory management ... nothing low-level at all. Just high-level abstraction.

Then there's hardware reliability. As an example: you can walk inside the mainframe, pull a CPU, and the thing keeps working. Replace that CPU, and the system makes use of it. None of the system's users ever know that it happened.


That job setup uses a language called JCL. And while COBOL reads more or less like english statements, to the point that anyone who has ever programmed procedurally could follow it, JCL is quite inscrutable. One old joke was that nobody ever writes JCL from scratch, they just take what worked last time and change the dataset names.

http://www.jargon.net/jargon/jargonfile/j/JCL.html


> Then there's hardware reliability. As an example: you can walk inside the mainframe, pull a CPU, and the thing keeps working. Replace that CPU, and the system makes use of it. None of the system's users ever know that it happened.

From what I understand, until Tandem came along and started to seriously eat away at the market in the late 1970s and forced all the other manufacturers to follow, mainframes were not known for reliability.


Im still looking for data myself on reliability over time. I recall one OS for IBM mainframes bragging about its reliability because it ran six months or so without a reboot. I was think, "Wow... some good numbers there..."


> Not compeltely related to article but I've often heard that banks argue that COBOL is more reliable and maintainable then any "modern" language.

What I've usually heard is that code that already exists and has been demonstrated in many years of use to work correctly in COBOL is more reliable and less costly to maintain in COBOL than the alternative of replacing working components with something in a modern language when new business requirements required a change.

Having been involved in a big-bang replacement of a large, working COBOL system with a from-scratch .NET solution, I'm not unsympathetic to this argument.

I've never heard the argument (though perhaps its been made somewhere) that COBOL is, for a clean slate project, more reliable or maintainable.


It's just IBM marketing :-)

To the extent it's true, it's because the people that have built these systems are experienced and have a lot of business knowledge, and know how to reuse and tweak their systems while not trying to build each new requirement with the latest interesting framework they downloaded 5 minutes ago.

But it's slow to code in, especially if you go outside COBOLs comfort zone.


This is probably in respect to a few things. 1)There's no need to constantly update your version of the language and framework unlike with say Ruby and Ruby on Rails. 2) COBOL is battletested so most of the failure modes are well-known unlike with adapting the latest and greatest 3)I doubt that mainframes suffer from as many distributed computing problems


I also imagine they are fault-tolerant in ways we no longer consider today.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: