Hacker News new | past | comments | ask | show | jobs | submit login
The Beauty of COBOL (2018) (devops.com)
100 points by fortran77 27 days ago | hide | past | web | favorite | 73 comments



Have not dealt too much with COBOL but learnt it a while ago. It was one of the easier languages to comprehend quickly. I thought it was well suited for input and data processing focused applications.

I do remember thinking it's odd that a line has exactly 80 characters and putting an asterisk in the 7th or 8th row means the line is a comment.

This is probably not true these days with modern COBOL. But when I saw Python the first time, and that it enforces similar restrictions with intendation and then Golang following suite, I had to smile that such old concepts of enforcing style still are seen as useful today.

Often we just have to go back and see what others have done decades ago to get new ideas!


> I do remember thinking it's odd that a line has exactly 80 characters

Maybe you know this now, but for those that don't:

The limitation is due to punch cards - specifically IBM-format punch cards (which was pretty much the standard by the time COBOL was developed). Those punch cards had a layout of 80 columns by 12 rows:

https://en.wikipedia.org/wiki/Punched_card#IBM_80-column_pun...

Each column would represent (more or less) a 12-bit "word" for each character on a line of code - so one card per line of COBOL code.

Now - note that the IBM PC (well, 5100 and earlier terminals too) had a layout of 80 columns by 24 rows. Why? Because you could (in theory) see a virtual representation of two punch cards one above each other. Whether this was actually used or not, or is an apocryphal or retcon'd reasoning I'm not sure; I've been told both.

There's also parallels and such with printers - standard printer width was 80 columns (for regular 8.5 inch wide paper - sans tractor-feed edges), while greenbar was 120 characters wide (using the regular font for most printers); I'm not sure why "40 extra characters" or "half-card" extra, but I bet it's related to IBM punch cards in some manner.


>> This is probably not true these days with modern COBOL

It still is. Which column (not row) a statement starts in matters and it's a syntax error to start the wrong statement in the wrong column. Sorry I don't remember any specific examples because it's been a few years since I last used COBOL.

I remember hearing that this had something to do with how the early COBOL programs were encoded on punched cards. Anyway there was some practical aspect to it that is now not an issue anymore with the changes to hardware.


This was the case with Fortran 77 too, and for the same reason (punch cards). But modern Fortran (F90 through F2018) no longer requires formatting by column.


Good point. I think maybe one reason is that COBOL is valued for its consistency, with the same code often running on modern machines that was written 50 years ago or so (at least that was the received wisdom when I was working at a COBOL shop). Perhaps the same is not the case with FORTRAN?


>>It still is

It is not if you use free source format, which most new cobol programming is done in. Fixed source format does require the columns specifically, free source format lets you start wherever on the line you want, and extend for however long of a line you would like, but comments start with *>


So offsides syntax errors like Python. But immutable instead of arbitrarily dependent on context. Seems simpler and more modular to be able to look at a line of code and determine if the whitespace is correct.


> I do remember thinking it's odd that a line has exactly 80 characters and putting an asterisk in the 7th or 8th row means the line is a comment.

You should check out IBM RPG and its various iterations, it's a whole lot worse than that.


As someone who spent serious time in RPG III in a previous lifetime, I would not advise anyone whose welfare I cared about or was responsible for to go look at RPG.

But yes, you are correct.


There is RPGII code I wrote back in the 80's that is still in daily use today. Most of the people who know I wrote the code have passed on, so the support calls have stopped.


Partway through that particular gig, the IBM CE came by and installed COBOL. (For many years, I had ranted about how much I hated COBOL.) So I transitioned all the RPG code to COBOL and what a relief.

So I had to eat a little crow.

Shortly after my gig, they went full scale IBM and built an IT department, as opposed to just a system 34 in accounting.

So no support calls for me.


> As someone who spent serious time in RPG III in a previous lifetime, I would not advise anyone whose welfare I cared about or was responsible for to go look at RPG.

One of my first gigs was working for a manager who loved RPG III. To the point where I was tasked with making a data collection interpreter using the same constructs RPG III is well-known for.

To this day, whenever someone mentions "RPG III" and what that project entailed, I think of the other one[0].

0 - https://en.wikipedia.org/wiki/Rocket-propelled_grenade


Golang doesn’t have semantic indentation.


I don't know why you're being downvoted, this is true.

Generally, extra tabs and spaces do not change semantics; nor do new lines (when inlining statements you just can't imply the semicolon).

Go files run just fine without any indentation whatsoever. People often assume enforced formatting by the `gofmt` tool is mandatory for the compiler, but most of it is addressed at human beings for readability.

Here's the 'official' blog post about gofmt: https://blog.golang.org/go-fmt-your-code

Edit: downvotes don't change reality ; )

Try it yourself: https://play.golang.org/p/_bXvXryM5Ih — how incredibly ugly, and yet it runs just fine. Please hit format to clean your eyes.


I've seen go code fail to compile because a closing } was on the next line and there wasn't a trailing comma. I'd fully expect a formatter to handle that kind of minutiae for me.


I've seen go code fail to compile because a closing } was on the next line and there wasn't a trailing comma.

That actually is a syntax error, as go enforces (that is they are not optional) trailing commas. gofmt only formats syntactically correct code.


I get that a formatter should reject syntactically incorrect code, but I question why it's syntax error in the first place. Coming from using black with python, I generally leave the trailing comma there but for the compiler to care seems weird when there's a standard formatter available.


The compiler cares because they want to enforce this standard (and it may make the parser slightly simpler). That there is a mix, or at least a non-obvious break between what the compiler enforces and what the formatter will accept and produce is an inconsistency in go, I suppose, but not necessarily worse than arbitrary decisions made in other languages.


Oh absolutely, do use gofmt, always.

The above example was just to make a point, and perhaps give insights into how the compiler works.

IMHO, Go's 'enforcement' of common formatting is powerful; it has an incredible compound effect which gains momentum as you get acquainted with the language.


For me, COBOL is so over-presented in the media among the old (mainframe) lagnuages, but I recently was surprised to see a production mainframe LPAR offering an APL2 interface,and it seems the language is still maintained on zOS[1]. But I never heard of business application written in APL.

1 https://www.ibm.com/support/pages/apl2-whats-new


APL was strongly supported by IBM early on, and was used for verification of early IBM systems. When you look at APL in context, it was extremely ahead of its time.


I went to Burning Man with Allen Rose's cousin (of "Gilman and Rose" fame), and got to hear two fisted tales of him writing CRM applications in an afternoon that were still in production 20, 30 years later. It's a shame, really; it really is the most productive language I've ever used.


Do a string search for "APL" in this webpage then: https://www.simcorp.com/en/career/product-development

(A friend of mine worked for www.apl.it which was later bought by simcorp.com - the original company worked on actuarial models for insurances and funds).



Oh man, I always wanted to learn more about COBOL. People like to crap on the language, but this article makes it seem like it's not that bad, actually. I could see myself programming that, for a large sum of money of course.

With that said, the real COBOL code is probably much more horrifying, when you consider, just how much uglier real-world code is compared to small tutorials and review articles. I would really like to read a series of “COBOL War-Time Stories”.


  how much uglier real-world code is 
One of my first projects in COBOL was to complete a rewrite, using structured programming, of a report writer that dated back into the 1960s. If you think COBOL is hard to follow, imagine COBOL code laden with gotos and older than the 1968 standard.


Yeah, that sounds fun. Did that have the evil "alter" statement, which caused even more spaghetti with gotos?


I think ALTER usage was formally disallowed.


> Oh man, I always wanted to learn more about COBOL. People like to crap on the language, but this article makes it seem like it's not that bad, actually.

  IDENTIFICATION DIVISION.
  PROGRAM-ID. WELCOME-TO-COBOL.  
  
  DATA DIVISION.
     WORKING-STORAGE SECTION.
     01 MAKES-PROGRAMMING-FUN PIC 9(9).
     01 ALL-VARIABLES-ARE-GLOBAL PIC 9(9).
  
  PROCEDURE DIVISION.
     A000-FIRST-PARA.
     MOVE 0 TO MAKES-PROGRAMMING-FUN.
     MOVE 1 TO ALL-VARIABLES-ARE-GLOBAL.
  
     IF ALL-VARIABLES-ARE-GLOBAL = MAKES-PROGRAMMING-FUN THEN
        DISPLAY '20 LINES OF CODE TO SAY ENJOY COBOL!'
     ELSE
        DISPLAY 'USE A BETTER LANGUAGE'
     END-IF.
  
  STOP RUN.


> makes it seem like it's not that bad, actually

That’s because it was written by somebody who just learned it. It looks fine when you just learn it, until you actually try to solve real problems with it, and then the weakness of its simplicity become very painfully obvious.


I wrote a ton of COBOL (mainframe and MicroFocus COBOL on AIX) back in the day in my early IT career (late 90s, early 2000s) around the Y2K conversion. It was a fun language because you could pick up someone else's code and be productive right away. It was hard (but not impossible) to do things in COBOL that weren't pretty straight forward.

One of the big drawbacks we had though, and this may have changed since, was that COBOL wanted every variable and structure to be fixed length and defined up at the top of the program. So it was not really a good fit for the modern world of the web, JSON, etc. where you don't always know the length or structure of your input in advance. The versions of COBOL we used wouldn't have been able to process a blob of JSON and pull out a particular key for example without having the full JSON structure defined beforehand.

I always joke w/my colleagues that my copy of "COBOL For Dummies" on my bookshelf is as good as my 401k as a retirement plan.


> I wrote a ton of COBOL (mainframe and MicroFocus COBOL on AIX) back in the day in my early IT career (late 90s, early 2000s) around the Y2K conversion

Haha, same here! I owe Y2K my first paid programming job.

> It was a fun language

Nope. You lost me there.

COBOL still gives me nightmares. It was the most profoundly unfun language I've ever had the misfortune to be forced to work with.


A good place to start would be use a modern COBOL web framework, such as COGS http://www.coboloncogs.org/HOME.HTM


I wonder if it's possible to write a Lisp-like (such as Clojure or Hy) that compiles to COBOL in an interoperable fashion.


The string manipulation he's doing is already pretty gory, though I think COBOL has better facilities that he's not taking advantage of.


I wouldn't call it beautiful, but COBOL is a fairly capable language. I used Microsoft COBOL on the original IBM PC (twin floppy) in 1985 to produce a realtime multiuser data collection system in a factory. This interfaced to various machines and devices using a custom TSR* routine written in x86 assembler.

COBOL was chosen by the customer as they were running an IBM System 36, and the implementation was capable of calling the assembly code almost directly.

Each compile would take around 20 minutes, which teaches you to be careful with the syntax! When the source code got beyond 64k, we had to build the app in sections as that's all IBM Edit at the time could handle at one time. As the article shows, COBOL is quite verbose, so 64k isn't that large!

In later years we updated to Microfocus COBOL which was much better than the Microsoft version.

I can't imagine any modern programmer having any problems learning COBOL sufficiently well in a few days. It was my third language after BASIC and assembler.

[*] TSR = Terminate and Stay Resident program, which you loaded into MSDOS first, prior to running the main (COBOL) program. A very poor-man's "multi-tasking".



> Well-written code is a work of art. Always has been, always will be.

The same thing was thought of assembler programming a few decades ago. Now it is rather a lost art. I fear that a day would come when machine learning algorithms can write code on their own and coding itself would become a lost art like assembler has become now.


Assembler isn't lost? It's still very common in hardware development.


There was an option in the old CAPEX optimizing COBOL compilers I used back in the 80s: you could print out the IBM 370 assembly language it generated from your COBOL source -- a look "under the hood", so to speak.


To clarify, COBOL (similarly to SQL) was intended as a UI - it was created for business people, non-programmers.


No, it wasn't. Non-programmers didn't program in COBOL. Non-programmers generally lacked access to IDEs (even basic TSO workflows), compilers, and computer time, let alone JCL and data sets.

COBOL was intended for business cases for which FORTRAN was ill-suited, and COBOL shops generally didn't need science/math implementations.

Even PL/1 and RPG required a degree of programming skill.


Actually, yes it was, in a way. COBOL was intended to do away with programmers and to allow their managers to interact with the computer directly. Programmers of the day worked with machine language and generally took forever to deliver bits and pieces of functionality due to the low level nature of their tools. The idea behind COBOL was that the compiler would take a high level specification (the 'source code') which any competent person should be able to put together rather than that this specification had to be handed to the programmers.

Of course, the end result was that we ended up with just one layer of abstraction and programmers were in more demand than ever before because of their increased productivity, and the 'managers' ended up being programmers themselves.

This then led to the schism between 'systems' programmers (those that understood assembly) and 'applications' programmers, those that only worked in high level languages.

When I started working professionally in IT this division was still quite visible.


> COBOL was intended to do away with programmers and to allow their managers to interact with the computer directly.

This is incorrect. COBOL has as its definition “COmmon Business Oriented Language.” It can be argued that COBOL was the first compiler widely used, but to say it was meant to “do away with programmers” is to ignore that application programmers were the intended beneficiaries of no longer having to work strictly in machine code.


“When I started working professionally in IT this division [the schism between 'systems' programmers and 'applications' programmers] was still quite visible.”

I think you nailed a rather fundamental trait (spectrum) in programmers that carries on, morphing along with times.

From systems to business logic, or from the 'container' (pun not intended) to the 'content', and dare I generalize: from a strictly technical eye to a strictly goal/business mindset (again, this is a spectrum, everything in-between exists, all flavors contribute to make a world).

The cliche would be your low-level C/Rust guy versus some frontend Flash/Js person; but this is oversimplifyling (C person might be very goal-driven with a business approach; Js person might be a performance expert). Reality speaks more of an engineer's world view so to speak: from "bottom -> up" (those that assemble elementary pieces, that need to crack the puzzle by understanding each component, like Lego) to "top -> down" (those that rather manipulate large abstract/complex entities as 'black boxes' and rather work on their structure, the graph, dimensions, and shift behavior thusly with surgical cost-effective changes).

Oh just my 2cts I'm probably rambling. I just know that these two extremes on the spectrum are what we battle with for every decision we make, if we are aware enough of both sides. (Is that a blessing or a curse?.. you tell me!)

I think it's why, deep down, we still (and might always) search for 'the next C' (cue Rust, Go, D, depends on use-cases I suppose), or why some people are almost principled advocates of the WebAssembly paradigm in complement or replacement of the Js ecosystem. These are the technical people of the spectrum. The high-level engineers are more about pushing for interoperability, available skill pool, maintainability.

It's just that 'programming' as a skill now pertains to three dozen very different jobs so the general line is much fuzzier than in the 1950-60-70's; but I think it's a product of human intelligence, whether how we think or how we build our tools, our machines. Historical parallels, yadi yada.

The mainframe situation, I think, is even more telling. Every programmer under the Sun has thought of ways to replace COBOL, and there are decent candidates. But the business side of us, of above, of reality, has mainframes still running on COBOL. Because as a 'black box', it works; the benefits of a rewrite have not yet justified their cost.

I know a solid team usually requires at least 1 of each 'sides' of the spectrum, keeps everyone else honest (as engineers in a problem space).


Then there are mathematicians and quants, like me: breastfed on Matlab, weaned on Python, claim a much deeper understanding of algorithms and code factoring techniques than front-end jockeys, but still shudders at memory management (other than by preallocating matrices/numpy arrays), devops and other practices that are all in a day's work for "man's man programmers".

Granted, we're precisely what PL authors had in mind from time to time (eg with SQL): nonprogrammers who can write domain code. But because our training goes into proving big-O complexity, etc. front-end jockeys look at us as if we're capital-P programmers.


I'm guessing that you never worked in a COBOL shop.


Your guess is wrong.


> Non-programmers didn't program in COBOL.

This doesn't contradict the parent comment. They claimed that COBOL was intended to be used by non-programmers, not that it necessarily succeeded.

(It's also possible that it succeeded by its original intention but appeared to fail as the definition of "programmer" changed i.e. if it originally meant someone who understood assembler but it later came to mean someone who spent their time writing programs. I could imagine this being the case, as someone who writes javascript today (or even VBA for Excel!) isn't really comparable to someone who wrote code in the 1950s, but both might be called "programmers".)

Having said all that, for what little it's worth, in a software history course I was taught that COBOL was intended to be written by programmers (in the full-time professional sense) but be readable by managers. I imagine that it did not even succeed at that.


An operating system, MULTICS, was written in PL/1.


That's not really accurate. Grace Hopper created the FLOWMATIC language (which was the immediate ancestor of cobol) because she believed that computers should be programmed using something closer to natural language than machine code, and COBOL was made to be business-oriented but it was not created for business people to use. The concept of a "user" of a computer who was not an expert in computing would have been quite strange at that time (1959).


From the wikipedia page on the subject:

"Representatives enthusiastically described a language that could work in a wide variety of environments, from banking and insurance to utilities and inventory control. They agreed unanimously that more people should be able to program and that the new language should not be restricted by the limitations of contemporary technology. A majority agreed that the language should make maximal use of English, be capable of change, be machine-independent and be easy to use, even at the expense of power."

Emphasis added. The machine code that went before was slow and very expensive to develop, and required detailed specifications to be handed to the programmers in order to be able to implement the software. COBOL was clearly an attempt at increasing the accessibility of programming for a larger audience than those that felt comfortable with machine language.

This succeeded admirably, many people joined the ranks of programmers with COBOL as their first language in a large variety of business applications. The jump in abstraction level between machine language and COBOL was very large.


“I used to be a mathematics professor. At that time I found there were a certain number of students who could not learn mathematics. I then was charged with the job of making it easy for businessmen to use our computers. I found it was not a question of whether they could learn mathematics or not, but whether they would. […] They said, ‘Throw those symbols out — I do not know what they mean, I have not time to learn symbols.’ I suggest a reply to those who would like data processing people to use mathematical symbols that they make them first attempt to teach those symbols to vice-presidents or a colonel or admiral. I assure you that I tried it.” — Hopper c. 1958


My impression is that the design was a bit more nuanced. COBOL's design allows a "business person" to read a program and get a sense of what it does. The user personas for "business persons" are roughly commissioned military officers. A non-comm hands the lieutenant the source. The lieutenant hands the source to the colonel managing the program. And an admiral at codeworks inspection can riffle a deck of cards and find something that could be done better. All that verbosity and rigidity is for reading code.

Programs are meant to be read by humans and only incidentally for computers to execute. SICP.


I still have nightmares about COBOL. "Beauty" and "COBOL" do not mix in the same sentence. Forget this article, it's not a nice language at all. I'm not going to crap all over the language -- I understand the requirements of its time -- but now you'd have to be crazy to want to use it.

It's not even true that COBOL programmers earn a lot of money because it's a niche and there are few programmers still writing it and banks are in a hurry to hire them. That's not my experience: all the COBOL programmers I know are mostly old guys earning average/below average salaries. And dealing with banks and similar institutions, which is a hell of its own.

If you like writing software and enjoy nice programming languages, COBOL is not for you. If you want to earn a lot of money or work on interesting projects, COBOL is not for you. If you're a bank manager, maybe COBOL is for you.


> "Beauty" and "COBOL" do not mix in the same sentence.

The author is falling into the trap of assuming that “easy to learn” = “optimal”. COBOL is extremely easy to learn; you can learn it in a few days. That simplicity comes at a price: you outgrow it _really fast_. You can’t even write a generic sort routine or a linked list data structure in COBOL; the language just doesn’t allow that sort of reuse. I thought it was almost comical that the author had so little experience with COBOL that he wrote that it allowed you to not repeat yourself: the painful experience of rewriting things like sort routines every time you had a different comparison condition or rewriting linked list traversal code when you had a structure with a different schema than the structure that you last wrote linked list traversal code for was the reason everybody started abandoning COBOL for more flexible languages like C (even Pascal was better).


I won't argue the merits of COBOL. I stopped doing it for a living 35 years ago for a reason, but sort is not a good example. COBOL has generic SORT and MERGE facilities built in as verbs. In an IBM environment (i.e. writing real applications) you'd be hard pressed to build something that has better performance than what's provided out of the box.


Those were good for working with flat files and nothing else!


You don't need sorting for VSAM or DB2 or their non-IBM equivalents.


Exactly right; all the Cobol jobs went offshore over the past two decades. There are 1,000s of former Cobol developers in my area alone. Cobol is a cost center companies seek to minimize.

All this talk about Cobol and nothing about it's characteristics that make it truly remarkable: all memory is statically allocated, there is no dynamic memory allocation. Makes streaming xml processing a true nightmare, but also make Cobol applications immune to entire classes of security vulnerabilities and exploits that plague every other language. Also, make performance reliable and easy to reason about with none of the "departures" common in other runtimes (ie thread pool exhaustion, stack overflow errors, heap exhaustion errors). Really dependable from a day to day management perspective.


There's this rather sad thread from a decade back of cobol developers discussing being stranded after the jobs dried up.

https://www.indeed.com/forum/job/Cobol-Developer/other-non-c...


Interesting. The problem of cobol is that is verbose, like java. Type type and then type. Copy and find all the time. Good luck to understand any as it takes a lot of time to read it.

Do not like cobol or java.


Compared to Java COBOL is surprisingly direct and readable. It has some silly bits that require some boilerplate to overcome but not excessively so.


> problem of cobol is that is verbose

My mother used to say, sometimes someone's good point is their bad point, and vice versa.

verbose languages can be harder to write, but easier to read.

On the other extreme is perl. I found it to be the easiest language to express myself. There are so many ways to do something, and in a compact way. But it's very hard to read.


Compared to the mess that is Javascript ... yikes ... no thanks.


Curious what language you would prefer? Just to trying to understand you argument better. Assembly?


Python is relatively low on the kind of boilerplate Java has.

(Or if you want to go more extreme, Haskell can really give you some terse programs.)

I'm not the OP.


As someone who has worked with both in a professional setting, Python has a lot of similarities with COBOL, IMO.


As someone who has also done that, the most important thing in common is that they're programming languages. COBOL is a rigid and "square" language, I'm tempted to call its style "COBOLic".

Any language where the dots at the end of statements are crucial part of the logic flow can't be that similar to Python.


Can you explain how? As someone that has worked with both professionally, I can't see many similarities.


I wonder when it will dawn on the COBOL tool vendors that a big chunk of what makes COBOL unpleasant is the all-upper-case presentation. The program is SHOUTING ALL THE TIME.


Has anyone made a syntax converter for COBOL to make it read like C-style language? At the very least, it would be trivial to make a macro to get rid of the magic column numbers.


COBOL was the original “self documenting language.” And that is why many others followed it.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: