I do remember thinking it's odd that a line has exactly 80 characters and putting an asterisk in the 7th or 8th row means the line is a comment.
This is probably not true these days with modern COBOL. But when I saw Python the first time, and that it enforces similar restrictions with intendation and then Golang following suite, I had to smile that such old concepts of enforcing style still are seen as useful today.
Often we just have to go back and see what others have done decades ago to get new ideas!
Maybe you know this now, but for those that don't:
The limitation is due to punch cards - specifically IBM-format punch cards (which was pretty much the standard by the time COBOL was developed). Those punch cards had a layout of 80 columns by 12 rows:
Each column would represent (more or less) a 12-bit "word" for each character on a line of code - so one card per line of COBOL code.
Now - note that the IBM PC (well, 5100 and earlier terminals too) had a layout of 80 columns by 24 rows. Why? Because you could (in theory) see a virtual representation of two punch cards one above each other. Whether this was actually used or not, or is an apocryphal or retcon'd reasoning I'm not sure; I've been told both.
There's also parallels and such with printers - standard printer width was 80 columns (for regular 8.5 inch wide paper - sans tractor-feed edges), while greenbar was 120 characters wide (using the regular font for most printers); I'm not sure why "40 extra characters" or "half-card" extra, but I bet it's related to IBM punch cards in some manner.
It still is. Which column (not row) a statement starts in matters and it's a syntax error to start the wrong statement in the wrong column. Sorry I don't remember any specific examples because it's been a few years since I last used COBOL.
I remember hearing that this had something to do with how the early COBOL programs were encoded on punched cards. Anyway there was some practical aspect to it that is now not an issue anymore with the changes to hardware.
It is not if you use free source format, which most new cobol programming is done in. Fixed source format does require the columns specifically, free source format lets you start wherever on the line you want, and extend for however long of a line you would like, but comments start with *>
You should check out IBM RPG and its various iterations, it's a whole lot worse than that.
But yes, you are correct.
So I had to eat a little crow.
Shortly after my gig, they went full scale IBM and built an IT department, as opposed to just a system 34 in accounting.
So no support calls for me.
One of my first gigs was working for a manager who loved RPG III. To the point where I was tasked with making a data collection interpreter using the same constructs RPG III is well-known for.
To this day, whenever someone mentions "RPG III" and what that project entailed, I think of the other one.
0 - https://en.wikipedia.org/wiki/Rocket-propelled_grenade
Generally, extra tabs and spaces do not change semantics; nor do new lines (when inlining statements you just can't imply the semicolon).
Go files run just fine without any indentation whatsoever. People often assume enforced formatting by the `gofmt` tool is mandatory for the compiler, but most of it is addressed at human beings for readability.
Here's the 'official' blog post about gofmt: https://blog.golang.org/go-fmt-your-code
Edit: downvotes don't change reality ; )
Try it yourself: https://play.golang.org/p/_bXvXryM5Ih — how incredibly ugly, and yet it runs just fine. Please hit format to clean your eyes.
That actually is a syntax error, as go enforces (that is they are not optional) trailing commas. gofmt only formats syntactically correct code.
The above example was just to make a point, and perhaps give insights into how the compiler works.
IMHO, Go's 'enforcement' of common formatting is powerful; it has an incredible compound effect which gains momentum as you get acquainted with the language.
(A friend of mine worked for www.apl.it which was later bought by simcorp.com - the original company worked on actuarial models for insurances and funds).
With that said, the real COBOL code is probably much more horrifying,
when you consider, just how much uglier real-world code is compared to
small tutorials and review articles. I would really like to read
a series of “COBOL War-Time Stories”.
how much uglier real-world code is
01 MAKES-PROGRAMMING-FUN PIC 9(9).
01 ALL-VARIABLES-ARE-GLOBAL PIC 9(9).
MOVE 0 TO MAKES-PROGRAMMING-FUN.
MOVE 1 TO ALL-VARIABLES-ARE-GLOBAL.
IF ALL-VARIABLES-ARE-GLOBAL = MAKES-PROGRAMMING-FUN THEN
DISPLAY '20 LINES OF CODE TO SAY ENJOY COBOL!'
DISPLAY 'USE A BETTER LANGUAGE'
That’s because it was written by somebody who just learned it. It looks fine when you just learn it, until you actually try to solve real problems with it, and then the weakness of its simplicity become very painfully obvious.
One of the big drawbacks we had though, and this may have changed since, was that COBOL wanted every variable and structure to be fixed length and defined up at the top of the program. So it was not really a good fit for the modern world of the web, JSON, etc. where you don't always know the length or structure of your input in advance. The versions of COBOL we used wouldn't have been able to process a blob of JSON and pull out a particular key for example without having the full JSON structure defined beforehand.
I always joke w/my colleagues that my copy of "COBOL For Dummies" on my bookshelf is as good as my 401k as a retirement plan.
Haha, same here! I owe Y2K my first paid programming job.
> It was a fun language
Nope. You lost me there.
COBOL still gives me nightmares. It was the most profoundly unfun language I've ever had the misfortune to be forced to work with.
COBOL was chosen by the customer as they were running an IBM System 36, and the implementation was capable of calling the assembly code almost directly.
Each compile would take around 20 minutes, which teaches you to be careful with the syntax! When the source code got beyond 64k, we had to build the app in sections as that's all IBM Edit at the time could handle at one time. As the article shows, COBOL is quite verbose, so 64k isn't that large!
In later years we updated to Microfocus COBOL which was much better than the Microsoft version.
I can't imagine any modern programmer having any problems learning COBOL sufficiently well in a few days. It was my third language after BASIC and assembler.
[*] TSR = Terminate and Stay Resident program, which you loaded into MSDOS first, prior to running the main (COBOL) program. A very poor-man's "multi-tasking".
The same thing was thought of assembler programming a few decades ago. Now it is rather a lost art. I fear that a day would come when machine learning algorithms can write code on their own and coding itself would become a lost art like assembler has become now.
COBOL was intended for business cases for which FORTRAN was ill-suited, and COBOL shops generally didn't need science/math implementations.
Even PL/1 and RPG required a degree of programming skill.
Of course, the end result was that we ended up with just one layer of abstraction and programmers were in more demand than ever before because of their increased productivity, and the 'managers' ended up being programmers themselves.
This then led to the schism between 'systems' programmers (those that understood assembly) and 'applications' programmers, those that only worked in high level languages.
When I started working professionally in IT this division was still quite visible.
This is incorrect. COBOL has as its definition “COmmon Business Oriented Language.” It can be argued that COBOL was the first compiler widely used, but to say it was meant to “do away with programmers” is to ignore that application programmers were the intended beneficiaries of no longer having to work strictly in machine code.
I think you nailed a rather fundamental trait (spectrum) in programmers that carries on, morphing along with times.
From systems to business logic, or from the 'container' (pun not intended) to the 'content', and dare I generalize: from a strictly technical eye to a strictly goal/business mindset (again, this is a spectrum, everything in-between exists, all flavors contribute to make a world).
The cliche would be your low-level C/Rust guy versus some frontend Flash/Js person; but this is oversimplifyling (C person might be very goal-driven with a business approach; Js person might be a performance expert). Reality speaks more of an engineer's world view so to speak: from "bottom -> up" (those that assemble elementary pieces, that need to crack the puzzle by understanding each component, like Lego) to "top -> down" (those that rather manipulate large abstract/complex entities as 'black boxes' and rather work on their structure, the graph, dimensions, and shift behavior thusly with surgical cost-effective changes).
Oh just my 2cts I'm probably rambling. I just know that these two extremes on the spectrum are what we battle with for every decision we make, if we are aware enough of both sides. (Is that a blessing or a curse?.. you tell me!)
I think it's why, deep down, we still (and might always) search for 'the next C' (cue Rust, Go, D, depends on use-cases I suppose), or why some people are almost principled advocates of the WebAssembly paradigm in complement or replacement of the Js ecosystem. These are the technical people of the spectrum. The high-level engineers are more about pushing for interoperability, available skill pool, maintainability.
It's just that 'programming' as a skill now pertains to three dozen very different jobs so the general line is much fuzzier than in the 1950-60-70's; but I think it's a product of human intelligence, whether how we think or how we build our tools, our machines. Historical parallels, yadi yada.
The mainframe situation, I think, is even more telling. Every programmer under the Sun has thought of ways to replace COBOL, and there are decent candidates. But the business side of us, of above, of reality, has mainframes still running on COBOL. Because as a 'black box', it works; the benefits of a rewrite have not yet justified their cost.
I know a solid team usually requires at least 1 of each 'sides' of the spectrum, keeps everyone else honest (as engineers in a problem space).
Granted, we're precisely what PL authors had in mind from time to time (eg with SQL): nonprogrammers who can write domain code. But because our training goes into proving big-O complexity, etc. front-end jockeys look at us as if we're capital-P programmers.
This doesn't contradict the parent comment. They claimed that COBOL was intended to be used by non-programmers, not that it necessarily succeeded.
Having said all that, for what little it's worth, in a software history course I was taught that COBOL was intended to be written by programmers (in the full-time professional sense) but be readable by managers. I imagine that it did not even succeed at that.
"Representatives enthusiastically described a language that could work in a wide variety of environments, from banking and insurance to utilities and inventory control. They agreed unanimously that more people should be able to program and that the new language should not be restricted by the limitations of contemporary technology. A majority agreed that the language should make maximal use of English, be capable of change, be machine-independent and be easy to use, even at the expense of power."
Emphasis added. The machine code that went before was slow and very expensive to develop, and required detailed specifications to be handed to the programmers in order to be able to implement the software. COBOL was clearly an attempt at increasing the accessibility of programming for a larger audience than those that felt comfortable with machine language.
This succeeded admirably, many people joined the ranks of programmers with COBOL as their first language in a large variety of business applications. The jump in abstraction level between machine language and COBOL was very large.
Programs are meant to be read by humans and only incidentally for computers to execute. SICP.
It's not even true that COBOL programmers earn a lot of money because it's a niche and there are few programmers still writing it and banks are in a hurry to hire them. That's not my experience: all the COBOL programmers I know are mostly old guys earning average/below average salaries. And dealing with banks and similar institutions, which is a hell of its own.
If you like writing software and enjoy nice programming languages, COBOL is not for you. If you want to earn a lot of money or work on interesting projects, COBOL is not for you. If you're a bank manager, maybe COBOL is for you.
The author is falling into the trap of assuming that “easy to learn” = “optimal”. COBOL is extremely easy to learn; you can learn it in a few days. That simplicity comes at a price: you outgrow it _really fast_. You can’t even write a generic sort routine or a linked list data structure in COBOL; the language just doesn’t allow that sort of reuse. I thought it was almost comical that the author had so little experience with COBOL that he wrote that it allowed you to not repeat yourself: the painful experience of rewriting things like sort routines every time you had a different comparison condition or rewriting linked list traversal code when you had a structure with a different schema than the structure that you last wrote linked list traversal code for was the reason everybody started abandoning COBOL for more flexible languages like C (even Pascal was better).
All this talk about Cobol and nothing about it's characteristics that make it truly remarkable: all memory is statically allocated, there is no dynamic memory allocation. Makes streaming xml processing a true nightmare, but also make Cobol applications immune to entire classes of security vulnerabilities and exploits that plague every other language. Also, make performance reliable and easy to reason about with none of the "departures" common in other runtimes (ie thread pool exhaustion, stack overflow errors, heap exhaustion errors). Really dependable from a day to day management perspective.
Do not like cobol or java.
My mother used to say, sometimes someone's good point is their bad point, and vice versa.
verbose languages can be harder to write, but easier to read.
On the other extreme is perl. I found it to be the easiest language to express myself. There are so many ways to do something, and in a compact way. But it's very hard to read.
(Or if you want to go more extreme, Haskell can really give you some terse programs.)
I'm not the OP.
Any language where the dots at the end of statements are crucial part of the logic flow can't be that similar to Python.