Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Do you use an old or 'unfashionable' programming language?
187 points by open-source-ux on Jan 30, 2016 | hide | past | favorite | 327 comments
I couldn't think of a better word than 'unfashionable' but what I mean by this is a programming language that is not new, upcoming, or has much traction.

The language may have unique or novel features, it may be a language with a passionate and dedicated band of programmers. But one thing the language does not have is much 'mindshare' amongst programmers: its time in the spotlight has passed. It may still be in active development, or it may be moribund.

Examples of 'unfashionable' languages include: Cobol, Snobol, Icon, Unicon, Forth, Pascal, Eiffel, D, Smalltalk, Basic etc.(Note, I realize this is subjective to a degree.)

If you use an 'unfashionable' language, what keeps you using it? It is a unique feature? Is it familiarity or comfort? Is it speed or performance or some other quality? What do you think we could learn from that language when developing programming languages today?




My dad uses and has used BASIC.

He's not a teacher, nor a hobbyist.

He writes bank software. In BASIC. For forty years. At the same company. UBS, Bank of China, and other names all run this massive million(s)-line BASIC codebase, that he almost singlehandedly wrote, on minicomputers[1] powered by OpenVMS. With it, according to my father, they process billion dollars worth of transactions and other facets of their business. Every day. Surprisingly for enterprise software, his customers love it. They are only being forced to move to a completely new system since HP went and deprecated their entire minicomputer line.

Given this and other parts of the story-- perhaps better saved for a blog post-- I've always seen him as my 10x engineer.

[1] https://en.wikipedia.org/wiki/Minicomputer, just so we're all on the same page.


> perhaps better saved for a blog post

Please do write it, and submit it to HN! This is the sort of material nearly all of us would love to read.


It'd certainly be interesting to relate a multi-decade software project from the outside :-)

I will have to talk to my father about this... and explain why folks on the internet must hear his stories.

In the interim, would anyone happen to have an example of a good secondhand account of a software project? A basis or, at the least, an inspiration for how to proceed.


Joe Armstrong's paper on the history of Erlang (of which he was one of the authors) is superb (though it's less about corporate culture than about the language): http://cobweb.cs.uga.edu/~maria/classes/4500-Spring-2010/pap...

There's The Mythical Man-Month: http://www.amazon.com/The-Mythical-Man-Month-Engineering-Ann...

Showstopper, the book about the development of Windows NT, is great: http://www.amazon.com/Show-Stopper-Breakneck-Generation-Micr...


Thank you kindly. I knew about the mythical month, but the others are completely new to me.


My example is not explicitly software-related, but it's very very close.

https://www.reddit.com/user/36055512/submitted

Start oldest first, bottom of the list, and go upwards for chronological order (identical titles are to different subs, so only comments will be different).

Last I heard a book was in the works. I'm not yet sure if it's surfaced yet, but these posts make for a good few hours' reading. They're great.

The secondary suggestion here is that finding a forum that will appreciate the posts and give feedback - note the upvotes and sheer quantity of gold - can be incredibly encouraging. This format also benefits from the fact that there are so many posts on a daily basis all the readers have plenty to keep busy with while they wait, so each writer can establish a lazy pace of writing and submitting that isn't pressured or hectic.

As for continued inspiration, https://www.reddit.com/r/talesfromtechsupport features some great technical writing. Sort by top/week or top/month to find quality examples and the latest top authors (top/all is also awesome, but changes there are months apart) - just try not to drown in the firehose of cringe-inducing idiocy reporting! xD

The idea here would be that the subreddit's sentiment is grokked, inspiration happens, and a new writer appears :D

(Stop by Reddit's TOS and overview, and do NOT skip installing Reddit Enhancement Suite if you decide you like this particular site.)

Wherever the content ends up - a subreddit like TFTS, another messageboard or forum, or a blog - definitely please do link it. :D


Thank you & any technical blog post I write will certainly get submitted through here!


Not just a software project. More of hardware. But a good read, IMO:

The Soul of A New Machine

by Tracy Kidder.

About the race to build a new minicomputer, by a team at Data General.

https://en.m.wikipedia.org/wiki/The_Soul_of_a_New_Machine

Prize winning book.


This is actually my favorite engineering book of all time.

Excellent to bring up!


Cool. My favorite part of the book is this:

[ Many of the engineers state that, "They don't work for the money", meaning they work for the challenge of inventing and creating. The motivational system is akin to the game of pinball, the analogy being that if you win this round, you get to play the game again; that is, build the next generation of computers. ]


That rewrite sounds like a great project to pair with junior developers on.

It has all of the optimal features:

1) You pretty much have to TDD, but it isn't hard to because you can start by writing tests that the old software passes as your "spec".

2) It actually matters if it works. As a junior dev, one of the hardest parts about giving a shit was that the tasks I was given often didn't matter.

The quality of my code when I was younger scaled exponentially with the challenge and meaningfulness. As is true with many immature youths, I only acted like it mattered when it mattered.

3) A senior engineer would likely enjoy refactoring/rewriting something like this. Focus on getting it right, take time to craft elegant solutions, etc. Plus you get to teach someone

4) Large company with resources and domain to take a long-term outlook on the problem. UBS isn't going anywhere (I hope), so they are incentivized to take longer to write a solution that lasts another few decades. They can also take the time to train a junior developer without worrying that they'll miss their metrics for the next funding round.

Honestly it sounds like a blast. Sign me up!


All excellent points; I don't disagree with anything except for the last one.

> Large company with resources and domain to take a long-term outlook on the problem.

My father works for company X that sold this system to banks; UBS and the other big ones didn't have a hand in the development other than testing and specifications.

Additionally, company X and their clients suffer from enterprise-levels of political dysfunction and mismanagement, e.g., "let's fire our internal QA team and replace them with underpaid contractors from <insert third world country>. what could go wrong?", "the new system being developed by our new college hire employees in <another insert third world country> is 10 times slower than the old one? Well we'll just run it on 10 times more expensive hardware!"

I could go on and on about the general shortsightedness, perhaps in a blog post[1]. Suffice it is to say: probably not the best idea to get signed on at the moment :-)

[1] The perhaps is saving me from commitment.


To elaborate a bit further, I feel like the best way of stating it is that in large organizations changing or allocating resources to anything (no matter how obvious) needs a sponsor in the form of a person.

That sponsor typically signs on because they really believe thing needs to get done and/or expect that if thing succeeds then they will look good.

Rewriting working code is a hard project to find a sponsor for, unless you're lucky to have software engineers high up in the hierarchy with clout.


I assume the original system is Business BASIC or something like it. I used that in one of my first dev jobs in the 1990s at an investment bank. All their internal systems were written in Business BASIC. They had moved off of the original vendor's hardware and were running it on PA-RISC HPUX servers.

Anyway -- the problem with developing tests for the old software is that there is no unit testing framework. You'd have to develop that too and unless you really invested in it you would not have stats like code coverage. Mocking out parts of the system would likely also be difficult because the code in such systems is normally spaghetti and not well abstracted.

If you're lucky they have some standard test data that can be used to run through system tests -- but that's at a much higher level than TDD would ideally want. And I know the shop I worked at had nothing like that back in the day. Everything was tested by hand.


Vms basic, most likely.

Main problem is not the lack of frameworks, it's a whole different culture of how to test...

Unfortunately, for the banks, they've not managed to maintain their withering infrastructure for a lot of reasons, and are just now realizing the threath from 'Fintechs' that are run by people that actually understands software development.


Not a surprise. There's a lot of that stuff around, on OpenVMS. There are new systems — yes, using BASIC on OpenVMS — being installed, too.

There are new Itanium servers with support added last year, another (final) generation is planned, and — in preparation for the end of Itanium — an OpenVMS port to x86-64 is presently underway, and there's a new OpenVMS release (for Itanium) is scheduled to arrive this March.

https://vmssoftware.com/products.html


I was an OpenVMS sysadmin and C/COBOL/BASIC programmer writing code for mostly storing tons of data and sending it off to other places. Hopefully though, now that HP spun off the VMS engineering team and the talk of an amd64 port coming in 9 they won't have to move.


"He writes bank software. In BASIC. For forty years. At the same company"

I would have gone insane :)


He didn't say his father was still sane ;-)


I had a good laugh at your comment and the parent's because I think I've said "I would have gone insane.", or "Are you insane?" to him more than a few times.

And yet he's more sane than me. He's a funny fellow.


True... No disrespect, but I wouldn't be surprised if the company hasn't actually existed for 20 years already ;))


Man, someone should just build an open source clone of HP's platform and keep the dream alive.

If only there were a drifting entrepreneur nearby interested in a doting customer base of Swiss banks wrapped up in a bow....


There are some products that can generate c++ from vms basic and implements a vms-like runtime on linux...


My TI 89 runs BASIC.

...but it also gives me access to great symbolic manipulation tools, which more than makes up for it.


Write a compiler for this dialect of BASIC for Linux.


What dialect of BASIC?


Pascal - still use it today.

I started with Turbo Pascal 3.0, went through several versions of Borland Pascal (DOS and Windows), several versions of Delphi, and am now using FreePascal. I do most of my work on Linux nowadays. Pascal still gets everything done and after 20+ years I am really fluent in it.

Did a little bit of Javascript with Node.js, also looked at Golang. While I did like some aspects of them, there was always the downside of having to learn a whole new ecosystem. Didn't like the dynamic typing of Javascript. Golang looked better for what I do. But in the end I just wrote a small Pascal library to emulate Golang's channels which I really liked, and that was the end of Golang for me.

My main project in Pascal is a search-engine: https://deusu.org

Sourcecode for that is on GitHub: https://github.com/MichaelSchoebel/DeuSu

So yes, you can (still) do actual stuff in Pascal. Even pretty cutting-edge stuff.


Holy cow! I used Pascal at college about 16 years ago. I never thought I would see it used in the wild, especially in 2016. Nicely done!


Some other in the wild examples, the WHATWG has a tool called Wattsi that's written in Object Pascal: https://github.com/whatwg/wattsi

Interesting blog post from last year on someone's experiences developing an application in Object Pascal: http://ziotom78.blogspot.com/2015/01/lfi-data-analysis-with-...

Free Pascal itself: http://www.freepascal.org/


I taught Pascal at college about 40 years ago.


This is pretty awesome, and the speed is great.

I really like obscure search engines, because their (comparatively :P) tiny indexes mean they'll return significantly different data to what say Google or Bing would return.

If there's one thing I would definitely vote for with projects like this, it would be... an API. In a world where virtually every search engine is built around the idea of controlled, non-open access, a predictable, documented, free-access API would be a breath of fresh air, and you'd get people seriously interested in this even though it's small. (You'd definitely need rate limiting though because there are always the hangers-on with the low IQs. >.>)

You may (or may not, no idea) also be interested in the Common Crawl corpus - http://commoncrawl.org/the-data/get-started/ - which contains a few PB of data collected over the past few years (sorted by date). Access via HTTP is only moderately involved: read the line immediately above "Data Format."

Of potentially much less interest is a 2012 80TB snapshot Archive.org are (or at least were) offering; they seeded their indexer off the Alexa top 1m and boxed up whatever came back. https://blog.archive.org/2012/10/26/80-terabytes-of-archived... It's apparently only available on request (I expect a search engine would definitely qualify, especially considering the data is 4 years old now - although I definitely want to see what's in it if you do decide to chase this - serious :P).

Finally, search for "cheetah project" - the 2nd top-level result is messed up, apparently the Chinese (Unicode) got scrambled. Just a note; you may already be aware.


Non-ASCII characters are indeed messed up. I wrote the HTML parser about 15 years ago. Back then my programming style was so bad, that I'm afraid nothing short of a complete rewrite of the parser will be able to fix it.

The Archive.org crawl is way too old to be of any use. And the Common Crawl corpus is only marginally bigger than what I crawl myself. There are currently just over a billion pages in my search-index and I can recrawl those about every 45 days. Crawling right now at about 650 URLs/s and 200mbit/s. Only about 60% of URLs crawled end up in the index. The rest are errors, timeouts, redirections, etc.

I have been thinking about adding an API. Rate-limiting is a must as you have correctly said. I'm already in a constant fight against SEOs who try to scrape the search-results and who apparently don't see anything wrong in making 10+ queries/second.

The API would definitly not be a free-for-all, but sort of like "send me an email and describe what you want to do with it, and if I like it I'll give you an API-key".Together with a primer on what not to do of course. :) If people do more than a certain amount of queries/month, then it would also have to be paid.


Ah. 2001 was... the era of HTML4, more or less, so it's still seeing the majority of page content, but... no Unicode. I see.

I figured old data was uninteresting, but TIL that crawling data yourself is not impossible. I mean you have to come from a thousand IPs, browser UAs and cookie states to find everything thanks to the sad state of things, but wow, I didn't realize 200Mbps could do so much. :D

As for the errors et al, that makes me think, I wonder what would happen if you saved that data and folded it into the seed list (with maybe 3 or 4 URLs up the chain leading to that link) for next run.

Agh... so you already have leeches :( all I can suggest is maybe mixing up your HTML, or maybe putting your site behind CloudFlare and, if there's an option(?), setting the attack detection sensitivity all the way up. I have no idea if they can do this.

10 queries/sec... that's... not going to be too great on the hardware :( I mean these people can just crawl the info themselves and get more up-to-date content >.< wow

By "free" API, I don't mean free-for-all "here, have at my bandwidth/hardware" xD - I mean more in the sense that curious shy CompSci types can maybe autogenerate themselves a basic-level API key to experiment with ultra-low-rate data requests here and there, and request a threshold upgrade on their key if they think they have an interesting/justifiable use for the data.

And it would be great if this could be a revenue stream! Have you ever featured the site on HN?

PS. Constructive criticism on design: I would personally alter the page styles a little; it currently looks nothing like what I associate "search engine homepage" with based on what I've learned to expect, and the page style, while very nice, makes me feel like the site is a blog, not a dynamic search engine.


Crawler works from a single IP. User-Agent is fixed to the robot's UA. Cookies are totally ignored. The search-engine works with just 2 servers. Crawler/Indexer and Webserver/queries. Crawler is a root-server with 1gbit/s connection hosted in a datacenter. Webserver sits here at home with 200mbit downstream and 20mbit upstream.

I use the Alexa top-1-million sites as seed-list for the crawler. The errors that do appear during the crawl are either sites that have an outage or more likely simply dead-links. Oh, and URLs that turn out to be blocked by robots.txt. There are a lot of sites out there which block anything but Google and Bing from crawling them.

Cloudflare is not an option for me. It would let Cloudflare know what my users are searching for. VERY big no-no. :)

I can filter out 99% of automated queries. Luckily they are still pretty dumb at the moment and give me enough fixed clues to identify them.

I like your idea of keeping the API free with a very low request-rate. That could work. I would have to find a way that they can't just generate many API-keys though. Using captchas for API-key requests won't stop them from doing that.

I posted a "Show HN" about a year ago. Brought in about 1500 extra visitor that day. Got up to 9th place on the HN homepage that day.

New webdesign is already done. I have a German site too. https://deusu.de which actually gets 90% of traffic. That site already has the new design.


1Gbps for the crawler totally explains it: I can see that doing 600 URL/sec. xD

I didn't think of CloudFlare being able to see the traffic... and wow, I never even processed that aspect of their service. But of course...

How good is Google's "[ ] I'm not a robot" checkbox thingy at weeding out bots? And perhaps you could use multiple captcha systems...? (Or are actual people tasked to do signups?!)

I shudder to think of such an idea, but linking API keys to <popular login-with/connect-with-this-site API> may be an alternative. (One thing that comes to mind is that, if someone authenticates using Reddit - which they can do without releasing any account info - is that you could check their (public, but unfakeable) karma counts and use that as a measure of confidence, in addition to the standard account age metric used everywhere.)

The new design is nice :D

And if it's been a year (!), another Show HN sometime would certainly be fine.


The WHATWG had a spec-compliant HTML parser in Pascal, if you need it. Currently only does UTF-8 but that should be fixable.


The HTML spec's preprocessor is written in FreePascal: https://github.com/whatwg/wattsi


Pretty cool! Ever tried Modula-2 or -3?


Did some baby steps in Modula-2 out of curiosity about 20 years back. The IDE and compiler were horrible compared to Turbo Pascal, so I went right back to that.


My favourite language in FreePascal's dialect of ObjectPascal (which is largely derived from the work Borland did with Delphi, which itself is largely derived from Borland's Turbo Pascal).

It has all the benefits of a modern language -- strong typing, generics, operator overloading, objects, metaclasses, dynamic arrays, interfaces, RTTI, dynamic dispatch, method pointers, type helpers, true modules, fast compile times, preprocessor directives, pretty large set of libraries from the community, etc -- and all the benefits of a compiled language -- compiles to pretty small binaries, small memory footprint at runtime, tight control over memory allocation and layout, trivial ability to shell out to assembler when necessary, etc. It builds to pretty much every platform under the sun -- Mac, Windows, Linux, Android, iOS, MSDOS, OS/2, BeOS, Netware, you name it.

Some of its features are features I've rarely seen in 'fashionable' languages. For example the way classes and metaclasses work in FreePascal is great. You can pass classes around by value, constructors can be virtual, even static methods can be virtual.

In many ways it's really a better C++. For example, what I just said about metaclasses. It has native support for strings. The declaration part of a module (unit interface) is separate from its implementation, so it's easy to get an overview of what is exported from a module, but it's in the same file and is a first-class-citizen of the language, unlike C++ header files. The syntax also makes a lot more sense than C-like languages. For example, there's an explicit keyword to introduce a function, and constructors are named rather than using the type name. (These don't sound like interesting features until you're trying to navigate the source and you find that searching for a method becomes far easier, or you're trying to copy/paste a class and you find you don't have to worry about renaming constructors, stuff like that.) It doesn't even have to link in a libc, it has its own RTL. (It can link in a libc if you want to use some C or C++ libraries with it though.)

There's aspects of it that are a bit crufty from having decades of legacy, but it's probably still better than C++ in that regard.

I don't really know why Pascal derivatives haven't taken off more.

At work I use Dart, which has many of the benefits I listed above, but in the form of a GC'ed VM-hosted language. Dart's a much newer language, though, so regardless of its usage it's probably too early to know if it will end up in the "unfashionable" category. Dart has all kinds of cool features like the ".." operator for chaining, the "??=" operator for assignment-if-null, the "with" feature for doing composition rather than inheritance, etc. Also very readable.


Could you refer me to a good introduction-intermediate level resource to learn modern Pascal? I currently work on my projects in C (quite fluent) and am constantly looking for a language that offers proper type safety, generics and a good module system without being overly complex (eg, C++). Pascal and Object Pascal seem like a really good fit but I can't get my hands on a high quality walk-through of the language implemented by FreePascal and its standard library.


I use a language originally called MUMPS, with the alternate name of M, which is the core language in Cache Object Script.

I use it because it has features that are still not (really) included in more "modern" languages, because it is actively used in niche markets, and because I've used it for 34 years, so I have a lot of experience in it and tools written in it.

It is incredibly terse, which allows for use in tight memory situations, and it is high-level enough that the implementor can optimize persistent storage to get speed gains that other languages can only meet by including very low-level code. It has built-in concurrency, again at a high level which allows the implementor a lot of leeway in making things more efficient.

As a niche language, it has specialized uses in Medical Computing and in Financial Computing. Its sparse data structures organized around strings of character is very suited to dynamic data (as a NOSQL datastore) and its handling of transaction processing and concurrency allows retractable computation as well as ACID data storage.

I am proud to call myself a MUMPSter.


I have used MUMPS and derivatives for most of 35 years. Although my first company produced library information systems, most of the rest of my employers have used it for medical information systems, and for me that was mostly laboratory information systems. I knew one guy who ran a multi-user system for accounting off of a PC-AT!.

Spent 4 years using Fortran on a similar application. I sure missed MUMPS when having to compile programs and having errors caused by moving too many spaces into a variable!

I've also dinked around with Forth and Icon/Unicon, which I used to create small programs, one which created Valentine bingo cards and another which I used to transform COBOL screen descriptions from mainframe COBOL to a version which ran on PC's.

Sometimes I do wish for a change, but overall I've enjoyed my career.

Perhaps someone in the future will feel sorry for those who now work with C++/C#/Python/Ruby/etc.

It's provided for me for a long time. Can't complain. And provided me a way to solve problems and get paid for it.


Can't have a mention of MUMPS without the classic:

http://thedailywtf.com/articles/A_Case_of_the_MUMPS


There used to be a company called IDX which sucked in many MIT grads into the world of MUMPS. GE bought them.. do you work for GE?


Or Epic healthcare in Madison, that was my first job out of college and continues to grow at an insane rate.

MUMPS was a kinda cool language - I liked it a lot better than VB6 which the GUI was written in. Every line had to start with a command like 'set' or 'do', which was abbreviated as 's' or 'd' which looked pretty gnarly. The `for` loop syntax there was a difference between 1 or 2 spaces which after the 'f(or)' :o There was a max file size and filename and function limit of like 6 characters, so things were named pretty cryptically. It had a built in object database and basic array and hash datastructures which were neat though.


I'm there right now.

I think the only reasonable alternative would be something like erlang at this time.

I see M as still used because it is without fail very predictable in resource usage and behavior, and with certain patterns, not prone to fail or crash. It also can be updated in runtime without having users have to restart or reconnect for most cases.

I suggest Erlang, because it has the similar desired behavior for error isolation and predictability.


I left there a while back, and I recently tried installing GT.M for old times sake. It's not really any fun without Chronicles unfortunately


Ha good old Cache. I did that for four years out of college. My friends and I couldn't stand the language and couldn't wait to leave it. I realized later that it was ahead of the curve on nosql, but I never heard if InterSystems managed to ride the wave or mostly hangs on with existing business.


Is this Cache the same Cache object database that used to be advertised in mags like DDJ and CUJ?


That's the one. The database is like kind of like a giant arbitrarily deep hash table of strings and numbers. At some point they added objects to the programming language and made a way load and save them to the database. Voila, object database.


Don't be embarrassed! The European Space Agency uses cache!


In the past 18 months I've been back working with my father, building web and mobile apps to interact with hardware devices used for for data logging in agricultural and environmental research applications.

He's been building these kinds of devices for almost 35 years; he basically built his business (and supported our family) on the back of Motorola 6800-series microprocessors, and now uses the Freescale S08 series, which is descended directly from the 6800 (8-bit, von Neumann architecture).

To this day, he codes everything in Motorola/Freescale assembly language.

In his mind, it's the only way to keep his code adequately lean, efficient and maintainable. He's never been interested in C, as he thinks (from observing younger engineers using it) it just gets in the way.

And while the coding and debugging process seems very painstaking, the results are excellent. His devices sit out in the field taking hourly readings for months or years at a time, never failing and going for over 12 months per battery change.

Initially I thought he was just stuck in the past and that his refusal to embrace more modern hardware platforms and higher-level languages was a disadvantage.

But now I see how reliably and efficiently his devices work, and how inexpensive the components are (the microprocessors we use cost just a few dollars each), I see it as a huge advantage over our competitors, whose products are much more costly to manufacture, much more power-hungry and less reliable.


I have this theory that if moores law stagnets for a while, and we can no longer count on hardware getting exponentially cheaper and/or faster, software is going to start competing by using either leaner languages, or better techniques. I think a lot of our (mis)use of hardware is simply because computing power is so cheap we can waste it.


Power budgets for software are already a thing that exists. It's not just about stuff getting cheaper and faster -- portable/battery-powered devices are becoming increasingly important.

So you're right, but not only for the reason that you think you're right :-)


I used to code a lot in Fortran 77, but have been programming in Modern Fortran for the past few years (90, 95... I am not sure which standard I am adhering to exactly). Both of these would also be considered fashionable.

Part of the reason is that it's still the standard in chemical engineering and atmospheric science for large-scale simulation where speed is an issue. There has been a long history of compiler optimization and numerical precision before standardized by ANSI C, and because of the simpler language (than C) it's easier for scientists to code up fast programs. Also because of its 1-indexing and array notation, it's easier to translate math to Fortran code.

With the recent module system there is a feature like Python's 'from module import *' ('use module') which when people use will make the code hard to follow, but when invoked with 'use module, only: var1, var2, foo, bar', you can make what's imported more explicit.

Other than that, I just find it very aesthetically pleasing and makes me very happy to code in it. I have heard they have string processing "as good as C" now, but I haven't used it much.


I've been learning Fortran 90/95 for my computational astrophysics class the language is proving itself to be surprisingly modern and capable, there's a lot of quirks coming from C/C++, but it's mostly syntax preferences and nothing huge. It's also very good at most of the problems that I've thrown at it which require high performance computing of some sort.


Fortran used to be the standard language available everywhere. This led to weird things like text games and compilers being programmed in it. For example: http://jerz.setonhill.edu/intfic/colossal-cave-adventure-sou...


Good article describing Fortran's historical dominance:

http://arstechnica.com/science/2014/05/scientific-computings...


I've been using Fortran 90/95 extensively over the last few years, for numerical simulation of quantum systems. I think there's a lot of misconceptions about Fortran (which is partly due to Fortran 77, with its punchcard format and spaghetti code). Fortran 90 is basically a different language, and is actually reasonably modern. I see it as significantly more suitable for numerical programming than C/C++. Fortran is conceptually much simpler than C. Having multi-dimensional arrays at the core of the language and not having to deal with pointers makes it much harder to shoot yourself in the foot, and yields more robust and easier to read code. Also, Fortran beats any other language (including C) in terms of speed. I would not recommend using it outside of numerical applications, but in its niche, it's pretty much perfect. For more high-level tasks I mostly use Python.


I also worked on a quantum (DFT) code in Fortran 95. I agree on your points, once you get used to it, it's basically isomorphic to C but with better numerical support. Compared to C++ it's verbose but understandable and with little magic, which is an advantage when working on a team, similar to Java. Main disadvantages I found were rubbish string and IO handling, no standard library of algorithms/data structures (lists, sort, etc.), and (in my codebase) there was still a bit too much reliance on global variables. Most of those are effectively fixed as the high-level controlling code is moved to Python and the F95 code is libraryized.


> Most of those are effectively fixed as the high-level controlling code is moved to Python and the F95 code is libraryized.

That's exactly how I think high performance numerical code should be structured like. Unfortunately this thinking hasn't quite arrived in the Fortran world yet where most don't really think about a two-language-approach.


First-year students in our maths curriculum here are still taught Fortran 95.

> not having to deal with pointers

There are pointers in Fortran, though. It's the absence of things like pointer arithmetic that makes them much safer to use.


HPF (high performance fortran) is quite nice actually. I was very impressed when we played with it in a parallel programming class I took years ago.


We use a business-basic style language called ProvideX[1] from the mid 80s.

99% of our codebase is GOTOs, GOSUBs, and PERFORMs. It's a mess to deal with.

But at the same time, the language has a database layer built in that (while slow) works well enough for our uses, it has a GUI toolkit that works cross platform, and you can run the same GUI panel via a web browser through a shim that the company created.

That's the one thing i've never seen anywhere else. I can make a program and GUI, and have it run on windows, linux, osx, and the web without any changes. And the "native" programs can run entirely client side, or they can behave as a "thin client" to a server somewhere, again without any changes.

We haven't moved away because moving away would basically mean replacing every single line of code ever written here... But we are slowly replacing it with more modern languages and systems.

[1]https://en.wikipedia.org/wiki/ProvideX


> That's the one thing i've never seen anywhere else

Xojo - http://www.xojo.com

Originally known as RealBasic it has the ability to output native, cross platform apps from a single codebase. Interesting history.


> I can make a program and GUI, and have it run on windows, linux, osx, and the web without any changes.

you should check out electron https://github.com/atom/electron


I'm actually a web developer at heart, and most of my work here is web related as we move away from this system, but electron is only one part of the equation.

There is still the database, the "server side" code, and unless we are going with isometric javascript, the client side code separate from the server side.

It's just taking some adjustments from a truly monolithic codebase where everything is in one language, one codebase, one "environment", to splitting it up into smaller parts and API-ifying everything to allow an easier separation.

It sounds silly, but even choosing an editor is a task, as providex includes their own IDE! (i'm rooting for atom!)


On a daily basis I work with C and a handful of modern languages. I do reach for Common Lisp and Prolog sometimes, so I'll count those.

Common Lisp: many, many great things about lisp have been incorporated into modern languages, but CL still has a few unique things, and the whole is greater than the sum of the parts. It's less compelling than it was 10 years ago but some problems are expressed more wonderfully in CL than in other languages.

Prolog: when prolog is the right tool then you really can't beat it. You can substitute another logic language if you like, but I'm not aware of major borrowing by modern languages (the odd library doesn't really count).

Things I'm meaning to look into but haven't really used yet:

Forth: I poke at forth every couple of years but always leave off to go do something else. I'm pretty sure I don't want to embrace the whole forth philosophy thing (real forth people are, uh, different), and doing a deep dive to see what I can take with me is something I just haven't made time for.

Snobol: I'm not really interested in Snobol wholesale at this point, but I want to explore how PEGs are used. There are PEG libraries available for other languages and I'd like to see how much benefit there is to be had from working in a modern language with a PEG library.


Although I haven't used Forth much since the time I implemented it on the Atari ST and used it for a commercial product, I still like to think of myself as a Forth person. It continues to influence aspects of my programming style, and can be useful to have known well in surprising circumstances; for example, I once solved a thorny problem in some rather complex XSLT by adapting a technique that's quite normal in Forth, but would be considered downright weird in many other languages. So yes, we are different :-)


> I once solved a thorny problem in some rather complex XSLT by adapting a technique that's quite normal in Forth, but would be considered downright weird in many other languages.

I'd be very curious to know what technique you used here.

I sort of stumbled on Forth recently, and while I'm not sure if I'll do anything with it, I like the philosophy behind it more than any other language I've yet found.


The only language I use these days at work is Tcl, as it's the embedded scripting language in the app we use for post-processing finite element analysis [1] results. Thus I use it for small scripts to make various data extraction and processing faster.

It's main strength is also it's main weakness: the whole everything is a string concept, even code. It makes the language both very simple to grasp at first and then also very powerful. It makes lisp-like macros and meta-programming possible. However, it quickly seems to turn into a big ball of mud.

Lack of first-class functions or OO is a major pain point, too. Objects have to be emulated by procs (ie functions), and procs are always in the global scope (there is namespacing but it's quite unwieldy). I wouldn't care that much, except the App's API is an object model, which means even a small script turns into 50 lines of only handle-getting and releasing. Make a small error and forget to release a handle (ie a proc name) and now you have to restart the interpreter (or write a lot of error handling script, which once again for small scripts is a pain). Also, you can't pass objects around by value as they are procs, only by name, which is also a pain.

[1] Numerical method often used by engineers to solve various physical problems modelled by boundary-value PDEs. https://en.wikipedia.org/wiki/Finite_element_method


TCL/EXPECT was a huge help when I needed tp automate data extraction from a health clinic practice management system. The available interfaces didn't give all the data. Decoding the custom data formats, file layouts, etc... wasn't going well. Scripting the character based UI with TCL/EXPECT was pretty cool including the error handling. If a bit hacky.

I was surprised to find out my temporary solution was still running without a hiccup 5 years after I left. Yes, I was younger then and hadn't learn to appreciate the life expectancy of temporary solutions.


TCL became the standard embedded language for many EDA tools. I really hate it: it's difficult to use and slow. On the other hand, tcltk made it a convenient GUI language.


Can you use 8.6? TclOO makes objects a lot more wieldy: they're still commands, but you get automatic cleanup, destructors, some nice options for inheritance/delegation and so on.

"emulating objects with procs" sounds like you're not taking advantage of any of the previous OO extensions (snit, itcl, xotcl ...), which has to be making life harder than it should be :(.


Unfortunately I'm stuck with the embedded 8.5 interpreter.

And, to be entirely honest, most of the gripes above are likely more due to the poorly designed API than to the language itself.


I use Awk fairly regularly, in the form of actual scripts (not just expressions passed on the command line). Despite its age, it's still extremely powerful and usually very pleasant to read and write. Gawk adds socket support, so writing network scripts is usually painless.

I've also been known to write Perl (5) and Tcl programs, but significantly less often nowadays.


Add me to the list of people that use Awk. It's my goto language for programs of less than 100 lines. I can do a lot in 100 lines.

I've only used GAWK's network support a few times, mostly to write data collection on a set of boxes and then ship them to a central box for data presentation.

For text data processing it's really hard to beat AWK.


Do you have some examples of network scripts written in Gawk? Particularly something that you'd prefer to write in Gawk rather than in a more "general-purpose" language. Would be interesting to see.


Check this out, I think you'll be impressed!

https://www.gnu.org/software/gawk/manual/gawkinet/gawkinet.h...

(If you want it to fully function you'll need bits of code starting here -- https://www.gnu.org/software/gawk/manual/gawkinet/gawkinet.h...)


Hmm...I haven't published many of my Gawk scripts, but here's a simple little IRC bot using Gawk sockets: http://woodruffw.us/snippets#awkbot

The general procedure for using a Gawk socket is just building a string of the form `/inet/tcp/0/example.com/80` (example.com, port 80) and redirecting to/from it as a coprocess with `|&`.


I use MATLAB for 90% of my day to day work, for a combination of reasons -

1. Historically it is what has been used at my firm. We have a lot of code already written in MATLAB, interfaces to internal apis, external data providers etc. Everyone at the firm understands MATLAB code.

2. It really is very good for numerical work - both in terms of speed, and clarity of the code (much better than Python and R for clarity - probably on a par with Julia).

I also used KDB+/Q very heavily in a previous job. It is blazing fast in its domain (financial time series) and enables extremely rapid prototyping. The fact that it is a combined query language / programming language is very appealing for data-focused research. I wouldn't want to write a production system in it though (though I know people who have done!)

When I was studying for my PhD I wrote a lot of Fortran 77 and IDL. Essentially because my supervisor used IDL and had a lot of code written in it, and because we were using an external tool which consumed Fortran 77 files as input.


I also dislike R. Strongly disagree with the argument that MATLAB is more clear than python, however. Have you tried numpy/scipy? Provides excellent performance and very powerful APIs.

I've always found that a great deal of the MATLAB syntax does not gel with my expectations. "./" vs. "./" for instance, or using parenthesis to address array elements.

I would much prefer a truely OO capable language, as well.


The real killer about Matlab is its one-based indexing. Makes arithmetic index operations look ugly (and unintuitive) as hell.


It's one-based indexing combined with length being defined as the number of entries, IIRC. There was a blog post on here a few months ago that explained how you need one or the other to minimize the need for off-by-one corrections, though I can't find it now.

I'd consider switching to numpy (ignoring organizational issues) if it didn't take 10 times as many characters to describe all the operations. If you're spending you're day doing algorithm development having least squares and every linear algebra operation you can think of as first class citizens is a huge plus for readability and code cleanliness.


> "./" vs. "./"

Is this a typo? It seems like two instances of the same thing


Ooops -- './' vs. '/.', e.g. right array vs. left array


It doesn't seem like '/.' is an operator. Do you mean './' vs '.\'? That tends to match my intuition.


Although I mainly write code in Python, JavaScript and Go, I have a SaaS web application that was written in the early 2000s using ColdFusion (CFML) which I still support (started at v5 and now runs on v8). In the mid to late nineties (and early 00s) it was one of the best ways to write web applications (it was actually the first Application server as far as I know). But today is 2016 and ColdFusion is one of the least fashionable language I know. I cannot with all honesty say I enjoy working with it, and all signs show that its a dying language with very few people to mourn it.

Although it started its life as a standalone (C based) web application server and the CFML language, it was rewritten to run on the JVM when Macromedia acquired Allaire, and these days owned by Adobe. Essentially its a glorified JSP tag library. The CFML language itself looks like XML and offers a "script" alternative syntax which until recently didn't even support all the functionality of the tag based syntax. What as good about it was the ability to hook to a database and perform queries very easily. At later versions it even received OOP features. Fancy that.


ColdFusion was a breath of fresh air in the mid-to-late '90s when the predominant technologies were CGI/Perl and ASP/VBScript (pre-.NET). I used to love showing other programmers e.g. "Here's how you create a database connection, execute a query, and iterate through a recordset in ASP (40 lines of opaque nonsense)... and here's how you do it in ColdFusion (cfquery, cfloop)." It enabled us to quickly deliver a lot of intranets and e-commerce sites for our customers, but man oh man did it require a lot of server hardware to run (at the time), and it was expensive as heck!


The compare-to-ASP scenario was my go to "there you go" shtick when trying to demonstrate the platform to ASP developers. And yes. It was expensive, and still is, considering there are free alternatives.


Hate to say it, but you'll want to upgrade to a later version than 8. There are a lot of exploits out for 8.

Oh, that was the other crappy angle for ColdFusion I forgot to mention in my post - the $1500-and-up license fees.


I know. We actually rebuilt it from scratch and the first version will be ready soon. Redone from the ground up and fitting to the year we live in. So there is no real compelling reason to change anything in the ColdFusion version.


To be fair, Lucee (formerly Railo) is a great open-source implementation of CFML.


CFML is currently ranking better than PHP and Go in a recent twitter survey: http://code2015.com

I would attribute that to the passionate community and open source version, Lucee.


I use almost entirely ColdFusion (well, Railo). Never studied computer science so don't know any better. Wouldn't know what to transition to if I wanted to use anything else.


What kind of apps do you build? Framework-driven? If you use fw/1, look at Sinatra (Ruby) or Express (Node.js); ColdBox, look at Rails.

You may wish to look at CF on Wheels - it may be a good on-ramp to Rails.


CMS/CRM-type things. No frameworks.


Try Python. Its nothing like CFML but it has that "AHA" feel to it that reminded me of the same AHA ColdFusion gave me in the nineties.


Thanks. I'll investigate. I have used ColdFusion for 10+ years and as such am comfortable enough in it, and support 50-100 sites built with it. But sometimes it's hard to find tools/tips out there as the community isn't very strong.


The "problem" with CF now, is the amount of legacy sites written with it. They will need support and maintenance. To me that is boring work I rather not do (with the exclusion of my own legacy business). Its a stable comfy income, but if you want the excitement and adrenalin (yeah, I'm exaggerating), you might find it in Python, Go, Rust etc.


just curious what kind of SaaS application is it?


I was trying to avoid the details but because you asked: Its a CRM application. Surprisingly (to me) its has a very large user base.


Smalltalk! I got in touch with it in university and I love it. I'm convinced it is the best OO language there is, and generally one of the best general purpose languages (and environments!) ever invented.

I even had the luck to be able to use it on my last job (VisualWorks), but unfortunately I no longer have that job :(


Here here!

I had several Smalltalk jobs (VisualWorks, Squeak), before going back into more mainstream stuff. Still use it for personal projects, though.


Hum, I played also first time with smalltalk some months ago and even installed squeak, but can't avoid feeling the "what I could done with this thing?" sensation. Lots of games and cartoons moving in the screen. Should I push stronger?


Look up Niall Ross's summaries of Smalltalk conferences over the years: http://www.esug.org/data/ReportsFromNiallRoss/. They show interesting stuff that people do with Smalltalk and will help you decide whether Smalltalk is for you.

Personally, Smalltalk rocks!


I tried Smalltalk recently, but I couldn't get into it:

1. The (basically mandatory) UI was keyboard unfriendly

2. They removed multiple inheritance from the language. That's not still Smalltalk, is it?


As far as I know, Smalltalk never had "real" multiple inheritance, but merely an experimental hack implementation around doesNotUnderstand (cf. http://c2.com/cgi/wiki?MultipleInheritanceInSmalltalk ). So I wouldn't agree to the sentiment that it has been removed from the language, but I'm too young to judge how essential it once was. Personally, I don't miss it at all.

Which Smalltalk did you try exactly? There are huge differences between the UIs, and e.g. GNU Smalltalk doesn't have a UI at all.


Didn't it have traits?


Um, Smalltalk doesn't have multiple inheritance, isn't supposed to.


It is the best OO language there is, it is after all the original OO language. Everything else is a poor copy.


Depends on one's perspective. You could also make an argument that Simula was the first.

I'll grant you that Kay coined the term (at least AFAIK)


Simula was not an object oriented language, it was a procedural language with objects; that's a very different thing. Smalltalk took objects and made them primary, everything, even the language itself is built upon the foundation of objects. True and False for example are not language elements, they are singleton instances of the classes True and False which are both subclasses of Boolean. What are "if" statements in other languages, including Simula (a superset of AGOL) are virtual methods in Smalltalk in the Boolean hierarchy.

Simula invented objects and added them as an option to an existing procedural language; Smalltalk took them and built a language out of them and thus coined the term Object Oriented.


I know. But just because something is the first, doesn't mean it (still) is the best ;) But as I already said, I agree; everything that came after was subpar.


I've been using Clarion since back in the late 1980's (when it ran on DOS). Still using it to this day, and it earns me the majority of my revenues in my company.

Wiki link: https://en.wikipedia.org/wiki/Clarion_(programming_language)

The language itself is an amalgamation of COBOL, Pascal and C. But for me, the biggest advantage was the development environment. It was the first IDE that I came across back then that allowed a separation of the back end database from the actual front end code. I loved how easy it was to build applications using dBase, SQL, Btrieve etc. as the back end data store.

Also loved how the business logic was defined in a 'dictionary' so that validation rules etc. could be centrally controlled. Still haven't seen a modern day framework that allowed this much control and flexibility.

Clarion has had its ups and downs, but I think I will be still writing code in it until I retire...


Wow, Clarion. In my younger days, I helped write a Clarion app to help homeowners insurance agencies electronically rate new policies after Florida re-wrote it's insurance underwriting rules to force everyone to follow the same formulas. Was pretty popular, though my contributions were fairly minor.

I wonder how many Bernoulli cartridges are still kicking around with that old code...


I use Delphi. The RAD Studio IDE is super featured and it's really easy to create interfaces. A nice feature is that the programs produced contain all dependencies so nothing else needs to be installed when distributing.


I'm curious if you're familiar enough with other more modern languages and platforms to give an opinion on the relative efficiency of Delphi compared to some of the others? As someone who used to build lots of apps in MS Access (which is similar to Delphi but not as good), I find it takes far longer to build things with modern platforms, although you have much more capabilities.


I do a lot of Delphi during the day (medical systems), also spent a number of years with other languages - C/C++, Objective C, Java, c#, javascript, typescript, lua, perl etc and I agree whole heartedly. Delphi is still the most productive platform imho, so much so that most side projects I do with delphi. The fact it is now cross platform adds to the list of 'why I think delphi is the best'.

Unfortunately there's no one thing that I can point to that makes Delphi the best, its a lot of little things - the IDE and screen painter, the way you can use pointers, deployment is so easy with a single exe, RAD. The one language I prefer to delphi is typescript, it is Delphi with a lot of functional sugar, but since its written by the same guy probably not a surprise.


Delphi is excellent!


I used to program in Delphi between 1997-1999 and after that I couldn't get even close in terms of productivity with Java. It wasn't the language though, the strongest thing about the Delphi environment was that it was truly comprehensive and you'd just go in and implement your business logic.


Same here. Funny how, 15 years later, there still isn't a competing language/UI builder that can measure up. Xcode + Interface Builder, for example, is a joke compared to Delphi. The first time I used IB, I remember being horrified by how IB requires you to "connect" UI elements to variables, and how it can't even create those variables for you, let alone allow you to add event handlers.


I was so pleased when I could write my Delphi code on iOS/osx and android as well and could abandon the horribleness that is IB.


Bash. I'm obsessed - so much so that I'll solve more complex problems taking 3x time rather than falling back to Python. And then there's bash one-liners, they're the best!


I too love bash. It's always there. It has no dependencies. It works.


It has a dependency on the bash runtime.

And coreutils.

And often moreutils, or various others.


Often with better performance than Python or Perl


The only times I've ever had a bash script run faster than Perl or Python is when it has handed most of the work off to another utility (eg written in C).

The actual bash processing bits seem much slower than Perl or Python.


Isn't most heavy work in bash done by piping data to C based utilities, with bash just being the "glue"?


I don't think that's true with Perl at least, unless something has changed dramatically very recently.


If you use it to glue/pipe bunch of tools written in C it can easily be faster.


That's one recommend approach towards Python. E.g. numpy.

The other approach is pure Python and then using PyPy's JIT.


Which often is what you do in Bash with grep, cp, mv, find etc...


Obfuscated oneliners are where it gets really, really fun though.

I love making these, because they can be edited and iterated directly from the terminal prompt, and you can just paste them from the clipboard to see if you like them and want to keep them.

For example, the URL this oneliner spits out is actually real. The technique works almost nowhere now thanks to HTTPS (it's one of my poorest oneliners), but it's hilarious to show people who don't know about networking.

  $ echo http://google.com/trends | (E=echo\ ;S=sed\ ;read a;x=($($E "$a"|$S's,^\([^:]\+://\)*\([^/]\+\)/\(.*\)$,\1 \2 \3,'));$E"${x[0]}$(i=32;o=0;for x in $(ping -c1 "${x[1]}"|$S's/[^(]\+(//;s/).*//;;s/\./ /g;q');do((o+=(x*(2**(i-=8)))));done;$E$o)/$(for((i=0;i<${#x[2]};i++));do printf "%%%x" "'${x[2]:i:1}";done)")      
  http://3627736174/%74%72%65%6e%64%73
If you use VT100's alternate character sets you might like this character map which shows all the character mappings at once (man console_codes for the idea behind B0UK)

  (((r=255/((${COLUMNS:-$(tput cols)}-16)/18)));C="\e[38;5";Z="\e[0m";A="\e(";for((i=-1;i<r;i++));do X=;for((j=0;c=20,(v=j*r+i),v<256;j++));do((v>r-(\!(i+1)+1)))&&X="$X$C;177m |";((\!(i+1)))&&X="$X$Z 0UKB ddd/ooo/hh"||{(((v>32&&v<127)||v>160))&&printf -v c %x $v;printf -v X "$X $C;160m${A}0\x$c${A}U\x$c${A}K\x$c${A}B\x$c $C;120m%3d$Z/$C;214m%03o$Z/$C;33m%02x" $v $v $v;X=${X//\%/%%};} done;printf "$X\n";done)
Then there's this one I won't tell you about, which #bash suggested I put in the channel bot (!! wow) when I showed them... run it in a 256-color terminal. :P

  E=echo\ ;P=printf\ ;M=$'\e[0m';a(){ $P"\e[38;5;2${z}5;48;5;${1}m %4d $M" $1;};Z=$E$M;for i in 16 93 160;do ((r=(n=\!n)*2-1));for((x=0;x<6;x++,i+=r)) do for((j=0;L=(i+(R=(j/6?17-j:j))*6),z=(R%6>1?3:5),j<12;j++))do a $L;done;$E;done;done;z=5;for i in {0..23};do a $[i+232];((i==11))&&$Z;done;for i in {0..15};do ((i==8||\!i))&&$P'\n%*s' 12;a $i;done;$E


I've written a bidirectional filesystem synchronization tool, and a content management system, in (nearly) pure bash.

I keep expecting the performance to be terrible but it's surprisingly efficient.


The only thing I've found strongly lacking in bash is math. You can `| bc` all you want, but it still seems slow compared to other languages. I typically do all my math in python (typically through pipes... heh), and everything else in bash.


$(( ))?


I use Common Lisp since 2 years now. Here in germany, most of the students I talked to classify CL as an unfashionable language..

I really like the simplicity of the syntax and the stability of the language standard/specification. There will be no "Today we released v2, all your v1 code will be invalid"


I'm curious since you mention students, so you may be one as well, but is Common Lisp just a hobby language for you, or do you have a productive professional use?


I'm a computer science student and will get my master's degree in October. At the moment Common Lisp is just my hobby language, but also the language in which I invest most of my programming/learning time. I'm able to program in a few mainstream languages, but I want to have one language where I really know the tricks and shortcuts.

Getting a job here, where I would program in CL is nearly impossible. So in the next years I'll probably stay in the academic world and then try to do my own startup or freelance stuff with the language(s) I like.


Perl and C here mostly. C89, that is. Both are widely used languages, just not in the startup space. a lot of our stuff is built on those languages so it wouldn't make sense to rewrite a lot of the stuff at this point unless there was a business case for it.


My personal favourite combination as well! Don't care about if its old-fashioned or what other people think about Perl and C. To me, a programmer should start learning C then work his/her way up!


So I'm not alone :) Perl and C everyday ;)


Perl and C++ here. You are making me feel modern and up to date :-)


I'm mostly a sysadmin so Perl is a big part of my life whether I like it or not, but not much C.


Me too, Perl most days, C most weeks.


C in my day job.


The code for my startup is in Visual Basic .NET (VB). Before you laugh, it's essentially just a different flavor of syntactic sugar as C#. For my approaches to writing code, VB is fine.

My scripting language is Open Object Rexx, and a version of it, Kexx, is the macro language of my favorite editor, KEdit. I have about 150 scripts in Rexx. I do nearly all my typing into KEdit and have about 150 macros in Kexx. Some fine day I may move to Microsoft's PowerShell -- if they write better documentation, then I will move sooner.

If I have some optimization (mathematical programming) to do, then I will use the IBM Optimization Subroutine Library in object files ready for Fortran from the Waterloo Fortran compiler. So, I'll use Waterloo Fortran.

For more in programming languages, I still like a lot that is in PL/I, especially the ideas around tasks, memory management, exceptional condition handling, and scope of names -- all of those deeply interact in nice ways. And I'm totally in love with PL/I structures, much more efficient than objects, a little easier to think about, and nearly as powerful.

For more, I want more on a programming language that (1)admits some automatic transformations that improve some properties and preserve others, (2) can report on some of what is going on in the code, and (3) helps with documentation. For (3), IMHO the documentation is more important than the code but in programming language design is treated just as meaningless gibberish and an after thought. Yes, Knuth's literate programming was a step forward but that lesson seems to have been forgotten for other languages.


I use Erlang/OTP as my go-to language. The flack about its ugly syntax certainly puts Erlang in the unfashionable camp. But unjustly. I use it because I like single assignment of variables, pattern-matching, and I also think FP is pretty swell. Someone said recently that the Erlang community could be as small as one-thousand serious hackers. Now that's relatively small!


Erlang/OTP is becoming fashionable again thanks to Elixir :-).


And LFE, Lisp Flavored Erlang brings the BEAM and OTP to Lispers! I play with LFE, so I can not say I use it, but it fits your unfashionable descriptor.


1000? That's insanely small! Especially for the amount of tools there are for it.. is it all developed by a small subset of these 1000? I've read there are about 18-19 million programmers in the world, which also seems like a pretty small number, but it seems like enough to have created everything we have that uses software.


Im in the EDA industry and use Tcl. It's easily the worst language I've ever used and I firmly believe that the entire semiconductor industry is being held back years due to the stranglehold of Tcl.

Everything in Tcl is a string. This is just one example of the idiocy of this language.


Even as a fan of rich type systems, I can think of much worse design choices for a language than "everything is a string". "Everything is an object whose class is determined at runtime" comes to mind.

And as a former Tcl dabbler, that's also definitely not the first thing I'd pick on. `upvar` is a far more egregious violator of sanity, and I seem to recall that parsing optional arguments to a function is quite the adventure.


You wanna know something worse? It parses braces inside comments. I spent a day debugging a compilation error because i had a random { in a comment.


I don't think that's true of modern Tcl anymore, though.


The words modern Tcl are an oxymoron.


I admin/dev on an old LDMud, which means the "game" is written in LPC: https://en.wikipedia.org/wiki/LPC_(programming_language)

Perhaps not a terribly interesting addition to your list, since the reason is obvious: I keep using it because it's the language the game is written in. That said, I'll play along and discuss what I like about the language:

One of the fun things about working on an LDMud or in LPC is that the game driver (written in C, compiled) gets loaded and serves as the virtual machine for your game, and the parser/compiler for your game's LPC code; this means that once the game is up and running, as long as you aren't editing some really foundational game objects, you can create/load/destroy/update/unload most of what you use inside the game environment without a recompile/restart of the game. There's very little friction when you can quickly play with the new objects/npcs/environments you're creating as you iterate.

You can see the age/heritage of the game as a glass half full, or half empty. You can be frustrated that some relatively modern concept/module/support doesn't exist, but you also have the opportunity to learn from writing it (and not just as an exercise; for production!) Because it's got such a long legacy, you get plenty of chances to rub up against old code and reason about what constraints may have shaped it, and whether it is worth refactoring. Chances to become conscious of the extent to which decisions you make in the present will someday look as curious and smell as musty as the legacy code you're griping about.

It's very OO, and code re-use is a big deal. There are some really complex inheritance trees, and while a lot of issues you'll run into are the sorts of problems you'd expect, I feel like it's pretty good training for breaking down and reasoning about complexity of any sort. This gets ratcheted up by the interesting concept of allowing a certain kind of object to "shadow" another object at any time. A shadow object will intercept calls to the object it is shadowing, giving it the opportunity to modify how the shadowed object will behave. So, for example, most classes, subclasses, boosts/buffs/curses/etc. are implemented as shadows you apply to a player object, or to some piece of their equipment.


Man, I loved LPMuds back in the day. I remember being so impressed when I (my character) was sitting around in a pub with a friend, and all of a sudden he put on a hockey mask, whipped out a chainsaw, cut off my arm, and started beating me on the head with it. And I actually couldn't do anything that required me using my arm. Apparently he coded that whole thing up while we were sitting in the pub.


I still actively use Pascal/Delphi 5. I wrote a lot of software in it for the company and I'd say the bosses are quite happy about it and couldn't care less that it's not something "mainstream". Looking back I can't imagine that it could have been possible to achieve with VC++ 6 or VB.

Edit: of course it's not just the language, it's the whole Delphi environment with a variety of components and libraries and the development community.


There are a couple languages I write code in that are less popular, notably TCL and bash shell. Throughout the 2000's I wrote a lot of Perl, but it's now been a while. Most of my development these days is in Go, Python, or Javascript.

TCL is a language that has its problems, but has some really interesting use-cases. TCL is the LUA-before-LUA that is particularly great for constructing REPLs and DSLs. It has a forward-polish notation that can make it easy to construct a shell-like DSL or for processing other forward-polish notation languages and formats (URLs are FPN!)

I find bash shell to be the most enjoyable language that I work with. It's actually possible to write good, readable software in shell, but it's absolutely a skill that developers choose not to hone. Shell is something that many find necessarily and few developers really understand. This leads to a lot of bad software and, consequently, a culture against the language. Objectively, there are good reasons why shell might not be good for large applications, but those arguments could be made for more popular languages as well. However, due to the biases against it and the fact that many developers know "just enough to be dangerous", it's not a good language for teams.


Delphi Developer here. Still using it at work and there is still Development in it.


Delphi was the first language I learned and I always thought it was a teaching language. What are you developing in delphi?


Full blown ERP with postgresql. Its super fast, as it is a native windows application and thats what our customers like about it. Also with nice components like devexpress its looking realy good, has high usability and is easy to develop.


I'd love to see some screenshots of a modern Delphi application if that's possible?


I'm pretty sure Altium Designer, a decent and pricey EDA suite, is written mostly in Delphi.


Yeah, I believe it is too. And its primary scripting language is still DelphiScript. We use it daily at my company.


Have a look here:

https://www.devexpress.com/Products/VCL/

Especialy their datagrid is the most advanced and fun thing to work with.


The Delphi product page has examples of UI features available in the VCL and FMX: http://www.embarcadero.com/products/delphi


Last I knew Skype (at least for Windows) was written in Delphi. That knowledge is a few years old so may no longer be true post-Microsoft acquisition.


I develop medical systems for windows with Delphi. Also develop a graphical system in my spare time for the Mac with Delphi


I use VBA for automating the production of pharmaceutical analytics reports, and have for the past 8 years. It is horrible in so many ways, but as far as I've found has the best support for manipulating the MS Office DOM, which is important because all the pharma reps use Excel/PowerPoint for everything and tend to have very specific nit-picky needs regarding their custom reports.


Have you ever investigated writing your code for office automation using C#? It takes a little bit to get it figured out, but being able to get away from VBA is worth it I think.


Or VB.NET, which is also a very nice language.


I still support websites I previously wrote in CFML, aka ColdFusion. It was much nicer to work with than the alternatives on Windows, many years back, and it running atop Java since version 6 often turned out to be handy. It wasn't a bad way to bash out a website, and with some care, you could build decent stuff with it.

Still, it's a loosely-typed (except when it isn't) language that exists in two forms - tag-based and Algol-ish (within <CFSCRIPT> tags) - and it carries a lot of Java-derived complexity in is API while not being as flexible as even JavaScript.

Lately, I've been using Python and Flask to build sites in, which is far nicer.


I've used Yeti[1][2] to write an Android app for personal use (on Nook Touch) that I couldn't force myself to write in Java. It made the task much more fun for me and allowed me to finish after I got a serious case of "author's block" trying to code it in Java. Yeti is ML-like and runs on JVM 1.4+

[1] http://dot.planet.ee/yeti/intro.html

[2] http://mth.github.io/yeti/


Now this is interesting to me! I found Yeti last year when looking for a practical strict ML in which to write a small utility. It didn't fit my requirements at the time due to the JVM dependency but I made a note to try it later for some JVM task. I see that it has had a new release since, which makes me want to try it more still.

Could you share a little more feedback about what using the language was like? Did you like it overall? If you've used Standard ML or OCaml (OCaml-Java), how would you say it compares?


As I wrote above: "It made the task much more fun for me and allowed me to finish after I got a serious case of "author's block" trying to code it in Java." In other words, I did like it! :)

I haven't used SML nor OCaml. [Though I seem to believe I prefer SML over OCaml in terms of visual appeal/readability, FWIW.] I wanted to code for Android, and OCaml-Java requires too new a JVM (from what I understand, Android is stuck with JVM 1.4 [or 1.5?]) IIUC. Before I found Yeti, I tried to use Kawa (a Scheme/LISP for JVM), but it worked too flakily for me, unfortunately. Then I found Yeti; I stumbled into one bug at some point, but the author quickly fixed it, and I didn't have any problems since then! And it's typed, so a plus vs. Scheme.

edit: you can view my code at: https://github.com/akavel/bookshelf/tree/main/src/com/akavel -- but please note I didn't care to make it pretty or documented!

edit 2: also, I liked Yeti's manual & documentation very much.


Thanks for the details and for the source code link!

Funny that you mention Kawa: I've been meaning to try it out along with Yeti. That you had problems with it does not sound very encouraging, but then, it has had a major version release since.

When it comes to ML, I prefer SML to OCaml. I think Yeti, Standard ML and Scheme appeal to the same "small core language" mentality. If you decide to give SML a go, I'd recommend SML/NJ for development and as your REPL — especially while you are getting used to the language since it has better error messages — and MLton for production binaries.


I worked a lot on a legacy system developed in Progress ABL (formerly unknown as Progress 4GL). I am still working on that system, but I am mostly an analyst now and don't really write code (only short scripts to extract data).

We still use it because a core application has been developed in it for the last 20 years, and replacing it with something else is currently unfeasible.

It's a sort of PL/SQL for a non-relational DBMS. It recently (last 5 years?) got OO extension - the product is still used/mantained/extended.

To be honest, I doubt that "we could learn" much from it. I mean "RDBMS were not just a fad, after all" is not something we need to learn, right? (and no, it's not part of the NOSQL family, it is just old and odd). Some other lessons we don't really need to learn now are, off the top of my head:

- I think we don't really need foreign keys, folks

- What about something like a virtual table that only exists in memory but which allows you to optionally revert any changes on it by doing a rollback? (except that if you do that you also rollback any changes to the real tables, of course)

- Views are just a passing fad, also, foreign keys are for sissies - data integrity is done in your application.

- You know what? we could do query optimization at compile time instead of at runtime, it will be easier for us to write the query planner, even if this also mean that if you add an index later you have to recompile everything.

- You keep harping on this concept of foreign keys. Listen, just use a strict name convention for fields and things will sort fall in piec... I mean in place by themselves.

- Function/Procedure signatures are not really hard enforced so... if you have a function with 23 parameters and you call it with 22 (or 24) it will bomb at runtime because the compiler will not warn you.

- Speaking of signatures... we all know that dates are really an epoch plus a real, no? So I think it's sorta ok if you pass a number to a procedure that has a date as input parameter...

Some of the points mentioned above might have been mitigated in the more current version (or will be in the next one) except for the foreign indexes because those will stay like that forever (officially declared in their roadmap 2 months ago).


I write a fair bit of Apex for salesforce. I can't say I enjoy it, but it is the only game in town. My org uses salesforce for a lot of day to day stuff, and there is a lot you can do with users on cheap licenses if you're willing to write custom controllers for it.

I'm not sure if this is so much old or unfashionable, as Apex is pretty similar to java 6, but it is just not what I would call a "fun" language.


I've been working on emulators (in C), but they allow me to run interesting languages like Turbo Modula-2 and Algol 60 for CP/M or MPL for the Motorola Exorciser:

https://github.com/jhallen/cpm

https://github.com/jhallen/exorsim


Anyone use ABAP (language for SAP)? I think you can get paid a lot of money as an ABAP programmer.

https://en.wikipedia.org/wiki/ABAP


you also get paid a lot of money as a truck driver in Syria, there's a reason for that


I did UC4 stuff for a while.

Was pretty fun but making separate applications is often the better idea.


Most of my programming has been done in pl/sql. It's still the best language for interacting with a relational db I've seen. I've just started using c# and when I pick up some more linq I think I might like it. I also program in javascript, java and c is just that plsql has been the best way to get stuff done for me working with enterprise db apps since the late 90s.


For the past year I have been working in PL/SQL it's like being on a different planet. The language is clunky and difficult to work with. The culture is weird (everyone uses TFS for version control, for example, if they use a VCS at all). The tooling is awful. No parser is available, meaning linters are only included in bulky GUI apps (TOAD is the worst, most unusable piece of software that I still use on a daily basis). I have resorted to Emacs and SQL*Plus but even Emacs has poor support for PL/SQL. No one wants to write open source tools for a proprietary language.

PL/SQL is definitely not sexy, but it's undoubtedly the best way to interact with an Oracle DB and the raw power is amazing. I have accomplished some incredible things with it, but I can't help but feel like everything I do is already out of date and needs to be replaced with something better. Any tips for making my experience a little more pleasant?


There's nothing stopping you from using version control and having an automated build and unit tests. We always have everywhere I've worked before.

Check out steve fuerenstein's books and websites for more modern ways of working with pl/sql http://www.stevenfeuerstein.com/

I've normally worked with pl/sql developer which has some great tools-lint, beautifier,templates,import export data generation etc. There are a few good open source libraries eg pl/json but you have to look around.

You can always easily call out to java or c if you want to work with other libraries.


I used to use Emacs and SQL&*Plus, but after trying SQL Developer have stuck with that.


c# is the most under-appreciated language today. It has a lot to love, including async/await, typed generics, good interop with C/C++, type inference, and a strong tendency toward functional style. I think of it a bit like the CLR's version of Scala.


It's not so much "using" as it is "understanding". I've been playing around with INRAC (http://boston.conman.org/2015/11/16-19), the language used to implement Racter (https://en.wikipedia.org/wiki/Racter) which has the most insane flow control I've ever come across in a language. Subroutines (for lack of a better term) do not need to have unique names, and when you call a subroutine, one matching the name is picked at random.

Oh, and you can specify subroutine names with a pattern (and of course, one subroutine whose name matches is picked at random).

The same works for GOTO as well. It's a very non-deterministic language. And it's so moribund that I think only two commercial programs (Racter is one, I think I've come across another) were written in it.


I started out with Perl and I still occasionally use it today. There's nothing I especially like about Perl that other languages cannot do just as well, or even better, but at the time I started using it, it seemed pretty powerful. It's still a formidable language and a viable means to build a solution for some problems—provided you don't have to collaborate with anyone. It's hard to find people willing to work with Perl these days, at least where I'm based.

I also help maintain some web applications built with VB.NET interfacing with IBM AS-400 and iSeries midrange servers. Which means some RPG coding as well. Can't say I enjoy that too much, pretty dull, but RPG is straightforward enough.

Basically, I work with VB.NET because it's not my code and there would be little advantage to converting it to be in C#. Very much a "if it ain't broke, don't fix it" situation since the clients depend on uninterrupted service.

I mostly work with—and prefer to use—C# and JavaScript these days.


You are the only person in this thread to mention RPG. :-)


I still use Awk.

I picked up Awk for some random 15-123 assignment in sophomore year. I still use it for automating/shell scripting anything that I need to. I could use Python or something hip. But I haven't quite grown out of my fondness for Awk.

I'd add Lisp to the list, but you can't call it unfashionable, unless you also call Beetles unfashionable.


I am using PHP for my hobby project. It's "old" but obviously not abandoned and I think it's becoming a bit less ridiculed, but you still see the regular "Ew why PHP"-type remarks and other jokes about the language.

I don't use it because of familiarity or comfort (I'm actually quite uncomfortable with it because I know it's easy to fall into traps of horrible practices). The simple answer is that as a kid I loved PHP browser game and envisioned the project I wanted to make for 10 years as a PHP project. I've experimented with using different languages for it, but go back to PHP simply for the sake of nostalgia. I realize this is normally a horrible reason to pick a certain language for a project, but considering it's a pretty quirky long term passion project with no goal of monetization (or even full completion), I think it's reason enough.


For front-end web projects, I still love using PHP+codeigniter.

However, for all back-end tasks, I now use Python. It has so many features that I wish were in PHP and it works well for long-running tasks.

Most long-running PHP tasks that I've tried to build, leak random memory. The dev list also scares me and some of the features that have been added to the latest versions are highly questionable.


Bash users represent!

Seriously, Bash is fucking beautiful once you really dig into it. Pipes are unbelievably elegant.


Bash is great - we write almost 100% of dokku in it - but most bash looks like shit on a stick. Really hard to follow sometimes, and unless you are using modern bash, also really hard to ensure your own sanity.


I suspect that part of it is unusual syntax--if...fi, anyone?--and part of it is unusual syntax in non-bash things, i.e., Awk looks like moon runes, and to replace ; with \; in sed you need a grand total of SIX backslashes.


> if...fi, anyone?

And here I thought everyone wanted ALGOL-like syntax...


As someone who dabbled in C-like languages, it threw me for a little bit.


My comment was a joke based on contrasting two points:

1) "C-like" languages are often described as having "ALGOL-like" syntax. And indeed, C was based on languages significantly influenced by ALGOL-60 (though ALGOL-60 used BEGIN/END for blocks, not curly braces).

2) "IF/FI", "CASE/ESAC" and similar were in ALGOL-68, and Stephen Bourne drew them from there when he wrote the Bourne shell (with which bash was eventually intended to be compatible).


I spent a year maintaining 90k lines of bash. I've written some beautiful bash code and some horrendous bash code. Bash is a pretty poor programming language; it's an amazing user interface.


Elixir has pipes too. One of the great things about it.


I'll have to check it out, then.


There's lots of job openings out there for COBOL coders. Nobody wants to learn it, and those who know it are dying off.


As a young developer, do you know where and how I could go about getting into the COBOL industry?

All the job listings seem to require a decade's worth of experience and lists expertise requirements in technologies that are proprietary and only available on a mainframe. COBOL on its own is not very hard to pick up, but I'm lost as to how to learn about the ancillary technologies. I'm not sure it's so much that nobody wants to learn, but that nobody knows where to learn it.


Contact the companies looking for COBOL coders. They're desperate, and will tell you exactly what you need to learn and probably pay for any training you need.

Sure they'd prefer someone with a decade or five of experience, but seriously: COBOL coders are literally dying off at this point. 17 years ago was the last great hurrah for hiring & training COBOL developers, averting the "Y2K crisis" before calendars ticked over to the year 2000; we've got kids born after that who are entering the software workforce now. About 5 years ago I had someone speak in my C++ class begging students to consider a career in "mainframe" technologies; there was virtually no interest shown, and methinks most of the students didn't even know what he was talking about.

But...the need remains. Those systems are so large & profitable they're not going anywhere, and employers must be quite willing to take on new coders willing to learn, directing & funding their education as needed. If you're interested, contact a hiring company directly and ask - even if you have 0 of 10 years required experience.


I wonder, are there any good tools for translating COBOL into something else? There must be at least some demand? Wouldn't that be a good approach in some cases? (I suppose it requires longer-term thinking, and is sometimes simply not possible in a business context, etc.)


Thanks for the advice. I guess it doesn't hurt me to give it a shot. The worse a recruiter can say to me is no. There seems to be no shortage of available COBOL positions at the major banks and telecoms where I live.


Why would you want to?

There's a lot of smoke and noise made about how companies are desperate for COBOL developers because the old ones are retiring or dying blah blah blah, but I've never seen actual good salaries to back it up. And to add insult to injury, you're unlikely to be able to take the skills you've learned to your next job when you inevitably change jobs.


I suppose you're right. After going through some publicly available COBOL salaries in Canada, it seems the salaries for senior COBOL positions were about what I was able to ask for as a junior frontend developer (50-60$/h CAD). I think I would primarily consider a short stint for the heck of it. Maybe some short term contracts to see what it's like.


What kind of salaries have you seen?

I've read that most of the companies still using COBOL are banks, insurance companies, etc. They seem like the companies, if any, that could provide really compelling salaries for something with allegedly high demand and increasingly small supply.


Don't. The job requirements are impossible on purpose because the posts are just there to justify getting guest visas.

You're better off going towards other enterprise technology stacks with more action and rewards.


> There's lots of job openings out there for COBOL coders. Nobody wants to learn it, and those who know it are dying off.

A lot of people say this, but all the COBOL jobs I see advertised (not many) offer a very average salary. Demand can't be that high if COBOL jobs are paying less than the typical "frontend developer" jobs.


I get the feeling that Perl qualifies as "unfashionable". To be fair, I haven't stepped up to Perl 6 just yet, so I arrive at that conclusion from a Perl 5 perspective.

Why do I still use it? That's broken into two parts. I started using it because it was the language of choice for web apps circa 1999. I kept using it until recent years because of a large code base I was maintaining.

I continue to use it today because Perl Dancer 2 is a kick-ass framework. It has an extendable DSL, and it really makes short work out of writing web apps.


Have worked with COBOL on a mainframe for a large bank in NYC for the past four years. A very stable and critical application, it's nice to work with a group of established and proven people who value a work life balance. A bit bummed that a lot of the mainframe is locked down or controlled by the mainframe systems folks run by the global tech team. Overall a positive experience with all the business domain exposure, but finding it increasingly hard to remain relevant and move to new opportunities...


I use Tcl for AI in Life Sciences and Hedge Funds.


I wanted to say that I still occasionally use TCL, too! :)


I also use Tcl quite a lot for some pretty sophisticated software. (And it's not embedded! It seems most surviving Tcl programmers are using it as an embedded scripting language.)


I've written larger apps in TCL, but it was painful. I must admit to being guilty of using it solely as an embedded language these days.


I write a lot of C and C++. Considered Java for a recent Web api backend project but went with node in the end due to the richness of available libraries, somewhat quicker development for simpler things, and lower cognitive load from same language on both sides. I still sort of prefer strongly typed languages though. I hate the idea of bugs at runtime that a compiler might have caught.


Ada, it has a better typing system than any of the more modern languages and checks to find some common errors when it compiles.


I used to make a living programming in Visual BASIC and ASP before the Dotnet era had changed them.

I have a Windows XP Virtual machine with Visual BASIC 6.0 on it and IIS with ASP 3.0 on it.

I wrote a few sample programs in VB 6.0 and ASP 3.0 but I don't have any clients to develop for it. I call it legacy programming to write for older languages.

The problem is the people and businesses using the older technology cannot afford to upgrade to the latest technology and cannot afford to hire a programmer to write software for them.

I also heard that NASA uses older computers like old 8088/8086 PC clones because of the legacy software they have that runs on it that won't run on modern systems. They buy them from eBay because they aren't made anymore.

Ironic that they could use a virtual machine with DOS on it emulating an 8088 to run the software, but I guess they need access to the serial and parallel ports that don't exist on modern systems since USB replaced them.


The main system at the insurance company where I currently work is written in FORTE 4GL's TOOL (I think it stands for The Object Oriented Language)

https://en.wikipedia.org/wiki/Forte_4GL

It's basically an earlier version of Java, so I don't think there's much to learn from it.

To make things worse, the developers of the system in their infinite wisdom developed a custom scripting language that looks like an ill-conceived assembler.

It's apparently cheaper to have some developers support this monster than to rewrite it from scratch or buy an off-the-shelf product (which, to be fair, are horrendously expensive and consultant fees are ridiculous).

I learned the language through mentoring and reading the online documentation. You really get to value communities and stuff like StackOverflow or Github when you don't have them :)


I use PowerBuilder (of Sybase) at my job. We have a database centric stack mainly built with stored procedures and PowerBuilder works pretty well (PowerBuilder datawindows just represent a SELECT query or point to a stored procedure). It's very legacy. The syntax is similar to BASIC, with some SQL database querying built in as a kind of DSL. The built-in string and array libraries are particularly bad.

The Powerbuilder IDE rebuilds the object you're working on whenever you save, so if there are any errors you can't save the file from within the IDE. You can't view or edit a script and its ancestor at the same time. When source control is switched on, you can't save a file without checking it out first - but you can't check it out without closing it and discarding any changes you made. I usually have Notepad++ open alongside PowerBuilder at all times.


I've recently fallen for Standard ML. It's been a steep learning curve due to all the compilers and implementation choices but what a mind bending approach to software development. OCaml seems to be the obvious next step but I'm having fun with toy programs in Standard ML right now.


What do you find mind bending about software development in SML?


Coming from dynamic languages I find Standard ML to offer a much more methodical approach to designing software. The view that SML brings is that your program is a language of its own and the Algebraic Data Types you define are your program's syntax.

I find the type system to be easy to work with (unlike C, C++, Java) and I'm finally seeing the virtues of a static typed language. Also, having to specify types while defining functions forces me to think more in a good way. It's a far more methodical approach than I'm used to in Ruby for example.


Cool, thanks for sharing your viewpoint! I also very much enjoy SML and SML-like languages :)


There's a huge difference between languages that are no longer fashionable, call them old-fashioned, and languages that never had wide following, if any, call them niche languages. Most of the languages you mention were never mainstream production languages, for good reasons.


You are surely joking. Do you really mean to say that Cobol is not a mainstream language? Are you claiming that banks and similar institutions are niches? If that is the case then pretty much all programming languages are niche languages.


I am claiming SNOBOL, Icon and Eiffel aren't, for example.


I use Java, many people here would probably say it is 'unfashionable' but I have never found a task--within reason--that Java couldn't provide a decent solution for. When you understand the JVM and some of the choices that have been made Java is very fun to use.


Not exactly a programming language, but some early mac users would know: 4D - 4th Dimension


Still using it??? I built stuff in it about 89-90 ...


In my professional life, C wih occasional forays into ASM (my niche is embedded DSP which itself is inside a larger embedded system). In my side projects, C, FORTH, and Lua. All at the same time, in the same build.


Maybe not unfashionable, yet, but I still find Awk and its variants very useful and productive. Combined with SQL a lot of data crunching happens without anything fancy.


Using an 'unfashionable' language is similar to other areas in life where one rejects mainstream or too modern things for obvious reasons: - Be different. - Use something which is extraordinarily suited or especially made for your task at hand (in other words: do not compromise). - Maturity and stability is great.

Please add more. The analogies are manifold.

I really like modern re-implementations of well proven languages (Clojure <- Lisp, Factor <- Forth, ...)


As someone fairly new to programming I don't understand what makes modern implementation of Lisp like Clojure better than its ancestor. I like Scheme because it's ridiculously flexible. You can implement as many libraries of your own as you want with Scheme which I doubt can easily be done in something like C++. To me Clojure is an overdesigned and rigid version of Lisp. Why is that considered good? Since I am new to all this I might be missing something so I'd love if someone could ELI5 it to me. Thanks.


Naturally, other Lisps have advantages over Clojure, but here's a few advantages of Clojure. You might also want to check out this interview: (http://codequarterly.com/2011/rich-hickey/)

* Enormous reach: access to both Java and Javascript ecosystems. (If you count ClojureScript. I don't know how Clojure CLR for .NET).

* Immutability by default. When you want state, its primitives help with concurrency.

* A good set of basic data structures, including lazy sequences.


A close friend still uses Fox Pro to do data analysis sort of work in a trading company. Every time I see him working masterfully, I feel I should learn it someday.


I used Foxpro for about 15 years, until 2011 when I got laid off from that job. At one point I experimented with MS-SQL and came close to sacrificing the CDs on a funereal pyre because it was just. so. horrendously. slow. compared to Foxpro.

VFP had a slick little way of letting you indirectly address variables and pop them into a SQL statement or other VFP code, and they would evaluate at runtime. Later when I was learning Java & Javascript I was quite annoyed that I was not able to do this.


I use Lua. It's super light-weight, very easy to get started, and not a lot of ceremony( concise language) and has good integration with C libraries


VB6.

You can literally feel the disdain from others when I talk about it.


I still support a few apps in ColdFusion. It wasn't that long ago that I was speaking at CF conferences; I still am the manager of the Houston CF user group. I started my career at a time when it was a super fast option to build a web app (1999).

It's still a solid language, but there just isn't the ecosystem that Rails (and others) have, which is where I'm mostly parked these days.


C and elisp most of the time. I'm considered the weirdo in my classroom, all those cool kids that learned with Visual Studio and fancy C#.


I write assembly (arm and intel) almost every day, though I write glue code in C and use python for autogeneration of very regular code.


Nothing currently, really. However, when I first joined the company I work at (as a Graduate Data Analyst) I wrote a recommendations engine in VBA.

It took about 5 minutes to chug through 3 months of sales data and produce a list of content -> content pairings & weights.

Then I discovered Python, cursed my manager, and the same algorithm ran in <1s on the same data.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: