Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What “old” programming languages will you still be using in 2017?
212 points by _of on Jan 3, 2017 | hide | past | web | favorite | 493 comments
And what is the purpose?

I'm still using bash, awk, and a bit of FORTRAN for scientific computing.




Many folks have said it here but I'll say it again: Common Lisp.

I have been using it professionally for 5+ years as a full-time employee at various companies. Some big-name ones, some smaller start-up ones. The mean Lisp team size has been around 4, but I did work on a project of 15 Lisp programmers. None of these projects were legacy code. Some were in places you wouldn't expect (embedded, multi-processor systems on custom-designed boards, for example). In every single case, we had no additional trouble hiring a Lisp or Lisp-capable programmer as compared to hiring for any other language, including Python. (In fact, Python was more difficult to hire for because the market is saturated with beginners who claim expertise.)

Lisp is one of those languages where the ratio of long-term benefits and productivity vs. good initial impressions is at a record high. It doesn't look like C or Python or JS, with all the parentheses, so people brush it off.

Lisp isn't the pinnacle of every great idea to come about in computer science and software engineering, but it is one of the most robust, macroscopically well designed, and most productive languages for translating arbitrary abstract ideas into maintainable, production code. Even if it doesn't look initially very pretty in the eyes of a career Python programmer.


>In every single case, we had no additional trouble hiring a Lisp or Lisp-capable programmer as compared to hiring for any other language, including Python

Doesn't that imply an over-supply?

I did my grad research on a highly specialized topic, involving plenty of math/physics.

Then I worked in industry applying my skills for 4 years.

Then I switched to programming - nothing related to my engineering degrees. This job requires only a BS (and not really that - one of my coworkers has no degree).

The highly specialized job was the one where I had little leverage, lower pay, and a miserable experience. Why? Fewer jobs than supply.

Whereas for programming, the supply is much larger than for the specialized work, but the demand is even larger than the supply.

The geek in me yearns for a job where I can go back to numerical algorithms and physics. But frankly, those jobs tend to suck when you don't have much leverage.


Does it indicate over-supply? Maybe, but it's probably a bit more complicated than that. There is a very large supply of self-proclaimed Python programmers. (There's no official accreditation of this, of course, so we can only go by self-proclamation.) But this is unlike, say, raw materials supply for building a bridge. A lot---and by "a lot", I mean enough to be noticed when hiring---of the "Python programmer supply" is under-skilled. So, in reality, and limited to my own experience, the supply of skilled, knowledgeable Python programmers is much less than that of the total supply, so I'm not sure I'd label it as "over-supply".

With that said, though, and especially with huge companies like Google using Python as one of their main languages, the supply of qualified Python programmers is no doubt larger than that of Lisp. But, it's a fallacy to believe that the supply of qualified programmers correlates inversely to the time spent searching for them. It's sometimes easier to find needles that standout in the haystack.

As for software jobs in the domain of math/physics/etc., they're out there, but just harder to seek out. The world hasn't run out of hard science problems to solve, and many companies would go to great lengths to hire brilliant individuals who have both scientific acumen and programming skills.


>As for software jobs in the domain of math/physics/etc., they're out there, but just harder to seek out.

I'm well aware of that. However, my experience is the working conditions are poor. Some do have a higher salary, but everything else seemed worse.

Also, not entirely related, but it does suck more when you're working on a very challenging problem, and can't fully crack it on the company's desired schedule. You get negative feedback with veiled threats about losing your job. When you're in that situation, and realize that you can get a much easier job with similar pay, you have to ask yourself whether the risk of working a "science" job is worth the rewards.

It was all fun in grad school, where the deadlines were much looser and you're not worrying about saving for family, retirement, etc. But it really sucks if your livelihood is at stake merely because you wanted the thrill of working on a more challenging problem.


I've often thought it's probably like that. I'm an engineer solving complicated problems, but nothing cutting edge. Hours and pay are good.


I would guess, maybe, but if the companies hiring Lispers try to low-ball their pay and work experience too much, the Lispers will just move on to programming a different language somewhere else.

Yes, might take a little time to get your Github up to speed with samples in a different language and employers intelligent enough to see past "X years in Y language" requirements, but I think it will be an easier shift than numerical algorithms and physics to programming, I think.


If you have become a truly skilled industrial programmer, you can very easily go back. There are a lots of interesting scientific programming jobs, but your typical Ph.D or post-doc is only half trained for them, at best.

There are fewer pure research positions, and unless you are very good there is a leverage problem there because every year dozens of new grads come looking for essentially that job. Something to bear in mind if your desire is primarily to work in matlab/python/R on things that look a bit like your thesis did.


Can you elaborate on the work you did - in particular what problems did you have to solve, and why Common Lisp was used instead of other languages?


Slime + emacs for your environment?


Yep, all the way. Slime + Emacs + Paredit.


Ah interesting. I thought paredit was standard in emacs.


You mention several benefits, how many of those do you feel apply to all "lispy" languages, and how many are specific to Common Lisp [in your opinion]?


I think it applies to Common Lisp specifically. I think languages like Scheme are just "not there" in terms of ecosystems and other things important to the professional programmer. I think debuggers, avenues for professional support, a unified standard, a variety of highly compliant open source implementations, and other things are necessary for any projects a business should invest in. Not to mention a sort of package manager.

The one exception is maybe Clojure, but I find Common Lisp to be all around better at the problems I've worked on (namely ones where at some point or another, a native machine code compiler is required). Clojure' async/immutability story seems good for larger distributed systems, though, and interoperation with Java is invaluable to some projects. (Though, Common Lisp does have an implementation atop the JVM, called Armed Bear Common Lisp. This is one of the many benefits of an official language standard.) But by using Clojure you are definitely locking yourself into a single implementation.


Are there resources you'd recommend for finding jobs in CL ?


The canonical site is https://lispjobs.wordpress.com/

Googling, LinkedIn searching helps.


+1


- sh (I really don't use bash features in scripting)

- awk

- perl

- sed, if that counts as language.

It's funny that a course that I took in university almost 29 years ago, for learning the use of Unix scripting tools, is one of the most useful learning experiences in my daily work today - even if my job is not really a programmer. But very often I see colleagues (project managers, architects etc) struggle with processing information in ways that involve a lot of repetition and manularity. Scripting solves those things.

(For some things also Excel and VB perhaps counts as an "old" programming language; it is also often quite useful. And I probably will end up doing something in C again this year.)


Any suggestions for a bash scripting tutorial to help me better connect the dots? Over the years I've learned some nifty one-liners, but I'm not really familiar enough with the patterns and art of of command-line thinking to solve my own problems in bash. Most of the bash resources I've found online are just loosely organized collections of magic spells. I'd really like to get better at it though.


The resource that has been instrumental for me to move up from beginner-that-generally-can-work-out-things-with-a-lot-of-random-tries, to someone who often has things right on the first try, and if not, have a clue of what is going on , has been the following book:

Classic Shell Scripting Arnold Robbins, Nelson H.F. Beebe

Recommend it to anyone with a basic understanding and a will to learn more. It also really doubles as a rich information resource. Cannot recommend it enough if books are your thing.


I would recommend two things: 1. Read the "bash" man page about once a year, and 2. Learn Perl or Python or something for the harder tasks.

It's helpful to learn some more about bash, and there's some good stuff in there. But shell scripts above a few dozen lines tops are a bad idea, and even those "few dozen" lines need a surprising amount of armoring with "set -e" and such.

I don't think this is because bash is bad. I've come to the conclusion that there's a fundamental tension between interactive use and safe programmatic use, and anything really good at the one is not going to be good at the other. In particular, interactive users more-or-less want the shell to accept inputs that are a bit sloppy, because they can interactively recover from the vast bulk of misinterpretations immediately, whereas a program becomes very risky with the same level of sloppiness because it will quite likely just keep going, doing very bad crazy stuff. (Which is why all my shell scripts tend to start with "set -e", which amounts to "If you get confused... STOP, instead of trashing an arbitrary amount of state.")

And again let me emphasize #1. By splitting my world up into interactive use and programmatic use, I've made my peace with shell, and I don't hate it or anything. I use a lot of the features interactively. For instance, I do a lot of little loops on the command line. Often I'll build them up with some "echo"s done to make sure that it's about to do what I think it will do. There's a lot of great ways to save time in interactive bash (or zsh or whatever), and the best part is, in interactive use you can be as crazy efficient as you want and there are no software engineering implications. Writing bash scripts with every crazy sophisticated trick in the books starts raising software engineering issues fast, and worst of all, some of them are quite invisible ("what if you run this in a directory with a file with a space in it? a newline? a file called '-rf'?"), to say nothing of whether the next person working on the script will know what all the fancy operations do. And I've got a decent-sized collection of 5-line bash scripts lying around doing various things. I just don't let them get much longer.

One of the most useful things to do is just load up an index in your head of what bash can do, then know you can look it up if you ever want it.

(Substitute your choice of shell for bash throughout.)


And read the bash manual at least ONCE. (It's not the man page.)


But the manpage is useful in itself already: I've been using a lot of

  ${VAR:INDEX}
  ${VAR/PATTERN}
  ${VAR#PATTERN}
etc. in the last few months. It covers a lot of string processing needs in shell scripts that you would otherwise have invoke sed or awk for.


[Shameless plug] Not too long ago, I wrote a small book that tries to address those kinds of questions, i.e. how to use Bash to solve problems instead of 1-liners. You may find it useful: http://adventuresindatascience.com


The book looks good. Is there a place I can find the table of contents or reviews of your book?


Here's a table of contents: http://robertaboukhalil.com/book-toc/, hope it helps!

I don't have a formal place for reviews but I posted about it on HN previously and the response seemed positive: https://news.ycombinator.com/item?id=10112615.

Also, you can enter "hackernews" to get 25% off


The book starts with: "Following the 2008 US elections, all eyes turned to Nate Silver... "

And goes on from there ;)


Haha I may need to update the intro to reflect more recent events.


Yeah I wasn't being critical dunno why people are down voting. I thought it was funny and interesting!


As always there's a relevant xkcd. https://xkcd.com/519/


Too true. When I was 13 or so, I read the DOS 5.0 manual from cover to cover. If I recall, a lot of it was about batch files (autoexec.bat and stuff). Batch was really my first "programming" experience.

Then when I was 22, after a good computer science education, my first job was working at EA, and the build system involved a bunch of batch files. That knowledge from nearly a decade before came in handy! The CS stuff didn't come up until later.


I studied English lol. My whole career is based on my weekends. My education helps me write eloquent email responses, though.


Eloquent email responsed are not to be underestimated, being able to communicate in clear language is a valuable skill.


I find that clear notetaking is amazing in guiding discussions. Being the one who sends out the meeting summary with next actions and clarifying remarks or being able to refer back to meeting notes gives me an amazing power in making decisions or resolving disputes. Just because I'm the one who's writing stuff down, I'm somehow given the authority to say "this is what we decided at the meeting".


Absolutely! Can I ask if you are taking these nodes the old fashioned way with pen and paper or are you using a program to do this?

I myself have found out that during meetings I prefer to use a pen and paper but it is mostly just notes that I take for myself.


It varies. I'm often tinkering, which is a problem, I know. I've used a notebook, a Surface pen and One Note, a text file, scribbles on the agenda for the meeting (if there was one). It depends if it's a teleconference or not, whether my laptop battery is charged, if I can find a nice pencil, etc. etc. as well.

The summaries always go out as emails, of course, and if I paper note-take, I try to take a picture and upload it to onenote or sharepoint.


We use a record feature in Zoom for our teleconferences that we want to save, but notes are always better because you can review conclusions without the noise of skipping through discussions. It'd be amazing if there was an application that recorded meetings then summarized them.


I am not who you were asking, but I exclusively use pen and paper during meetings. I try to keep my laptop closed and transcribe action items to a personal trello board after the meeting is done.

The trello part is more recent, but I have notebooks going back to my first internship in 2012(which I know isn't that long for most here). It's pretty interesting to see old notes I took and the problems I had at the time.


I've found that the more attention I pay to my notes, the better I recall the topic at hand even without referring to the notes I've taken.

As a result I exclusively use pen-and-paper for meetings. I also started using fountain pens about two years ago and the discipline required to write well, especially with vintage flexible nib pens, helps keep me focused on note-taking.


Touch-typing was probably the best thing that elementary school taught me :)


I'm not a native English speaker. I could throw in that the best thing regarding learning I did (in addition to buying a home computer) was to get J.R.R. Tolkien's The Hobbit (which came with the adventure game) which in turn led me to read LOTR and then other books in English, and generally become much more fluent.

Later on, wasting time reading NNTP news at the uni was also actually a good investment for learning the language in a way that is useful at work.

For my kids, the games have again been the same thing, along with Harry Potter books.


That's interesting. Whenever my dad wants to become more comfortable with a new language, he reads the translated editions of Harry Potter, too! He's read "Harry Potter and the Philosopher's Stone" in Dutch, French, Arabic, Mandarin, and Spanish, lol.


The mouse over text is even truer.


And the ten minutes striking up a conversation with that strange kid in homeroom sometimes matters more than every other part of high school combined

Wow, that is pretty weird. That was true for me in college. I was at home after graduating in January 2002, at the bottom of the dot com bust. It was hard to find a job then.

I took this world music course in college, and I remember I talked to this guy who was also a computer science major, and he told me he was going to EA. Probably 6 months later, I e-mailed him out of the blue, and that got me the internship which led to my first job.

So instead of all the computer science stuff, I should have just relied on connections from the world music course and on my DOS batch file knowledge from age 13 :)


Very true with perl, although for me it came a bit too late: Larry Wall published first version of Perl at the time of that course, so it obviously wasn't in the content, and I did the stuff with sh, sed and awk, and perl has unfortunately always been a bit of an afterthought for me (I have to look at a manual in some simple things where with awk I don't).


Seriously! I learned so much about scripting and networks in my high school programming classes because a group of us just wanted to play Quake 3 and CS 1.6 instead of doing assignments, and we needed to get around whatever software they had in place to monitor whether you were slacking off.


I'd add Typewriting. Not having to look at the keybaord (including my DAS Keybaord with no key markings) has made life sooooo much easier.

QBasic was nice, but no-where near as all-encompassing as touch-typing.


I'm not proud of it (and I'm not going to share the code) but I wrote a CMS for a medium-traffic site entirely in bash and standard unix tools like awk/grep.

It should be an absolute clusterfuck but it's been surprisingly reliable and performant, to the point where I have no desire to replace it.


If you can't show the code, can you give any info about the overall design of it? I can't even think how you'd structure a "large" application in shell scripts.


It runs as a cgi, and uses files on disk instead of a database.


This is how many web apps in 1999 ran. As long as you treat your inputs carefully and properly handle file locking, there's nothing inherently bad about that approach.


A lot of cron


SQL. If I had to pick a language that I'll use my entire career that's the one. It has been overused and underused, but always relevant.


I love that I'm at a company that I haven't had to use SQL. I don't know if I could even find a relational database here.

But in a classic meme sense: "It isn't you I hate SQL, it's what I have to think about when I use you." I don't want to spend my time thinking about database indexes, query plans, or other complicated bits. I just want to get my data out.

Sometimes I struggle, and I'd give anything for a plain inner join, but on the whole I like this tradeoff.


I still teach SQL, via SQLite, as a first language to students, even though in practice I do 90% of my data analysis through the command line or Pandas. But SQL is very direct and clear as a language, and the overhead of SQLite is...well, light. Making an index is just one more line of code that for many datasets, you run once and never touch again.

What do you find simpler in terms of data querying and management?


I love sql myself. I'm not sure why I love it so much and other people hate it. Sometimes I think that coming up with reasons is just after the fact justification (do I really like it because it has a solid mathematical foundation in the relational calculus, or does that just sound like a good reason to say I like it)?

Pandas is an excellent contribution to data analysis, but I gotta tell you, I was absolutely delighted to learn about pandasql and realize I could write selects, group by, order by, and so forth, converting data frame to data frame through sql.

What can I say? I truly encourage people to give SQL a real chance. I'm happy to use ORMs that save me some typing on simple queries, and certainly I use a combination of pandas and sql, there are some things much better accomplished through a programming language. But I'm willing to drop to SQL pretty quickly if I sense that the programming language or library is really just forcing me to re-learn a new implementation of sql.


I agree, I was always so so about SQL until I had to use a lot of it. The syntax isn't amazing, but the relational model works really well once you can think in sets. The declarative nature of it means that if I push stuff from the app layer to the database layer its usually less buggy and has better performance.


I used to dislike it myself, and in the context of building within a framework like Rails, I'll defer to an ORM. But my appreciation for SQL deepened as I became a more experienced programmer and began valuing explicitness in code. The reason why I do most data work in Python is because most of my work is exploratory, and I need a language that is well-suited for gluing together services and non-analysis-specific functions. But when it comes to doing work in which time and consistency and portability is Key, SQL is my choice for structured data.


IF you start to think in a set orientation, SQL makes a lot more sense. If you try to think procedurally you will struggle with SQL.


That's interesting. I didn't learn SQL until taking a course on it in grad school. I had some programming background, mainly from math classes and a bit of CS.

the concept of a query really seemed odd, it took some mental adjusting. My only programming background was largely in writing procedural code. My brain really wanted to think in terms of methods, functions, inputs, and return values.

It's been so long (18 years since that course), that it is difficult to remember the mindset I was in. It all seems pretty natural now. But you are absolutely right, you do need to start thinking in terms of sets and operations on sets, not in terms of methods with inputs, procedures, and return values


I was in a similar boat. I studied computer engineering in school but didn't take any database course. My first encounter with SQL was years later as an education reporter. Someone in the newsroom threw out a "Access 97 for Dummies" and I picked it up and realized it was just what I needed to do the kind of analysis I wanted on California schools and test scores (CA.gov is quite good about posting raw data [0])

I eventually moved from Access to MySQL/SQLite, but if it weren't for Access's interactive query designer, where the tables are presented as lists of columns and you draw/drag lines to declare a JOIN between columns [1], I honestly don't think I would have ever grokked SQL.

I actually think that without SQL, I would have never understood the power and purpose of a JOIN, even though that concept of comparing lists is the foundation of most investigative research. So I justify forcing SQL on to journalism students not merely because it's a useful skill, but because it presents the best vocabulary and grammar for describing the idea of joins.

Also, SQL's general lack of data-munging functions or conveniences really drives home that data is just characters and numbers, and doesn't magically have meaning or come into existence. That is an extremely difficult concept to convey with Excel -- e.g. how it automagically converts `9-11` into `11-Sep` and confers a datetime context onto what was just plaintext.

[0] http://www.cde.ca.gov/ds/dd/

[1] http://www.opengatesw.net/ms-access-tutorials/Access-Article...


Couldn't agree more - I often see this pattern in programmers - they try to think in cursors instead of sets and end up really fighting the instrument - not using it as intended.


I kind of inferred this, rather than outright stating it, but I like SQL. The concepts are elegant, the language is a little kludgy but not obscene. I can never really remember the semantics of left, right, outer and inner joins, but I usually don't want missing data so inner all the way.

It's not SQL I dislike, it's that it means I'm using a database, and I've lost more of my life than I care to admit to in planning, designing, and maintaining databases so that they perform under production pressures.


The functionality of SQL is great but I really wish it would like more look like other programming languages. I wonder how SQL would look like if someone would design it now.

It also should integrate easier with other languages. Using SQL from C# or C++ is a real pain with tons of casts and string manipulation. Better debugging would also be nice.


> The functionality of SQL is great but I really wish it would like more look like other programming languages.

I think it looks a lot like COBOL :)

Which isn't surprising given it was designed by IBM in the 1970s.


Considering that SQL describes the result and not how to get it (that's for the query planner, that is the SQL compiler), maybe it would look like Prolog. More about this at http://stackoverflow.com/questions/2117651/comparing-sql-and...

A SQL like language in JavaScript is Mongo shell, which is much more complex that SQL. Every time I use it I wish they adapted SQL to Mongo instead of writing their own thing. I usually end up writing complex queries in Ruby with Mongoid. They're more compact and easier to understand.

About ActiveRecord/AREL vs SQL, they're more or less the same. Sometimes a query is easier to understand in Ruby, sometimes in SQL. Complex queries are much easier to read in SQL or outright impossible to code in Ruby.

If I could change something about SQL I'd change all the arbitrary weirdnesses of the data definition language and the non orthogonal stuff, like HAVING which is a WHERE in a different context (aggregations vs rows).


Wow. I'm surprised at the negative thoughts about SQL. I didn't realize I was an outlier.

I'm grateful that it doesn't look like other languages. The format it uses is really easy to generate using other languages, which makes it really useful in a lot of what I do.


I found that SQL is actually an excellent language for people with no coding background. Especially in the BI field it's easy to teach people SQL so that they can perform their own (often quite powerful) queries. It's much easier than learning a proper programming language.

A re-design might be nice for programmers, but would likely lock out most people with no programming backgrounds who work on databases.


Maybe simple queries like "Select where" are intuitive for non-programmers but once you start joining tables things get pretty ugly. Also, the syntax for UPDATE is totally different from INSERT although they do similar things. Why does UPDATE use key value pairs but INSERT uses list of column names followed by list of values?

Since Javascript is the best programming language ever (so I hear :-) ) maybe using Javascript syntax and objects would be nice.


I absolutely agree that INSERT statements should use key value pairs. It would eliminate many errors where columns were switched accidentally.


Fellow teacher here, what method (tools) do you use to teach SQL via SQLlite?


I use DB Browser for SQLite as the GUI: http://sqlitebrowser.org/

Like the FireFox SQLite Manager plugin, it's cross-platform and free and fairly easy for anyone to install. DB Browser's interface is substantially cleaner than the Firefox plugin, though, and the devs are responsive on Github.

DB Browser is great, all things considered, but I've never found a GUI that didn't overcomplicate the kinds of simple interactions I want when teaching databases (i.e. 99% of what I teach are SELECT queries). It also makes a few concessions in order to appeal to the Excel crowd. For example, not committing writes to the database until you do a Cmd-S...which contradicts what I tell my students about how SQL gives you more responsibility through explicit and consistent behavior (e.g. when you delete a row, it's gone forever...so don't ever delete things unless you really need to). On the other hand, some of the concessions are nice, such as the ability to rename columns via the GUI, which is not part of the SQLite spec.

So I have contemplated teaching SQL from the command line. But I've had a hard time imagining how the steep learning curve could justify the potential advantages for the average novice.

Here's a page I wrote up for students on how to install SQLite browser: http://2015.padjo.org/tutorials/sql-admin/getting-started-wi...

For comparison, here's my instructions for FF's SQLite Manager Plugin: http://fall2014.padjo.org/tutorials/databases/getting-around...

(that page also includes how to work with Sequel Pro and MySQL; I soon realized that giving students two choices for learning SQL was a very bad idea and have since stuck to SQLite)

Here's a take-home midterm: http://2015.padjo.org/assignments/midterm-wsj-medicare-walkt...

I've found it better to provide pre-packaged databases for students and basically ignore the administrative part of SQL and databases -- e.g. importing of data, defining a schema, creating indexes, etc. -- until we've learned all we can about working with SELECT and JOIN: http://2016.padjo.org/tutorials/sqlite-data-starterpacks/


Hard to imagine how it's to live in a company that has no database. But I do take issue with that:

> It isn't you I hate SQL, it's what I have to think about when I use you.

Do you live at the fringe of performance like that? Nobody spends much time thinking about those things, and the few times that require thinking about them would require thinking about much more complicated things if you weren't using a database.


It's not so much existing at the fringe of performance, but there have been weeks of my life lost to needing to tune a query or an index. Sure, some of that time is actually building the index I just added. But I don't want to have to figure out a query plan.

If I'm building a simple CRUD app, then I don't need the complexity that SQL provides. If I'm building something bigger, then I probably need to think about the query plans.

I would have loved a tool that I could hand a proposed schema, estimates of the data, and some SQL I want to run that would come back and tell me "That's stupid, because you're doing a join on computed data; add this index, denormalize this field, and remove this clause from your query." but I'm also realistic enough to know that such a tool isn't plausible.


MSSQL has one:

https://technet.microsoft.com/en-us/library/ms176005(v=sql.1...

Postgres has explain and some third parties make query tuners.

Oracle has a host of third party tunes, Solar Winds being the first that comes to mind.

I don't know if any of these can do it off a proposed schema but you could create a dummy database and your insert/read routines run a script against it to simulate transactions and run the capture on these tools and they will explain the recommended adjustments. Some of them like MMSQL query tuner will export a script that can make the adjustments.

The biggest thing for DB performance for most CRUD applications is to ensure you eliminate table spools on joins. If you find all of those you are ahead of the game in most instances.


The worst of my time was using Oracle, around 2008. Back then SolarWinds was just a consulting company. And I don't really do much relational work anymore. A lot of my stuff ends up being documents with simple keys organized in hierarchical ways, and the few relational things I do are easily handled with that model, and wouldn't justify an actual database.

Why should I have to know what a table spool is and why they're bad?


Dunno, sounds easier than chess.


Surely there's a relational DB somewhere...perhaps indirectly via Saas? Maybe payroll, company books, timesheets, or some other software common to most businesses.


Oh, I'm sure there is. I'm also sure I have zero idea how to access it, request one, or who to even start talking to about it.


Or... in your smartphone maybe (your iPhone / android address book is stored in sqlite iirc)


I'll agree with you there. SQL is the one language that I hate, yet use constantly, and will certainly out-live me.


I used to think I hated SQL until I decided I had had enough of it and I tried to do a project using a no-sql database. Suddenly everything became a lot harder and I was having to write massive amounts of code to do the things that could be done with a few lines of SQL. Even the simplest of tasks just sucked the life out of me.

I soon ported the project over to use Postgres and decided that actually SQL is awesome.


I know exactly what you mean. I've realised that when you're solving a hard problem (complex data logic, extreme performance requirements) with SQL, you end up systematically experimenting and thinking very deeply within a rigidly defined system. With no-sql databases, you're monkeying with obscure APIs that cover up obscure, idiosyncratic, and hard-to-reason about database engines, and patching strange behavior with rickety app-layer code.

Both kinds of work are hard, but with SQL you can usually expect the problem to yield to intense engagement and produce a solution that's complete, robust, and satisfying.


Then you learn about hstore and json store in postgresql and you realize it's the best of both worlds.


my workflow with complex queries despite the fact the outcome is always the same: Write a SQL query to see if it works as supposed, use a query builder in another language, analyze the code and the query because it's not as fast as I would liked to, realize the entire code in another language would be faster if it were a stored procedure in SQL, write stored procedure, repeat.


Yep, no getting away from SQL any time soon.


Lots and lots of C. (I'm mostly an embedded C programmer these days). Some C++ for a bit of higher-level "business logic" that benefits from a class-based approach, but a fairly limited subset of "modern" C++.


Coming from the HFT side, I find C++ surpasses C in a lot of ways for optimization work. Mainly you can use integer template arguments and generic functions to abstract all the boilerplate in a way that is more safe than C macros.

For a semi-contrived example, instead of writing a do4Things() and do8Things() to unwind some loops, I can write template<int> doThings() where the int argument is the bound on the loop.

And having things like a universal template<typename Whatever> toString() that operates on enum classes is nice.

The downside is that it's horribly easy to invoke allocations and copy constructors by forgetting an ampersand somewhere, and the std library isn't well suited to avoiding that behavior either. You have to be vigilant on your timings and occasionally callgrind the whole thing.

The other downside is that your colleagues are more likely to "knit a castle" with ridiculous class hierarchies or over-generalization. ( https://www.infoq.com/presentations/Simple-Made-Easy )


Yeah, the nice thing about C++ is that you can generally hide highly optimized portions of code behind nice templates or class interfaces. And with templates you can write libraries that let a lot of compile time logic happen to inline a bunch of stuff and not have to resort to virtual methods.

But when it comes to using things like custom allocators, etc. it's a nightmare. Or a lot of the compile time "traits".


I have a friend who makes a living writing CUDA kernels as C++ templates. His job will be safe for decades to come because noone will be able to decipher the code. :)


C++ has this weird evolution curve where as you start using features things improve up to a point then go downhill, but then go uphill again with new modern features

If you limit yourself to using the STL with C++11 stuff but don't go crazy with inheritance and templating in your classes it can be nice


Serious question: How does one get into this C embedded stuff?

I'm addicted to low-level "embedded" type work. But all I hear back from are Django, RoR, $HOT_JS_FRAMEWORK, DevOps teams/recruiters.

Is there some buzzword or special topic to know and do a project in? Any suggestions for projects to display on GitHub, topics to know through and through (domain specific knowledge)?

I sit at work thinking all day about RasbPi sensor projects sitting on my workbench collecting dust due to job+young kids.


Search for companies that need to design sophisticated hardware at very low cost. This pretty much guarantees they need to hack microcontrollers, which are usually programmed in C. One I've worked with in my past life as an embedded developer is TI MSP430 (costs pennies), but there's many others.

Another route to code in C close to the hardware is to find work writing Linux kernel device drivers. Of course, you have to know the Linux kernel fairly well in addition to being a good C programmer to land that sort of job. Companies that need this skill include server manufacturers that design their own network interfaces and such, and perhaps phone manufacturers, or companies that design their own SoC's (Qualcomm, TI, etc). Maybe some IoT startups need this skill as well.

Yet another way to code in C is to work in a place that needs to program DSPs. Usually, this also requires knowledge in video/audio engineering. Companies that need this skill design and sell devices that do some sort of video / audio encoding (like video conferencing devices, etc).

I'm sure there's many others I'm not thinking of. I hope this helps...


+1 for device driver writing. That's how I got my start over a decade ago.


Look by industry. In the financial sector, knowing VHDL/Verilog is becoming popular (given the rise of FGPAs everwhere). In sectors like robots, defense, aerospace, etc. you'll find a lot of traditional embedded roles.

These aren't "sexy" areas so don't expect to get recruited. Like, mainframe programmers, these positions are in demand but just aren't widely advertised at all.


Being familiar with RTOS's is certainly useful, since there's a decent chance that your future automotive/aero/industrial job uses one.

For a reference of stuff to learn about, go grab the manuals (usually massive) for a recent embedded CPU and see what common topics are foreign to you. Just reading the TOC should prompt some ideas. You may not be applying for one of the lower level "board bring-up" type jobs, but that team will be delivering you libraries that assumes familiarity with IRQs, timers, memory paging, processor modes, DMA, lots of I/O types, etc.


I would get a Teensy 3.2 and start programming bare metal with Arduino. You could also start with Arduino on AVR, but the Teensy gives you a lot more headroom and performance and a very small physical size for very small $.

Arduino is just g++ under the covers, and you can use as much or as little C++ as you want. You can stick to straight C if you like just by not using any C++ features.


Lots and lots of C for me too. A huge pile of legacy business software which is still being extended to do new stuff.


My only real gripe with C is that you have to write your own functions/macros (or use someone else's) to make dealing with strings intuitive. The stdlib is perfectly fine from an algorithmic standpoint, but I always have to look up every string function's manpage, every single time I use them.



Plus if you don't need inheritance, you can just use pointers to structs and pass them to functions. And if you need simple polymorphism in some small places, you can use tagged unions.


This viewpoint is exactly why I like Rust. It has syntactic sugar to make things look like traditional objects, but everything is really a bundle of data with a set of associated static functions. (v-tables only appear with trait objects AFAIK, which are uncommon compared to templates.) And tagged unions are built in, with pattern-matching!


Actual question from a software engineering newcomer: What would you consider "business logic" and where would said programs be running?


Well, I was using the term a little sarcastically since it is (I think) usually applied to multi-tier web service architectures.

In my case I have an embedded system with a touch screen display, running a bunch of tasks to drive peripherals. There is a task that handles communication with the touch screen (serial). Then there is a set of slow tasks that handles states and modes, for example the runtime representation of a modulator or an RF amplifier. This got extended because we are adding "remote control" support over RS-232 and USB and so this runtime representation has to support more of a multiple model-view-controller. This can be done with queues and a lot of functions but it was simpler to represent it as a tree of stateful values that can be updated via remote commands or via the touch screen and have "downstream" updates. So that's what I mean by the "business logic" -- higher-level than, say, a driver and task for talking to a digital potentiometer or a DAC.


Tcl. It works really well as a cross-platform (*nix/Windows) alternative to shell scripting that doesn't suffer from the same problems as the POSIX shell (see http://www.dwheeler.com/essays/fixing-unix-linux-filenames.h...). It builds on the same foundation of everything being text but with a few changes to the substitution rules builds something a lot like a Lisp on top of it.


I found expect (which is Tcl + some extensions, I believe) to be quite useful.

Implementing something like modem initialization - "Send AT command to modem/wait for a reply matching one of several patterns or a timeout/handle result/rinse, lather, repeat until done" - is quite trivial to do with expect.

It's not something I use every day, but it's one of those useful things to keep in your mental toolbox.


Came here to say this. Expect was and remains extremely powerful.

I've been using it for 15 years to automate network configuration deployment. Now that automation is all the rage in the industry it gets dismissed, but I find it is often much better than some of the other tools available.


Can it integrate with Unix command-line filters and tools roughly as well as shell can? Interested.


That depends upon your definition of "integrate".

It can read/write stdin/stdout (just as any scripting language can) so it can be used to write additional filters/tools that don't exist in the standard set that can then be used from shell as with any other tool.

But if you mean using it to create a pipeline out of existing command line tools, then yes, but with a bit more boilerplate than plain shell needs. The built in "exec" command has an almost shell syntax for starting pipelines, and the "open" command supports the same syntax, so it is 98% "as well" in that regard. But to avoid getting stuck in a deadlock trying to write data to a pipeline that is returning filtered data to the script requires turning on Tcl's event driven file IO, which is where the extra boilerplate comes from.


>That depends upon your definition of "integrate".

I meant roughly all that can be done with shell, except maybe somewhat differently (due to Tcl's different syntax or features). The usual things: call existing command-line tools, write scripts in which you use its own syntax/features as well as call / pipe between / orchestrate command line tools, redirect std I/O, etc. - the things that make the shell powerful.

Thanks for the reply. Seems like it is close to equivalent from what you say.


Uh.... yes. At least Tcl can; I haven't used expect in years.

The subject is "tcl pipes".


Did a search for that subject:

https://www.google.com/search?q=tcl+pipes

and this second result was interesting:

http://wiki.tcl.tk/17419

It compares doing the same task (getting a part of a string from the last line of a text file) in Tcl and Python. Seems to conclude that some things are easier in Tcl and some in Python (based on a quick read).


Ditto. It's original purpose as an embedded scripting language for C programs still stands.


Most of my work is in C++, which is almost 35 years old. By far the best choice of language for cross-platform, client application development, especially C++11/C++14, which bring modern language features (e.g. lambdas, futures, etc) with all the original benefits of the language.


Hands down the best thing about C++ for me is that with modern C++ you can write relatively safe clean code while taking advantage of decades of robust libraries.

Yeah, Rust is a big step forward in some respects, but it'll be almost a decade before we see it catch up to C++'s incredible ecosystem. C++ is almost the lingua franca of finance, for instance.


Rust can already take advantage of the C ecosystem quite easily. Rust/C++ usability is a major goal for 2017. I'm hoping that by year end you can have the best of both worlds. The advantages of Rust for new code and the mountain of code that exists for C and for C++.


Well, the C ecosystem has always been easy. The nice thing about C is that generating bindings is a breeze.

I've never been a fan of bindings to C++ libraries. They usually always feel cumbersome and hardly idiomatic. Like it or not, the best way forward is to bite the bullet and slowly natively reimplement the libraries we need in Rust.

I'm starting some work next year on writing some of the basic libraries that someone in the electronic trading world might want to start writing applications in Rust (a FIX/FAST engine and a port of the Aeron message transport). I'm already also writing a native Rust port of our internal consolidated market data library. We already have a Go port along with the C++ code and plan on open sourcing these pretty soon. This stuff tends to be highly proprietary and slow to develop so hopefully we can get some of the more cutting edge users to pitch in and get more people using the language.


C++ FFI will likely be difficult given that unlike C, C++ lacks a stable ABI since the standard doesn't define an ABI, and leaves it up to compilers to implement.


I used to write in a pre-11 style. When I got my first job where everyone was using all the hot new features, my mind was blown and my life quality doubled.


I love that C++ is my primary language right now, after years of being stuck doing Java. I really enjoy working in C++11. With CLion it's super awesome.


CLion really has made C++ fun again. Yeah, there are some bugs you run into here and there with the static analyzer but generally it's fast, really intuitive, and nice to work with.


Too old for any one language to have lasted my entire career. But:

C for (embedded(embedded))* audio DSP engine, with hardware specific intrinsics and occasional drops into inline ASM when those fail. When every cycle counts and Hard Real Time contracts must be met, to me it is anathema for any invisible compiler-generated code to exist - such breaks the required determinism. So no C++. Would write in pure ASM in the old days. Still want to - most times the C preamble/postamble is not needed in event handler functions. But TPTB dictate otherwise. Small talent pool and they want Jr progs to take it over.

FORTH as the most useful interactive hardware/low level software debugging tool for behavioral debugging (vs post mortem breakpoint debugging, which loses the behavioral context). Used to use hardware logic analysers for this but system complexity out-grew them. I should note I am not using a commercial FORTH, rather a freeware core greatly extended and modified by me for this purpose. Used it in several projects with different CPUs and architectures over the years. Can feed it C headers for readable dumps too (translator written in PERL). Key point: does this while the target continues executing. Caveat: do NOT use unless you know and understand every line of code in the engine and its implications for the target system. Given that, it's the sharpest double-edged handle-less debugging tool I've ever found.

* the DSP engine is embedded inside a larger embedded system. All inside the same chip. Makes for interesting times when one cannot touch it directly in main OS land.


Pascal/Delphi

Before there was an internet of lightbulbs and smoke detectors, there was an internet of industrial process things.

And they used windows 2000 and Delphi. They still do.


Delphi/Pascal ran the call center telephony system at a company I worked at, and it ran it well. It was very performant and stable, and ran things for around 14 years.

Spent a while writing and modifying Delphi apps there, I really enjoyed it. I'm actually surprised it isn't used more today, it was definitely "rapid development" at the time.

One thing it was great for was making small, dependency free executables. Similar to today's concept of "Microservices" we often wrote small executable services and applications for tasks and it was very straightforward and simple. You could whip something up in a couple hours that accomplished a lot. Push the .exe file somewhere, add it into the process and forget about it.

Not surprised people are still out there using this stuff.


> I'm actually surprised it isn't used more today,

I'm not at all. Embarcadero has made Delphi a language that is out of reach of the beginner as their Delphi has been rediculously priced. Until recently they didn't have a free Delphi entry level product, it was $199 and was crippled. Now they at least have a free version but the cost has prevented me several times from actually learning Delphi. Now there is FreePascal but Lazarus just seemed clunky and I don't think it has as many bells and whistles as Delphi has.


>I'm not at all. Embarcadero has made Delphi a language that is out of reach of the beginner as their Delphi has been rediculously priced.

Where I live, piracy is casual. People get software for free, and they stopped using Delphi because of Embarcadero. When people don't even deign to pirate your software, something's off.

I think there are many who are still on Delphi 6 which is what I used when I was younger. It was simple, effective, tight. Then that Embarcadero thing came out.


I just think their marketing sucks and their IDE is nowhere near Visual Studio but the price points are nearly the same. Sure you get the VCL and other proprietary things (such as their fireman firemonkey app platform which deploys to android and iOS) but the cross platform tooling for .NET has gotten much better that C# makes more sense for many LOB applications than Delphi does.


My company has a multi-million-LOC application directed at medium to large healthcare providers written in FreePascal. I have worked with a lot of different codebases, and this is by far the best one. I would like to say it is because I have been the code dictator since it's inception, but it is probably because things are quite easy to structure using pascal.


I am working on 20 years old codebase that runs accounting, warehouse management, time tracking, invoicing and whatnot monster that power most of the roofing companies in Austria. It is still Delphi 5 + Paradox, developed on Windows XP (I run it from VM on Mac), but work on modern Windows as well. Networking is a bit of problem, but nothing we can't deal with.

On the other side, I don't think that Delphi/Pascal is dead and I am starting new project this month: control software for laboratory equipment. I can't find any other tool that will let me create multiplatform desktop software that interface directly with hardware. This new software will be written in Lazarus instead of Delphi.


Not unsurprisingly, I'd much recommend getting rid of Paradox. For my projects (several 100KLOCs, most of it DB-related) I relatively easily migrated to Firebird with IBObjects (http://www.ibobjects.com – not affiliated), both of which have been a major success.


Migrating db is on my list all 9 years I am working on project, but I still havent find a way to do it. Core of the problem is stupid idea that paradox tables are actually files, so there are tons of copying and moving files around instead of using paradox as db.

Given the complexity of project and our budget it will have to stay.


You make me want see it and really not want to see it at the same time. :)


Don't let me started on custom file locking mechanism and ways to modify the file system on remote server.

To be honest I am not sure if the founding developer was crazy genius or just crazy.


"Before there was an internet of lightbulbs and smoke detectors"

Hahaha. That was good.

Delphi is used heavily at my company to process millions of transactions a day, in the credit card industry.

I myself am a C# developer though. But I respect Delphi's performance and relatively clean syntax.


That's funny because Anders Hejlsberg was the chief architect of both Delphi and C#.


Hence why I consider a big error to have created the CLR in first place.

Now we kind of got .NET Native, but it still isn't 100% done.


Sometimes I fantasise about what would've happened if MS had decided, by some fluke, to push, say, OCaml as their big .Net initiative. Can you imagine MS Visual OCaml#? OCaml for Windows app development ... that would've turned some heads.


You mean instead of C#? Their pushing F# a bit now


Exactly, instead of creating a Java clone, what if they'd decided to use a well-understood existing functional language with an elegant module and object system from the very beginning....


F# is a great project and we have a passionate community behind it, certainly a good thing to try if you're into functional programming


I'm getting into Elixir for my functional programming fix. I had looked into F# awhile ago but it wasn't for me at the time


Same here but MPW Pascal with 68K Asm.


What, really? In 2017? Explain, please!

You're deploying on 68K Macs?



Ah, interesting. I used MPW a bit, but far more THINK C and CodeWarrior "back in the day." It is interesting to me that the MPW toolchain would still be useful.


Fascinating! And you keep old Macs around to run the toolchain on?


Been using emulators for years - first classic or whatever it was called on OSX, then basilliskII/sheepshaver, and now mpw:

https://github.com/ksherlock/mpw

Here's the end of a Makefile, very unix-like:

  APPL = ${TYPE}${NAME}
  DATA = DATA${NAME}
  OBJS = ${NAME}.p.o
  LIBS = ../WordSum.asm.o ../LASysLib.a.o ../PasLib.o
  
  LANG = C
  
  MPWC = /usr/local/bin/mpw
  
  AS = Asm
  AFLAGS = 
  
  CC = SC
  CFLAGS = -mc68020 -mc68881 -b2 -Opt all 
  
  PC = Pascal
  PFLAGS = -mc68020 -mc68881
  
  LD = Link
  LDFLAGS = -w -t LApp -c MPWX -m MAIN -sg ${NAME}
  #    Link -w -t TEXT -c RWG1 -rt CODE -m HELPLOOP -sg HELPLOOP
  
  CFMFLAGS =
  # CFM = ${MPW:/mpw=/cfm}
  # CFM = /usr/local/bin/cfm
  CFM = cfm
  
  # TFTPFLAGS = -vt
  TFTPFLAGS =
  # TFTP = ${MPW:/mpw=/tftptool}
  # TFTP = /usr/local/bin/tftptool
  TFTP = tftptool
  
  .SUFFIXES : .p.o .p .c.o .c .asm.o .asm
  
  .phoney : all clean install
  
  all : ${APPL} ${DATA}
  
  .asm.asm.o :
  	${MPWC} $(MPWFLAGS) $(AS) $(AFLAGS) $<
  
  .c.c.o :
  	${MPWC} $(MPWFLAGS) $(CC) $(CFLAGS) $<
  
  .p.p.o :
  	${MPWC} $(MPWFLAGS) $(PC) $(PFLAGS) -r $<
  
  ${NAME} : ${OBJS} ${LIBS}
  	$(MPWC) $(MPWFLAGS) $(LD) $(LDFLAGS) ${OBJS} ${LIBS} \
  	  -o $@
  
  ${APPL} : ${NAME}
  	$(CFM) $(CFMFLAGS) $< $@
  
  mult.asm : mult.py
  	./mult.py >mult.asm
  
  mult : mult.asm.o
  	$(MPWC) $(MPWFLAGS) $(LD) $(LDFLAGS) $< -o $@
  
  ${DATA} : mult
  	$(CFM) $(CFMFLAGS) $< $@
  
  # send DATAFILE first
  install : ${APPL} ${DATA}
  	$(TFTP) $(TFTPFLAGS) ${NODE} ${DATA}
  	$(TFTP) $(TFTPFLAGS) ${NODE} $<
  
  clean :
  	$(RM) -f ${TYPE}${NAME} ${NAME} ${OBJS}


I learned to program with Delphi and this was 2.5 years ago, since then I've been actively working with.

My only problem is that I have to maintain legacy codebases without OOP code.


One of the companies that works with us in health care projects do all their Windows products in Delphi and AFAIK they don't plan to change anytime soon.


>And they used windows 2000 and Delphi. They still do.

Ha ha, good one.

There was this interesting thread on Delphi a while ago on HN:

Delphi – why won't it die? (2013 (stevepeacocke.blogspot.com)

https://news.ycombinator.com/item?id=7613543


I like Delphi. I used to play a lot with both Dephi 7 and VB 6 back when I was in high school. The nice thing about both of them was that they made GUI programming really easy, but Pascal was just a nicer language.


There are still a bunch of modern Windows desktop applications written in Delphi and C++Builder – I work on one of them. ^_^

I really like Delphi actually. It's a shame Embarcadero doesn't really seem to care about it.


Delphi and C++ Builder downfall sadly started when Borland went greedy and started to scare customers with their lack of business focus, specially the spin off.


I think their biggest mistake was trying to follow Microsoft into .Net instead of taking over the native desktop development niche. A native Delphi 8 with some VB6 import tools would have done wonders for them.


Don't agree.

Their biggest mistakes were focusing on enterprise life cycle tools, selling the developer tools unit, creating an half hearted port of Delphi to GNU/Linux based on WINE, letting all the key developers leave the company.

All these together created the image that most of us should leave Delphi and C++ Builder while it was still affordable to do so.


They have had so many mistakes over the years, embarcadero with their predatory pricing has finally chased me away


Projects I know I'll need to heavily modify or work on for the foreseeable future, I bring forward into Lazarus. Lazarus has quite a bit of activity in its community.


Lazarus did huge improvements in last 5 years, just MacOS version needs more love.

I keep spending tons of time fighting problems, even with mainstream libraries like sqlite.

Still the best tool for multiplatform desktop apps development.


I spent time recently in the webcasts for the new version of Delphi, and from what I saw they're doing some good work. The mobile development for example is enticing.

But, and I asked during the webcasts, who's going to buy into a new project now?

For myself I have a couple of old D7 apps for clients that have been running for a decade now.


Lisp, and specifically Common Lisp although the latter isn't that old.

It first appeared in 1984 and the ANSI specification was finished in 1994, but it traces its lineage directly back to Lisp 1.5 from 1962. It can run Lisp sources decades older than it trivially.

It is a very practical, multi-paradigm language with high quality, high performance implementations on many platforms.

Its main issue is that it is not very popular, but that might also be a blessing.


I'll add elisp there.


C (I hate C++)

assembler (various) (I've even learned x86_64 well after being a RISC aficionado, it's actually kinda neat)

sed

bash

LaTex

I used to use YACC/LEX but that's been replaced with ANTLR for awhile now.

What language am I starting to use? Rust and R.

I wrote something in PERL once but when I woke up the next day I couldn't really read it. So that was that.


Perl makes it easy to write line noise, and only possible to write legible code. That possibility also goes mostly out the window as soon as you use the real strength of the language (regex).


Regexes support whitespace and comments, and can easily be composed. But I agree that an effort needs to be made to write legible Perl code. "Perl Best Practices" is required reading for any professional Perl programmer.


Aye, you know you're starting to reach expert level once you understand the parts of that book which should be ignored.

for OP: Perl is a rich and deep language. It's as or more comptetent as exactly the same things as ruby and python (although pythons maths, and keeping management happy libraries are better, perl's async and systems wrangling )support is better. It is more permissive than you would want in some circumstances, but in my experiencing running a competent perl team with supportive management is a pleasure.


Lisp. For all purposes. It conquered my brain in 1988 and ever since then it's the first thing I turn to whenever it's an option. My favorite Lisp ever was Ralph. My favorite nowadays is Common Lisp. Generally speaking, I like any Lisp better than anything else, but I prefer those that preserve the whole interactive programming-as-teaching culture in all its glory.

Python for when someone needs me to work on Django, and also for helping my younger daughter learn programming, because it's a pretty accessible place to start.

C when I need to be that close to the machine and there's some reason not to use Lisp.


I do scientific computing in academia.

Secretly, it's just fortran all the way down, and LaTeX for document preparation.

Piled on top of that is a fair amount of matlab, simulink, and python.


You know, modern Fortran isn't awful. Fortran 77 code however, is frighteningly ugly to look at.


IMO, it's more than not awful, it's great! I wish I had picked it up sooner as a mid-level language for mathematical and scientific computing (over C++). Additionally, you can connect Fortran to Python dead simple with `f2py`.


It's too bad some people don't know about it and just copy/paste blindly.

It took me about a week to find out that the author of a model i worked on didn't know about `from copy import copy`, which explains a great deal about why their ODE solver wasn't working.


IMO the ongoing trend of moving everything academic to C++ is just awful. I still don't get why everything thinks a language-to-rule-them-all is what should be used by default, especially in the age of LLVM. IMO Alan Kay and his research group has always had exactly the right ideas in this regard. I think iPython/Jupyter + Numpy + some kind of JITed Fortran 2008 environment with accelerator support would be a nearly ideal scientific computing environment. Basically a free, open source and blazingly fast (HPC ready) version of Matlab.


this is how my field is going. (biomedical/neurosci). Of course, since numpy is effectively a nice shell on top of linpack, my original assertion still holds.


Ocaml (which will turn 21 this year).

It really hits a sweet spot for me:

- fast executables

- fast compile times

- very expressive

- good standard library in 'Core'

- a REPL so you can easily play around with code you have just written

- a type system that's really useful for catching bugs

- imperative programming is relatively easy

I find myself becoming more and more productive. I will have to see if writing a little server for JSON web API sitting behind nginx is feasible - currently using ruby for that.


MUMPS. Despite its strong resemblance to compiler IR and lack of a type system, it is the standard DB in Health IT in America.


Oh my, just had memories of a much younger me reading these in horror back in the day:

http://thedailywtf.com/articles/A_Case_of_the_MUMPS

http://thedailywtf.com/articles/MUMPS-Madness


MUMPS is standard in places which deal with Intersystems, via Ensemble (integration system) and their Cache database. An intriguing stack of technologies: an old and 100% backward compatible programming language, an object-oriented (non-relational) database with direct access from the high-level language (M; they do not like it when you call it MUMPS), and quite a number of very good software engineers having worked on this system for 30+ years.


I'm going to take a wild guess - EPIC?


Heh, of course the Madisonian would figure it out :)


First started programming in MIIS (relative to MUMPS) in late 1981. And have continued to use MUMPS/M/MSM/DSM/Cache for most of the intervening years with 4 years off for Fortran and a year or two for COBOL.

Most of the work has been with medical information systems, although the first job entailed a library information system. One of the few languages I knew of which could make a PC-AT into a multi-user machine. It also enabled a clinic to move from a PDP to a 386 and to no longer have to relegate financial batch jobs to off hours.

Will probably also continue to use AWK this year.


Oddly enough I just got a recruiter spam message for a MUMPS developer position. I had never heard of it, but I thought to myself "why would I want to develop mumps???"


MUMPS get's a bad rep, it's kind of ugly, but has some awesome ideas built into it. I'm not convinced EPIC's success is despite the fact it uses MUMPS, I think it's a contributing factor.


OMG the memories! I did work in MUMPS while I was in college in the early 90s. I had completely forgotten about this. Thanks for that :)


Perl 5, I have been using it in Financial Services for the past 12 years. It is still a workhorse when it comes to reporting. Many large banks still use it, but most don't advertise it. I used it at places like Intel back in 1999. It really is a versatile language.


I switched to Python a few years back mainly for Django, but Perl still does a number of things better (regexes, interacting with the filesystem for example).


My understanding is that django/sqlalchemy and catalyst/dbix::class are equivalent. The former being better at keeping managers happy, the latter at keeping the right kind of programmers happy ( due to the superior flexibility).


I did try Catalyst but being Perl and "more than one way to do it" caused me a lot of confusion (you could replace the template engine, the ORM and other parts to the point where I didn't know what the framework actually was).

Django was a bit more beginner friendly for someone new to a framework, and the documentation is really good. And the Django admin is one of the best ways to get something up and running quickly, there wasn't a Perl equivalent when I was trying Catalyst.


Mojolicious is a really nice alternative to Catalyst.


Delphi

I'm in the process of "inheriting" a company whose sole product is a multi million LOC behemoth written in Delphi. The product itself is a highly customizeble/parametrizeable control software for business processes/alarm management build around a rule engine for rules drawn in a graphic editor.


I inherited a Delphi program written in-house for our accounting department. Thankfully it's only about 10,000 lines of codes, and the code is quite readable. But I still have to explain to our accountants how little I know about accounting (it has something to do with money, right?).

On the plus side, Delphi is quite easy to read. I got started with no prior knowledge in Delphi or Pascal, and I was able to roughly understand significant parts of the application within a few days.


Reminds me of Gensym G2 back when I studied expert systems.


Siemens Step5 (for maintaining an installed base of S5 PLCs - the most indestructible piece of computing hardware ever designed, IMHO.)

Also, some work in MPASM (Assembly for the Microchip PIC series of microcontrollers - same reason)

Edit: I am not sure just when S5 got introduced, but I've got installation media - on 5.25" floppies - for CP/M in my office. Made in West Germany.

That old.

Luckily, the IDE has also been ported to more contemporary operating systems.


lisp: It excels in "key algorithmic techniques such as recursion and condescension". http://james-iry.blogspot.com/2009/05/brief-incomplete-and-m...


Almost didn't read it thinking someone was going Total, LISP Weenie in there. Glad I did: it's a parody of a bunch of languages with some great laughs. Can't even pick a favorite haha. Thanks.


I will use Clipper open source alternative called Harbour.

I started using Clipper in 1986 and some of 1986 code still runs today. Clipper was a compiler for Ashton tate dbase II language.

Harbour now allows to run old code on windows, linux, Mac, Solaris and a bunch of other less common OSes, both in text mode or using some gui lib. It will also run on android and ios (using Qt).


Wow! I had no idea Clipper was still around. I was using it in 1993 with code blocks (essentially lambdas and first class functions, but not closures).


Holy cow. Used this for a summer job at an agricultural cooperative. Some special linker was a big help, and so were the trade journals (which I would have to track down for this to make sense).

Blast from the past.

Also, the floppies got wiped out on the way to the big demo, 'cause the backpack was up against the car speakers in the back seat on the way there). Maybe '91.


Yes it is still thriving in some countries. Every month someone asks how to run old clipper programs in some more recent os and porting (if you have source code or exes can be decompiled) can be done quite easily. If you still have clipper code you can give Harbour a try (it is on github).


Perl is one of those languages that nobody likes at my work, but that is impossible to get rid of.

Java is in the same boat, but at least it's still marketable so engineers don't mind working with it as much.


I once worked at a place that basically just used perl server side. In this decade (2010s). The codebase was just one tangled mess.

They told a story of a guy who failed the cultural interview because he told a VP "you can't write a large web app in perl", and of course they had done just that. With the reflection of hind sight I think despite the obvious empirical counter factual, he was on to something. It really was an uphill battle against the language to do things large scale.


It's not impossible, it's a cultural mismatch. Perl is obscenely permissive, which makes it a really interesting and weird language.

But when you work in a big team and make something complex, you want something that's consistent. All of that permissiveness and flexibility bites you.

Now, with good styling guides and linting, you can make it work for complex software with a big team, but that's culturally pretty anti-perl. We're trying to do that with Javascript now, which isn't as weird as perl, but has many of the same excesses that you want to reign in with a style guide.


In my company we primarily use Perl, but it's just on my team for our ETL pipeline. It works really well, especially since we modernized our older code and enforced really tight code standards. To use Perl end to end, well, there are modules for that (such as Dancer), but I can't comment on them because I've never had to use them.


I recommend Dancer, actually Dancer2 now. So easy to write web apps. Maintainable ones too! Most of the basics of app security are covered within the framework. And where there are gaps (like CSRF tokens), it's trivial to add primitives via DSL.


I tend to answer "can you...?" questions by pointing out that you can do just about anything in any Turing-complete language. The more useful questions are whether you should, and if not, why not. It sounds like an annoying hair-splitting technical distinction, but I've found it leads to more useful discussions.


I think developers have moved on to reinventing Perl in JavaScript. Ever since I saw "use strict"; in a .js file my eyes have not stopped rolling. Personally, I'd choose Perl over JavaScript, Python or Ruby any day of the week but I'd lose all my hipster cred.


Java is good if you don't create 10 layer enterprise applications where each component is broken on 17 maven artifacts.

I use Perl to manage such a monster.


Perl 5, of course!

It's an awesome language and both the language and the ecosystem is getting better and better all the time


Seconded. We have a Perl codebase going back about 20 years (Command-line interface to our CMDB, integration with lots of external systems like DNS, generates config for mail servers, DHCP server, routers, ...), and while there are some nasty spots, overall it's pleasant to work with.


Under OpenBSD you have pledge for perl, have it a look.


What do you do in Perl?


The largest Perl project I work on is http://www.theregister.co.uk/ which is mainly Perl on the back-end :)


I have written a number of Perl scripts (both CGI and batch) to make up for shortcomings in our ERP system, and to hook up our ERP system to our PDM and financial accounting systems.

Also, I have written a couple of reporting scripts to watch over our Active Directory (look for computers that do not exist any more but still have their AD accounts, stuff like that).

Oh yeah, and a couple of Nagios plugins, too.

For these tasks, there are not many languages that can match Perl in versatility and productivity. It's not the prettiest language, but it's extremely useful.


I don't know about the original poster, but Perl is great for back end development, and there are wrappers for just about every GUI toolkit if you want to go that route.


I use it for command-line utilities, such as static-site-compiler, a blog-compiler, and similar.

Beyond that I use it for microservices, implement REST-APIs, and also for powering whole sites. With the right framework/modules you can do almost anything.

For example I re-implemented the API my home lights use to communicate, allowing me to use cron to turn off lights via the perl interface.


Not the OP, but in my company we use it for our ETL pipeline. I've grown to really enjoy it as a language.


I've used it here and there and don't think its near as bad as people say. Perl6 is a good design, but if the implementation doesn't get sped up, I don't see it taking off.


I've been falling back in love with Forth, and specifically SwiftForth https://www.forth.com/swiftforth/ the full source code to everything comes with the paid version, and you can literally change everything. So much saner and cleaner in implementation than shitstorms like Go, GCC, LLVM, and Java.


What do you use it for? I had tried it some years ago during a vacation, and liked it. Not used it for any production thing.


"classic" asp. Still have some crappy legacy pdf generation code no one is willing to invest the time to rewrite into here. Fortunately that's all of an update a year or so.


I know that pain.


I get a call about once a year from somebody who I did a favour site for in 2001. Still running. Still breaking.


From a contractor POV, I love these calls. The hourly rates are high, and the work is generally easy.


uses PHP and JS

Those aren't old enough to count, I guess

googles age of PHP

Holy crap! PHP is almost 23 years old. Probably still not considered "old" though...


JS is actually 21 now. According to wiki, it was developed under codename "Mocha" and shipped with Netscape Navigator 2.0. [1] "May 23, 1995" Personally, I'm pretty sure the version of "JS" when it was first released and more modern JS share very little other than logic and symbology, I think that can be said for many programming languages.

[1] - https://en.wikipedia.org/wiki/JavaScript


I have a feeling that by "old" the topic creator actually meant less relevant than it once was. Both languages are still very relevant and therefore out of the running. PHP may be waning but it still powers a huge chunk of the internet, just think about all those wordpress blogs.


some of the biggest advancements in PHP recently have come from folks who were not born when PHP was initially created...


Visual Foxpro. The application originated in 1986 or so and the original language was dBase II, but moved to FoxBase in 1987, and FoxPro/Visual FoxPro over the next couple of decades.

I do as much in Python as I can justify, but I still have to work in VFP quite a bit. I personally have code from 1988 that's still running in this application.

Sigh...


Just found this comment, which mirrors my sentiment exactly. Quite a bit of my time is spent on modern infrastructure and languages. But, the rest is spent maintaining a VFP application from early 2000's

But, I'll give FoxPro one plus, it can integrate quite well with REST APIs. Hell, even modern SQL databases are fine.


I still get paid to program in Pick Basic (https://en.wikipedia.org/wiki/Pick_operating_system) which seems kind of unreal in 2017.


You're not alone. We're still use Pick Basic via Universe for legacy software. It's not working so bad, so.


Whatfor?


Line of business application for leasing companies.


  assembly 
  apl (k4)
  forth (bootloader)  
  snobol (spitbol)
  C (small subset)
  sh (Almquist-like)
  sed (not GNU)
  nawk (do not use added functions such as strftime)
  execlineb


what in the world do you still write in snobol / spitbol? I haven't heard anyone even mention that language since 1983.


People mention it here on occasion for how it handles string processing. Experts in such things might want to dive deep in a review of it for the rest of us.


Icon was also produced by the same man, Ralph Griswold. And it has inspired a newer incarnation, Unicon. I imagine each would also be pretty good at string processing.


I occasionally pull out SNOBOL4 as a less-awful awk, mostly for poking around/formatting logs and such. Not a bad language honestly. It's aged very well.


As a hobby this year I'm beginning to experiment with mechanical computer designs, implemented in Lego Technics (yes, really). That's going to involve looking back to Turing machines, Babbage's difference engine, the Pascaline, and various other contraptions. I guess you could consider the means of encoding instructions for such machines to be programming languages, albeit very primitive ones.

Here's a couple of machines that others have built:

https://www.youtube.com/watch?v=FTSAiF9AHN4

https://www.youtube.com/watch?v=i_u3hpYMySk


BLISS. The system programming language for PDP and VAX computers from Digital Equipment Corporation. Think "high-level assembly language". Powerful, elegant, simple, totally unforgiving. The world's most powerful chainsaw, without a chain guard.


What are you using it for?


With that description, I hope street cred


Emacs Lisp. I want to port all my remaining sh awk and perl stuff to it, and I'll havee some time to do that in the coming weeks.

Also, m4. I use it together with awk and bmake for a couple websites. I plan to port awk scripts to sth. else though, I keep forgetting awk.


Emacs lisp is just so immensely sticky... once you cross the initial "what's up with this crappy lisp?" threshold, and have a finger caught in its web, you suddenly just want to put your hands all in. And then find more hands to add to the pot.

I expect it to stay around a good, decent while :)


It's such a nice UI paradigm married to a decent lisp (any lisp is better than no lisp :)). Once you get the actual hang of it you can't ever leave it. Just like real coffee vs. instant abominations.


Any recommendations for a good starter guide? I've done some lisp here and there, a bit of Clojure, and some other FP-style things (e.g. Elixir). I'm not scared by parens :). I'm just not sure where to start with elisp and getting into a good groove with it.

Edit: also a long-term Emacs convert, just haven't written any of the elisp myself.


I myself have discovered everything I know along the way, and have read no guides, so unfortunately don't know any of them. Tho C-h t is the unavoidable initiation to emacs the editor.

Using tools lik Org mode, Rmail, Elfeed etc. gives you many opportunities to write some useful elisp, which is a pedagogical experience.

Emacs is like the mildly beautiful geeky girl which you fall in love with: whenever she spots you inclined to some new cute girl, instead of going mad with you, reveals such a wonderful piece of her interior world that you just forget the other one :)


I would suggest starting off by reading the first few chapters of the official elisp manual https://www.gnu.org/software/emacs/manual/elisp.html and experimenting as you go along.


C++98 with a few C++11 features for work (our compiler is not really C++11 compliant at all, but it does have move semantics).

Ada for fun, though I'd say Ada 2012 is a bit like C++11 in terms of “basically a new language”. People do seem to have a perception of Ada as ‘old’, though I'm not sure why (it is, by my account, more ‘modern’ than, say, Go – actually, it's not unlike Go with a nice generics system).


I did Ada 95 back in college. Was really nice for embedded programming -- the built-in concurrency primitives are astounding, and it feels like a much safer systems programming language than C/C++


Good for you -- I am not a fan of most of the latest C++ features; it feels again like a crazy research project. Although I _am_ excited about uniform initialization (finally), despite that fact that there are slight syntax and parsing confusion issues that arise from backward compatibility (as always, in C++).


I really want to write something in 6502 assembly this year. I've read tutorials and played with it, but I've never used it to build a non-trivial program. I've also been wanting to build a NES emulator in common lisp, and this would be the perfect opportunity to pair the two projects.


Have you seen the game called Human Resource Machine?

This is not actually a joke comment -- it teaches assembly language programming. Not 6502 specifically, but the fake architecture looks a lot like 6502. You work up through a set of more and more challenging assignments until you have to write a sort. I really enjoyed it and would consider it a great introduction to basically any assembly language.


I had never heard of Human Resource Machine but that reminded me of TIS-1000 which I highly recommend.


Thanks for the recommendation; I just purchased it. As another commenter said, TIS-100 is also a great game about an assembly language for a fictional computer that I'd highly recommend. I had a lot of fun playing with MIPS assembly in college, and I'd really like to do it again.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: