Hacker News new | past | comments | ask | show | jobs | submit login
Language Pragmatics Engineering (borretti.me)
106 points by ashpil on June 1, 2023 | hide | past | favorite | 26 comments



I loved this paragraph:

   You observe this with types. Dynamic types feel faster, at the REPL, when you’re coding at 1Hz, because you’re not factoring in the (unseen) cost of future bugs, the cost of refactors you won’t do because you don’t have the confidence to refactor which static types give you, the cost of legacy software that can’t be replaced because it can’t be understood by anyone, the cost of a Python server doing four requests per second while you pay five figures to AWS every month.


It's funny, in a few dynamically typed projects I've noticed people start treating some code as append-only. The fear of getting bitten means reinventing the wheel is a more palatable prospect than diving into the inexplicable horrors that are around the corner.

Of course, that 'a g i l i t y' of dynamic typing is impossible to give up. The devs must churn out code ASAP, maintenance be damned -- mostly because they probably won't be maintaining it.


There's so many causes:

- Average tenure in a dev job being so short means it is detrimental to optimize for the long term.

- Promotion policies in most companies are heavily balanced towards "shipping more" rather than towards "shipping better". Even in companies without formal policies around this, "more" is way more visible than "better". Even more so if you get to be the hero firefighter who rescues the site from downtime, nevermind that it was your code that set the place on fire in the first place.

- Software as an industry has very low capital requirements, which makes it easy to get into. However, that also means competitors can spring up very quickly and many companies respond by constantly churning out features to stay ahead of the competition.

- It is IMO way easier to teach yourself programming with a dynamic language rather than with a statically typed one. Interpreters are certainly not exclusive to dynamically typed languages, but they are definitely more prevalent. With so many devs being autodidacts, it's not a surprise that many of them default to dynamic languages.


I call this the Peter Principle for code: http://h2.jaguarpaw.co.uk/posts/peter-principle/


Except this is bs because modern jit outperforms C and C++ compiled code in some cases. Dynamic types mean the constraints are known later. It doesn’t mean they’re never known. And when they’re known you can partially reduce.


> Dynamic types mean the constraints are known later.

Don't you mean, constraints are found when the program throws an unexpected exception and leaves you with a largely meaningless stack dump :)

I'd rather have the system tell me "you can't do that" much earlier - at compile time, not run time.

I use python quite a bit, but hate the thought of all those never-exercised control paths, which only occur in the most unusual circumstances - and I know silly problems will show up when those paths do get executed :(


We're discussing performance. Dynamic types do not mean no type errors while you write code. It means the types can be variant at runtime.


You latched on to the performance question, but the paragraph you were responding to was much broader. It's reasonable for someone to think you were actually responding to the full paragraph and not one phrase.


> modern jit outperforms C and C++ compiled code in some cases

I'd love to see some examples of this. Links?


Even saying you're correct on performance, that's one point out of four that you've addressed. It's a bit of a stretch to call BS until you've addressed the bugs, refactoring, and legacy code problems.


> Except this is bs because modern jit outperforms C and C++ compiled code in some cases

In "some cases" doesn't mean the statement is bs. Generally jit is slower than compiled code. There are exceptions to every rule, we use generalizations to reason about the world anyway.


I really hope just using static types alone doesn't give people confidence to refactor. It of course depends what "refactor" really means. Are we talking about a function, class, module/library, an entire application? Maybe in the function case, static type give more confidence, but in the others the way to de-risk is to have good tests. Static types only help catch one class of errors where refactoring can introduce several error classes depending on the specific case.

In other words, if static types alone are giving confidence to refactor, it is a false confidence except in the most trivial of cases.


This is a sliding scale though; types in C and types in Haskell aren't the same. Most statically languages sit somewhere between the two extremes, where few veer towards the right.

As a consequence, a lot of peoples primary experience with type systems is limited to "this should return a list of strings", which like you said, is one small class of problems (though I'd argue it's huge win compared to not having that guarantee).

Every project needs tests, but like all things, what happens in practice is seldom ideal and thus the more you can offset to (reliable) automated tools the better.


To add to this: A bit similar to Lisp, due to the different compiler plugins, Haskell is more like a family of dialects. One can be pretty vanilla about what the types should protect against, but one can also build arcane type-magic buildings that rely on the newest PL research.

Simple, vanilla Haskell is approximated more and more by the mainstream languages: Optional types, andThen().orElse() and friends and other things - which I am really happy about!

Why not use a more mainstream langauge then? For me, there are at least 3 hard reasons:

1) The stuff mentioned above is old, battle-tested and deeply embedded into the language and the community - it feels ergonomic to use.

2) IO-being-a-library is a paradigm that produces programs that I love to maintain, also when others written them.

3) Haskell has nice interfaces that I miss in mainstream languages, Functor and Monad for example. My prediction is that in around 5 years, mainstream languages will start to offer these kind of interfaces - starting from "Wait, what else can we do with andThen() and optional chaining and so on?"


Hardly pragmatic; this page seems to be little more than rah-rah Haskell evangelism. (And I like Haskell!).

Making it hard to YOLO your I/O does not seem to be paying off very well for Haskell; the cost in adoption often outweighs the gain in safety. Yes, Django codebases occasionally have bugs due to I/O happening at the wrong time, but that's pretty far down the list of causes of bugs in Django codebases.

Likewise the argument against macros seems to be driven by nothing more than personal taste. (And again, I dislike macros! But you have to actually make the argument why they're bad. Certainly my experience is that checking generated code into source control is worse, not better)


I didn't RTFA, but I wanted to note that in Linguistics, at least, "pragmatics" isn't really about pragmatism.


It isn’t, but the confusion is the author’s, not the commenter’s. Although the author situates the article as discussing the programming languages equivalent of pragmatics in linguistics (explicitly coupling it with syntax and semantics), the author actually focuses on practical, everyday concerns, like how programmers will steer away from using things a language makes difficult to use, or the inefficiencies resulting from bad tooling, or tedium-avoidance by programmers.


Fair enough.


About macros: the author did argue why it's bad. It was actually the final point leading to the conclusion:

    salience bias: measuring what is seen and not what is unseen.
And he did not say all macros are bad, giving the example of Common Lisp macros being good. Why? Because you can easily expand them (the language itself gives you the capability, not just an IDE... but of course with SLIME it's one shortcut away) and see the actual code you will get running.


> And he did not say all macros are bad, giving the example of Common Lisp macros being good. Why? Because you can easily expand them (the language itself gives you the capability, not just an IDE... but of course with SLIME it's one shortcut away) and see the actual code you will get running.

That seems pretty unconvincing. If it's important for macros, why isn't it important for regular code? And yet taking a Haskell function and dropping down to the corresponding Core is no easier than expanding a Rust or OCaml macro.


The salience bias can be seen the opposite way: your code base consists of understandable metaprogramming based on principled tools, and the corresponding write-only generated code is unseen.

The OP might have been bitten by bad macros and bad code generation (early and repeatedly).


Generally in agreement with the article. This bit was confusing to me, though.

> Moving with the arrow keys, erasing text by holding backspace for an eternity feels slow, but it is predictable and reliable. Vim or Emacs-style editing, where you fly through the buffer with keybindings, feels faster, but a single wrong keypress puts your editor in some unpredictable state it can be hard to recover from.

I've been using neovim for about a year now. I don't think I've ever entered a state from a mis-click that is hard to recover from yet. The worst case scenarios tend to result in something taking comparable time to if a mouse was used.


I agree with everything, apart from metaprogramming. I'd say the issues with metaprogramming are issues with tooling (that you cannot easily inspect / debug the generated code).


This is why I love Go. Go feels like a language that is boring on purpose - few magic short cuts, and few ways of misusing some special language feature. Just a lot of boring, regular code which is easy to refactor and maintain at scale (IMO).


Various thoughts:

It seems to me that the fundamental definition of a function/method should be a compile time executed function that generates a syntax tree always (assuming the compiler dog-foods itself or can parse and interpret it). Much like Python has metaclasses or Rust's procedural macros but it's just assumed by default.

It's kind of unfortunate that a language with managed effects & capabilities hasn't gone mainstream. Maybe it doesn't have the right ergonomics yet.


I am pretty sure you can create a mess with Haskell too




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: