Having had to work in a large pascal codebase, I don't ever want to use this language again.
No metaprogramming in the language meant we had some parts of the code that were written in pascal to generate pascal code before compilation of the main project.
The code was so verbose, and to get the same thing done in alternative languages would have easily been twice as shorter.
Refactoring always took twice as long as I thought it would. Just moving semicolons around was enough to break my concentration when I was in the flow.
I think the kicker was identifiers being case insensitive. That alone would have been enough to drive me crazy. People complain about Nim's case insensitive features a lot, but Nim's implementation is actually good and orders of magnitude better than Pascal's.
Also hiring a good pascal programmer was next to impossible for us at the time.
I don't know why anyone would pick Pascal today over Nim, Zig, Rust, Julia, Go etc.
> No metaprogramming in the language meant we had some parts of the code that were written in pascal to generate pascal code before compilation of the main project.
Weird...if you are talking about Generics, FreePascal has them and work just fine, plus it compiles instantly!
Here's a demo:
program UseGenerics;
{$mode objfpc}{$H+}
type
generic TFakeClass<_GT> = class
class function gmax(a,b: _GT): _GT;
end;
TFakeClassInt = specialize TFakeClass<integer>;
TFakeClassDouble = specialize TFakeClass<double>;
class function TFakeClass.gmax(a,b: _GT): _GT;
begin
if a > b then
result := a
else
result := b;
end;
begin
{ show max of two integers }
writeln('Integer GMax: ', TFakeClassInt.gmax(23, 56));
{ show max of two doubles }
writeln('Double GMax: ', TFakeClassDouble.gmax(23.89, 56.5));
end.
> > No metaprogramming in the language meant we had some parts of the code that were written in pascal to generate pascal code before compilation of the main project.
> Weird…if you are talking about Generics
But aren’t they very clearly talking about metaprogramming, which is a completely different thing than generics?
Yes, but this example of matrix multiplication does weird stuff:
{$MODE DELPHI}
{$modeswitch advancedrecords}
type
TMatrix<T, R, C> = record
fCoordinates: array[R, C] of Double;
end;
function mul<T, R, X, C>(A: TMatrix<T, R, X>; B: TMatrix<T, X, C>): TMatrix<T, R, C>;
var
i: R;
j: X;
k: C;
begin
Writeln('yep');
Writeln(High(R)); // 3
Writeln(High(X)); // 4
Writeln(High(C)); // 3
for i := Low(R) to High(R) do
for k := Low(C) to High(C) do begin
Result.fCoordinates[i, k] := 0;
for j := Low(X) to High(X) do
Result.fCoordinates[i, k] += A.fCoordinates[i, j] * B.fCoordinates[j, k]
end
end;
type
R1 = 1..3;
R2 = 1..4;
var
A: TMatrix<Double, R1, R2>;
B: TMatrix<Double, R1, R1>;
C: TMatrix<Double, R2, R1>;
begin
mul<Double, R1, R2, R1>(A,B); // should fail because B would have to have as many rows as A has columns--and it doesn't.
mul(A,C) // does not work even with a right-size C--even though it should
end.
The second bit is because you lack the explicit specialization - you must always tell the compiler how to specialize a generic. Personally i prefer to use generics in objfpc mode as you are more explicit that way. However FPC trunk can allow implicit function specialization if you request it with {$modeswitch implicitfunctionspecialization}. Your code compiles with that.
The first bit is an interesting case and i wonder why it happens. It seems the root cause is that it considers the two type specializations equivalent as even ignoring the generic functions something like "A:=B" is allowed. I trimmed the code down to this:
{$MODE DELPHI}
type
TMatrix<R, C> = record
X: array[R, C] of Double;
end;
type
R1 = 1..3;
R2 = 1..4;
var
A: TMatrix<R1, R2>;
B: TMatrix<R1, R1>;
begin
A:=B;
end.
This is clearly a bug and not intentional behavior as doing the specialization by hand shows an error as expected. Also it seems related to multidimensional arrays as with a single element it also shows an error. Note that the other way (B:=A) doesn't work, so my guess is that at some point the type compatibility comparison only checks if assigning one array would fit memory-wise into the other.
I'll check against latest trunk later and if it happens i'll file a bug report.
The types themselves count as assignable since the values can be compatible and unless the compiler can tell statically a value is out of range it wont complain at compile time (but will at runtime if range checking is enabled).
However using the ranges for arrays is not compatible, if you have a A: array [R1] of Double and a B: array [R2] of Double you can't assign one to the other.
Free Pascal had support for generics since 2006, predating even Delphi's. While that isn't as long as Wirth's Pascal, it still is almost two decades old which i wouldn't call that recent.
From the point of view of someone who has worked on MLOC delphi projects for seven years:
* The language has evolved. Delphi's pascal compiler has generics, lambda expressions for both proc and function, it has containers, map and filter functions.
* Design-time: You know how your code will behave without running it, as you are dragging and dropping components. It will draw data from your database and show it in dropdowns and grids. You have fine-tuned control ober widget alignment and anchoring .
* Data modules: Having your orm and data layer as composable components enforce separation of ui and db operations.
* Devexpress: They make the best grid components by far. I use their web products as well.
For the cases Delphi is a great choice, I'd steer clear as far as possible from julia, zig, go and nim. Maybe C# comes close.
> I don't know why anyone would pick Pascal today over Nim, Zig, Rust, Julia, Go etc.
When writing desktop apps probably the familiarity with well known IDEs such as Delphi or Lazarus makes the difference. Unfortunately decades passed, still nothing comes even close to them for rapid GUI development.
Should one day Lazarus support different languages such as the ones you mentioned, we'll probably hear a rumble around the world. However things in certain fields progress much quicker, and I wouldn't be surprised at all if in some time we could use AI to analyze a drawing then create the corresponding desktop interface description to be linked with non GUI code.
Lazarus is a compelling reason to pick Pascal. But I share your pain and had been looking for alternatives for machine-code-compiled RAD GUIs. To add, only modern Delphi supports var declarations that can be placed elsewhere other than the top of the function.
Actually, the PascalABC dialect[1][2] supports using variables in that way too. However, I've seen arguments that allowing variable declarations anywhere tends to lead to sloppier coding and unnecessary errors, so Free Pascal/Lazarus has not followed that trend. It also appears that Delphi/Embarcadero may have went in that direction to synchronize their C++Builder and Delphi products, to make it easier to jump between them.
If I had to do a desktop app I’d with Java Swing because of the Java backend libraries and developers. Were I yo ignore those factors I’d go with Gambas for the drag & drop gui.
When were the last time I code in Swing/JavaFX? 4 or 5 years ago, perhaps.
The 3rd party libs available for Delphi/FPC are more than enough for my needs.
I hate Python, and I've met others who feel the same.
It's a language which had a reason to exist 25 years ago, but has since been surpassed in every way by other faster, more robust, more compatible languages that don't have such quirky syntax. So a lot like Pascal actually. (To be fair to Pascal, its syntax wasn't anything as inconsistent and frustrating as Python's.)
Python has become the lowest common denominator. It's a practical "no-frills" language that can be picked quickly by a sysadmin, a web developer, a scientist doing number-crunching or a kid automating their homework.
The problem with the “no-frills” proposition is that it’s loaded with frills, complexities, and decades of tech debt. Just installing dependencies for a Python project can feel insurmountable.
IMO JavaScript should be the default for any Python use case. It has the same deceptive veneer of beginner-friendly dynamic behavior, same kinds of footguns; but at least JS has a single package manager, much faster optimized engines, and you’d have to learn it for front-end work anyway so it has long-term dividends.
Having used both Python and JS, the only real benefit the latter has it may be faster.
The language has way too many gotchas. I wish they could shed all the legacy aspects to it. My experience with Python, along with many other folks, is that if we don't know something (API, syntax, etc), we often just guess it and it turns out to be right. Fairly intuitive.
Agree with the other commenter: Using pip + venv tends to solve the majority of packaging problems. In my career I deal with a Python dependency headache once every few years.
> and you’d have to learn it for front-end work anyway so it has long-term dividends.
One of things I really like about Python is the huge standard library.
Contrary to JavaScript (which I think is a really weird recommendation), where you need to npm install half the world.
It wouldn’t matter if those dependencies where stable, but JS has a culture of abandon libraries or to make backward-breaking changes every week, so upgrading a project after just a few months is a major pain
I've learned Javascript and a framework or two and I hate it with passion.
I patiently wait for the moment in which we can manipulate DOM from within Weabassembly. Meanwhile I shamelessly push for Blazor for front-end, wherever I can.
Python is absolutely not a "no-frills" language. It is loaded with features, I'd be willing to bet that 99% of python programmers don't know all language features (even relatively old and frequently used ones like metaclasses) and that virtually nobody knows the whole stdlib
> It's a language which had a reason to exist 25 years ago, but has since been surpassed in every way by other faster, more robust, more compatible languages that don't have such quirky syntax.
The same could be said about C++, but yet, here we are. Stroustrop's quip comes to mind: "There are languages people complain about, and languages no one uses". Just look at how much people complain about JS.
I don't know how old people here are, but from the early 2000's till probably the mid 2010's, Python really was the great language, with few alternatives - at least when you consider the libraries available for it.
It's not an accident that people began using it heavily for numerical computing. The only viable alternative at the time was MATLAB.
I think this sentiment underestimates the amount of work people do on existing codebases.
The most reviled languages are often previously loved languages that made writing code easier at the cost of reading and understanding it.
Team 1 secretes reams of instant legacy code, but get money and promotions for shipping it. Team 42 gets handed a steaming pile of manure that is beyond understanding. They then pick up the pitchforks because it is obviously the language's fault.
So, yeah, it isn't the language so much as the average incentives and behaviors around development.
Hahah. I like Objective-C with ARC (in spite of its complexity and limitations), but Swift is more compact and I'm probably never going back (except maybe for hobby projects like writing something for GNUstep.)
I also like how Objective-C++ can mix in C++ code bases. It's an amazing chimera of a language.
Most professional programming is done in already existing code and new projects often have to respect existing standards. You rarely get to choose what language to use.
I was recently asked to recommend a language for a new team. My boss (who won't be writing or even reading any of the code) suggested Python. His reasoning: it's what the kids are learning in college these days.
So here I am as the team Senior, looking at current and future team members and deciding on whether to be selfish and choose what's good for me, or trying to figure out what makes something good for the team. Is it better to pick something with a large hiring pool? Is it better to pick something with fewer footguns? Is it better to pick something with higher performance? Is C# really ok if we need Mac support? Is Swift ok for one-off glue apps that run on Linux? Is Zig too young? If a team of five has a Java geek, a Rust zealot, a C# fan, and somebody who only knows Python, can we just agree to all learn and use Go? If I know my replacement will undo whatever I choose, does it really matter?
Not always, but the most languages I have used professionally were my choices.
That means x86, assembly, C, C++, C#, Python and half of Javascript. For Javascript I mean half because I enjoyed doing simple things in JS on front-end long time ago, but these days I kind of dislike JS frameworks and I've learned them because I had to after I signed some contracts. I could have stayed 100% with backend work, but now is too late to complain.
I managed to steer clear of Perl, Objective C, Cobol and other languages I don't enjoy. I managed to learn F# and some Ocaml which I would like to use but never got the chance (personal projects excluded).
Yes, I do understand that, having programmed in business environments with legacy code for over a decade.
I've hated the poor decisions my predecessors had to make, the corners that were cut.
I've hated that it's not maintained or documented.
I've hated that it doesn't have tests.
Or the seemingly lost successor, Oberon. An entire graphical OS and compiler was written in, and completely documented, in a book: "Project Oberon: The Design of an Operating System and Compiler" [0]
But yes, modern stuff should not be judged by the 1970 edition. I got soured on Pascal by having to use an ISO standard compiler on an embedded system; it was painful in several areas. But I shouldn't judge modern C by pre-ANSI C, or modern C++ by cfront. So, while I still say that ISO Pascal was deeply flawed, I shouldn't hold that against all Pascal versions.
Totally agree. Various people seem to be out of touch with modern developments, misinformed, or are being disingenuous (as really advocates for a competing language). A lot of the discussions surrounding Pascal are as if Object Pascal (and its dialects), Delphi, Lazarus, or Oxygene (RemObjects) don't exist or never happened.
Python is the Settlers of Catan of programming languages. Most people have a different favorite game they would rather play, but enough people don't hate it that you end up playing it anyways.
I don't hate it, but it's just mediocre. Python is fine for short scripting as long as you don't need dependencies (which still remains rocket science in Python world), at that point it's better to look elsewhere.
Dependencies in Python have been sorted for years. Pip works great for install and pip-tools compile works great for locking. Please stop spreading this untruth.
It is a current truth that there are modern apps of interest to me written in Python that I can't install because I need some special dependency manager or environment manager or something and it looks like I will do serious harm to my system environment if I follow the installation instructions.
I'm not sure I understand you. If the project uses something "special" like poetry, hatch, pdm, etc., you only need to install those for development. If you simply want to use the project, they all should be installable directly with pip, as they all have pep517-compatible backends.
Even then, how can installing those tools harm your system? You can always use pipx and have each tool be installed in its own virtual environment.
My biggest problem with pip-compile is that it's really slow. Pip-compile runs can take upwards of 20 minutes, depending on how many dependencies your project has.
We use it because we don't have any alternative. I'd really love to have a faster tool though.
No, I mean the object file defines the public symbols at the start of the file. Go very deliberately stole that feature. Rob Pike said that using that idea let Go do something faster when compiling, though I don't remember the details.
There is some embellishment in that statement, but people coming from Pascal/Oberon would likely feel much more comfortable using Go, than other C family languages. This would also include Vlang, and to an extent, Odin as well. These two have both been heavily influenced by Go and Pascal.
FWIW Free Pascal's FCL has the fcl-passrc package that provides units for scanning FPC code, building syntax trees, resolving identifier references and writing/formatting source code.
Free Pascal comes with pas2js which "transpiles" Free Pascal code to Java Script and is written using fcl-passrc so it should have enough functionality to parse most FPC code.
Go neither, but parsing and AST are in the standard library, and code generation is in the standard build process. All it requires is an (ugly) comment line in your source code. So you've got decent and standard tooling. No make magic required.
I've looked into this abandoned gcc port a while ago, and try to resurrect it, esp. after they added modula2. Didn't get far.
It produces of course far better code than free pascal/lazarus. It could even beat all other gcc backends. As you see in some gcc benchmarks, the modula2 part produces the smallest and fastest code of all backends, esp. C++ and C.
Which would be nice for safer system languages, since we have only ADA/Spark. And ADA is lost in commercial/military hell, even with Spark.
But we would really need a Concurrent Pascal or Oberon dialect, similar to Go.
Nowadays I would just add dialects on top of modula2, like the different C standards.
Seems abandoned. But Lazarus is doing well, although I doubt there are many Pascal users.
There was a time when Pascal was almost as popular as C. In school we learned Pascal after learning Basic and before learning C. I think in US the East coast preferred Pascal and the West coast preferred C.
I feel the same. I had learned c and x86 asm before they taught pascal, and I was annoyed that I was abstracted from absolute control of the hardware even a little. There was no internet, and all the bbs were long distance, so I had focused on diving into the low level details of a computer andnit didn't expose them as well. Like you, in retrospect I like a lot of aspects of the language.
These are exactly my thoughts. Maybe Pascal should have been given a bit more attention. While I preferred C (it seemed more "pro" and advanced) and then C++, many of my school mates preferred Pascal and then Delphi.
A good argument can be made that it comes from the machinations of Bell Labs (AT&T) and their promotion of C (and later C++). Debatably, a concerted effort was made to attack and belittle Pascal in the media, to get it out of the way. Many people seem to refer to Pascal with a very odd hostility or unknowingly regurgitate talking points originating from Bell Labs, and as if the language froze in time and stayed the same as it was in the 1970s.
Well, a language that's friendly to newbies will also be used by newbies (also leading to some projects or code that are maybe not that good), so more advanced developers (or those who consider themselves more advanced) will have a tendency to turn their nose up. Something similar is currently going on with Rust vs. Go I suspect...
Just that Go has 10x developers and projects that Rust has. And most Go projects are commercially supported while Rust projects are mostly done for free projects in the line of "let's rewrite X in Rust".
If you follow the money, there's where the developers are. And where the developers are, there is the success of a programming language.
If it ever was a Rust vs Go contest, Go already won and by a big margin.
I can't explain it very good, but I feel like a lot of weird people enjoy Rust and Scala for all the wrong reasons. It's like they enjoy the pain and want that pain to be inflicted to others so they push for those languages.
I agree with your premise, not your conclusion. I have resigned to rust eating the software world. It's not backed by anything, yet it creeps in everywhere.
You can stop it, but it's hard work and if you ever has as much as a hairline crack in the Porcelaine, Rust will get there, eventually.
To me it was a lot that it was so wordy and had weird syntax. Semicolon as a separator instead of "end of this line or statement" felt just insane to someone who just had understood (not really, even) how an assembler worked.
I think if it had more C like syntax and was sold as "C, but hardly any footguns", I would have liked it much better. I don't even know what was Pascal and what was Borland stuff, but I remember that I was reluctantly very impressed with its "module" system or whatever it had instead of .obj and .h files.
Pascal was the more restrictive and more verbose language, whereas C had more expressivity and terseness. Most of what could be written in C, could be written in Pascal, minus the hacky stuff.
Yeah, now it only needs a VSCode plugin that does auto-formating (no need to explicitly type keywords in uppercase), and maybe some package management of sorts.
I have very fond memories of Modula-2. I hacked a lot of cool stuff with it (also with inline assembler) as a teenager. When I eventually switched to C and C++ because I was curious about learning, the overall experience was much worse due to the build system and much slower compile times. We were talking orders of magnitude here.
Pascal pointers are different animals. Since it is a strongly-typed language, juggling between ints and pointers or between pointers of different types is not allowed. Pointer arithmetic on a byte basis is also not straightforward.
That's... not true? You have `Pointer' for generic pointers and you can use `^' to indicate any type should be pointer based. You can then use Inc() or plain addressing plus arithmetic. And you can use `^' again on the right-hand side to dereference it.
And you can certainly juggle between different types. You can cast things in Pascal.
Pascal was a lot like BASIC in that every version added their own (often incompatible) stuff because the official standard language was so limited. Certainly you couldn't drop down to assembly in standard Pascal -- that sounds like something Turbo Pascal added and may be present in modern versions because of the popularity of that.
It’s less to do with limitations of the language and more to do with variations in compilers (each vendor wanting to add their own flair to sell their compiler) and a lack of standards.
You saw the same with C and C++ before they were standardised.
I do like C but there’s no denying that it makes it ridiculously easy to fuck up. Whereas Pascal enforces correctness. Some view that as annoying verbosity but you can’t please all the people all the time.
dropping assembly was the killer feature for me. It allowed to migrate gradually from a very easy to understand language to more performance code, without having to understand what a linker does.
I was blown away when a colleague of mine wrote a little Pascal program which rebooted the computer in the lab. He wrote something at some memory address or port and the computer was running DOS.
My learning history is similar to yours Basic->Pascal->C. I feel it was a lot easier learning C by already knowing Pascal than it would have been coming directly from Basic.
Yes, the UCSD p-system virtual machine -- basically the same idea as the JVM and .NET but decades earlier -- compile to a virtual machine and deploy to anywhere you had the VM deployed. It was the basis of the Apple Pascal system which was basically an operating system onto itself (on the Apple II, it was used most famously for the early dungeon crawler Wizardry)
From GNU I wish projects like GNUStep + Applications were much easily built instead of having to do a chain of compilings. Also, Gnome was promoted from GNU back in the day and now it's corporateware too tied to profit/XML/complex settings so they gain lots of money on support but hindering the libre projects depending on GTK+.
Looks interesting, thanks for the hint; does it also support the Raspi Pico? Seems like there is an accumulation of Pascal offers for embedded in Australia; here is another one: https://www.astrobe.com
>Note that the Raspberry Pi Pico is not supported by Ultibo core as it is based on a microcontroller instead of a microprocessor and is not able to support the features required by Ultibo core.
No metaprogramming in the language meant we had some parts of the code that were written in pascal to generate pascal code before compilation of the main project.
The code was so verbose, and to get the same thing done in alternative languages would have easily been twice as shorter.
Refactoring always took twice as long as I thought it would. Just moving semicolons around was enough to break my concentration when I was in the flow.
I think the kicker was identifiers being case insensitive. That alone would have been enough to drive me crazy. People complain about Nim's case insensitive features a lot, but Nim's implementation is actually good and orders of magnitude better than Pascal's.
Also hiring a good pascal programmer was next to impossible for us at the time.
I don't know why anyone would pick Pascal today over Nim, Zig, Rust, Julia, Go etc.