Ah, those were the days! TP v4, v5, v6... i'd graduated to C by the time 7 came out.
> That almost nobody has the skills these days to implement something like TP is something we could eventually come to regret – or, rather, our grandchildren could. Software development is becoming a semi-skilled industry, and I doubt that the rise of AI will entirely compensate for this.
If anything, the rise (read as: shoving-down-our-throats) of AI/machine learning will hurt the collective ability to code, rather than compensate for it in the least. So many modern developers take pride in using their CoPilots and related AI/machine-learning tools to assist in coding, all the while ignoring that such usage actively keeps them from improving their own coding skills, in the same way that copying a classmate's test answers might let the copier pass the test without actually teaching them anything.
When that day comes that AI can pump out a complete, working, bug-free application based on natural-language descriptions, we will have lost the ability to code entirely, as any code such a thing generates will never be maintainable by a mere human being.
_Sigh_.
Well, it'll be fun while it lasts. The Machines can pry my makefiles and C compiler from my cold dead fingers.
> "Ah, those were the days! TP v4, v5, v6... i'd graduated to C by the time 7 came out."
Mee too, and it is painful to realize how much ahead of its time the IDE was. I am talking about the TUI version and yet:
- It already supported a dual monitor setup - one monitor for the IDE, one for the app - live debugging was a charm.
- The debugger had everything current debuggers have, but it was fully IDE integrated - no messin' with GDB or JDWP.
- The context sensitive help was the best, I never encountered a better one. I think a large part of it was that it really was context sensitive and it tried to solve the problem at hand, regardless if it was caused by the IDE or the language.
I've never seen C as an improvement. Macros encourage all too clever code that's impossible to parse at first glance. The string handling is horrible. The fact that there are multiple contradictory ways to express pointers.
The only thing holding Free Pascal/Lazarus back is the documentation. There's no way to work on a small part of it, Wiki style, so it's always going to be an albatross. The auto-generated descriptions of variables used in library function calls just don't work as proper documentation.
On the plus side, it's fast and has all of the linking built in. You can stuff gigabytes of binary in strings and never have to allocate anything. It does object oriented stuff, generics, is strongly typed. It can spit out code for almost any platform, including WASM, Android, etc.
> the rise (read as: shoving-down-our-throats) of AI/machine learning will hurt the collective ability to code
You sound like Socrates lamenting that the invention of writing is hurting people's collective ability to think, since their brains no longer need to hold many things simultaneously.
That's the theory, but in practice if you are a capable developer you'll spend a lot of time fixing the AI generated code. And the other practice is that a lot of novice developers will try and use AI to code, then bother others to fix it if it doesn't work. They will rely on AI and not bother to increase their own skills.
I may be a luddite about it but I'm also not entirely wrong. That said, I'm sure the same was said about code generators and other features introduced by IDEs, like syntax highlighting.
> all the while ignoring that such usage actively keeps them from improving their own coding skills
That's an opinion not a fact backed by any data.
In fact it feels like "you hurt your hacking skills by using an IDE rather than a simple text editor, or higher level language than assembly, or managing your memory manually rather than using GCs".
Also, maybe coding skills will become more and more irrelevant for most developers the same way knowing how memory or CPUs work is largely irrelevant for all but a tiny part of the industry.
In fact, if AI assistants will keep getting better, the best skills to have will be ability to direct those assistants and overseeing their output while meeting business and product needs.
> That's an opinion not a fact backed by any data.
It's a well-demonstrated fact that, for the overwhelmingly large majority of people (mainly those without eidetic memory), copying someone else's answers, rather than coming up with them by applying one's own problem-solving skills, hurts one's ability to advance their own related knowledge/skills. There's a _reason_ we're taught not to cheat in school.
> the same way knowing how memory or CPUs work is largely irrelevant for all but a tiny part of the industry.
This is not true. A large part of the industry just _thinks_ these things aren't relevant to them and that's why we have such terrible software everywhere.
In my industry experience software is slow because:
- time to market is overwhelmingly more important than performance so the pressure is to have something that works, not something that works smoothly. You can fix performance or bugs later (Notion is an easy example of such evolution).
- the industry keeps being largely understaffed so finding resources to allocate to performance is hard
- most of the industry is managed by people that are neither good at product nor tech, but think they understand products and tech
The point is that if you understand how the high-level code you write gets translated to what runs on the actual CPU and memory then you write more performant high-level code by default the first time round. So you can get to market just as quickly. Doesn't mean there's not more speed or memory optimizations you can make later but you just raise the bar generally.
Knowing how the underlying system - CPU ram network db - works is the difference between using an ORM or SQL query to do SELECT * FROM db across the network, and filtering it locally with program code, and using a WHERE clause in the SQL, a pattern I've fixed for multiple startups.
I'm not ignoring your request, but I need to think about it a little as my day to day work isn't just pasting together APIs. If the only thing an engineer is allowed to do is paste together a fixed set of APIs then they may well be completely limited.
You mentioning Delphi reminds me of the Miniatur Wunderland in Hamburg, a large model railroad with a day/night cycle, cars driving on the roads and planes "landing" and "taking off", a lot of it controlled with Delphi: https://www.embarcadero.com/case-study/miniatur-wunderland-c...
I dont even want to imagine how that codebase looks. The delphi code I‘ve come into contact with (not a delphi dev myself but worked close to them) tended to be quite bad by modern standards. The reason why was generally similar: not very experienced devs in the 90s needed to something with a UI off the ground.
Its true. I think the brain drain has had a severe impact on the evolution of the language as it exists in Delphi. However, some implementations, such as Oxygene, appear to evolve rapidly.
Turbo Pascal was by far my favorite programming language in the 1980s. I switched to C reluctantly in college because it was portable to the 32-bit Sun workstations we had, then I was a Linux Zealot for a decade or so so I never jumped in the Delphi (sequel to TP) train.
In the early 1980s there was a push for “anything but BASIC” for teaching languages. Before TP, in my mind PASCAL had the stink of failure between the watching-protons-decay slowness of UCSD Pascal and the inadequacy of ISO Pascal to do anything but solve leetcode problems.
C on the other hand accomplished what PL/I had set out to do in a “worse is better” way. TP added enough of what ISO Pascal was lacking to be fine for systems and applications work like C but it was saner, didn’t have the null terminated strings which made strcpy() + a later return Turing complete in C.
TP felt just a little more disciplined than C, like a pop version of Ada.
I really do miss the days I was locked in an office with no internet connection with a free coffee machine and could be productive. Things were small, understandable and had documentation on paper.
Yes, the best time to be a programmer. I remember being several hours doing productive programming. Nowadays you cannot spend a few minutes without having to search for some information that is not available locally.
Is it possible that the information you need is available locally via interactive help functionality, but the attention harvesting complex that is the modern web has tricked you into habitually reaching for a browser when calling the help function or reading the code would do?
The overwhelming majority of the time it's not available offline. It's even got me like "Welcome to the hidden offline documentation. Stand by while we download it... <error downloading from server>".
everytime i code i feel like i need to complete side quests. you find one problem and the solution online requires to solve another 2 problems and so on. add to that the fact that our brain can't help and get distracted, by the time you finish the side quests you need time to remember what brought you there in the first place.
What? You are much more likely today to have shorter interruptions to productivity because it's so much quicker to search Google or whatever for answers than to get up from your desk and ask the office greybeard or consult paper books or worst case have to actually fiddle around until you just figure out some possibly undocumented behavior in some library you're trying to use.
If you were actually a professional programmer in pre-internet days (as I was) I don't think you would have any nostalgia for it.
I spent a fair bit of my career in that sort of environment writing embedded software on MSDOS. Have fond memories of that but I wonder if I re-created that today would I be able to mentally cope? Might be the technology equivalent of becoming a monk in remote Tibet with the same needed commitment and rewards.
A younger self used to look forward to the latest (paper) shareware catalogue arriving by (snail) mail, posting back the order form and waiting for the floppies to arrive. True story :-)
I was indeed around then. I built a fair few non trivial things on MSDOS and Windows with MASM 6, PDS7, VB3/4 and MSVC++. All offline. Mostly on airgapped networks. That and enbedded stuff with various tool chains.
We were definitely more productive. An order of magnitude more. As long as you picked appropriate tools. For example, I wrote a whole statistical analysis package in month (VB4). An ERP package in 6 months (access).
Code quality, which is difficult to objectively measure, was not really a problem.
Today I can barely even find a consistent UI toolkit which doesn’t fall to pieces, doesn’t require a server to run and doesn’t pull in 200 meg of untrusted JavaScript.
You still have to figure out things for yourself today. Novel things instead of tedious things like "why doesn't this function work the way the docs say it should."
Again, anyone who was actually a programmer back then and is now today would never have nostalgia for those days. It was not better before.
One of the few places these kinds of jobs still exist is within the universities, in various labs or research cores. Pay is often lower, but quality of life and work often way higher, with much more freedom under responsibility.
The builtin help in Borland IDE's was supreme. I used to everyday choose a random help page, read it, copy the example and modify it until I understood what it did. That's probably the best programming tutor I ever had and how I learned programming by myself in my teens without Internet.
It's funny how Denmark could produce two computer scientists with very distinct opinions on what makes a computer language good (I'm happy ate least one of them knows the right answer)
Either Rasmus Ledorf or Bjarne Stroustrup. Arguably, another Danish programmer contributed to the history of Pascal: Peter Naur (and also Jørn Jensen, but he was more into implementing ALGOL 60 compilers).
Honorable mention to DHH, although he didn't create a programming language. And also to Lars Bak for being the lead developer on V8.
Because, even though I hate it, web tech is the main way to build GUI apps nowadays. The demand for native apps in corporate settings has all but disappeared.
> But what really made TP 7 special was its CP/M heritage. Turbo Pascal came to Windows by way of CP/M and MSDOS, which meant it had a compiler core that was designed to run with tight memory and a comparatively slow CPU. In the days when most compilers had to write a stack of intermediate files on disk, TP just constructed everything in memory. [...] It doesn’t matter much today if your compiler spits out a heap of temporary files, because disk storage is fast and cheap. But in the days of the floppy disk, you had to minimize disk access. Unfortunately, you also had to minimize memory use, because there wasn’t much of that, either.
Of course, the “heap of temporary files” is itself originally an adaptation for low main memory. The genius of Turbo Pascal is exactly that on those early PCs main memory was large enough / mass storage was small enough that for the program sizes in question the architecture of many passes communicating via disk files just wasn’t worth it. As far as I know, that was never true on minicomputers that our compiler tech was born on, and it’s very much not true on PCs now. (Consider: DOS machines had 100s of K to perhaps 4M of RAM and a couple megs to several dozen megs of floppy or disk, amounting to a proportion of perhaps 1:20. The laptop I’m typing this on has 4×10¹² bytes of mass storage and 64G bytes of RAM, a ratio of about 1:60, and I specifically made sure to get an abnormally large amount of RAM.)
Just like porn advanced web technologies, videogames advanced PC hardware. Don't believe anyone who tells you that porn and videogames are useless, they were the recreative equivalent of the Apollo Program.
I mean, I know why I got that much—I really value the ability to load up a ~20GB dataset into Pandas or pipe it through AWK for a one-off (or ten-off) task and just not worry about the particulars. Running `make -j16` and having the system stay usable while the compiler chews through the hundreds of KLOC of (admittedly very... vertical-looking) C and C++ is also appreciated. Also, memory modules are cheap and I feel stupid paying the markup for basically the same ICs soldered down, and once I had the slots available, well, as I said, memory modules are cheap.
More generally, though, screens and syntax trees.
First, a laptop screen that an 19th century book printer wouldn’t call an atrocity against the art (say 200ppi and above), at 32 bpp, takes ~16 MB for a single framebuffer. At 60 Hz, that means your display panel interface is already pushing 750 MB/s of useful data (and I’m ignoring all the legacy cruft like overscan and vblank). Unsurprisingly, the GPU that does all that, for multiple layers, in floating point (so 96 or 128 bpp), is heavily pipelined. Feeding that pipeline adequately and without heroics means at least a length-3 swapchain, so already you have 48 MB of screen buffers for any full-screen app that you don’t need to wait to redraw after you switch to it (remember when we needed to do that?). If said app is showing something large and scrollable, and you count on being able to see the thing you’re scrolling (remember when you couldn’t? thank touch input for killing that), that’s probably also, what, three, four screens’ worth of prerendered tiles? And now we’re already pushing a hundred megs—just for the pixels. How many browser tabs do you have open? (Personally, I get really irritated when my browser gets an attention deficit and doesn’t immediately show me the tab after I click on it.) Or thousand-page PDF specs?
(I was recently installing an old version of Windows in QEMU, and the installer showed me the installation wizard’s window—with all the text and controls—first and then spent multiple seconds blitting a purely frivolous graphic of a computer with installation disks around it onto the left pane. I didn’t remember computers used to do that! At least until you got around to installing the graphics driver, anyway, I guess.)
Second, I’ve been mulling over a Web browser’s job, and I can’t help but conclude that it’s really bloody awful.
The existence of the mutable DOM means it has to maintain a full-fat syntax tree for all the HTML—a humongous soup of pointers in a 64-bit address space (so in modern browsers they are hand-rolled 32-bit not-really-pointers). On top of that it needs to have an acceleration structure for style recomputation accessed on the critical user interaction path (because :hover), as well as a layout tree with line boxes supporting dynamic Unicode- (even BiDi-) aware line breaking and perhaps even hyphenation on resize, and none of that should fall over if you load up War and Peace in the original Russian+French with the paragraph breaks deleted.
It’s old compiler lore that no syntax tree will be as compact as the source code it was constructed from. Browsers sound like the bottommost circle of syntax tree hell even more than a general GUI has to be. The situation is probably salvageable with a flattened representation like the one Jetpack Compose uses and differential execution[1] used before it. But it’s definitely a lot of work, and to my admittedly cursory knowledge noöne’s really working on it.
(This part inspired by the urge to clutch my pearls that I felt while watching Andreas Kling’s Ladybird videos—there’s a lot of very thicc pointer soup in there, and I don’t think it’s possible to morph it into something saner in a continuous fashion. But then I thought about the problem, and yeah, the problem sucks.)
So all in all I think single-digit gigs sound about fair, I think, and that’s how much my system typically shows when I’m not deliberately running heavy stuff on it.
(For what it’s worth, I once spent weeks cramming all data required for Unicode normalization into about 20K. It’s certainly possible, but as it turns out, three iterations of fetch-shift-popcount for every character is slow. Not “why do I need to wait for my computer” slow, but definitely “why does my computer’s SSD have to wait for its CPU” slow.)
I was around back then and there is a lot of nonsense in this article. The biggest one being overstating Turbo Pascal's role in Windows' adoption. Nobody cared what tools programmers used back then. MS-DOS dominated and Windows dominated because they were the OS that came on IBM compatible PCs. And that platform was by far the dominant hardware platform at the time. All the software users wanted was on MS-DOS and Windows 3.1 was backwards compatible with MS-DOS. Etc. Etc. (And while some of that software was written in Pascal, most of it was not.)
Pascal at that time was like Ruby or something is today. Definitely widely used but a far distant to the incredibly dominant C. There's no way to explain to folks today how dominant C was back then. If you were a software engineer learning C was just a given. You may or may not have bothered with BASIC, you may or may not be comfortable getting down to Assembly, but if you were serious about coding, you knew C.
The new hotness was C++. Object orientation was the AI of its day. TP jumped on it so fast with 5.5 it ended up throwing away that object model for Delphi a few years later.
TP had little impact on Windows, I agree. TP was for a particular niche, small software shops targeting DOS. And it was a big success, it paid for a nice building in Scotts Valley, but it also fuelled the ambitions of a bunch of executives who got derailed, getting into databases, a multi-pronged bet (Paradox, dBASE, Interbase) that didn't really pay off.
Don't forget about the time Borland really thought they could outsmart Microsoft on .NET and gave us the wonderfully horrible Delphi 8. To put it into a metaphor, Delphi 7 was like XP, Delphi 8 was like Vista. People still use D7 while Borland and Embarcadero wants to forget they ever tried D8 (and Delphi Prism which is just rebranded Oxygene).
And at about the same time they also tried doing a native Delphi for Linux, which was so bad they pulled out after 3 versions and never tried Linux again until recently (and even then, you can only cross-compile, you can't run Delphi 12 on Linux or macOS).
And they also tried getting into the VCS sphere with StarTeam (which somehow still received updates well into 2017... Despite it being made in 1995). I don't know why OpenText decided to buy it in 2023.
Let's not forget about Turbo Prolog and the time they almost did a Turbo Modula-2, but it was actually published by TopSpeed and now lives in Clarion. Ugh.
Borland... Err, Inprise... Err, Embarcadero... Or CodeGear? Or Idera? Who keeps count? Anyway, that switched so many hands it's sad to see. They were doing way too much. A lot of bad decisions made them less and less popular (not like their current greed is helping their case). Oh well. Long live Free Pascal.
One place I worked in the 90s had a bunch of meetings/tutorials to try to sell all the C programmers on Object Oriented Programming in general and C++ in particular -- the software in question was super complicated and massive and IMHO could have benefitted from some judicious application of OOP concepts like separation of concerns -- but it got caught in a rut of just constantly trying to explain OOP to middle aged programmers who had only ever written and thought in imperitive code.
I think this is fundamentally the same toxic thinking that is still with us today.
C offered no actual advantages over Turbo Pascal for DOS/Windows programming. I'm sure one can come up with platforms or releases that weren't supported by TP, since C was the defacto standard. Regardless, they're fundamentally equivalent.
Meanwhile, Turbo Pascal was blazingly fast compared to C. The C linker in particular used to take forever. Turbo Pascal was seen as a toy language. My AP Computer Science exam results were disregarded by colleges because it was in Pascal.
A few years later, I was a C++ snob and looked down on Visual Basic "macro hackers". Meanwhile, at least one true genius electrical engineer I worked with would rapidly prototype software in VB that would take me weeks to replicate in C++.
There's a weird disdain in software development for approachable, efficient tools that create actual business value.
Pascal with its tight type safety made certain things extra hard, like working with images. C on the other hand let you walk all over it. Pascal was very much my preferred language until I learned C - then I never looked back.
> The familiar ‘green tick button’ was an OWL enhancement, which graced so many Windows applications that it eventually became an irritating cliche.
Fun fact: this lived on in Delphi, TP's successor. The TBitBtn component, a button with an image and one of the oldest Delphi components (probably from Delphi 1?), has a Kind property that auto-sets all sorts of glyphs; Kind := bkOk sets it to a green check.
About a year ago the glyph was updated to look a little more modern :) But it lives on. I love occasionally seeing it on apps in the wild.
Many stacks offer hot code reloading. I am not a dedicated frontend developer but I used to work on a Vue 3 frontend for a project and it was actually quite nice to have the dev service watch the files and the browser hot reload the page when I changed code in the editor. It's not all bad.
Was TP for Windows considered TP7? I spent my youth in Turbo Pascal 7 writing graphics based apps in DOS and never even know they made a Windows version. Not that my 286 could run Windows.
If memory serves, RAM was my reason why I couldn't. Not so easy to upgrade as it is today. I tried it once or twice, using some HDD utility at the time that partitioned off some space for RAM like a modern day swap file.. and it did not go well.
I also started learning programming in high school days (2001 - 2004) using TP 7.
Fast, affordable internet connection wasn't pretty common on those days, so have to rely on internet cafes and books/magazies.
As the name implies, it's indeed TURBO. I mean, to achive similar snapiness when running Eclipse or VS, you'll need pretty powerful machine.
The legacy is carried on by Free Pascal folks (there's the text based IDE). I only use this for fun/nostalgia purpose. For actual work, well there's Lazarus IDE.
FTA: “That almost nobody has the skills these days to implement something like TP”
I don’t think that’s true. Percentage-wise, the number may have dropped, but there are many more programmers now than there were back then.
I even am not convinced that percentage is lower today. There are plenty of people in the demo scene who, if incentivized, could write something like it.
You also have to realize that TP 7 is relatively simple compared to today’s development environments. To mention a few items:
- pascal is a lot easier for an IDE than C
- no messing with character encodings
- the editor may be fast, but does it stay fast when confronted with very large files?
Flashbacks to high school CS back in the 90s. Started with basic then moved on to this. I understand now they use Java to teach kids, and I can't imagine trying to learn to code with that language.
This was the first easy way to write Windows software. Our company had develped software to send pages to Motorola pages via the dial-up interface provided by most pager providers in TPW 1.5/ TP 7
>But who needs a Turbo Pascal, when you have 128Gb of RAM attached to an 5GHz CPU, for a hastily-written, bloated Java program to burn?
I mean we are in 2024 and should be well past the Java joke. Not only is modern Java seriously fast and could be efficient as well as natively compiled with Graal. ( Where 20 years ago GCJ was experimental at best ). The modern equivalent should really be Electron.
But back on the topic, I really miss Pascal and Delphi.
There are still aspects where Java is lacking: JVM startup time and memory usage. While you can work around the former with Graal native images, this messes with dynamic loading of classes and with reflection, which means you can’t do it for any random Java application or library.
At least with Java you can select a GC which is more aggressive in giving memory back to the system (Shenandoah and ZGC). With Electron/Node you don't even get that (not to mention the fact Electron apps always spawn multiple processes).
The "Java resource hog" is still a thing...in a few areas.
As an Android dev, I still see complaints (most likely by beginners) saying how Android Studio consumes much RAM and asking if it's possible to replace it with.. VSCode. I still remember doing Android development using Eclipse on a 2 GB PC and it was reasonably okay-ish. Of course it's practically impossible to use Android Studio on such machine. 8 GB is bare minimum.
Well fair point. During my university days, there were 2 prominent Java-based IDEs: Eclipse and Netbeans.
Even after installing some plugins, Eclipse ran faster, and thus become my primary IDE. I kinda miss Eclipse. On the other side, seems like most Java dev nowadays already switch to IntelliJ.
The article ends with an unwarranted jab at "bloated Java apps". If anything, Java nowadays is like Turbo Pascal back then, striking a reasonable compromise between complexity, readability and performance when compared to Electron apps.
Ah, those were the days! TP v4, v5, v6... i'd graduated to C by the time 7 came out.
> That almost nobody has the skills these days to implement something like TP is something we could eventually come to regret – or, rather, our grandchildren could. Software development is becoming a semi-skilled industry, and I doubt that the rise of AI will entirely compensate for this.
If anything, the rise (read as: shoving-down-our-throats) of AI/machine learning will hurt the collective ability to code, rather than compensate for it in the least. So many modern developers take pride in using their CoPilots and related AI/machine-learning tools to assist in coding, all the while ignoring that such usage actively keeps them from improving their own coding skills, in the same way that copying a classmate's test answers might let the copier pass the test without actually teaching them anything.
When that day comes that AI can pump out a complete, working, bug-free application based on natural-language descriptions, we will have lost the ability to code entirely, as any code such a thing generates will never be maintainable by a mere human being.
_Sigh_.
Well, it'll be fun while it lasts. The Machines can pry my makefiles and C compiler from my cold dead fingers.
PS: _GET OFF MY LAWN!_