One of my favourite computer games was written in Turbo Pascal: ZZT, written by Tim Sweeney of Epic Megagames. It was a quircky text-mode game with puzzles, shooting, and so on. It had a built-in game editor that came even with the free shareware version, and it even had a little programming language called ZZT-OOP.
ZZT's original source code was lost. Years later, Adrian Siekierka painstakingly reverse-engineered the original Pascal code till -- when compiled with the original version of Turbo Pascal -- produced a byte-for-byte identical executable. Amazing! Read more:
One of mine too, a really special bit of software. It was my real introduction to programming too. I'd typed in BASIC listings from magazines and done dumb stuff on the school BBC Micros before, but ZZT-OOP on my first PC was where I learned to make things I wanted from scratch.
Turbo Pascal is what got me into programming. I remember spending hundreds of German marks on a license for Borland Pascal 7.0 and later Delphi 1.0 and 2.0. I ended up developing my first “commercial” software that I sold for money.
In the DOS era, Turbo Pascal was probably the easiest way to get into programming, outside of Basic. And on Windows 3.1/95, Delphi was eye-opening how easy GUI programming could be.
In many ways, I feel like we have gone backwards from there. How was it possible for the Turbo Pascal / Delphi compiler to produce a small binary for a fully-featured GUI program when today similarly powerful software is orders of magnitude larger in size?
Changing the code and seeing the results was even on the machines of that time practically immediate with the first versions of the Turbo Pascal, as long as the program and the source were able to fit in the RAM. And the interaction was designed to be immediate too. If I remember correctly the compilation error didn't cause infinite screens of report, it just repositioned me to the line where the error was. If the cause was a typo, I was able to edit recompile and run in a second. Most more recent tools have other ideas, requiring more unnecessary activities from a developer.
> fully-featured GUI program when today similarly powerful software is orders of magnitude larger in size?
That's because the "fully featured" of the 90s would be barely usable today. Or to rephrase: the frameworks and programs of today are not "similarly powerful" to the ones from the mid 90s, even if you just recompiled the app from 25 years ago with the current version of your framework (theoretically ;), it would gain support for unicode (codepages where just...), internationalization, accessability, support for networks (who still knows Novell?),... . Except for SAP, they somehow succeeded at combining the user-hostileness of the 90's UIs with the resource consumption of contemporary programs ;).
Turbo Pascal had been the language I used after BASIC, with which I started.
Joel Armstrong once said "You wanted a banana but what you got was a gorilla holding the banana and the entire jungle.". That's a very catchy metaphor, although he was talking about C++ style OOP with bloated classes. Still, it feels like it could be generalized to software bloat, rigth?
What you're arguing seems to be "you think you just wanted a banana, but it turns out the customers needed various banana sizes, levels of ripeness, even special alternative hypoallergenic banana breeds, so instead you get a gorilla to fetch you the right banana from the jungle for each situation."
I do think that there is truth to that, but to be honest I think that in an ideal world that would account for some these orders of magnitude, but leave out the majority of them. It is really, really easy to underestimate just how much faster and bigger computers have gotten in the last thirty years.
What I find a more compelling argument for the majority of this increase in software size and CPU usages is that letting the software bloat and slow down the level that the customers tolerate is a form of externalizing costs for developers. A lot of developer convenience comes at the cost of the end-user, imo. And even if the developers care, then the companies that employ them don't mind saving money that way.
> That's because the "fully featured" of the 90s would be barely usable today.
Saying that in a thread about TurboPascal is strange. I mean notepad needs about 5 seconds to start on my windows 10 work computer. And the usability of windows 10 is ... not.
Depends on what you mean for SAP. Reports at least are super easy to write and have better usability than other in-house stuff I saw. UI5 now has components that you can use with React, which give the web experience. SAP GUI is kind of okay, and fast. What do you mean?
Ah, the mark. It's been a while. Maybe this was just in Britain but the name of the mark was never translated into English when spoken or written, it was always just the Deustchmark.
Similarly, the pound is often refered to as "pound sterling" or "British pound" outside of Britain. I suppose it is because outside of the home country, "30 marks" and "30 pounds" are ambiguous terms without the prefix/suffix
Unicode support is cheap memory wise. The expensive thing if having the fonts. The fonts can live on the hard drive until actually needed.
I suspect the issue is more of the runtime overhead of supporting it over the entire stack. That's a bigger cost that people would only want to pay if they were actually going to use it. So then you're in the situation of having to support unicode and non unicode versions of everything.
Most programs written in Pascal were UI programs (even in client-server model client was typically a specialized program, not browser), so rendering a string would indeed require a font file. You could pre-render and cache in RAM some frequently used glyphs (locale-specific alphabet, digits etc), but hitting HDD every time to render an emoji won’t be fast enough. Modern Unicode was simply not feasible.
If you use an emoji load it into ram then. You aren't (generally) going to be using Arabic, Chinese, Japanese, Sanskrit, ancient Egyptian and emoji all in one document on one screen. If you're using Japanese load the Japanese font, if Chinese, use the Chinese font.
HDD and memory sizes were growing very fast back then. So it would've been feasible even on fairly low-end hardware, starting from the mid-1990s or so. If you could have "multimedia" or "DTP" software on such PC's, modern fonts ought to have been possible. The flip side is that old computers became obsolete very quickly back then, a few years were enough for very real generational changes.
Low-end hardware in 1995 was something like i386 with 1 Mb RAM, meaning that just a modern Unicode font would consume most of the memory available to a program, probably leaving no enough space for the rendering code (which is not small by standards of that time). In 2nd half of 1990s I still maintained a classroom with 15 IBM PC XTs, which were still doing their jobs (our most modern hardware were Pentiums with 16 Mb RAM IIRC).
That isn't unicode though. That's a font. You can have bitmap unicode fonts if you want.
Let's put it this way. Say you have a unicode aware library and only ever use the ASCII compatible codes. You aren't using more space for fonts.
If you want to read a Chinese document, yes you would need to then install Chinese fonts. That would take space yes. But it's possible. If you only speak Chinese that's something you have to deal with.
Could you have a font for every unicode point at the same time, probably not, but most people don't need to read most code points most of the time.
Today you can just use Lazarus and recompile. The lower of the bottom machines are Atom netbooks, Core Duo's and Raspbery Pi's.
I'd consider using an Atom or RpiB+ the same as a 496 in 1999.
On fitting in RAM's, in depends. From 1993 to 1998 the changes where huge.
Yeah, sure, about two DM to the Euro. 20 years, I meant, not 30.
Betcha it doesn't mean the same in Yorkshire as in German, though: "Teuro" comes from Ger. "Teuer", expensive. (Cognate of Eng. "dear", as in "paid dearly".) Because wages halved, numerically, while many claimed prices were about the same in Euro after the change as they'd been in DM before it. Or, if not the same, still way above half, as they "should" have been for wage/price-parity to remain the same.
The main thing I recall about TP was that it came with simple, easy to understand code examples for every function that were organized logically and easy to find and use. So as a teenager who barely understood basic I could teach myself TP without the internet, just using the IDE.
So many modern systems do not have anything even close to this. I wonder if its inherent in the nature of the simpler x86 DOS based systems of the time, where its simply impossible with modern pace of change, cross-platform needs, and complexity to have something like that.
Processing (P5) had this: you can select any string of text in its IDE anl search for it in the docs, and if it's one of the built-in functions or constants it will open the associated static html page that came installed with the software, so no internet nor server required. And despite being offline you can still navigate the docs too. This feels a lost basic skill in static site generation these days.
It was the only creative coding framework that had complete, offline documentation like that at the time I might add. OpenFrameworks is still mostly autogenerated stubs for example.
IMO it was one of the things that gave Processing an edge in educational contexts over all alternatives. I was pretty sad to see p5.js not fully continue that tradition and require that you go online to read the docs, and that it's not a static website but that text is rendered with javascript when you open it (still complete and with examples though).
It is, but I haven't used it in a long time so didn't want to imply anything about the current implementation, since I don't know if anything changed in the last decade. I hope it's still as good as I remember!
> I wonder if its inherent in the nature of the simpler x86 DOS based systems of the time
I also learned Turbo Pascal without any books (I had a Basic book, because that came with the computer). But it was way easier. Drawing pixels was as easy as just selecting your (double buffered!) display mode and, well drawing pixels. Same with the reading of e.g. the joystick port or the mouse, you'd just read it.
Same here. I'd been using GFA Basic for a couple of years when I came across Turbo Pascal which really helped - being a version of BASIC with code blocks, variable types including records, functions and procedures and even references - so I picked up Pascal just by reading the help documentation and studying the numerous very clear and helpful examples included on almost every page.
This was a good couple of years before I discovered the Internet at university so there was no easy access to learning resources. I'd tried to pick up C a couple of times without much luck before learning TP, and it wasn't until after that that I then was able to transfer what I'd learnt from Pascal to C - although again that was aided by another great Borland product, C++ Builder :)
Turbo Pascal was the first "real" programming language I learned past 6809E ASM, structured basic flavors and batch/shell back in the very late 80s. When I went to college for CS in the early 90s, they were still teaching the first year of classes in Borland Pascal so that turned out to be very useful experience.
A few years later, my Pascal background sailed me into a half-decade stint as a custom app dev working with Delphi. That, in turn, led to a job at Borland testing the Kylix and C++Builder IDEs. Good times.
I still miss Delphi. Nowadays it's normal to have IDE plugins to the point it's a dealbreaker if they don't exist. But at the time the idea of a component library and coding environment that was easily extensible in its own language was pure magic. I had so much fun playing with both VCL and the IDE itself.
It's a shame Borland left the grassroots devs behind in their quest for the enterprise market, and more or less killed any growth in community adoption. It was a really cool setup, at least through Delphi 7 or so.
In my country Delphi was the most popular dev environment by a mile. It was really bizarre, with most of the world going for VB 5 and we in Delphi. It always seemed to me like a fantastic environment but for some reason it only took off in a couple of markets (my perception as an outsider).
That's close to the route I took, although I switched to Visual Basic 5 later on no thanks to job opportunities. Still don't feel anything rivals Delphi for GUI development. VCL was genius.
My first exposure to TP was when I pirated it off a warez BBS in the winter of 93 at the age of 14. The raw power I felt when I compiled my own EXE in contrast to running just a .BAS file was enthralling! I started modding Renegade BBS and writing door programs. I tried to, unsuccessfully, create worms, Trojans and viruses. It changed my life and set me on a course for where I am in tech today. Moreover I’m reformed my deviant teenage tendencies.
I owe Borland a lot.
That throws me back. Being a teenager without any real understanding of compilers, interpreters etc., being able to create my own EXE file in TP4 felt like having superpowers - like being a real(TM) programmer :)
A few years later at 16, I actually got paid for developing a small app for managing my dad's customers, paid by the company he worked for. Part of that money went into getting a legal version of TP6.
Ah, the good old BBS days. I wrote a door library for Turbo Pascal back in '89 when I was 17 after I learned how to write an interrupt based serial driver, and then released it as shareware. It saw quite a bit of use until the mid-90's when the BBS scene fell off a cliff.
I think doors in BBS terms were essentially side-loaded programs that ran within the BBS framework. Different content, text games, that sort of thing. So the library would be something written to work within that specific BBS software.
I had a similar experience in the same era. In addition to the deviant stuff I also used it to tinker with graphics programming using inline assembly for the “performance critical” (for a 386SX) parts. That experience definitely laid the foundation for what I do today.
I pirated Turbo Pascal back in the day because I was a poor student in a Third World country and there was no way I could afford it. Ditto Turbo C.
After I started working, I had a hand in selecting Borland C++ as our in-house development platform, and we paid Borland a fortune in licensing fees. Which would have gone to Intel or another company if it were not for my, and other folks in my cohort's experience with pirated Borland products as students.
The lesson is that piracy (and to an only slightly lesser extent, free student licences) was an extremely powerful tool to keep a tool market cornered: impossible to compete on price when piracy is so ubiquitous that payment is essentially opt-in.
(I had a mouse driver disc, 5.25, that wasn't pirated, it felt like something from a parallel universe with its machine-printed label)
I really wonder when all this changed. I seem to have this idea, that I share with many others that software houses back in the day explicitly didn't care much about piracy because it got people hooked on their software and in the end enterprise would foot the bill.
I'm not sure how true this was, as I've yet to see any real official sources speaking about this myself but it seems like it was like that. But most of my thoughts about this comes from being in the piracy space back in the days so I really have no idea if this was ever a real thing or not.
Free Pascal still has a look-a-like of the original TP IDE! But that code is bitrotting by their own admission (it still relies extensively on obsolete quirks of the original MS-DOS platform) and it's sad that we don't have a look-a-like version that can work as a general editor in the terminal (like neovim or emacs) and integrate with modern IDE-oriented facilities like the LSP, tree-sitter parsers or the debug adapter protocol. That could even be a game changer for editing code remotely from SSH/terminal connections.
A little over a decade ago some of my friends who were Turbo Pascal fans transitioned to Lazarus. Seemed like the logical successor to Turbo Pascal and Free Pascal. (Personally I transitioned to writing C++ in Emacs as I did not really appreciate the Pascal language.)
> Seemed like the logical successor to Turbo Pascal and Free Pascal.
To be more accurate, Lazarus is a GUI IDE for Free Pascal, not a successor. The same way Emacs can be an IDE for C that you'll compile with GCC.
I know, Delphi effectively replaced Turbo Pascal from the Borland side, even though Delphi could have been "just" an add-on to Turbo. They just didn't take that route.
It's hard to overstate just how much faster Turbo Pascal was than its competitors. So fast that it was hard to believe it wasn't cheating in some way. And then the resulting program was faster too.
A tour-de-force of its day, and it deserved all the acclimation that it got.
In my college graphics class (taught by Jack Bresenham, no less) I asked for permission to use the new Borland Turbo C for my class projects. After a quick recompile for the demo (the PC in the classroom wasn't a 80286) I found my code ran a third of the speed of everyone else's, which had been written in Turbo Pascal v4.
And that's how I learned that mature compilers are better than 1.0 release compilers.
Delphi, the successor of it still does that. Here is a link to an article about having a real project that contains a bit over a million lines of code. Has a video in it as well, and the comments there also discuss the times regarding Win64 compilation times:
Indeed, Ctrl-F9 in Turbo Pascal was almost instantaneous, whereas the same in Turbo C++ would let you read through the list of files in the popup window as they were processed.
When I was a kid, a kindly computer store owner (who also made me a great deal on an PC-semi-compatible running MS-DOS 1.25, for approx. a hundred lawns mowed and babies sat) sold me a copy of Turbo Pascal for generic MS-DOS (no PC BIOS assumed) on 8" floppy. He transferred it to the 160KB 5.25" format that my semi-compatible used.
I hope I was appreciative enough at the time, as I am now. That helped bootstrap my career.
Back in the early days, MS-DOS was very much like CP/M in that there were several different 8088 machines. The “PC Compatible” hadn’t quite exploded yet, so MS-DOS (and CP/M 86) compatibility was enough.
But soon, Flight Simulator became the benchmark for compatibility, and I think the final nail was that Lotus 1-2-3 required an actual PC compatible machine. That plus the rise of clone BIOSs pretty much ended the brief era of generic, 8088, non-PC compatible machines, save for niche domains.
An interesting artifact of this era was when Steve Ciarcia of BYTE magazine released his 8088 board, and it was not PC compatible. He made some different design decisions.
I thought it was the BIOS that was the compatibility issue. That's one of the few proprietary parts IBM had when they released their PC. Clones were built on a reverse engineered BIOS that, for a while, wasn't quite right. I vaguely recall 90% PC Compatible computers or something like that.
Halt and Catch Fire covers this. I could be wrong though.
Flight simulator as a test … still remember my boss tick that out as said you really cannot put that as requirement. We are in business not playing game …
Missed this and commented separately. Thanks for reminding me of the model. This was my upgrade to the TI 99/4 I drove across the state of CT to spend 995 1980-ish dollars on.
I remember back in the day working for a law firm that had some Wang MS-DOS compatible PCs. Note "MS-DOS" compatible, not PC compatible...they ran DOS without even trying to duplicate the IBM-PC BIOS, as Compaq eventually did.
That was the version of DOS that the Columbia MPC ran, and it was almost 100% compatible. I ran some code based on articles about the video card, and it couldn’t do the 160x100 mode that people wrote about.
So a lot of “clones” may have been this close as the Columbia was.
I transitioned from Apple Pascal (based on UCSC p-system Pascal) on the Apple ][+ amd //e to Turbo Pascal on DOS in the early 80s. Turbo Pascal was a blast: the very quick compiler, good feedback, colorful editor, and use of the Wordstar keybindings, which I already knew, made for a great experience.
I recall writing programs large enough to require use of the overlay facility, which essentially let you page in different parts of your program under DOS.
I also recall meeting David Intersimone, a great ambassador of Borland, sometime in the 80s - might have been later 80s - when he visited my university as a guest of the local ACM chapter.
Weirdly, Turbo Pascal was my first high-level programming language and it was on my Macintosh Plus. (I have come to find Turbo Pascal was more of a PC thing.)
In college I used a student loan to buy my first Apple computer, the Macintosh Plus. By chance there was a classified ad in the college newspaper for a copy of Turbo Pascal for the Mac — some professor was selling it for like $40 or something. I snagged it.
It had the manual, thankfully, but very simply tools for Macintosh development. No ResEdit, but the odd R-Maker app that expected you to create a Macintosh resource as a text file and then run it through the tool to create the resource fork.
My first few apps were Turbo Pascal implementations of algorithms from the Computer Recreations column of Scientific American (thankfully requiring little UI — a window and a few buttons typically).
Later I would learn of and move on to THINK Pascal — a much more Mac-centric IDE. Later still I would take the leap to THINK C....
But I will always look back on Turbo Pascal and remember it with fondness. It was a time when I was entering a wild new world and Turbo Pascal was there to hold the door open for me.
And Delphi, where at least he tried to do something with strings hah. Never got used to them, and never really liked Wirth languages, but respect is due of course.
Delphi strings were really well implemented. Garbage collected (RC), multithread safe, length in first integer, fast, make them as big as your memory could handle. Reduced bad memory pointers a lot compared to competition at the time. Didn't handle the Unicode transition well: but virtually no languages did.
On the contrary, I'd say they handled the Unicode transition very well.
At work we had many, many thousands of lines manipulating strings, high and low level, and full project was over 300kLOC plus many heavy dependencies. We spent just a day or two to convert it to be Unicode capable.
Since we've had hardly any issues, and Unicode is not something we need to think about in the day-to-day.
In high school I thought "structured programming" would cramp my style. I'd spend hours with BASICA printouts spread out over the floor, tracing through chains of GOTOs to find my bugs.
One day my dad brought home Turbo Pascal so I gave it a shot. I wrote Conway's Game of Life, and was shocked when it ran correctly the first time. Never wrote another line of BASICA.
Much later I turned pro and moved on to other languages, but my dad kept messing around with Pascal until he was in his 70s.
> was shocked when it ran correctly the first time
This overall "if it compiles it works" philosophy is quite common among Pascal-family languages, and derived languages like Ada. Though there is also PL/I if you want a language with a loosely Pascalish syntax that relies on the complete opposite design style - the one that was carried over to C and its derivatives.
I doubt that by modern standards, Pascal seems to work if it compiles, compared to languages like Java or Go. They were comparing it to writing spaghetti code with GOTOs everywhere in BASICA, so no wonder their code seemed more reliable. Nowadays, the jump in reliability comes from strong and helpful type systems, not structured programming. Ada still seems good there, but Pascal is just an ordinary programming language.
Yeah, the benefits of strong typing have nothing to do with Pascal...
Sigh. Kids nowadays, etc. Look up which was the original language ridiculed for its "belt-and-braces" approach to type safety, in contrast to C, the freewheeling language of Kewl H4xx0rs (although they weren't yet spelled that way).
Now excuse me, there's a cloud I have to go shake my fist at.
I genuinely didn't realize Pascal was recognized as providing having a strong type system. I guess there is Brian Kernighan's "Why Pascal is Not My Favorite Programming Language," where he complains that there's "no escape" to the type system as in C, where anything can be cast to anything. But I took it to mean that C just had an exceptionally loose and liberal type system.
Yah you have to think about this in the context of really bad C code from the 1990s.
It was really common for people to stuff pointers in ints and ints in pointers and do all kinds of really abusive things with the memory system in C. It was all fun and games until the program died with nary a stack trace. (Stack traces were pretty amazing the first time I saw them!)
There were even big commercial libraries that did really odd stuff. Motif (big early Unix UI framework) did a lot of weird stuff.
Pascal was pretty straightforward and disciplined compared to that, although I'm sure developers who were too clever for their own good found ways to do silly things.
I almost coded some Turbo Pascal earlier this year. I found the source code for one of the (not so great) games I (tried to) write in Turbo Pascal 2.0 as a kid. Had fun porting it to the latest stable Free Pascal. Never used Free Pascal before and was quite impressed. Well documented, stable, amazing cross-platform. Supports everything from creating 16-bit DOS COM-files to various modern 64-bit platforms and JS.
looking at the perfection of Turbo pascal always made me feel stupid. At least it's not just me, wikipedia says:
"Scott MacGregor of Microsoft said that Bill Gates "couldn't understand why our stuff was so slow" compared to Turbo Pascal. "He would bring in poor Greg Whitten [programming director of Microsoft languages] and yell at him for half an hour" because their company was unable to defeat Kahn's small startup, MacGregor recalled.[21] " https://en.wikipedia.org/wiki/Turbo_Pascal
Actually, how was Turbo Pascal faster than the competition was completely obvious to those experienced in the software development at that time:
It did everything possible in memory, avoiding reading and writing of temporary files. Microsoft tools did it with the file I/O for multiple passes, a traditional solution for coping with limited RAM and applying optimization steps and targeting different processors.
> "Scott MacGregor of Microsoft said that Bill Gates "couldn't understand why our stuff was so slow" compared to Turbo Pascal. "He would bring in poor Greg Whitten [programming director of Microsoft languages] and yell at him for half an hour"
They tried this on a project at my workplace. After six months of money spent on "yelling", they finally started to fix the issues.
I remember it well, including the fonts it bundled .. burned into my retina even today, given the crazy code I wrote using that toolkit. Thanks for the nice memories, I always wonder what happen to TTK .. as well as PC-Write, for which I had the source code license, and always tried to pack along with the TTK in my backups for some reason (I was a weird kid) ..
Fond memories of watching my dad sat in front of a bright blue screen covered in inscrutable glyphs. It looked like pure wizardry to me then.
While the original colours are too intense for my taste, I do sometimes switch to the Noctis Azureus theme in VS Code when I find myself thinking of him.
The colour scheme I recall was yellow on black. Version two, I think, but I'm not sure... I copied a copy I found -- no doubt already pirated by whoever left it there -- in the Economics computer lab onto a floppy ("crunchy", actually; a 3.5" diskette) back in ~1990-91. And then went on to implement a multi-variable linear regression algorihtm for my exam paper in statistics in it.
The language that taught me systems programming should be all about, with proper security, modularity and good high level code, no need for lack of security shortcuts.
This brings back great memories. I wrote a text based adventure game in turbo pascal for a high school project (97/98) and had a blast. It was a really easy language for my needs and skill level. Started a life long love of code.
Turbo Pascal was a game changer. Prior to this, I'd been learning to code with Computer Innovations C compiler. It required multiple compiler passes involving swapping floppy disks. What a pain. Turbo Pascal was small, compiled fast and produced tiny executables. It was a joy to use. Later, I used Delphi to develop the GUI for an electron microscope. It made GUI development so easy - even developing custom controls for knobs and 2-D sliders.
Is anyone going to mentioned that that's not at all what Turbo Pascal looked like when it was first released? Not in 1983. The UI was significantly simpler.
I'm doing that on a single-board Z80-based system, and it has to be said that writing pascal is a pleasure on such a machine. 64k of memory, and yet code compiles to real executables "instantly".
I’m one year younger than TP, and the blue screen in TFA is the one I remember when I first used it (1990 or so I guess), so that must have been a later version.
Turbo Pascal was my first real programming language (after BASIC and a bit of 8086 assembly) in my pre-teen years, and I read the manual very thoroughly. It was fascinating to learn, even though much of it was over my head.
I remember sitting at the living room table on many evenings, studying a stack of printed out source code. My task was to contribute to translating a bulletin board system called WWIV.
I went on to study Turbo C, which came with a wonderfully informative manual also. That experience formed the foundation of my programming in the next decades (2~3).
When I first learned Turbo Pascal and later Delphi I thought we reached the peak productivity to create World-Class desktop programs and in the future there will be so many good desktop programs everybody creates because nothing came close to Delphi in my opinion.
Then the web came (and I got confused) and I've never regained such clarity in programming as with Delphi on my little Windows desktop.
I started coding in C.
Things were difficult to me.
Luckily we had a good professor who switched our programming learning to Pascal. That's when coding started to be clear and simple to me and for my classmates.
Niklaus Wirth has done an immense work. From Euler to Oberon, passing by Modula, he is a great visionary and a decades determinant mind.
> "The entire Turbo Pascal 3.02 executable--the compiler and IDE--was 39,731 bytes"
On my computer now, it's smaller than:
- License.txt which comes with "TurboPascal (with DOSBox)" (42,044 bytes)
- sqlanywhere.vim, syntax highlighting file for the SQL dialect of SQL Anywhere (41,929 bytes)
- parse.py, the URL parser in Python's urllib (42,057 bytes)
- doc_testing.html the Rust By Example section on writing tests in comments (42,059 bytes)
- pwd.exe (print working directory) in GitHub Desktop (42,296 bytes)
- WindowsExplorer.admx the XML policy file describing the available
settings in Windows Explorer (42,461 bytes)
Also smaller than half the binaries in /usr/bin on my x86_64 Fedora 39 system[1] and less than half the size of every binary executable in /bin and /usr/bin on my x86_64 Mac[2] in spite of the fact that all the Linux and Mac binaries are dynamically linked to at least libc / libSystem.
It's also less than half the size of Xcode's PNG-compressed Retina application icon (83,056 bytes).
[1] This includes such gems as cat (40,848 bytes), tac (40,808 bytes), and tee (40,968 bytes).
[2] Where even /usr/bin/true weighs in at an astonishing 100,512 bytes, despite its entire text section disassembling to
I was about to say there are a good few tools smaller than that on my system, but I was one hundred percent wrong.
# zsh glob for regular files that are smaller
print -l /usr/bin/*(.L-39731) # Some ~500 results
# Not interpreted or using system libraries
file /usr/bin/*(.L-39731) | egrep -v '(script|dynamically)' # 0 results
Turns out there are only smaller things if you accept big caveats.
It's being used to this day (or until very recently, at least) in some countries, generally running under dosbox because DOS software can no longer run natively in modern PC OS's.
Never wrote a line of code in Turbo Pascal afterward, but taking it in HS (somewhat on accident) in 1995 was what set the trajectory for the rest of my life.
Cool. I should try that. I was a heavy TP user too.
Man, this reminds me of amber screen monitors. I liked them better than the green screen ones, but somehow, the amber ones seemed to be much less popular, at least in areas where I was.
That was a common choice for TUI software running in the 16-color PC textmode palette. I suppose the blue provided a "dark" background with less contrast than actual black. (TUI programs of the era generally had a pure black-and-white mode too, and the overall look there was not unlike that of *nix terminal-based software.)
I suspect it was in rather large measure because white on blue was the default for the DOS prompt in the early-mid eighties, and that in turn I always thought was because those were the colours of IBM's logo.
Similar story here. I started with like 12 casually with C and later C++ in multiple attempts and failed, but TP brought me the joy of programming. And later I could master all other languages (including C/C++).
I will never forget how great it is to have some simple IDE with a "run" button/key, and build simple UI toys with points/lines/rectangles/... (what was it? tortoise? turtle? idk) for a school kid in that age. it was pure magic to bend pixels to my will.
Turbo Pascal is probably the key thing that led me to end up studying CS and becoming a software engineer. I had dabbled in Basic starting around 10, I took a summer camp coding class, which would have been pretty novel in the late 80s I guess.
But then at some point we got a copy of Turbo Pascal (my Dad was an engineer and was always bringing interesting stuff home for me to try) and I started banging on that. In my senior year of HS I got super into it when I took another class. I did the whole year's curriculum in a couple months and then wrote an asteroids clone in class. It kind of drove the teacher nuts cause he said he had nothing left to teach me and he got annoyed when I started sharing the program with other students and they were playing the game instead of working.
The one thing I remember not grokking when using TP was pointers. I wrote my whole asteroids program with fixed size arrays. I don't think it helped that I didn't really have anything great in terms of books and no one around me really understood how to use them well to talk to me about it. A year later when I started learning C/C++ in college I got it quick though. I didn't even switch to CS till after I decided where to attend, I had originally planned on studying Aeronautical Engineering and the whole Turbo Pascal adventure made me change my mind.
I keep wondering if I have a copy of that program sitting on a floppy disk and I could recover it. I don't actually own a computer with a floppy drive though, even though I might have an old internal floppy drive sitting in a box.
When my dad found a box full of old floppies in the back of a drawer, I bought a USB floppy drive and was able to backup all the data in them: some of them contained my very first programs written in Turbo Pascal 3.0 and GW-BASIC.
USB floppy drives are quite cheap to buy, and in my case it was totally worth the money!
One of the best things about Turbo Pascal was having to declare all variables upfront at the beginning of functions.
I really don't know why languages like C, Java or JavaScript started to allow variable definitions everywhere. Needing to declare variables first leads to much cleaner functions.
Yes, those are useful, but I think the requirement to declare upfront in general prevents function that are too long.
When you start to have a screen height full of variable-declaration, you start thinking: "Maybe I should refactor this." Inline variables conceal that a bit.
> I want to have variables that only exist within an `if` block. That is something C99 got right.
C89 allowed you to declare variables at the top of any block, not just functions, so you could declare variables that only exist within in if block. For example:
if (flag) {
int c;
while ((c = getchar()) != EOF) {
/* code */
}
}
I'm currently reading "Assembly Language Step-By-Step" by Jeff Duntemann from 1992. In it, he calls TP "moron friendly" (unlike Assembly). Also something like Sears Catalog fallacy,
"where you go hunting through veritable Sears catalog of toolbox products looking like: SearchDataFileForFirstRecordBeginningWithStringAndDisplayInRed
Basically, this method glues other people's canned procedures into programs..."
30 years ago they had the same complaints like we have today
I had something called Turbo Pascal DiskTutor which if I remember correctly really helped me understand object oriented programming.
I also spent quite a lot of time working on a type of rough wireframe 3d CAD editor in regular Turbo Pascal when I was in the 8th grade. I used equations I got from a little reference book my dad had brought from some garage sale or whatever to figure out how to do rotations and perspective.
We also used it in AP computer science class which was probably the highlight of high school for me.
I started with TP 3 in high school in the late 80s and then went on to use it my freshman year in college as all our Intro to CS projects were in Pascal. I think 98% of the folks in that class used the mainframe, which involved submitting your job in the terminal and then walking across campus to the printing room and seeing the output. I did all my work in my dorm on my Tandy 1000 SX and when I was done was able to use a pirated copy of an IBM file uploader to get it into my mainframe account and then submit it. Luckily TP 3 still understood old style Pascal comment delimiters as we were graded as much on our comments as we were on working code.
A couple of years later as a senior I used TP 3 using "Turbo Vision" to write a simple learning focused stats package for a professor's book (_Quality Control, 4th Edition_). I found out later that same package was being used for the production line of Jim Beam whiskey.
I also wrote a few shareware programs in TP and put them on Simtel in the early 90s.
It was pretty cool getting random checks in the mail for utilities like "vgalogin", a Novell login frontend in VGA mode with a floating message of the day.
TP 2.x and 3.x had integrated editors that were functionally similar(inspired by) to WordStar, but were not wordstar, and using an external editor was a pain in the ass.
I'd like to get into Free Pascal and Lazarus but I am also trying to get into so many other things like Python. I worked with Procasti to do this forum software: https://github.com/orionblastar/K666 But I forgot what I learned due to my medicine and mental illness which causes a disability.
I've come a long way, but needed a break to get sane again.
All is well. How are you doing? I'm thinking of skipping Windows 11 and going to Debian or Ubuntu. My PIII problem was that the CD-ROM drive was not supported by Linux, so I had to buy a Sony drive.
I got Ubuntu on a laptop and Virtual Box on my Windows 10 PC. I am slowly getting better with it. All I had to do was avoid IWETHEY and investigate on my own.
I still open up TP7 in DOSBox-X to play with "leet code" puzzles. They're mostly imperative and the super-fast compilation time and debugger are impressive even by today's standards.
Other than some Logo experience, I first learned "real programming" using Turbo Pascal. I was in sixth grade and went to a programming summer-camp-of-sorts, held on the grounds of the Techno-da science museum (today called the Mada-Tek). At break time, one of the other kids got a copy of "Ironman Super Off Road" [1], and we would play or watch others try to beat the computer. And at break time there were bread rolls with some filling I think, and every other day or so it was this chocolate-flavored spread.
Man, that was so much fun!... I haven't thought about those times in years; thanks for the trip down memory lane :-)
I was carrying a shoebox with my programs in 5 1/4 floppys in at 4th floor of my library where the had dozens of Apple II e PCs. One of of the programs was Turbo Pascal. The first computer science class in 1985 was in Pascal. Funny I was just as interested on the teachings of Blaise Pascal and Pascal's Wager. Like many ones of my generation we wrote an asteroid program where an asterisk was printed in a for loop at a random place between 0 an 79 and produced asterisks coming down the screen and the user had to navigate an caret left-right avoiding the the asterisks. This was not part of my boring assignments. I wrote it and shared with my friends and carried it in my shoebox filled with floppy disks.
Happy Birthday Turbo Pascal. (TP means something else in the US).
They taught Pascal in high school, for my AP Computer Science course. We used Turbo Pascal on PS/2 model 30's. There were no hard drives in the machines! The teacher handed out floppies at the start of the class. By that point, I already knew C but it was still fun learning a new language.
One of my favorite vintage PCs is my PS/2 Model 30, and I had no idea it was available in a dual floppy configuration! It seems way too late to have been sold that way, but yup, checks out, that was the bare-bones configuration.
Turbo Pacal! It's been ages, but it's the first language I learned. The Borland IDE was amazing for its time and allowed me to learn the language in no time. My dad learned it too, and even developed some small games with it. Even now he likes to use its successor, Delphi.
I mostly consider languages as a mere tools like a screwdrivers. But yet there is that warm fuzzy feeling when I remember lying my hands on Turbo Pascal. Comparatively to other "high level" tools of the time it was at the different level.
Turbo Pascal is still in the very good hands of Embarcadero Technologies and Delphi and Marco Cantu. One of the best value propositions for app development targeting multiple platforms simultaneously with native code.
The first ever programming environment I ever had contact with. The awesome feeling of compiling your own programs and bending the computer to your will. Fond memories.
Thinking I'm probably a member of the last generation to have the magic experience of computers quiting the realm of science fiction, stuff we only saw at movies, to become a real world tool. People born after that time surely experience them as trivial, mundane stuff, one is used to since forever. For me, to this day, computers still feel a bit "magic", like flying cars, but that actually happened.
My first "big" program was a school project submission done in TP. It had an editor for drawing pixels in a (text) grid .. one line at a time. You could draw polygons, name them and the next time you drew something similar to the earlier ones it would recognise it and suggest the same name. If it couldn't find one, it would ask you for a name to remember.
Was so much fun writing it. The interface was fun to write. You could use it to tell apart isosceles triangles from equilateral triangles, for example.
The machine I used only had two floppy drives and still I recall TP was fast as a Ferrari.
That's a blast from the past. Turbo Pascal 3 was the first "real" language that I learned when I was young. And then there was 5.5, which had a much nicer IDE. Ah the good old days.
put together a sieve of eratosthenes from algorithm as a kid and thought I was hot shit. something about watching a computer slowly tick through primes was so satisfying.
While SWAG seems to be a group (https://en-academic.com/dic.nsf/enwiki/9596219) I remember this executable "swag.exe" that ran inside Turbo Pascal and had tens of demos of this (for me, then) advanced coding environment, even with graphics demos.
Pressing F9 would compile and run the selected "demo" super easily.
One of my earliest memories of using a computer, I think.
The interesting thing about Turbo Pascal is that by the time it was TP6 it was just about as expressive as C itself. (Evidenced by the fact Turbo Pascal for Windows could easily interoperate with the heavily C-based Win16 API).
My high school started offering it's first programming course in 1988. It was in Turbo Pascal, and it was being taught by a math teacher who knew absolutely nothing about programming and was learning out of the book (Borland Turbo Pascal 5.0 [still have it]) at the same rate we were. The classroom was basically a computer room lined with PCs that we all sat in front of. These PCs were equipped with 5 1/4" drives as well as 40 MB hard drives. Being the "hackers" that we were, we downloaded the original version of nethack (early rogue offshoot) off of a BBS with our 1200 baud modems, brought it to school on 5 1/4" disks, loaded it onto the hard drives, and started playing it while the teacher sat at the front of the room with her nose buried in the book, trying to figure out how to learn enough Pascal to offer instruction. Unfortunately, after a while, these mysterious black rectangles started randomly appearing on the screen, and the PCs started experiencing performance issues. Nobody could figure it out. After many months, it was determined that we had infected all of the computer with the Jerusalem virus (in 1988!). Soon after, a school-wide prohibition on installing your own software was implemented (though not followed).
Whenever TP or Delphi get brought up, people wax nostalgic about how amazing they were, and that they are actually unmatched to this day in certain aspects.
I know the adoption of both got hurt by Borland's and Embarcadero's corporate shenanigans. However, FreePascal and Lazarus are reportedly just as good as Borland's products, but without the business issues. Despite that, they seem to barely get used. Why?
I'd guess it's the same thing that makes any niche language difficult to use for real: developer education, tooling and lack of third party libraries. Mainstream languages tend to have these three things in spades, and the language itself is good enough.
FreePascal is pretty good but could be better with more manpower; Lazarus never was as good as Delphi GUI-wise, and hasn't been updated for modern GUI toolkits, so (unfortunately) it's almost obsolete by now.
What do you mean? I've been using Lazarus for many years and i used Delphi 2 and 7 before that (mainly 2 though) and i find Lazarus to be an improvement over these versions (AFAIK Delphi 7 was incredibly popular - among Delphi developers anyway - to the point where it was offered for years after newer versions were made).
IMO the main reason is simply that Pascal has lost its "cool" status - and also there is a ton of misinformation out there about it (i even still see people mentioning Kernighan's article on Pascal about why it isn't good, which not only isn't valid anymore -aside from a couple of cosmetic differences- it also wasn't valid in the 90s or really even when he wrote it - though in his defense he was referring to Standard Pascal, but that hasn't been relevant for literal decades).
> and hasn't been updated for modern GUI toolkits, so (unfortunately) it's almost obsolete by now.
Lazarus is a fully volunteer-developed project, so people work on what they want. If you want support for a modern GUI toolkit you basically need to do it yourself (the maintainers are very accepting of external contributors).
Though chances are your info is a bit out of date. As of right now (i use the trunk version since i contribute to Lazarus, though my contributions are certainly on the "old GUI toolkits" side) there is support for Qt6 (which is modern enough in my book :-P) which seems to be at a decent state. Here[0] is an image with Lazarus compiled using the Qt6 backend with a small 3D model viewer i wrote - running Lazarus itself is a litmus test for a backend as the IDE is quite complex.
Of course the neat bit with Lazarus is that you can also use any other toolkit that is supported - in the same image you can see the same exact model viewer running with the Gtk1 backend :-P
My first language in college was TP. It was such a nice environment to use that when I first encountered gcc and similar UNIX compilers I thought them to be very primitive.
TP was also the first language that I learned in collage in 84. I remember the Prof always calling it the language of love. We made fun of him alot for that statement but I think he got the last laugh...
I used the Turbo Debugger to crack or develop cheats for games, that is until I discovered Softice. It always amazed me that when I set a breakpoint, it would halt the running program, switch to text mode, I could single step for a bit, modify an opcode, then continue running, and the screen would flicker back as if nothing had happened. Even the key combination, I think it was <ctrl> <break>, would halt a lot of programs. This was in the DOS days when programs had a lot more control over the system.
My experience with Pascal was my first experience with a procedural language. Before that I had been writing BASIC. I didn't believe at first that I wouldn't need a goto statement somewhere. I recall later thinking that compared to BASIC, the resulting code was more satisfying looking. In highschool we used the Watcom compiler running on QNX on a 286. It was slower to compile I remember than my later experience with Turbo C.
I still have my original Turbo Pascal 7.0 floppy install disks from high school in the early/mid 90s. There was no Googling or ChatGPT'ing for help when you got stuck...I feel like it made me a better thinking programmer back then and that somehow I've lost something with all the answers only one query/chat away.
I bought a copy of Turbo Pascal at the 10x10 booth they had at the West Coast Computer Faire in SF when it first released in 1983? or 1984. I used it to build a TSR app on DOS 2.2 in my first job. Still the fastest dev environment I have ever used relative to anything else at the time; it was amazing to use.
Man - such memories. I used this for the first time in a commercial software product for Geophysical forward modelling of gravity and magnetics. It was such an amazingly fancy system at the time - the first "real" development system I'd ever used.
I learnt to program and to animate sprites with Turbo Pascal!
Later on me and a bunch of friends programmed a Sokoban-clone called “Project S” for MS-DOS: it used the graphics mode “X” for smooth scrolling; we even bought a commercial .mod player library for the sound effects and background music.
One day, a spiffy program called Compas Pascal appeared from Denmark, which Philippe Kahn bought and renamed Borland Turbo Pascal. Turbo Pascal was sort of shocking, since it basically did everything that IBM Pascal did, only it ran in about 33K of memory including the text editor. This was nothing short of astonishing. Even more astonishing was the fact that you could compile a small program in less than one second. It’s as if a company you had never heard of introduced a clone of the Buick LeSabre which could go 1,000,000 MPH and drive around the world on so little gasoline than an ant could drink it without getting sick.
I learned Pascal in highschool the last year they taught it before switching to Java! I remember the textbook we used had a Dune themed question/problem to solve which was delightfully nerdy compared to the corporate OOP blandness of Java
This brings memories.. I remember running Turbo Pascal 4.0 under PC-Ditto (XT emulator) on Atari 520ST :-)). It all looked mysterious and raw comparing to GEM [1], but it was probably my first _serious_ programming setup.
The Atari was from Germany, and I didn't know German at all, but I knew a bit of English. And so probably that was an important factor in why it was more attractive than whatever Basic or Assembler I had on the host system.
Dear God, how I LOATHED Turbo anything as a beginning student. Yes, the debugger was helpful, but I spent HOURS upon HOURS chasing down non-sensical error messages. This was in the pre-Mozilla days mind you, so good luck finding help. Even the TA was useless. Years later, I learned C++ with gcc. Only debug tool we used was "got here" statements. MUCH less frustration.
I have vowed to smash every copy of Turbo Pascal and Turbo C/C++ that I find.
My High School had a computer lab of ~8 Apple ][s with dual floppy drives and the CPM board to run UCSD. It worked, but we were definitely limited in the number of computers we could use due to these limitations. I also had access to an HP 9835 also running USCD, so it was very familiar to me.
But part way through my class we switched from UCSD to Turbo Pascal, which only needed one floppy and just absolutely blazed. It was like a space age rocket ship.
For all who wonder why Turbo Pascal was so fast here some insights:
50% is certainly due to the language Pascal itself. Niklaus Wirth designed the language in a way so it can be compiled in a single pass. In general the design of Pascal is in my opinion truly elegant and compared to other programming languages completely underrated. Wirth published a tiny version of his compiler written in Pascal itself in a 1976 book called "Algorithms + Data Structures = Programs".
In the late 70s Anders Hejlsberg took that version and translated it into assembly. He certainly must have changed the codegenerator since Wirth's version emitted bytecode for a tiny VM whereas Anders version produced machinecode, however if you take a closer look especially at the scanner and parser of Turbo Pascal and Wirth's version you can see that they are very similar. Back then Anders was not so much a language guy in my opinion but much more an assembly genius. And that resulted in the other 50% of why Turbo Pascal was so fast:
-) The entire compiler (scanner/parser/typechecker/codegenerator/ and later the linker) was written in assembly.
-) The state of the compiler was held as much as possible in cpu registers. If e.g. the parser needed a new token from the tokenstream, all registers were pushed to the stack and the scanner took over. After the scanner fetched the next token, registers where restored.
-) The choice of which register hold what was also very well thought through. Of course the cpu dictates that to a certain extent but still lots of elegant usage of the "si"/"di" register in combination of non repetitive lodsb/stosb instructions were done.
-) The entire "expression logic" (expression parsing / expression state / code generation for expressions) was kinda object oriented (yes, in assembly) with the "di" register hardwired as the "this" pointer. If the compiler needed to handle two expressions (left expression and right expression), then one was held in the "di" register and the other one in the "si" register. Since the "di" register was hardcoded, you will find lots of "xchg di,si" in the codebase before a "method" (a procedure with the "di" register as a "this" pointer) will be called.
-) Clearly the cpu registers were not enough in order to hold the entire state of the compiler so heavy use of global variables were made. Global variables have the advantage of not needing a register in order to access them (e.g. "inc word ptr [$1234]").
-) Parameter passing was done through registers and were possible stack frames were avoided (too expensive), meaning no local variables (still heavy usage of push/pop within a procedure, does this count as a local?)
-) Parameter passing done through registers allowed procedure chaining: instead of "call someOtherProc; retn" at the end of a procedure just "jmp someOtherProc" was used to a great extent.
-) Jump tables everywhere. In general the compiler was quite table driven.
-) Avoiding of strings as much as possible and if needed (parsing identifiers / using filenames) then avoiding to copy the strings around as much as possible, meaning all strings were held in global variables. The big exception here was of course the copying of the identifiers from the global variable into the symbol table.
-) Starting with Turbo Pascal 4.0, hash tables were used as symbol tables. Arena allocator for memory management.
I am sure I forgot a lot, I reverse engineered Turbo Pascal back in the late 90s. Most of the above applies to Turbo Pascal 7.0, but lots have not changed in earlier versions over time.
It is a shame that such a wonderful codebase is buried under the "closed source, proprietary software" label. It is clear that today nobody would write a compiler the way Turbo Pascal was written, not even in a high level language but the codebase has some many tricks, so many elegant solutions, that it is a real pity that this is not open source. Of course the codebase is on the web, just not the official one.
Thank you Anders Hejlsberg for such a wonderful piece of software.
All of this is fascinating. I believe single-pass compilation is underrated, and quickly disregarded by a large body of the PL community as anachronistic. I think that's complete nonsense. Just take one look at the massive build infrastructure that's driving modern monorepos to see how incredibly crucial fast compile-times are.
> Jump tables everywhere. In general the compiler was quite table driven.
It would be interesting to see how this approach fares in the face of modern branch prediction on modern CPUs.
Single-pass is a bit of a gimmick. It requires programs to be written sequentially in a strictly "bottom up" way, so that forward references to parts of the program that have yet to be defined are rare enough that they can be marked specially (e.g. as with C program headers).
It's also largely irrelevant if you want optimized code generation, especially across multiple procedures, since that requires you to read abstract representations of the code into memory and deal with them globally which is the opposite of single-pass compilation.
How often do you need highly optimized cg during development? Unless I'm working on games, 99.9% of my time (even on highly performance-critical software) is spent on evaluating debug builds with 0 perf requirements - because I need to implement it correctly first, and make sure tests are passing.
I think it's uncontroversial that most fast, statically compiled languages benefit greatly from quick debug builds. It's just that very few of them are designed with this in mind.
I think I have made the argument above that the feasibility of single-pass code gen boils down to how the program is structured, not so much the language design itself. Perhaps current compilers should be reworked to generate code about as quickly as TP did, if optimizations are totally disabled and the code is written to eschew unresolved forward references. But I'm not sure there would be much of a point. And you would still have many language features where going back and performing a second pass over what was previously parsed (and perhaps codegen'd) just can't be avoided.
I'll join the chorus here and say that Turbo Pascal was my stepping stone from beginner tinkering (BASIC on Apple II and DOS) to professional development with C. The dev environment was the perfect mix of "serious and you can make real stuff including compiled executables for distribution" with approachability for a relative beginner.
I miss the turbo family of IDEs. I worked a lot in Turbo C, an did my graduation project in college on it.
Teacher would make us use vi in the first year. I still could not believe that you could just put a break point and stop the program in the middle of exectution to inspect variables. It was like magic to me.
I did learn it in school when I was about 14, 15. My daughter is 8 and she'll learn some programming this third year in school but I bet it won't be TP. I didn't teach her any programming on purpose, but if she likes it we'll do a game in Godot.
I tried to view the article. First, I got a certificate error. I tried again, and I got a proxy error. I tried again, and then I got a "WebCommand Error". So, then I tried again, and this time the connection timed out.
Turbo Pascal was my first programming language! I think I was 10 or 11 y.o. when my school teacher introduced us to it. I still like its simplicity, Python (my main language now) reminds me of it.
I was incredibly lucky that my college way back in 1987 gave all engineering students a PC XT loaded with things like Turbo Pascal. So much better than being stuck on a time sharing system in a lab…
My first paying job! At least for the relatively simple problems for which I was using it (rating auto insurance premiums by insurer and by state), it was a very nice environment.
My uni had two micros halls, one with 5.0, the other with 5.5. What's the difference? I asked. It turns out the 5.5 had some new shit called "Object Oriented Programming"... and a Breakout clone to illustrate the paradigm.
At some moment Borland released, free as in beer, the 3 and 5 versions. Binaries should be out there somewhere for those curious.
I'm sure young people would be surprised to see what 5.5 provided. The on-screen help was amazing: cut and paste useful examples from the help of many functions. And the instantaneous compiling... :)
That's very cool! IIRC I tried to write a very simple Turbo Vision clone (or subset) but I couldn't fully comprehend how the nested widget initialization worked. I mean, I knew the last parameter was a pointer to the next widget, and so on and on, but somehow I couldn't "integrate" everything in my brain :).
Niklaus Wirth (Pascal inventor) had the rule that compiler speed must never regress. So if you add an optimization (which means the compiler has to do more work), the optimization must "pay for itself" and make the compiler faster.
That philosophy probably seeped into Turbo Pascal to some degree.
That’s very interesting. I see similar attitude in the C# JIT/Roslyn developers where they take this very seriously. Interesting are this is influenced by Anders, or just whole thing tick for a lot of compiler developers?
You're saying that the reason for the slow compilation speed we see here is a commitment to keeping the compiler fast. That's not a logical sequence of thought.
Also, Pascal is a direct descendant of a very, very old language (ALGOL-58 from 1958). In the 1950s, HLL compilation was at the extreme edge of what computers were capable of doing and a key goal of language design was (or should have been - cough COBOL cough) to make it as efficient as possible.
By the time Ken Thompson was designing languages, hardware had improved a lot and he could make compilation efficiency a lower priority.
Technically, ALGOL-58 lead to ALGOL-60 which led to ALGOL X (1966) then to ALGOL W (1966) and then to Pascal (1970). X and W were proposals by Nicklaus Wirth and Tony Hoare as successors to ALGOL 60.
I think that the question was rather why it took 25 seconds to the current compiler to compile an Hello world on a contemporary computer... Turbo Pascal would have definitely done that in a second or less on an 8086 CPU...
hand written and also optimized for throughput right ? maybe pascal syntax was also parsing friendly.. i don't recall
one thing for sure is that it felt near instant if not real time building small projects, to the point that 14yo me was completely unaware of meaning of Compile until years later.
And despite being single pass, you could call functions and procedures before you defined them (later in the code), by using the forward declaration feature of Pascal.
Because of this feature, the language could also support mutually recursive functions, which some other languages might not have been able to (not sure).
Could you please stop posting unsubstantive comments and flamebait? You've unfortunately been doing it repeatedly. It's not what this site is for, and destroys what it is for.
ZZT's original source code was lost. Years later, Adrian Siekierka painstakingly reverse-engineered the original Pascal code till -- when compiled with the original version of Turbo Pascal -- produced a byte-for-byte identical executable. Amazing! Read more:
- https://blog.asie.pl/2020/08/reconstructing-zzt/
- https://news.ycombinator.com/item?id=22609474
- https://benhoyt.com/writings/zzt-in-go/