I joined Apple in 1987, about the time they started ditching Pascal in favor of C. C++ (in the form of CFront) was just starting to be a thing.
Apple's Pascal had been extended to the point where there were few true differences between it and C, other than
- strings with busted semantics (size being part of the type being a huge mistake, leading to a proliferation of types like Str255, Str32, Str31, Str64, etc). I should add that C's strings were semantically busted, too, and in more dangerous ways. No way to win :-)
- nested procedures (not terribly useful in practice, IMHO)
- an object syntax, used for Object Pascal and MacApp (a complete, though large and somewhat slow app framework).
- some miscellany, like enums and modules
Apple extended Pascal pretty extensively, adding pointer arithmetic, address-of, variant functions calls, and a bunch of things I've forgotten. I could write some Pascal, then write some C, and squint and they'd look pretty much the same. Most people shrugged and wrote new code C if they were able, and then moved to C++ when CFront became usable.
I think I misliked Pascal's nested procedures because they weren't true lambdas; once the outer procedure returned, the inner procs were no longer callable (just one contiguous stack, right?). Early-on, using C++ lambdas, I found myself making the same mistake in a design for some asynchronous completion stuff. Embarrassing; C++ and Pascal are not JavaScript/Lisp/Scheme/Smalltalk :-)
You can assign a c++ lambda into an object for later use, you just have to be careful about capture by reference and capturing pointer types... But the capture using the copy constructor is handy here, eg. you can copy a shared_ptr and get reference counted captures...
It's funny in this way how c++ lambdas lead to very c++ specific issues and uniquely c++-ish solutions. I think the wackiness of the closure capture mechanism is the best example of this I can think of.
I discovered with C and C++ compilers, if an extension was added that was not in the Standard, nobody would use it - not even the people who requested the extension. It wasn't like the Pascal community, where nobody would use the compiler unless it had a boatload of extensions :-)
I think you're onto something (about C extensions). There was a time, before clang became default on osx, when Apple deprecated nested functions (and perhaps others) in their version of gcc, and afaik no one complained. So I may well be one of the very few. Oh well.
I've done that, too. It's clumsy and only does half the job (doesn't provide access to locals). At some point one just gets tired of the workarounds :-)
It's probably a bit late for this reply, but... Another clumsy workaround is to use operator(). Again, standard C++ and this one has access to local variables:
int g()
{
int x, y;
struct local {
int & a_;
int & b_;
local(int & a, int & b) : a_(a), b_(b) {}
int operator()()
{
return a_ + b_;
}
} nested(x, y);
x = 99; y = 1;
return nested();
}
assert(g() == 100);
Nested functions are good if you have no alternative. If you use them as lambdas or a means of code folding, (both of which don't exist in older IDEs and compilers) you will have to keep jumping around in code in order to follow the logic. That is a very tiring thing to do, especially if you just switched jobs.
A lot of the goto's were used to use a common sequence of code followed by a return. With nested functions, the nested function contains the common sequence of code, and you just:
T common() { ...common sequence of code... }
...
return common();
instead of:
goto Lcommon;
...
Lcommon:
... common sequence of code ...
return result;
Got it now, thanks. But, this (the "return common();") could be done even if the function "common" was not nested within the current function, but defined outside of it, right? So what is the benefit of defining "common" as a nested function? Is it because it then has access to, and can use, variables defined in the outer function?
Yes, you can do it that way. But then you'll need a "context pointer" to pass references to the locals it'll need, and the function will need to be located "someplace else", meaning it is not encapsulated.
I've done that for years with C and C++. Nested functions are so much nicer and clearer.
Or just have first class functions, and prototype based inheritance. Way more flexible and powerful than class based inheritance. See JavaScript 5 as an example, ask Crockford.
Speaking of few differences from C and busted string semantics: my least favorite thing about Pascal is that strings and some types are 1-indexed, but dynamic arrays are 0-indexed.
And an annoying difference between the desktop and mobile compilers for Delphi is that on desktop strings are 1-indexed as they've always been but on mobile strings are 0-indexed:
The Low and High compiler intrinsics make array indexes easy to deal with. You don't need to care about the indexing to iterate over the array. You just go from Low(array) to High(array):
I don't remember if Apple Pascal had those intrinsics or not. But I never saw any code that used them, and I read most of the Pascal code base for MPW and a bunch for the OS.
Yes, this is not a limitation in Free Pascal or Delphi. The ShortString type still exists and is limited to 255 bytes. But the AnsiString, WideString, and UnicodeString types are variable length and can be up to 2 gigabytes:
strings with busted semantics (size being part of the type being a huge mistake, leading to a proliferation of types like Str255, Str32, Str31, Str64, etc). I should add that C's strings were semantically busted, too, and in more dangerous ways. No way to win :-)
I think trading some programmer inconvenience for a world with no buffer overruns would have been a good thing, in retrospect!
I'm reminded of a really great interview with Bill Atkinson where he describes (among many other things) how he initially brought Pascal to Apple and the Apple II.
...
My manager at the time said, no, we don't want to do this [Pascal],
people are happy with what they got. I overrode him and went to
Jobs, and Jobs said "Well, I'm not convinced. I think our users are
happy with BASIC and assembly language. But you seem passionate
about it. I'll give you one week to prove me otherwise."
I was on an airplane within two hours down to UC San Diego and I
started porting right away.
...
The other thing that happened then is I had to plug in the disk
routines, and their system was pretty big and that little 13-sector
floppy disk didn't have a lot of capacity. Well, Woz had just come up
with a different way of encoding the data on the disk so that we could
get more data for the same disk size, and we needed the 16-sector disk
routines. And so Woz came down, and I was there... I had never bothered
to get a motel because I slept on the bench when I wasn't working. This
is in the computer science lab at UC San Diego. I was busy, I didn't
have time to go sleep.
But Woz came down, and I got to interact with him and it was really fun
because he was working on installing these 16-sector disk driver
routines, and he'd go 'type type type type type' -- and he didn't type
in assembly language and have it assembled. No, he'd type in 6502
machine code. Hex. -- He'd type in hex, and then, you know, watching
him type and he'd go 'type type type' -- pause -- 'type type type type',
and when he finished I asked him what was the pause? And he said
"forward branch, seven instructions, I had to compute the offset before
I continued". So, he didn't back-patch the offset, he actually looked
at what he was going to be typing, knew how many bytes it would take...
he was brilliant.
This is one of the great quotes in the Walter Isaacson Steve Jobs book:
Next he created for the Apple II a version of Pascal, a high-level programming language. Jobs
had resisted, thinking that BASIC was all the Apple II needed, but he told Atkinson, "Since
you're so passionate about it, I'll give you six days to prove me wrong." He did, and Jobs
respected him ever after.
Not to take away any of Woz's brilliantness, but coding the 6502 in hex was something many of us did every day in the 80's. He's right that branches are a tad tricky, but the number of instructions you usually use are few enough to remember the hex codes(and number of cycles) for along with the various hardware registers(CIAS, VIC-II and SID mostly for me) and bits within those registers.
While I did a little bit of COBOL and Fortran (on digital minicomputers) in high school, USCD Pascal in grade 12 was the thing that really got me off BASIC on my personal computer (at that time, an Apple ][gs).
IIRC, the line my high school computer science teacher used about USCD Pascal without paying at the time (around 1988) was that it was out of copyright or something, but now that I think of it, I'm not so sure that was a legit reason.
I wrote a fair amount of Pascal in my 680x0 Mac days, both in MPW (the Macintosh Programmer's Workshop) and THINK Pascal. Back then Modula-2 was available on VAXen and "big" machines, but Pascal was almost "portable" across Mac/PC/VAXen and was amazingly fast, so it was pretty fun.
I eventually moved to C (also using THINK C - see retrospective link below for a sample of those heady times) and never looked back until a couple of weeks ago I set up Lazarus for my kid to play with (there are too many Python GUI development options, and none halfway as good).
Lazarus is _amazing_ (if somewhat odd in today's world), and I really wish we had more IDEs like it instead of all the crappy Electron/web approaches for building desktop apps. It builds fast, tiny, entirely native apps in less than a second, and is an excellent example of how far Pascal went despite falling off the mainstream wagon.
(If anyone knows of anything like it for cross-platform desktop apps, let me know, I'd love to try it out)
>(If anyone knows of anything like it for cross-platform desktop apps, let me know, I'd love to try it out)
Check out Xojo. HN user SyneRyder sometimes comments about it here, and likens it somewhat to Delphi. Cross-platform too. I'm in early stages of trying it out, so won't comment myself on it.
Xojo is a BASIC variant, and IIRC, was called RealBASIC earlier. While I've done some work with BASIC and Visual BASIC in my time, and don't puke out on it (as some people allege they do :), I confess that I prefer the syntax style of the Algol family of languages, such as C, C++, Pascal, D, and Go.
However, the convenience of VB, Delphi, Lazarus and similar drag-and-drop GUI app builder tools is real. Great for quickly prototyping something, and many a time for the final product too.
I tracked RealBasic for a few years before the rebranding. It's nice, and if I were developing desktop apps professionally I'd probably buy it - but for my kids to try out, it's a little expensive :)
I get you, but the lowest version of Xojo, without database support (or maybe even that includes just SQLite support), is free, IIRC, for local use on a PC. And anyway one can do plenty of apps without any DB.
>It builds fast, tiny, entirely native apps in less than a second, and is an excellent example of how far Pascal went despite falling off the mainstream wagon.
D meets some of your criteria too, at least somewhat small fast, native apps. Build speed is decent (or more) too, and it is designed for that - doesn't have C's repeated overhead of pulling in include files. [1] Haven't tried the GUI toolkits for D though (yet). There are some, like DLangUI, GtkD, etc. Not sure if there are any GUI builders like Delphi, Lazarus, VB, Xojo, etc. have. Also, I think I read that D's language design allows D compilers to be the single-pass kind.
[1] I think this is the video I had blogged about a while ago, in which Bjarne Stroustrup talks about the details of that include files issue:
>doesn't have C's repeated overhead of pulling in include files.
Sorry, typo - I meant to say "C++'s repeated ..."
C may have it too, don't remember if so or not right now.
The video mentioned in my above comment has some details on why this is such an issue in C++ - from the source, i.e. Stroustrup, the creator of C++. There's a bit of humor around that fragment of the talk - it is somewhere near the end, IIRC.
I don't know anything like Lazarus, but I do know a language that shares many of the characteristics of Pascal (small, fast, native binaries). That language is Nim and I think there is a good chance we could get something like Lazarus out of it.
I really think Sun and MS did a big mistake with their VM approaches instead of following what was common on the 90s.
So we had to wait 20 years, failure of Moore's law, cache optimization issues, competition from new languages, for them to come up with .NET Native, CoreRT and the initial AOT on Java 9
> Calling VM's a mistake is surprising considering that today, a crushing majority of code (JVM, .net and arguably, LLVM) runs on a VM.
LLVM isn't a VM.
.NET achieved its position, because on Windows many of the APIs became .NET only or C++ COM. Not many people bothered to use alternatives to Microsoft tooling.
JVM achieved its position due to being free, and the huge amount of money and engineering resources that Sun poured into it, which in a way ended up killing the company.
Yet due to the pressure from Fintech customers, Oracle started to look into AOT compilation, with initial support for Linux x64 available on Java 9, other platforms will be supported later. Most commercial JDKs had support for AOT since the early days.
> In comparison, .net native and Core RT are tiny dots on the radar.
Yes, but that is where Microsoft wants to go. There is no pure .NET on UWP, other than via the desktop bridge, the transition technology to port Win32 into UWP.
> - Pascal UCSD Pascal was a VM (p-code)
Quite correct and it ran quite slowly, purely interpreted.
> A few anti-piracy schemes on the Apple ][ used their own VM to obfuscate their code
- ScummVM was extremely popular to create games
The games from Lucasart were hardly something where performance mattered.
> VM's were a thing in the 90s and they are even more of a thing today.
Given Apple's decisions on their programming languages, Go, D, Rust, Pony, Haskell, OCaml, the change of direction on Dart's design, I am not quite sure.
Your point, which I was replying to, was that VM's are a mistake, not that they were/are slow, or how some VM's became dominant, or making guesses toward where Microsoft or Oracle want to head.
My response is that 1) VM's are dominant today and 2) for very good reasons.
And because of that, they are not a mistake.
I also speculate they are going to remain dominant for a while because of all their advantages over native approaches, advantages which become even more prominent as hardware keeps pacing up, like it has always done.
> Calling VM's a mistake is surprising considering that today, a crushing majority of code (JVM, .net and arguably, LLVM) runs on a VM.
That was more of a self-fulfilling prophecy than a need. There simply was a shortage of reusable compiler backends in the 90s, so when major companies (Sun, IBM, and Microsoft in particular) decided to devote all their resources to the JVM and the CLR, then those became easy targets for language implementors (and low-hanging fruit, too).
Regardless of what you think of them, their success is probably mostly the result of going where the most resources were being spent.
Lazarus looks interesting, but what holds me back is that the language doesn't seem to have evolved. I enjoy functional languages much more these days, and I prefer not to go back to an old language. I would be happy to be proven wrong on this
The language has evolved, but it still falls squarely in the imperative, OO camp.
And besides, you mentioned than you'd prefer to stick with more functional languages. It's hard to prove someone's preference wrong. :)
I mostly share your preference, but I make an exception for Pascal. There's just something about it that draws me in, even though I've only used it for fun, and not professionally.
Just curious: do you use an IDE, and if so, which one ?
The thing that drew me in to Object Pascal was getting Delphi back in 1996 (I think it cost $200), and being just amazed at how much I could do out-of-the-box without knowing a thing about Object Pascal. The productivity with IDEs like Delphi/Lazarus is just astounding.
I look at setup instructions for many languages today and just cannot believe what I'm reading. Pages and pages of instructions on putting together a dev environment just to compile a Hello World program. It's just insane to me...
I've used Lazarus when I've had small GUI apps I wanted to make. Other times, I'll just use Turbo Pascal in DOSBox when I'm just trying to have some fun and make simple little games.
Yes, Pascal is a fun language to work in. I've always enjoyed working in it, whether in the form of Turbo Pascal, Delphi or Lazarus (did some commercial work in TP and a bit in Delphi, and Lazarus so far, for fun.) Probably many features (or non-features [1]) of the language contribute to it. This is just one point: the use of sets as built into the languages, is nice and reads naturally for expressing some kinds of relationships and tests (foo in bar, where bar is a set) - e.g. char in vowels, etc.
[1] "Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away."
- Antoine de Saint-Exupery
I really miss Pascal; it was a great and safe language for beginners. As it was extended with Objects and Modules, it was great for development.
But there are good reasons it was surpassed by C. In early Pascal, you got a pointer by allocating memory; you could not get a pointer to an existing variable. You'd be surprised how often that gets in the way when implementing a data structure. Just try to implement the following function in C without using the address-of operator:
struct list *head = (void *) 0;
void push_back (struct list *entry) {
struct list **p = &head;
while (*p != 0) p = &p->next;
*p = entry;
}
Pascal got better. But once you've switched to C, the sheer verbosity of Pascal is bothersome. Instead of "{" and "}" Pascal uses "begin" and "end". It uses "procedure" or "function" to introduce a function.
There's no going back, but I wish it was still available for learners. Java is comparable in terms of programmer safety, but has too much ridiculous boilerplate just to write "hello, world".
> Just try to implement the following function in C without using the address-of operator:
Pascal had `var` arguments, i.e. call-by-reference, which can be used for similar purposes. Call-by-reference isn't a complete replacement for C pointers, of course, but it also avoided the whole host of memory safety issues arising from C's more general model and which plague C to this day.
> But once you've switched to C, the sheer verbosity of Pascal is bothersome. Instead of "{" and "}" Pascal uses "begin" and "end". It uses "procedure" or "function" to introduce a function.
That's entirely subjective, I think. As somebody who grew up with Pascal and learned C later, I found the C syntax to be comparatively hard to read. I do favor the Modula-2/Eiffel/Ada approach of having "end" terminators instead of "begin" / "end" blocks (which share the troubles that braces have), though.
Lazarus IDE and Free Pascal Compiler are available if you really want to play with Pascal. It's super easy and fun to do simple GUI programming in it especially.
I don't understand the extreme focus on brevity, though it does seem to be mentioned regularly. In practice I don't notice a difference between "begin" and "{". It's not something I have to think about, and my typing speed exceeds the speed at which I'm able to determine what to type next.
I understand avoiding excessive amounts of large boiler plate where standard patterns are constantly repeated for no particular reason, but that's not a comparable situation to individual keywords.
Turbo Pascal was taught at most UK college in the 80's I think most electronic engineers and CS guys who went to school around then were exposed to it and you would be surprised how many PC/DOS programs in the 80's were written in it.
Turbo Pascal seemed like the perfect language for writing DOS games in the early 90's. Fast, you didn't need to know low-level things but you could always patch in assembly routines if you needed to.
Intro to computer science freshman year of high school in 2004 was still using turbo pascal and it was great. Dropping into BGI graphics mode was a great beginner friendly way to draw shapes and make some cool things. It's what got me into game development in earnest. I wish I could get this same setup to teach people with but BGI doesn't work in later versions of Windows.
I think BGI graphics will still work in DOSBox. I remember having a setup a couple of years ago where I shared a directory between Windows and DOSBox. I'd edit the code in Windows using Sublime Text, and then pop in to DOSBox to build/run it.
So although it doesn't quite make BGI graphics based apps run directly on Windows...it's still quick and easy enough to have some fun. I did it specifically to give myself a way to make programming fun again at a time when I was feeling a bit burned out from working on massive, overly complex C# and JavaScript code all day at the office.
Turbo Pascal is, or at least until some time ago, was, also available from the Borland/Codegear/Embarcadero (hat tip to often useless or destructive M&A) Museum (along with other historical software from them, like Turbo C and so on). I had downloaded TP and TC from the Museum and tried them out a bit again some time ago, for old times' sake. Worked okay, IIRC, but I may have only used them for CLI work, not for BGI graphics. Worth a try though, for BGI, even on Windows, without installing DOSBox.
I wrote a ton of simple graphics programs in Turbo Pascal for fun, many of them related to math equations, generating curves from formulae, etc. Good fun. Still remember functions like HiRes; GotoXy(x, y) and the like :) It was a very productive environment and enabled very high speed of development.
Pascal is a smaller language, and so easier to come to grips with at first glance. The syntax is tailored towards some verbosity which usually appeals to beginners. However, the downside is that you don't have a lot of leverage to do things in Pascal: Python pushes you down a path where you are writing something productive very quickly and can reach into the standard libraries to do many tasks - it makes some things very easy. Pascal requires some time to prepare the solution and encode it in syntax, and it's harder to find what you need, but there is usually code somewhere online that you can adapt. This is a lot to ask of beginners who want to do practical work, though. As of right now Pascal retains a lot of strength in desktop GUI code.
In terms of safety/dangerous code, modern Object Pascal style lets you be as dangerous as C if you want, but the default semantics are much more comfortable, and take you away from the danger zone more often.
Both Python and Pascal are relatively easy to get up and running with, and have pretty solid, standardized toolchains for industry use: in contrast C and C++ leave the build process relatively undefined and varying between compilers and platforms, which has resulted in a huge amount of friction to get any project building on a new machine.
Although I use Python extensively and started to forget about Pascal (ex Delphi guy here), Pascal is a much better beginner language IMO.
Start with Python and you're most likely to be stuck with dynamic typing/runtime mindset and it'll distance you too far away from OS level native land.
OTOH starting with Pascal, you'll learn a great deal of low level (well, relatively) stuff and that will make you appreciate the higher level languages when the time comes and will let you leverage them more efficiently. Also as a bonus, a big bonus I think, Pascal will let you feel at home if you ever need C/C++/D in your career.
Overall not well. It's statically typed, which can be helpful. And as mentioned, Pascal, was used widely in teaching so one would say well. However! So was Python, in it's ABC infancy.
Note that there are multiple Pascals, especially in the late 70s and 80s. Pure, true Pascal was annoying and restrictive [1].
No commercially successful Pascal was pure. The classic was Turbo Pascal which was disdained by purists but enormously successful. It's inventor went on to work on Delphi (a sort of Pascal which is still popular) C#, and Typescript.
IMO it's the best language for absolute beginners, if we are talking about classic Pascal without Borland extensions. It's very small language which could be taught in a few lessons yet it's powerful enough to learn classic data structures and algorithms.
The main drawback is that Pascal is not commercially successful, so you'll learn it and move on. Python, on the other side, is widely used, so Python knowledge is useful on its own.
Start from Pascal if you want to learn a lot of languages. I specifically would suggest to dive deep after Pascal and learn some assembly language, may be x86. They you'll have good basics and you can learn almost any language you want, C would be good, for example.
Main problem with Python IMO is that it's too high-level, it's dynamic, it uses GC. All those things are too far away from real machine code, so if you'll learn Python first, it might be hard to learn low-level programming and it's useful even if you're using JavaScript.
>Pascal got better. But once you've switched to C, the sheer verbosity of Pascal is bothersome. Instead of "{" and "}" Pascal uses "begin" and "end". It uses "procedure" or "function" to introduce a function.
I love both C and Pascal, and I'm a guy who has done a lot of both of them. Hey, its possible to like more than one language. With that background, I think your above points are minor (IMO of course) and should not be an issue in deciding between the two. One can easily get used to "begin" and "end" instead of braces, or vice versa. Same for the keywords "procedure" and "function". Plus, editor shortcuts in modern editors should be able to handle that, or a keyboard enhancement tool like AutoHotkey, vim's abbr command, or equivalents in other tools.
Better still, use both languages, at different times, as per needs, wants, convenience, etc.
It may seem quaint now, but Apple Pascal was a serious tool. I took AP Computer Science in 1985 and the language taught was UCSD Pascal on the Apple ][+. (In the 80's, C on an Apple ][ was impossible. The only C compiler you could get was for a card that went in the expansion slot that included a Z80 processor.)
When I went to college in 1986, Pascal was the primary language used in all entry-level courses at Virginia Tech. (Turbo Pascal on an IBM PC -- $5 at the student stores, if you brought your own floppy. I'm the weirdo who brought a Mac Plus to school and used Lightspeed/Think Pascal.)
All of the classic Mac APIs used pascal calling conventions. Pascal continued to be the language used for serious Mac development for a long time.
I can't find any references via Google, but Apple had an internal language called "Classcal" which I was told was "pascal with classes". Eventually Think Pascal adopted this object-oriented Pascal syntax.
Just today I was thinking about how great it was coding in Lightspeed Pascal, when I was trying to get VS Code to display ligatures. Lightspeed Pascal parsed the AST and auto-formatted all your code for you. Tabs became tab stops, like a word processor. I still miss that; hard to believe today we're still fighting about tabs v. spaces.
A major issue was that C compilers were actually fairly ill-suited for 8-bit processors with limited memory, limited processing power, and at most a floppy disk (if that) for external storage.
Pascal was designed to allow single pass compilation of programs, while the C preprocessor alone was a source of what was immense overhead at that time. On CP/M, Turbo Pascal blew all C compilers out of the water in terms of compilation speed. The one C compiler I had on a ZX Spectrum at the time did not actually implement the full language so that you'd have some memory left over (I remember that the C type cast syntax was replaced with a keyword in order to simplify the parser, for example).
That said, UCSD-Pascal on an Apple II (which we got to use in school as part of our regular computer science course), was not all that convincing as an implementation (even if the system as a whole was fairly impressive). Because it used a bytecode interpreter (even if that bytecode interpreter leveraged type information and was much faster than a bytecode interpreter for a dynamically typed language), execution speed – including that of the development environment, which was written in UCSD-Pascal itself – was still pretty sluggish in comparison to an actually compiled language. (The UCSD p-code in principle also allowed compilation to native code, but at least the system I had access to – that was in the early 80s – didn't support that.) But it was still more pleasant than many other compilers, and that was at a time when cassette recorders were often still the primary external storage medium, so people were more tolerant, when their point of comparison was systems that literally included source code from cassette tape at compile time, because for large programs you couldn't fit both the source code and the object code into memory at the same time.
C was also designed to compile in one pass, hence the need for forward references. The preprocessor was extra overhead, though.
Also, from my recollection, Turbo Pascal was not just faster than C compilers. It was also faster than other Pascal compilers. And pretty much any other computer regardless of language. That was part of its appeal.
My first job out of college was porting the interpreter from Macintosh Pascal (the predecessor to Lightspeed/THINK Pascal, sold by Apple) from the Macintosh back to the Apple II (as "Apple II Instant Pascal", also sold by Apple). Boy, was that fun. Had to tuck the interpreter into a 4K "fold" of the Apple II memory map -- a block of address space that was normally mapped to ROM could be switched to point to the corresponding RAM in the language card that would otherwise be inaccessible.
[Edit: to be clear, neither of these were the UCSD Pascal referenced in the original article; I had that on my Apple II in college, and demoed a baseball statistics program I wrote in it to the THINK people as part of my job interview.]
They also had a pascal. Granted it was targeted to the IIGS rather than the straight II line, but given my personal experiences running apple pascal on a machine with only a single floppy drive it was just as well. In a way Jobs was right, BASIC+assembly were a better target for the 8 bit apple II than pcode and the long development turnaround time for pascal. Worse the resulting applications were slow, not just running but the extra delay loading the pascal system from 5.25" disk rarely made for a good user experience.
That said, to this day, I compare all new languages I learn to object pascal/delphi, and frankly I have yet to find one that seems as polished, productive, well documented, resistant to bugs, etc.
> When I went to college in 1986, Pascal was the primary language used in all entry-level courses at Virginia Tech.
When did Virginia Tech switch from Pascal to C++? I did my undergrad there in CS between 2002 and 2006 and the introduction courses were in C++ by then (though I believe they now have switched to Java and Python).
The only exposure I had to Pascal was implementing a compiler for an undergraduate and graduate level course for a subset of the language's features.
In order to run Apple Pascal on my Apple ][+, I had to buy a "language card". This was bigger than an index card (maybe 3 by 6 inches) and added sixteen whole kilobytes to your computer's RAM, beefing it up to a massive 64K and rendering it capable of running such a system hog as Apple Pascal. I think it was about a hundred bucks in the early 1980s.
Meanwhile, the Apple ][+ could only display 40 columns on screen, where of course by "screen" I mean "television". (You could buy another big card to give you enough memory to display 80 columns at a time, but who had the cash to make another huge purchase like that?). Of course, 40 columns isn't enough to write in a structured programming language with indentation like Pascal, and in fact the Pascal program itself supported logical lines of up to 80 characters.
This issue was resolved as brilliantly as you might expect. You could toggle between looking at the left half of your program (cut off at the 40-character mark) or the right half. I'm not kidding.
I was lucky enough to have a second hand Apple ][ from a "hacker", it had a ][+ rom and was pretty much loaded to the gills with a "z80 softcard" and etc. The slot 3 80-column card also had 16K RAM and was the only full-sized card in the machine. (It also came with full documentation if you wanted to burn a different font into a ROM.) So I had a 64K machine, but slot 0 was empty. Unfortunately I never had docs for Pascal, so I never bothered learning it and instead was mentally mutilated by BASIC.
A full-sized Apple card was a good bit less tall than a PC AT card and had to be angled to fit the case. e.g. [0]
I would think the 10MB Sider HD was a bigger deal. A friend of mine used one to run a BBS in Toronto back in the day. Before that, he had a custom setup that had a pair of 8" floppies, which was pretty uncommon for the Apple ][ platform.
I've been reading a lot about Niklaus Wirth recently. I read an interesting piece about Oberon I found in an HN archive[0] that mentions Oberon usage on Macs. I'm very tempted to buy "The School of Niklaus Wirth: The Art of Simplicity" after reading a few things about him. I wish there were more instances of "computing in a vacuum" like at ETH.
I studied under Wirth at ETH Zurich in the 1980s, so I got to interact with him personally a number of times. His class on compiler construction was particularly interesting. You could see how a philosophy of simplicity and clarity in syntax and semantics would translate to simple and clear compilers.
The downside of this philosophy was that aspects of interacting with a computer that were inherently messy tended to be swept under the rug: The original "file" and "string" concepts as described in early Pascal reports were simply unworkable, and combined with a marked disinterest in language standardization efforts (which I personally suspect stems from Wirth's involvement in Algol 68, although I have no direct evidence) led to each implementation rigging up their own ad hoc solutions.
As a consequence, even TeX, a batch program with no fancy demands on hardware, was not written in strictly "standardized" Pascal, but basically Knuth had to assume the existence of a number of non-standard mechanisms.
As you indicate, the language described in Jensen and Wirth's 1975 "Pascal User Manual" didn't have a way to open files with a name computed at runtime, nor was there a way to close a file. TeX needed to be able to do these things, as well as to have what C programmers would call a "default" case on a switch statement, which was also missing from standard Pascal. Happily, almost all Pascal compilers had these extensions; unhappily, the syntax was not generally agreed upon.
Other than that, TeX actually used a strict subset of standard Pascal; quoting module #3 in Volume B of Knuth's Computers & Typesetting: "Indeed, a conscious effort has been made here to avoid using several idiosyncratic features of standard PASCAL itself, so that most of the code can be translated mechanically into other high-level languages. For example, the `with' and `new' features are not used, nor are pointer types, set types, or enumerated scalar types; there are no `var' parameters, except in the case of files; there are no tag fields on variant records; there are no assignments real:=integer; no procedures are declared local to other procedures."
As you also say, Pascal had virtually useless string types, but TeX worked around that limitation itself, by managing its own string pool in a single big array of characters, with a second array of pointers to where each string started. A string was identified by its index in the later array. Through careful design and coding, no general garbage collection of strings was necessary, just the ability to forget the most recently-made string.
I've got that book, it is interesting to read. I also spent quite a bit of time looking at the Oberon system and that has been very educational. Oberon is quite like Pascal, but in some ways simplified even more. He wrote a complete operating system and user interface in Oberon and it is just amazing how much he achieved with very few lines of code.
I'm currently attempting to write a C compiler and the complexity is incredible compared to Oberon or Pascal. Sometimes I regret choosing C over Pascal as I'd probably be done now with the compiler, as it is I've only got the parser and preprocessor implemented.
Eiffel is another language that didn't gain a lot of popularity but is very interesting to read about. Bertrand Meyer, the creator of Eiffel is now at ETH Zurich, where Wirth spent most of his career. It's a very nice language that didn't achieve a lot of popularity, but some of the ideas, such as preconditions and postconditions, have been used in other systems.
I've written an Object Pascal (the current incarnation that Delphi/FreePascal uses) compiler/transpiler for Object Pascal->JavaScript and yes, it's great language for compiler writers because the language makes the job of the compiler so easy. You have to really go out of your way to make the compilation slow, and there aren't a lot of undefined behaviors or other gotchas that are hard to navigate. Also, I wrote it in Delphi. :-)
The page isn't rendering properly with ad blocking. The original memo is being served from Storify. Where is it from? The Internet Archive. Here's the original, which reads better directly from the Archive.[1]
I've started with Basic, some assembly (CALL-151) on my Apple ][ clone (Pravetz 8C), but as soon as I got my hands on IBM PC/XT (or AT) Turbo Pascal (the 30-40kb turbo.com) was just the right choice. It fit on one disk, there was plenty more, while a Microsoft C/C++ compiler and linker each took a whole separate disk.
The best thing I've loved were the .TPU files (but not sure whether TP4 or TP5 had them truly). There were no .h files to be included, or .lib (.a) to be added, it worked just magically well (with some limitations).
I've moved to C/C++ later simply because, well it's a stupid reason. I was writing a "File Manager" like app for DOS (just single column, not like Norton Commander, FAR or Midnight Commander), and the only function in Turbo Pascal 5.0 to move files was just renaming a file in the same folder... Had I known about inline assembly and be more brave, I would've stayed in Pascal Land (And I was already familiar with Ralph Brown's Interrupt List)... But hey, this stupid reason moved me to C/C++ as the builtin function there did it... then again, soon after that I've started using more and more inline assembly.
I love C/C++ now (especially C), and where I used to be really good at Pascal, I might have some hurdles reading pascal code today. Delphi was my last stop, and while I like it, I switched to video game development, and Pascal was not much used there (... Age Of Wonders I believe was written in some form of Pascal and possibly some other games,... Also part of Xoreax's IncrediBuild might've been, especially the part that does the C/C++ header processing, I think it was since we had issues with it, and while debugging found something pascal-ish in there, but don't remember now).
My first programming language was (obviously) BASIC, but my second was Pascal. I took AP computer programming in high school and it was taught with Pascal on Apple II and IIe computers. My dad later bought me Turbo Pascal for the PC (I remember a yellow box with "+ Objects" on it, so it must have been 5.5 Pro in 1989), and I used it on his machine, but never did much with it other than tinker. I finally got what I viewed as a real programming setup when I got DICE (Dillon's Integrated C Environment) for my Amiga a couple years later. Still didn't do much more than tinker, though, until I got a Linux box a couple years after that and source code for everything was available for poking at.
Anyway, Pascal was very common in education back then and Apply was very common in education...ergo, Apple and Pascal went together a lot of the time.
For those of you still interested in this story, the author of the original memo David Craig commented with clarifications, corrections, and further information. It's a truly epic comment.
Pascal was one of my favorite languages (in the form of Object Pascal). I made a living from it for decades. But Borland and a long succession of companies killed it through mismanagement. It does still exist in the form of freepascal and lazarus, but I've moved on to c#.
Pascal was great for people who didn't have curly braces in their muscle memory. But I suppose I've gone fully over to the dark side now.
Does the p-code compiler self host? I've been working on an emulator for an imaginary 8-bit machine (AVR instruction set), and have been looking for languafge options to run on it.
In the last few days, I've gotten FreePascal compiling for it, bit I would also like to have languages that I can compile or interpret on the machine itself.
I learned to program with Turbo Pascal on my PC back in the early 90s. Language lite is fun.
And yet, when I clicked on this part of me was really just hoping this was referring to nvidia's Pascal architecture, a hint that maybe they were finally dropping the Radeon line and getting some decent video cards into their machines.
The UCSD Pascal system, which ran on Apple II and many other computers in the late 70s - early 80s, always displayed a menu (for whatever program was running) across the top line of the screen. I wonder if that was the inspiration for the Macintosh menu bar.
When opened w/o Javascript you see only the first paragraph and the timeline at the bottom. I almost skipped over this because I thought there wasn't anything interesting there.
It doesn't matter what year it is, JavaScript provides zero utility on a blog except to give the author analytics.
Why do people hate the idea of documents so much? Imagine if you had to suffer through a different app for every single book you read; that is what a js-mandatory page is.
It looks like in this case it's not so much a blog post as "I had this thread of discoveries on Twitter, how about I try out Storify to just bung it on my blog"[0], which means using Twitter's JavaScript embedding. Obviously even from a narrative perspective a properly-written blog post would be to prefer but this STILL manages to be way better than linking to the twitter.com feed because Twitter's JS UI is completely unusable to read threads...
As far as I am aware hacking the browser via malicious Javascript is still a thing. I don't mind selectively enabling it for a particular website, but I absolutely hate how every site I go to serves up Javascript from 10 different URLs.
http://www.tomshardware.com/news/pwn2own-2017-microsoft-edge...
The point of the post was supposed to be: Gee-- it's interesting that the site doesn't look visibly "broken" when viewed w/o Javascript, and that might cause people to just skip it.
It doesn't bother me to turn on Javascript when a site needs it. (If I complained about every site posted to HN that needlessly required Javascript I'd never get anything done...)
I'll turn it on by default as soon as websites stop abusing my CPU, battery, and bandwidth with their gluttonous scripts. On my machine, with sites I visit, the browsing experience is perceptibly better (and the machine stays much cooler on my lap) w/ Javascript selectively enabled.
I know 10 years ago turning off Javascript was still a real concern. I tend to be of the mindset that in 2017 a browser without Javascript is like a browser without a mouse/trackpad - technically viable, but no one does it, so depending on Javascript is a given. Know of any good statistics on this however?
I am part of the 1.1% mentioned somewhere in this thread.
If some page really and truly looks worth it, I do a click or two in uMatrix and enable the damn thing - for the originating domain, and then for external sources if it still doesn't work and really, really, really looks worth it. If it's clear that I shall be coming back, I make the exceptions permanent. But boy do I keep a mental note that the page is sub-standard.
Approach worked well in 1996 (though no uMatrix), and works well in 2017.
Please, everybody, if you haven't read it, stop now, get a copy of "Humane Interface", spend the weekend reading it, and then come in Monday and apply it.
Apple's Pascal had been extended to the point where there were few true differences between it and C, other than
- strings with busted semantics (size being part of the type being a huge mistake, leading to a proliferation of types like Str255, Str32, Str31, Str64, etc). I should add that C's strings were semantically busted, too, and in more dangerous ways. No way to win :-)
- nested procedures (not terribly useful in practice, IMHO)
- an object syntax, used for Object Pascal and MacApp (a complete, though large and somewhat slow app framework).
- some miscellany, like enums and modules
Apple extended Pascal pretty extensively, adding pointer arithmetic, address-of, variant functions calls, and a bunch of things I've forgotten. I could write some Pascal, then write some C, and squint and they'd look pretty much the same. Most people shrugged and wrote new code C if they were able, and then moved to C++ when CFront became usable.