Hacker News new | past | comments | ask | show | jobs | submit | microtherion's comments login

Some time ago, I attended the memorial service for a skilled painter (not exactly a household name, though), and one of the stories told about him was that he visited the municipal museum, where there was a new exhibit of a newly acquired abstract expressionist painting (I believe by Mark Rothko), which just consisted of painted rectangles.

He studied the painting for some time, and then asked to see the director of the museum, to inform him that the painting was hung upside down! When asked why he would think that, he pointed out that wet paint does not flow upward…

So it is indeed possible for a connaisseur to distinguish interesting details in a painting like this.


Excellent anecdote! Thanks for sharing.

Isn't this more evidence that it's arbitrary to decide something is "bad art" vs "good modern art" (of the pop/avant garde variety)?


I know literally zero working programmers who learned programming the way Dijkstra thought it should be taught — not even Dijkstra himself, as Donald Knuth once gently pointed out.

Practically everybody in my generation started off with BASIC. On the other hand, at some point (when?), this practice stopped, and the newer generations turned out fine starting out with more civilized languages.


To be fair to Dijkstra, he was writing about how he believed university students should be taught. Two years before that cruelty paper was published, I was getting my first exposure ever to computer programming when my parents bought a Commodore 64 that came with a BASIC manual that showed how to make a Pong clone. I was 6 years old.

There's maybe an analogy to riding a bike. If you're aspiring to compete in a grand tour, you probably want power meters, lactate threshold and VO2 max tests in a lab, training that is principled and somewhat scientific in the way it builds toward a goal. If you're 6, your parents just put you on the seat and push you until your balance gets good enough that you can take the training wheels off.


> If you're 6, your parents just put you on the seat and push you until your balance gets good enough that you can take the training wheels off.

Which can take a very long time, because you have training wheels to begin with. If you're about to teach a kid how to ride a bicycle, it's far better to do without them. And the pedals too; learning how to use those is a huge distraction from learning to find and hold your balance.

There are special “kick bikes” for tiny tots to propel by kicking off the ground. And some of those can later be converted into “normal” bikes by attaching a chain drive, but... Feels kludgy, and is usually rather expensive.

If you can find an ordinary bike where you can get the saddle low enough for your kid to reach the ground with their feet, just remove the pedals and let them use that as a “kick bike”. If you can find a (gentle!) slope to practice on, you'll be able to replace the pedals in a matter of a few weeks at most, or probably days.


Yes, Dijkstra was writing about the education of university students. I don't know whether he ever wrote anything about elementary school (or earlier) computer education, but I doubt he'd have approved of it, let alone of a hands on approach.

Total aside: Training wheels are a thing I remember from my youth but today (at least here) they are barely used at all anymore.

I'm still used to the phrase (taking the training wheels off) but I'm fairly certain my kids will grow up not using it.


The sort of pushbikes for littler kids lets them learn balance and steering before also having to learn how to pedal and brake.

So half the learning happens on those pushbikes before they move to real bikes.


Consider me naive, but what way did Dijkstra thought it should be taught? Someone who first learned to code in QBASIC

Other commenters are completely right to mention his concern for proofs and the "Cruelty of Really Teaching Computer Science", but the most BASIC-specific thing that he was associated with was criticism of the GOTO statement.

https://homepages.cwi.nl/~storm/teaching/reader/Dijkstra68.p...

In original BASIC, the GOTO is a foundational mechanism and a majority of programs would have used it, sometimes extensively. Dijkstra thought for many reasons that this wasn't good style and didn't promote clear thinking. And yes, one consequence of that is that it would be harder to prove programs correct or just to reason about whether they were correct.

Programs that overuse GOTOs (or from the point of view of later structured programming and functional programming advocates, perhaps programs that use GOTOs at all) were stigmatized as "spaghetti code".

https://en.wikipedia.org/wiki/Spaghetti_code

By the way, this concern is not just about aesthetics: some of the ideas that Dijkstra was advocating are arguably those that newer programming languages like Haskell and Rust can use to find bugs in code automatically at compile-time, or to make it harder to write certain bugs at all. The line between Dijkstra's advocacy and these techniques is complicated but I think there is a connection. So partly we might say that Dijkstra was not just concerned with how to make it easier for humans to think clearly about program correctness, but ultimately also about how to make it easier for computers to help humans automatically determine (parts of) program correctness. And it's true that the GOTO style complicates that task.


Kind of ironic that nowadays many people in our generation consider the newer generations to be lacking fundamental education because they never used GOTO based programming languages. I've talked to multiple people who lamented that young programmers have never done assembly or BASIC.

It’s helpful to have a mental model of how the computer works. I don’t know if it’s necessary that one have spent mountains of time building real software using a GOTO/jmp style, but having exposure to it would be nice, rather than hiding it away.

Jeff Dunteman’s assembly programming books included a “chapter 0” that I always loved, and which really stuck with me for how creatively they taught those topics.


I mean, CPUs do a bunch of work to make us believe they still operate just as a fast PDP-11, and I would wager that besides compiler experts that work on the backend parts of compilers, not many people have a real feel for modern hardware (obviously besides those that actually work on that given hardware).

So I'm not convinced that even those who think they know how it works know it actually.


We need updated board games that replicate the function of a CPU.

Assembly? Sure, that has some educational value.

BASIC? That’s just nostalgia for poverty.


Not entirely. GOTO can be pretty nice! And even the lack of structs probably has the advantage of helping to prepare you for today's world where column-major is back in style, for performance reasons.

Dijkstra thought of computer science as a subdomain of mathematics, and thought that hands-on experimentation with actual computers would mostly lead students astray. A program should all be worked out and proven correct before (optionally) feeding it to a computer, and testing and even more so debugging were abhorrent practices.

BASIC, on the other hand, is more aligned with what Seymour Papert later came to call "Constructionism": the student learns by experimentation.


It is the "correct by construction" approach vs the "construct by correction" approach.

Dijkstra was silly because everybody knows that Computer Science is the parent field of mathematics.

Mathematics is the study of all O(1) algorithms.

Computer Science is all other algorithms!


That's how it was with CS at Purdue when I was there in beginning of the 1990's.

It was Computational Science, not Computer Science, and was in the math department.

We did everything wiht pen and paper until I got into my 300 level classes and we got access to the NeXT cubes and IBM 3090.

I ended up switching to networking and the tech track, but it was definitely different...


Ironically, I grew up with limited access to computers, so I wrote many programs on paper first, including a FORTH implementation in assembly language I wrote over summer break with a typewriter, waiting for school to start again so I could actually test it hands on.

“On the Cruelty of Really Teaching Computer Science”[0]

[0] https://en.m.wikipedia.org/wiki/On_the_Cruelty_of_Really_Tea...


https://www.cs.utexas.edu/~EWD/transcriptions/OtherDocs/Hask... - describing why, in 2001, he thought Haskell was a good choice for a first college course in CS.

You can read many of his thoughts here: https://www.cs.utexas.edu/~EWD/welcome.html


He probably thought programming students should be taught Pascal, the academic language he pioneered. It is quite different from BASIC.

Pascal was Wirth, not Dijkstra.

Sorry, misremembering my meager computing history...

I remain convinced that the 1990s rewrite of Inside Macintosh (which can be found here on your site: https://vintageapple.org/inside_r/) was the best documentation Apple ever produced.

Apple really wrote amazing and comprehensive documentation. I wish they would do so today.

I've culled most of my computer books as I've moved, but I'll hold on to my Inside Mac volumes.

Vol. 6 of Inside Mac was too big though. No one is going to curl up on a sofa and read that book.

As programming documentation, they were way better than 99% of the after-thoughts that most companies published.


It mostly makes sense to me. "Advanced CPUs" should have been "advanced models" (The Mac LC) missing coprocessors.

And it probably was true that crashes in Microsoft products got patched in the system, while most other app developers did not get that courtesy.


The best base of reference!

1 Ringgit = 1 Ringgit


Another factor might be that Intel MacBook Pros got thinner and thinner. The M1 MBP was quite a bit thicker than its Intel predecessors, and I think the form factor has remained the same since then.


The 2012-2015 MBPs were slimmer than the previous unibodies but a lot of that was due to dropping the optical and spinning hard drives. The thermals were not a particular problem. Where that did become a problem was with the 2016 redesign that introduces the wedge-shaped case. The design of that would have been locked in 2-3 years earlier. From what I have read, when the case designs were being worked on Intel was promising Apple that their next generation of chips would be on a smaller processor node and would run cooler so that defined the thermal envelope of the new MBPs. Unfortunately, as we all saw, Intel’s production stalled for 5-6 years and the only chips they could produce were power hungry and hot. That caused problems for those thinner MBPs.

Apple seems to have taken that to heart when they designed the cases for the Apple Silicon MBPs and those have excellent cooling (and more ports).


Yes, but I had about every generation of intel MPB - it never was as good at it as M1 MBP.


Unless I misunderstand you, you're citing a completely irrelevant factoid. Employment statistics are based on actual people working, not job ads posted. And the revisions [1] (which sometimes are upward) have nothing to do with ghost jobs, but are due to additional data coming in over time, leading to refinements of the original estimate[2].

[1] https://www.bls.gov/web/empsit/cesnaicsrev.htm [2] https://thedispatch.com/article/jobs-report-revisions-explai...


My implications are that companies are either lying and being readjusted from later audits, or government is choosing not to take data into account until later, where impact is lessened. It's not a "ghost job" in the modern meaning, but it does seem to be pretending there are a lot more jobs than in reality.


It's the other way around, isn't it? MagSafe was removed in the 2016-2019 model years (not sure why; maybe to shave off another bit of thickness?), and then brought back in 2020 to MacBook Pro and 2022 to MacBook Air.

Personally, I practically never use MagSafe, because the convenience of USB C charging cables all over the house outweighs the advantages of MagSafe for me.


One complicating factor in the case of the Intel Macs is that an architectural transition happened after they came out. So they will be able to run less and less new software over the next couple of years, and they lack most AI-enabling hardware acceleration.

That said, they did suffer from some self inflicted hardware limitations, as you hint. One reason I like the MBP is the return of the SD card slot.


As far as I know, Goldman Sachs didn't have previous experience issuing credit cards either, so maybe it was also (partly) incompetence on their part.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: