Hacker News new | past | comments | ask | show | jobs | submit login

I've known Eric since around 1985 or so when he came to work at Data I/O. Our friendship has ebbed and flowed over the years, but he's always been special to me.

I have a little anecdote to share that probably only I remember now, but it has always made me laugh and shows what an unusual thinker he was.

When he was first working at Microsoft around 1991 or 92, he was talking to a manager who had a problem. There was a program written in assembler language, about 50k in size or so. The program had a problem that needed to be fixed. The author no longer worked at Microsoft. The manager had repeatedly assigned various programmers the task of fixing it, and they'd all given up after days of effort.

The trouble was the author had used the macro language that came with the assembler to invent his own quirky, weird, undocumented and thoroughly impenetrable language. None of the programmers could figure it out. He was lamenting that he'd have to assign a programmer to recreating it from scratch, which would take months and cost a lot of money.

Eric says "I can fix it".

In 2 hours, he had it all fixed and checked back into the system.

The manager, astonished, asked: "how did you figure out the author's language?"

Eric: "I didn't. I disassembled the binaries using Walter's disassembler [a disassembler turns binary code back into source assembler code]. The problem was then obvious, I fixed it and checked the new source code back in!"

Eric then laughed uproariously, and so did I. Nobody but Eric would have thought of that shortcut to solve the problem. If he couldn't break through the wall, he'd go over it, under it, or around it. He was always like that.

I'll miss him a lot.




Another example of how Eric was not constrained by conventional thinking. While at Microsoft in the 90's, he'd occasionally interview job candidates. He'd hand them his phone, and ask them "if it was your job to make a better phone, what would you do?"

The bulk would answer "Um, better sound quality? Maybe an address book?"

Eric just couldn't believe those responses. (Remember, this was in the 90's when a phone was just a phone.) He was always boiling over with ideas.


> "Um, better sound quality?"

> Eric just couldn't believe those responses.

If only a phone could be invented that had sound quality as good as we had in the 90s. I can barely understand people anymore. I'm not losing my hearing in my old age. (much) In person is fine. Other recordings are fine. Even video conferences are fine. But phone calls sound unaccountably terrible.


It has been starkly noticeable since my company has moved from doing phone-in conference calls to using Teams just how compressed normal phone line conversations are. Combined with a couple of our members having piss poor mobile coverage where they were (but much faster internet) our meetings became a lot higher quality.

On the phone-line side I wonder if that is a degradation due to digital switchovers & compression, a lack of allocated resources, or just pure rose tinted glasses.


VoLTE [0] really improves this, but with compatibility and network limitations make it all too rare. It's like night and day when it works though.

[0] https://en.wikipedia.org/wiki/Voice_over_LTE


I probably make calls with apps more than POTS calls these days. The sample rate and codec on a typical social app is noticably better than POTS calls.


POTS is not POTS any more. Anything other than possible last mile is now digital, with lowest possible if not lower bitrate and old codecs. And that extends to mobile.


Right. So substitute "AMR-NB" for "POTS".


It really should be AMR-WB or even AMR-WB+ by now, and EVS if you are on VoLTE.

EVS are very good at 16-32Kbps bitrate, possibly better than even Opus. But generally speaking Social Apps uses higher bitrate you get better quality. Although latency tends to suffer a bit.


Wait, you're using a smartphone to phone ? how peculiar.


The highest bitrate EVS codecs finally get there I think. Though you'll need sufficient signal strength and networks which support them at both ends of the call.


its likely due to the phone companies squeezing the last bit of space out of their frequency bands, making it sound worse and worse.


Might be a VoIP thing?


At least in Germany, since we got VoIP the voice quality is absolutely crystal-clear if all devices on the line are digital. It will still sound muffled if you connect e.g. an analog phone to your VoIP router, but a DECT phone over VoIP will sound so bright and clear that it's almost frightening.


My idea at the time was to add a morse code key (a rocker switch, one side was dit the other dah) so you could send messages privately while in a meeting.

Another idea is to make a walkie-talkie mode, where you could communicate with nearby cell phones without needing a cell tower.


Not sure about the US, but Canada had such a mixed cellular & push-to-talk network in the 90s called MIKE, and it was quite popular in some industries (e.g. trades). I don’t think it was actually a walk-in-talkie, but the user experience was identical.


In the USA it was Nextel.


Exactly. Those phones were very popular on construction job sites. Have a buddy who used to sell them. He talked me into getting one that supported J2ME. It really wasn’t much, but, in 2002/3, was the aha moment that showed where programmable phones would go.


> My idea at the time was to add a morse code key (a rocker switch, one side was dit the other dah) so you could send messages privately while in a meeting.

When the very first phones had physical buttons, less sophisticated UIs and T9 text entry, I remember being able to send an SMS from my phone without ever taking it out of my pocket. All from muscle memory.


> "if it was your job to make a better phone, what would you do?"

Turn it in to a cube maybe an octagon. I'm sick of rectangles.


half as wide. cinematic 2.3 aspect ratio, maybe more.

less snark comment (but also true), just jaw dropping that these pocket PC's do so much. on the down sides, I think the main think that floors me, just stabs me in the heart, is that they are still almost totally trash at communicating with those around us. Google dropped Project Eddystone, that sucked; totally straightforward sensible way to broadcast a url to the world about you. Apple has their file sharing AirDrop, which Google continues to make floundering attempts at (Beam, then Fast Share/Nearby Share), & never via any open protocols or specs. amazing personal communicators... totally reliant on completely centralized infrastructure. The couple Bridgify/Open Garden alternatives, it all seems too stack specific & lacking the layering of protocols & extensibility that allows communication tech to grow over time, become of enduring value.

I forget what it is but someone was telling me the Bump competitor in Japan is still used, that you still have a physical little ritual with your devices that causes them to exchange contact info. That kind of real.world connectivity seems like it ought be just the start, for how tech can augment the real world, versus us needing to start & navigate the digital to connect. We're doing little favors, entrenching connectivity so reservedly in the digital all space.


Japan the land of paper business cards.. and the lovegetty! Does anybody remember the poken which you could carry on your keyring. Touch two together and there was a kind of lovering glow, to exchange contact info. Insert the usb side into laptop to download the contact information. It almost hit critical mass in Japan. Poken parties in Tokyo were fun. Such a pre-covid alt reality...


We got free Poken at a conference. I was a bit of a snob for a day or so, & didn't really know or trust the controlling entity, but I decided to try it. It was really fun! Easy to use.

I think Bump happened a year or two latter, which was also fun. But it felt far more intermediated, like it was pulling down much more data. Looking at who in the area also bumped at just that instant to triangulate who to put in contact with each other.

I remember back in the 90's wear(able)-comp(uting) world, PANs were going to be a thing. Personal Area Networks. Your skin itself would be a low-baud rate modem, so you could shake hands & transmit your contact information. Definitely pre-covid. ;)

I really think Project Eddystone has many of the answers for how we ought to make ourselves visible to the world. It was mainly focused on commercialized entities, and people are scared of the internet, but I think there's still great potential. Alternatively, it seems weird that NFC, which is much more tightly coupled, 1:1, "safe", is so under-utilized. I'm not sure if there are good open standards for data-exchange over it. Meanwhile proprietary AirDrop and FastShare/Nearby Sharing are some of the few, tightly controlled offerings for offering files. All these manners of tech needs some champions to drive good protocols, enable more interesting futures. Sorry, I think I'm repeating myself again. Thanks for the fun comments dcsan.


> they are still almost totally trash at communicating with those around us

Probably because corps don’t want us communicating with each other in any way they can’t know about.


Kids today scan QR codes to link up on Snapchat to keep in touch, so in a sense, we kind of have come full circle.


I'd add a swappable battery, microSD card slot for easy storage upgrade, and of course, a headphone jack...


I’d add a headphone jack that doubles as a serial port, swappable battery, storage expasion, and a swappable, expandable, customizable OS and shell, like so: https://pine64.com/product/pinephone-community-edition-kde-p... (Extra batteries, when in stock: https://pine64.com/product/pinephone-lithium-battery/?v=505a... standalone charger: https://pine64.com/product/pinephone-battery-charger/?v=0446..., shell: https://sr.ht/~mil/Sxmo/)

Plus easier-access hardware kill switches and a faster GPU and RAM: https://shop.puri.sm/shop/librem-5/


I don't know many people who care about the hardware except for the camera. How is it? Can it compete with the current phones from Apple and Google? Does it do all the standard things like include filters, portrait mode, night mode, etc...? Can I search for "dog" and get all my dog photos or search for a city and get all the photos taken there?

And how's the app store? When some random game goes viral and I want to play with my friends, am I going to be able to? Are the apps vetted for security and safety?

Does it integrate with my car? One of the primary uses of my phone is for navigation. Being able to plug it into my car and say "navigate to 123 main street" as I drive is pretty key.

Is their store only online? If they don't have their own retailers can I at least see them at a carrier store or BestBuy? Will Verizon activate the phone on their network? A decade ago they wouldn't approve the Nexus 7 tablet, so this is something I think about.


> I don't know many people who care about the hardware except for the camera.

These phones are not for many people.

> How is it?

In development. The pinephone can take still shots. The Librem 5’s camera isn’t functional yet AFAIK.

> Can it compete with the current phones from Apple and Google?

No. It’s not playing the same game. I would pay money not to have one of those.

> Does it do all the standard things like include filters, portrait mode, night mode, etc...?

AFAIK, no.

> Can I search for "dog" and get all my dog photos or search for a city and get all the photos taken there?

I don’t know of any apps for them that do that at the moment.

> And how's the app store? When some random game goes viral and I want to play with my friends, am I going to be able to?

Probably not. It probably has about 6 orders of magnitude fewer users, and 5–6 orders fewer devs. You can virtualize Android, but I wouldn’t be surprised if it was a much worse experience than Proton on desktop (it doesn’t have Valve behind it. Anti-cheat might get you banned).

> Are the apps vetted for security and safety?

Yes, in some repos. Most of the distros will let you use Debian’s, which IIUC have had z-e-r-o zero malware (not counting systemd ;)) in their decades-long history.

> Does it integrate with my car? One of the primary uses of my phone is for navigation. Being able to plug it into my car and say "navigate to 123 main street" as I drive is pretty key.

I’m confused why it needs to integrate with you car to give directions. Do you mean connect with Bluetooth to use your speakers?

> Is their store only online?

Yep.

> If they don't have their own retailers can I at least see them at a carrier store or BestBuy?

Nope.

> Will Verizon activate the phone on their network? A decade ago they wouldn't approve the Nexus 7 tablet, so this is something I think about.

Nope, not as far as my crystal ball can see. The pine wiki says you can activate a SIM on a burner phone and swap it into the pinephone, and Verizon shouldn’t care.


> Do you mean connect with Bluetooth to use your speakers?

That would work. Integrating like CarPlay or Android Auto would be better, but bluetooth or aux out works too although it's a little primitive.

As long as I can do something like say "hey pinephone, navigate to 123 main street" and it does the obvious thing.


How are these doing, sales and developer support wise?


Bring back sidetalking! http://www.sidetalking.com/


Are we celebrating irrelevant interview questions?


It’s not necessarily irrelevant (depending on the role). After interviewing a lot of people, I’ve found you can ask a handful of questions and get a pretty good sense for their work style (outside of technical knowledge).

That can be helpful in hiring.

If you’re looking to hire someone for a technical role on some big hairy project that might not work or seems crazy, it’s helpful to select for people who won’t say “this idea is dumb”.


I don't fully understand why every question needs to be 100% relevant.

If you have two candidates equal in technical aspects, then a point of difference is in problem solving and dealing with the unexpected.


That's a very good question though. It allows assesing how other people approach problem solving, how they think and how they communicate. A question like this will often tell you more about a candidate than seeing them invert a binary tree.


I disagree. You're asking the candidate to guess what kind of answer you're looking for.

Possible answers, all valid, all tell you absolutely nothing

* I'd change nothing, it works "AS A PHONE' just fine. There are better things to to work on

* I'd attach it to the side of a rocket so you could use it as a handle. I'd much rather ride a rocket than use the phone

* I'd double the speakers and mic and offset them at 45 degrees so two people could hug cheeks while talking to the same person

* I'd give it a display, add a computer, and have it pocket size and it would annoy you all day long with notifications and distract you from getting anything done. People would stop using it as a phone and mostly only write text messages to each other which would be way less efficient than voice. Oh wait, you said "improve". Yea, ok, bad suggestion


You don't understand. When you get a question like this as an interviewee, they want to see how you arrive at your answer regardless of what the answer is. The answer by itself is less important than what made you pick it in the first place.

For example, possible takeaways from your answers:

* Lacks critical thinking, takes things for granted, doesn't show initiative or not comfortable with it.

* Elevated smartassness, possibly as a defense mechanism or due to inability to read between the lines. Maybe just nervous at the moment.

* Better. At least this solves some specific problem based on one's personal experience. It's possible to discuss that and see what else surfaces.

* Also good. Miraculous even if you'd managed to produce all that in the early 90s. Again, this also opens things up for further discussion.

See the point? It's a meta question.


The fact that you think "leave it alone" is an example of lacking critical thinking skills shows just how subjective this is.

On the assumption that not many people being interviewed by Microsoft in the 90s had an iPhone style plan or a desire to revolutionize cell phones, "leave it alone" is the most realistic, objectively useful answer a candidate might give. Sure, it demonstrates zero creativity, but it's totally unfair to say that person lacks critical thinking skills as their answer is the most rational.

To pass that question, you need to know the interviewer is looking for creativity and be able to make something up on the spot. That's obvious to many of us, but depending on the role I wouldn't fault someone for not being in tune to that question or ready for it.


> That's obvious to many of us, but depending on the role I wouldn't fault someone for not being in tune to that question or ready for it.

The job interview gets debated endlessly, week after week, on HN. There isn't one interview method or question that numerous people would consider unfair, irrelevant, broken, etc.

Eric was interviewing people for very high paying jobs. A decision has to be made about the candidate based on something. A person who wants such a job should go in eager and ready to convince the employer that they're a great fit for the job, and do what it takes.

And yeah, Eric's question was pretty obvious. It's open ended, and you can interpret it as you like, which is part of the point. But being a smartass and/or complaining about the question isn't likely to leave a favorable impression. A "faster horse" answer isn't likely to, either.


That's not the point. The point is the judgement of the answers to "how do you improve the phone" are totally subjective. Maybe as the interviewee I guess you want some crazy answer like "well we could built it into your car" but what you really wanted was proof that I wouldn't waste time on this but instead focus on getting stuff done.

As an aside, IMO smartphones are not an improvement on phones. They are an improvement on PDAs. As phones they've utterly failed. People called each other (as in use a phone) way more before smartphones. So smartphones are not better "phones". They're worse phones and better something else.


Build-in Automatic Teller Machine: a phone that dispenses money!


I was at Microsoft around the time of Eric’s departure (I left a few months before him).

Ironically, the two men arguably most responsible for the success of the Xbox were strongly opposed to the entire “Manhattan” project at the time.

Eric was instrumental in the DirectX runtime design that Microsoft ended up porting to the original Xbox hardware with shared high level components.

The other guy was Dave Cutler, who led the NT and hypervisor groups at the time. Xbox v1 borrowed heavily from Dave’s work on modular, stable-userspace NT. 360 was basically running a full-fledged NT iirc. Dave, like Eric, hated the Xbox project early on.

In today’s terms, it would be like Facebook deciding to make a search engine or mobile OS, so their reactions were hardly surprising. Word was that even BillG was hesitant, but Ballmer really wanted more product lines.

Eric will be dearly missed.


What's 'the entire "Manhattan" project'? I mean, other than the nuke on Hiroshima which I suppose you don't mean?

Remember that there's HN'ers who never worked at Microsoft :-)


"Manhattan" project was the code name for the original XBox.

At the time Japan dominated video consoles with the last western attempt being the disaster which was the 3DO. Thus Microsoft was going "nuke the japanese industry" and gain a foothold.

For further irony the Xbox was in effect was a spiritual successor to Sega's Dreamcast. That Microsoft went with such a distasteful name despite partnering with Sega highlights their cultural issues. Issues which have continued to prevent Xbox from building relationships with Japanese studios.


"Manhattan" was about DirectX, not the XBox. (That's why DirectX had the weird weird radioactive logos, even before the XBox was an idea.)

It wasn't about consoles but gaming in general. Programming Games for DOS required a lot of assembly knowledge (at least, if you wanted something fluid) and deviation from industry norms only increased development cost (thinking things like High res support necessitating VESA code, to say nothing of sound support in DOS.)

Devs hated dealing with all this extra work, extra compatibility, extra libraries (old gamers may recall MILES or DMX setup screens in some titles.) The libraries for Windows 3.1 were... not at all up to the task.

Thus, DirectX was created. Not so much to create a console competitor, as to show PCs as superior to consoles overall.

Which, when one thinks about it, may explain why Eric/etc were not big fans of the XBox; in some ways it conceded defeat on a level.


I hope the name was a coincidence - consider that Microsoft uses city names for big project codenames.


No it was not a coincidence as testified by Eric in his interview with the University of Washington business school.

He actually comes across a mix of interesting and unpleasant in this interview. It takes me back to the nineties when he and St. John were routinely pissing off people including myself.

But in the end people are complicated.

https://www.youtube.com/watch?v=rjZWUweIIEU


Of all the things microsoft have done, using a vaguely insensitive name is not really that worrying to me.


Chicago I think was windows 95


According to Eric when he told me about this, it was HIS idea to name it that. When he said this, it made sense because it followed his sense of humor (this was literally in September he told me this story).


It was a codename for a project that later became DirectX. That's why the early DirectX logo looks similar to radiation warning sign.


That's beautiful. Thanks for sharing. It reminds of a similar experience I had.

A few years back one of the systems I ended up inheriting stopped working on January 1st. As most people, I had taken the day off, but my manager called me to fix the issue in prod.

After much researching, I found out the assembly causing the issue. Similarly, the original dev had left and the source code was lost. So I used ildasm and found out the year was hardcoded. I patched the binary and went home.

Rest in Peace, Eric.


Similarly, the original dev had left and the source code was lost.

This seems surprisingly common, I hear it a lot from my customers.


> So I used ildasm and found out the year was hardcoded.

Deliberate time bomb or honest mistake?


This approach generalizes. When faced with a problem, consider the projections you have available and work in the most suitable.


As Alan Kay said, "A change in perspective is worth 80 IQ points."


Just look at some intractable problems in (T)ime, become almost trivial in (F)requency. Or the Laplace transform, brilliant: https://en.wikipedia.org/wiki/Laplace_transform


plus or minus?


You may be kidding, but I think it is true that pivoting to the wrong perspective can be actively harmful.

Consider, for example, the impact of anti-vaxxer's perspective that "chemicals are bad for the body".


What do you mean by projections? Do you have another example?


More examples:

When debugging a web app, you can debug the code (and add print statements / breakpoints). Or look at the HTTP queries its making in the inspector. Or debug the server - look at what queries its receiving and returning. Or experiment with different values in the database.

Last year I was trying to track down a memory leak in some wasm code. The tooling for that sort of thing in wasm was/is quite primitive, so I did some work to compile the module natively then ran valgrind on the resulting executable. That didn't work - the C code executed differently from the wasm and didn't exhibit the bug! So I added print statements randomly through the program and ran it both natively and through wasm, then diffed the text output to track down when the execution traces diverged. (It wasn't a compiler bug, fwiw.)

I wanted more information about the memory leak. So I wrapped all the malloc/free calls with a replacement that stored all outstanding allocations in a list, with the stack trace when malloc was called. When the leak happened, I could ask for the stack trace of all outstanding allocated objects[1]. That was fast and effective, and I added a memory leak check in my CI.

At one of my first programming jobs we were using visual source safe (oh god, kill it with fire). I ended up losing some changes I'd made in a ~3000 line java class that we'd already deployed to production. I didn't want to do my work over again (or admit the mistake to my boss!). So I ran a decompiler against the class file we deployed to production - but it was huge and unreadable. To find the changes I'd made I took the current version of the java file, then compiled it and decompiled it again. So then I had two ugly files - but I could diff them, and that highlighted immediately the dozen or so changes I'd made. Then I just manually rewrote each change back into the original source file. Importantly, I had complete confidence that I didn't miss any of my changes.

[1] Its not pretty, but the code looks like this if anyone's curious: https://github.com/josephg/mime-to-jmap/blob/2a7429f7b805cb0...


Diffing log output is a great technique for tracking down weird, hard to repro bugs I find - log out as much as you can in a normal session and then in a buggy session, and diff the two, just as you describe.

Have also used this technique to compare build logs when you have a version of a project which will compile and one which doesn’t (eg when trying to integrate a second project into an existing one, or battling CI issues), useful if you aren’t an expert in that particular tool chain especially.

Personally I use VS Code’s diff these days, since they added support for word wrap recently, it’s the easiest to read.


> visual source safe (oh god, kill it with fire)

Did you have merge week too? Often stretching into a second?


No - we had a small team (5 people). We still managed to corrupt the sourcecode database every couple of weeks and needed to reinitialize the whole thing.


We did daily backups to a second machine keeping the last 3. When it corrupted itself we copied it all back. I liked sourcesafe. It was kind of cool the way it worked and integrated into all the MS tooling we used at that time. Plus it was decently cheap enough for a small inhouse group of devs to afford. Its semantics for usage were very straightforward and easy to learn. BUT it was a pain for how it would corrupt itself randomly. That 'one' bug is why no one dares use it today and hates the thing. To this day because of how easy it was to use I compare the UX of all other source controls to it.


Map the problem into other domains. In this case, he could work with the original (hard to understand) source code. Or he could use a tool that would translate it (or the binary) into a more sensible language.

In other tasks, consider different data structures or representations that fundamentally represent the same data, but in different ways that offer different guarantees. Representing a graph with a matrix may let you answer some questions very quickly, but it may make others harder to answer, for example.


Its an eastern mode if thinking. When faced by a problem, walk around the problem and examine all avenues of approach. Much like scaling a mountain. Seek the path of least resistance.

To often we simply reach for our trusty hammer, and it's not always an efficient or usable tool.


How is this an "eastern" mode of thinking. It's a pretty typical way we'd approach solving problems in physics anyway, and this was all in very western university here in the US.


It may be prevalent in niche industries. But it's largely not culturally apart of historically western culture. And would you apply that problem approach prior to becoming a physics expert, did they teach that in kindergarten?

Eg in school we learn as kids derivatives, rote routines to solve a type of problem, and we plug and chug, but we are not shown (necessarily) by govt education the underlying theories that may create a solution. This/my opinion is of course dependant on how expert an instructor is ...

The best anecdotal example I can give off the net offhand would be: https://www.google.com/amp/ask.metafilter.com/105309/Tao-thi...

Thanks for your feedback

Edit: Also in the usa we're taught do your own homework for example. In asian cultures often students band together to succeed in classes (I work in higherEd and this is culturally visible in peers in classes). This is another example of this philosophy being different between cultures.


Creative/critical thinking is taught heavily from kindergarten in western schools.

In contrast to eastern education which tends to greatly focus on memorization through repetition.. (i.e https://en.m.wikipedia.org/wiki/Rote_learning)

The quote you have included is based on Taoism. Is that common? I was under the impression that Confucianism has the more dominant influence on values and thought.

Anyways, at least that's one thing we can agree on - Taoism's philosophy encourages an open mind and awareness.


> In contrast to eastern education which tends to greatly focus on memorization through repetition..

Wouldn't quite say that is true. People use a variety of techniques to get scored higher and there's always a certain percentage who will use such rote learning techniques, be it eastern or western.

I think what what's missing in the discussion is the western focus on individualism as compared to east asian communitarianism. Both have their pros and cons and mutual dislikes.

There was a book by 2 high level western/eastern women managers in Hewlett Packard that highlights this dichotomy. It's a great read.


> individualism as compared to east asian communitarianism

Again, that's because of a deep rooting in Confucianism. Also, "Communitarianism" is a western (Christian) philosophy. It may have been borrowed from Confucianism? I'm not sure. A lot of religions and philosophies share these kinds of values, so it's not unique to just one region / culture, IMO.

Anyhow, neither is better or worse, and probably a balance between communitarianism and individualism is best. All cultures can learn from each other.


> Also, "Communitarianism" is a western (Christian) philosophy

Hm probably a better word would be collectivism


> Edit: Also in the usa we're taught do your own homework for example.

I went to a mix of 7 private and government schools between K through 12th grades in the US, and I don’t remember any of them not having group projects. Some, or most of the homework is individual, but there’s plenty of group work too.


He's referring to the phenomenon of certain groups of students cheating on homework and justifying it by claiming it's a cultural difference in approach to learning. It's not. It's outright splitting up work between peers, often with the use of illicit solutions manuals, for the purpose of gaming the grading system. It's especially beneficial in classes where the professor just writes exams from homework problems or problems very similar to them.

There isn't anything special or insightful, mystical or different or "eastern" about cooperative learning. I'm thoroughly "western" and had plenty of homework groups in both undergrad and grad school. Nor is there anything special about the cooperative cheating I described: "certain groups" applied to plenty of western peer groups I saw in undergrad especially, though they didn't claim what they were doing was some "cultural" thing, as they knew it was simply cheating.


He means mathematical projections. For example, calculating rotation velocities in a cartesian projection is harder than using cylindrical projection.


Nope you're giving him too much credit. He meant projections as in to a different view on the problem. Not anything more well defined than that. Your manner of explaining it is much better.


I very much meant mathematical projections, but also (by analogy) other kinds of perspective shift where they have enough of the same properties.


Two classic examples (Cartesian vs polar coordinates, frequency vs time domain) are mentioned in other comments.

I would certainly add "looking at values vs looking at differences" and "looking at the generator vs looking at what's generated" (which the top level story is an example of).

It may take a little squinting, but we can see the common mathematical practice of reducing one problem to an example of another as an instance of this. There are also tweakier math examples where treating something as topology or something makes it easy to solve.


Back up and consider if all of the turns you made to get down this particular blind alley actually made sense. “First principles” and all that.


> What do you mean by projections?

@Jtsummers puts it as “Map the problem into other domains”; another interpretation (or a variation on the same) is “look at it from a different angle” or “look at it in a different light”. Do both -- move around with your light source, shine a torch/flashlight at your problem, and different shapes will be “projected” onto the wall behind it. Kind of like, if you are familiar with Hofstadter's _Gödel, Escher, Bach_ -- remember that “G,E,B” cube pictured on the cover of many editions? A more or less abstract “thing” or idea can be “projected” into different practical views depending on perspective. In actual practical work talking about “projections” can of course feel a little highfalutin’; to me it's a bit like always talking about your RDBMS tables or views as “relations”.

But in a more philosophical sense it is of course perfectly valid. And AFAICS it's an age-old idea in Western philosophy; what I described above is essentially precisely what is known as _Plato's Cave_. (Which incidentally is why @tenken’s identifying it as “Eastern” feels quite peculiar to me.)


Gordian Knot[0] recognition in action.

[0] https://en.wikipedia.org/wiki/Gordian_Knot


An an absolute monster!

Edit: non native speaker. I meant to say „what an absolute unit!“ to praise his skill. Be gentle to the ignorant please.


In the spirit of gentleness, at HN comments which don't add substance to the conversation are discouraged. So in this case it's not just word choice that people are reacting to.


Which is ironic given that a GPT generated article made it to the top recently IIRC.

I'm glad hn isn't like /r/programmerhumor (i.e. if I argue a point I don't want to explain what a pointer is) but the delusions of grandeur are in equal parts tragic and hilarious sometimes.


I can imagine calling someone a monster in an approving tone, but it's harder to convey accurately in text.


Yeah, in arabic its said often in praise .. my surprise coming back to -5 .. had to google the right term quickly


It's actually correct in english, especially in sporting contexts. Like this guy is a BEAST, or that guy is a machine, or that guy is a monster opponent. But that is very tone dependent, so if you say it in the wrong tone or someone interprets it differently then it can be taken badly.


It is correct in Latin, as well, if it helps. Monstruum = that which is worthy of showing


'Monstrum', yes. It's the root of "demonstrate".


Oh wow


Interesting! I wonder, back in those days, when your day to day job involves coding in Assembly, I guess reaching to a disassembler on a regular basis also should've been a norm, no?


Not really. But I can tell you that, as a compiler writer, reaching for the disassembler is what you do a lot of!


Of course. And I bet you've memorized a fair share of opcodes as well! ;)


Unintentionally, yes. One day I realized I just knew them. The code generator used to hardcode the opcodes, but I have since slowly migrated to use mnemonics, and the opcodes have faded somewhat in my mind.

Of course, I mean the 16 bit opcodes. The SIMD ones, no freakin' way!

It's like the Wizard of Oz. I know most of the dialog, though I never made any attempt to memorize it. A friend of mine knows all the dialog for all the Star Trek TOS episodes. He just realized one day he knew it.


Thank you for this story. Your remeberance of him is inspiring. Moments like these are what we live for.

Using your tool, hah!


Can you tell us what this giant asm macro program was?


I don't think Eric told me, and if he did, I forgot. In those days asm was often used for efficiency, as compilers weren't that good compared to hand-coded asm. He knew I'd purposely written the disassembler to produce MASM source code, and thought I'd enjoy hearing about the use he put it to (and I did!).

BTW, this anecdote is one of the reasons that I've strongly resisted adding macros to D.


Pretty sure this is also the reason why Anders never added macros to C# either.


FWIW, most large asm programs from back in the day relied heavily on macros just to have some semblance of abstraction.


I know. I did, too. But over time, I used the macros less and less. More than once I've used Eric's technique to figure out exactly wtf some wretched macro was actually doing.

I also purged my own C and C++ code of macro usage.


> Walter's disassembler

> WalterBright

Was it your disassembler?





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: