I have a little anecdote to share that probably only I remember now, but it has always made me laugh and shows what an unusual thinker he was.
When he was first working at Microsoft around 1991 or 92, he was talking to a manager who had a problem. There was a program written in assembler language, about 50k in size or so. The program had a problem that needed to be fixed. The author no longer worked at Microsoft. The manager had repeatedly assigned various programmers the task of fixing it, and they'd all given up after days of effort.
The trouble was the author had used the macro language that came with the assembler to invent his own quirky, weird, undocumented and thoroughly impenetrable language. None of the programmers could figure it out. He was lamenting that he'd have to assign a programmer to recreating it from scratch, which would take months and cost a lot of money.
Eric says "I can fix it".
In 2 hours, he had it all fixed and checked back into the system.
The manager, astonished, asked: "how did you figure out the author's language?"
Eric: "I didn't. I disassembled the binaries using Walter's disassembler [a disassembler turns binary code back into source assembler code]. The problem was then obvious, I fixed it and checked the new source code back in!"
Eric then laughed uproariously, and so did I. Nobody but Eric would have thought of that shortcut to solve the problem. If he couldn't break through the wall, he'd go over it, under it, or around it. He was always like that.
I'll miss him a lot.
The bulk would answer "Um, better sound quality? Maybe an address book?"
Eric just couldn't believe those responses. (Remember, this was in the 90's when a phone was just a phone.) He was always boiling over with ideas.
> Eric just couldn't believe those responses.
If only a phone could be invented that had sound quality as good as we had in the 90s. I can barely understand people anymore. I'm not losing my hearing in my old age. (much) In person is fine. Other recordings are fine. Even video conferences are fine. But phone calls sound unaccountably terrible.
On the phone-line side I wonder if that is a degradation due to digital switchovers & compression, a lack of allocated resources, or just pure rose tinted glasses.
EVS are very good at 16-32Kbps bitrate, possibly better than even Opus. But generally speaking Social Apps uses higher bitrate you get better quality. Although latency tends to suffer a bit.
Another idea is to make a walkie-talkie mode, where you could communicate with nearby cell phones without needing a cell tower.
When the very first phones had physical buttons, less sophisticated UIs and T9 text entry, I remember being able to send an SMS from my phone without ever taking it out of my pocket. All from muscle memory.
Turn it in to a cube maybe an octagon. I'm sick of rectangles.
less snark comment (but also true), just jaw dropping that these pocket PC's do so much. on the down sides, I think the main think that floors me, just stabs me in the heart, is that they are still almost totally trash at communicating with those around us. Google dropped Project Eddystone, that sucked; totally straightforward sensible way to broadcast a url to the world about you. Apple has their file sharing AirDrop, which Google continues to make floundering attempts at (Beam, then Fast Share/Nearby Share), & never via any open protocols or specs. amazing personal communicators... totally reliant on completely centralized infrastructure. The couple Bridgify/Open Garden alternatives, it all seems too stack specific & lacking the layering of protocols & extensibility that allows communication tech to grow over time, become of enduring value.
I forget what it is but someone was telling me the Bump competitor in Japan is still used, that you still have a physical little ritual with your devices that causes them to exchange contact info. That kind of real.world connectivity seems like it ought be just the start, for how tech can augment the real world, versus us needing to start & navigate the digital to connect. We're doing little favors, entrenching connectivity so reservedly in the digital all space.
I think Bump happened a year or two latter, which was also fun. But it felt far more intermediated, like it was pulling down much more data. Looking at who in the area also bumped at just that instant to triangulate who to put in contact with each other.
I remember back in the 90's wear(able)-comp(uting) world, PANs were going to be a thing. Personal Area Networks. Your skin itself would be a low-baud rate modem, so you could shake hands & transmit your contact information. Definitely pre-covid. ;)
I really think Project Eddystone has many of the answers for how we ought to make ourselves visible to the world. It was mainly focused on commercialized entities, and people are scared of the internet, but I think there's still great potential. Alternatively, it seems weird that NFC, which is much more tightly coupled, 1:1, "safe", is so under-utilized. I'm not sure if there are good open standards for data-exchange over it. Meanwhile proprietary AirDrop and FastShare/Nearby Sharing are some of the few, tightly controlled offerings for offering files. All these manners of tech needs some champions to drive good protocols, enable more interesting futures. Sorry, I think I'm repeating myself again. Thanks for the fun comments dcsan.
Probably because corps don’t want us communicating with each other in any way they can’t know about.
Plus easier-access hardware kill switches and a faster GPU and RAM: https://shop.puri.sm/shop/librem-5/
And how's the app store? When some random game goes viral and I want to play with my friends, am I going to be able to? Are the apps vetted for security and safety?
Does it integrate with my car? One of the primary uses of my phone is for navigation. Being able to plug it into my car and say "navigate to 123 main street" as I drive is pretty key.
Is their store only online? If they don't have their own retailers can I at least see them at a carrier store or BestBuy? Will Verizon activate the phone on their network? A decade ago they wouldn't approve the Nexus 7 tablet, so this is something I think about.
These phones are not for many people.
> How is it?
In development. The pinephone can take still shots. The Librem 5’s camera isn’t functional yet AFAIK.
> Can it compete with the current phones from Apple and Google?
No. It’s not playing the same game. I would pay money not to have one of those.
> Does it do all the standard things like include filters, portrait mode, night mode, etc...?
> Can I search for "dog" and get all my dog photos or search for a city and get all the photos taken there?
I don’t know of any apps for them that do that at the moment.
> And how's the app store? When some random game goes viral and I want to play with my friends, am I going to be able to?
Probably not. It probably has about 6 orders of magnitude fewer users, and 5–6 orders fewer devs. You can virtualize Android, but I wouldn’t be surprised if it was a much worse experience than Proton on desktop (it doesn’t have Valve behind it. Anti-cheat might get you banned).
> Are the apps vetted for security and safety?
Yes, in some repos. Most of the distros will let you use Debian’s, which IIUC have had z-e-r-o zero malware (not counting systemd ;)) in their decades-long history.
> Does it integrate with my car? One of the primary uses of my phone is for navigation. Being able to plug it into my car and say "navigate to 123 main street" as I drive is pretty key.
I’m confused why it needs to integrate with you car to give directions. Do you mean connect with Bluetooth to use your speakers?
> Is their store only online?
> If they don't have their own retailers can I at least see them at a carrier store or BestBuy?
> Will Verizon activate the phone on their network? A decade ago they wouldn't approve the Nexus 7 tablet, so this is something I think about.
Nope, not as far as my crystal ball can see. The pine wiki says you can activate a SIM on a burner phone and swap it into the pinephone, and Verizon shouldn’t care.
That would work. Integrating like CarPlay or Android Auto would be better, but bluetooth or aux out works too although it's a little primitive.
As long as I can do something like say "hey pinephone, navigate to 123 main street" and it does the obvious thing.
That can be helpful in hiring.
If you’re looking to hire someone for a technical role on some big hairy project that might not work or seems crazy, it’s helpful to select for people who won’t say “this idea is dumb”.
If you have two candidates equal in technical aspects, then a point of difference is in problem solving and dealing with the unexpected.
Possible answers, all valid, all tell you absolutely nothing
* I'd change nothing, it works "AS A PHONE' just fine. There are better things to to work on
* I'd attach it to the side of a rocket so you could use it as a handle. I'd much rather ride a rocket than use the phone
* I'd double the speakers and mic and offset them at 45 degrees so two people could hug cheeks while talking to the same person
* I'd give it a display, add a computer, and have it pocket size and it would annoy you all day long with notifications and distract you from getting anything done. People would stop using it as a phone and mostly only write text messages to each other which would be way less efficient than voice. Oh wait, you said "improve". Yea, ok, bad suggestion
For example, possible takeaways from your answers:
* Lacks critical thinking, takes things for granted, doesn't show initiative or not comfortable with it.
* Elevated smartassness, possibly as a defense mechanism or due to inability to read between the lines. Maybe just nervous at the moment.
* Better. At least this solves some specific problem based on one's personal experience. It's possible to discuss that and see what else surfaces.
* Also good. Miraculous even if you'd managed to produce all that in the early 90s. Again, this also opens things up for further discussion.
See the point? It's a meta question.
On the assumption that not many people being interviewed by Microsoft in the 90s had an iPhone style plan or a desire to revolutionize cell phones, "leave it alone" is the most realistic, objectively useful answer a candidate might give. Sure, it demonstrates zero creativity, but it's totally unfair to say that person lacks critical thinking skills as their answer is the most rational.
To pass that question, you need to know the interviewer is looking for creativity and be able to make something up on the spot. That's obvious to many of us, but depending on the role I wouldn't fault someone for not being in tune to that question or ready for it.
The job interview gets debated endlessly, week after week, on HN. There isn't one interview method or question that numerous people would consider unfair, irrelevant, broken, etc.
Eric was interviewing people for very high paying jobs. A decision has to be made about the candidate based on something. A person who wants such a job should go in eager and ready to convince the employer that they're a great fit for the job, and do what it takes.
And yeah, Eric's question was pretty obvious. It's open ended, and you can interpret it as you like, which is part of the point. But being a smartass and/or complaining about the question isn't likely to leave a favorable impression. A "faster horse" answer isn't likely to, either.
As an aside, IMO smartphones are not an improvement on phones. They are an improvement on PDAs. As phones they've utterly failed. People called each other (as in use a phone) way more before smartphones. So smartphones are not better "phones". They're worse phones and better something else.
Ironically, the two men arguably most responsible for the success of the Xbox were strongly opposed to the entire “Manhattan” project at the time.
Eric was instrumental in the DirectX runtime design that Microsoft ended up porting to the original Xbox hardware with shared high level components.
The other guy was Dave Cutler, who led the NT and hypervisor groups at the time. Xbox v1 borrowed heavily from Dave’s work on modular, stable-userspace NT. 360 was basically running a full-fledged NT iirc. Dave, like Eric, hated the Xbox project early on.
In today’s terms, it would be like Facebook deciding to make a search engine or mobile OS, so their reactions were hardly surprising. Word was that even BillG was hesitant, but Ballmer really wanted more product lines.
Eric will be dearly missed.
Remember that there's HN'ers who never worked at Microsoft :-)
At the time Japan dominated video consoles with the last western attempt being the disaster which was the 3DO. Thus Microsoft was going "nuke the japanese industry" and gain a foothold.
For further irony the Xbox was in effect was a spiritual successor to Sega's Dreamcast. That Microsoft went with such a distasteful name despite partnering with Sega highlights their cultural issues. Issues which have continued to prevent Xbox from building relationships with Japanese studios.
It wasn't about consoles but gaming in general. Programming Games for DOS required a lot of assembly knowledge (at least, if you wanted something fluid) and deviation from industry norms only increased development cost (thinking things like High res support necessitating VESA code, to say nothing of sound support in DOS.)
Devs hated dealing with all this extra work, extra compatibility, extra libraries (old gamers may recall MILES or DMX setup screens in some titles.) The libraries for Windows 3.1 were... not at all up to the task.
Thus, DirectX was created. Not so much to create a console competitor, as to show PCs as superior to consoles overall.
Which, when one thinks about it, may explain why Eric/etc were not big fans of the XBox; in some ways it conceded defeat on a level.
He actually comes across a mix of interesting and unpleasant in this interview. It takes me back to the nineties when he and St. John were routinely pissing off people including myself.
But in the end people are complicated.
A few years back one of the systems I ended up inheriting stopped working on January 1st. As most people, I had taken the day off, but my manager called me to fix the issue in prod.
After much researching, I found out the assembly causing the issue. Similarly, the original dev had left and the source code was lost. So I used ildasm and found out the year was hardcoded. I patched the binary and went home.
Rest in Peace, Eric.
This seems surprisingly common, I hear it a lot from my customers.
Deliberate time bomb or honest mistake?
Consider, for example, the impact of anti-vaxxer's perspective that "chemicals are bad for the body".
When debugging a web app, you can debug the code (and add print statements / breakpoints). Or look at the HTTP queries its making in the inspector. Or debug the server - look at what queries its receiving and returning. Or experiment with different values in the database.
Last year I was trying to track down a memory leak in some wasm code. The tooling for that sort of thing in wasm was/is quite primitive, so I did some work to compile the module natively then ran valgrind on the resulting executable. That didn't work - the C code executed differently from the wasm and didn't exhibit the bug! So I added print statements randomly through the program and ran it both natively and through wasm, then diffed the text output to track down when the execution traces diverged. (It wasn't a compiler bug, fwiw.)
I wanted more information about the memory leak. So I wrapped all the malloc/free calls with a replacement that stored all outstanding allocations in a list, with the stack trace when malloc was called. When the leak happened, I could ask for the stack trace of all outstanding allocated objects. That was fast and effective, and I added a memory leak check in my CI.
At one of my first programming jobs we were using visual source safe (oh god, kill it with fire). I ended up losing some changes I'd made in a ~3000 line java class that we'd already deployed to production. I didn't want to do my work over again (or admit the mistake to my boss!). So I ran a decompiler against the class file we deployed to production - but it was huge and unreadable. To find the changes I'd made I took the current version of the java file, then compiled it and decompiled it again. So then I had two ugly files - but I could diff them, and that highlighted immediately the dozen or so changes I'd made. Then I just manually rewrote each change back into the original source file. Importantly, I had complete confidence that I didn't miss any of my changes.
 Its not pretty, but the code looks like this if anyone's curious: https://github.com/josephg/mime-to-jmap/blob/2a7429f7b805cb0...
Have also used this technique to compare build logs when you have a version of a project which will compile and one which doesn’t (eg when trying to integrate a second project into an existing one, or battling CI issues), useful if you aren’t an expert in that particular tool chain especially.
Personally I use VS Code’s diff these days, since they added support for word wrap recently, it’s the easiest to read.
Did you have merge week too? Often stretching into a second?
In other tasks, consider different data structures or representations that fundamentally represent the same data, but in different ways that offer different guarantees. Representing a graph with a matrix may let you answer some questions very quickly, but it may make others harder to answer, for example.
To often we simply reach for our trusty hammer, and it's not always an efficient or usable tool.
Eg in school we learn as kids derivatives, rote routines to solve a type of problem, and we plug and chug, but we are not shown (necessarily) by govt education the underlying theories that may create a solution. This/my opinion is of course dependant on how expert an instructor is ...
The best anecdotal example I can give off the net offhand would be:
Thanks for your feedback
Also in the usa we're taught do your own homework for example. In asian cultures often students band together to succeed in classes (I work in higherEd and this is culturally visible in peers in classes). This is another example of this philosophy being different between cultures.
In contrast to eastern education which tends to greatly focus on memorization through repetition.. (i.e https://en.m.wikipedia.org/wiki/Rote_learning)
The quote you have included is based on Taoism. Is that common? I was under the impression that Confucianism has the more dominant influence on values and thought.
Anyways, at least that's one thing we can agree on - Taoism's philosophy encourages an open mind and awareness.
Wouldn't quite say that is true. People use a variety of techniques to get scored higher and there's always a certain percentage who will use such rote learning techniques, be it eastern or western.
I think what what's missing in the discussion is the western focus on individualism as compared to east asian communitarianism. Both have their pros and cons and mutual dislikes.
There was a book by 2 high level western/eastern women managers in Hewlett Packard that highlights this dichotomy. It's a great read.
Again, that's because of a deep rooting in Confucianism. Also, "Communitarianism" is a western (Christian) philosophy. It may have been borrowed from Confucianism? I'm not sure. A lot of religions and philosophies share these kinds of values, so it's not unique to just one region / culture, IMO.
Anyhow, neither is better or worse, and probably a balance between communitarianism and individualism is best. All cultures can learn from each other.
Hm probably a better word would be collectivism
I went to a mix of 7 private and government schools between K through 12th grades in the US, and I don’t remember any of them not having group projects. Some, or most of the homework is individual, but there’s plenty of group work too.
There isn't anything special or insightful, mystical or different or "eastern" about cooperative learning. I'm thoroughly "western" and had plenty of homework groups in both undergrad and grad school. Nor is there anything special about the cooperative cheating I described: "certain groups" applied to plenty of western peer groups I saw in undergrad especially, though they didn't claim what they were doing was some "cultural" thing, as they knew it was simply cheating.
I would certainly add "looking at values vs looking at differences" and "looking at the generator vs looking at what's generated" (which the top level story is an example of).
It may take a little squinting, but we can see the common mathematical practice of reducing one problem to an example of another as an instance of this. There are also tweakier math examples where treating something as topology or something makes it easy to solve.
@Jtsummers puts it as “Map the problem into other domains”; another interpretation (or a variation on the same) is “look at it from a different angle” or “look at it in a different light”. Do both -- move around with your light source, shine a torch/flashlight at your problem, and different shapes will be “projected” onto the wall behind it. Kind of like, if you are familiar with Hofstadter's _Gödel, Escher, Bach_ -- remember that “G,E,B” cube pictured on the cover of many editions? A more or less abstract “thing” or idea can be “projected” into different practical views depending on perspective. In actual practical work talking about “projections” can of course feel a little highfalutin’; to me it's a bit like always talking about your RDBMS tables or views as “relations”.
But in a more philosophical sense it is of course perfectly valid. And AFAICS it's an age-old idea in Western philosophy; what I described above is essentially precisely what is known as _Plato's Cave_. (Which incidentally is why @tenken’s identifying it as “Eastern” feels quite peculiar to me.)
Edit: non native speaker. I meant to say „what an absolute unit!“ to praise his skill. Be gentle to the ignorant please.
I'm glad hn isn't like /r/programmerhumor (i.e. if I argue a point I don't want to explain what a pointer is) but the delusions of grandeur are in equal parts tragic and hilarious sometimes.
Of course, I mean the 16 bit opcodes. The SIMD ones, no freakin' way!
It's like the Wizard of Oz. I know most of the dialog, though I never made any attempt to memorize it. A friend of mine knows all the dialog for all the Star Trek TOS episodes. He just realized one day he knew it.
Using your tool, hah!
BTW, this anecdote is one of the reasons that I've strongly resisted adding macros to D.
I also purged my own C and C++ code of macro usage.
Was it your disassembler?