I wrote Finger and developed the supporting database to provide this
information in traditional human terms -- real names and places. Because
I preferred to talk face to face rather than through the computer or
telephone, I put in the feature that tells how long the terminal had been
idle, so that I could assess the likelihood that I would find them there
if I walked down the hall.
The program was an instant hit. Some people asked for the Plan file
feature so that they could explain their absence or how they could be
reached at odd times, so I added it. I found it interesting that this
feature evolved into a forum for social commentary and amusing
I wonder what the world would be like today if Unix-like systems had gone more mainstream. Maybe ISPs would include a Unix account on their servers where you could set up your .plan file and things like 'finger' and 'talk' and regular email would fill the space, in a completely distributed and open manner, that Facebook and other social networks currently do.
Around 20 years or so ago, many did. At least, the ISPs that I used back then certainly did. It wasn't extremely common but it wasn't all that unusual either.
I think what killed these services was a potential for abuse and spam. If you have your name/details attached to an e-mail account, it's super easy for marketers to scrap that, and then spam you if you meet certain criteria.
Centralised services (with all my distaste towards them) prevent that, because they block spammers, and limit access to such data quite nicely.
For a similar reason I think many people stick with gmail - because it's centralised, it can use ML to filter out spam much better than any other mailing services. It's also more secure - the Google's budget on maintaining security of their services is far higher than that of any other provider.
It may have been true that Gmail did a much better job than most other services when it first launched but I don't think it's been true for several years now.
Gmail by contrast is only a hosted/cloud solution, and comes with a security solution as part of the service.
I can compare Symantec Security Cloud sitting in front of an Exchange solution to a Gmail solution, and say that hands down Gmail is far and away better with exactly zero phishing emails making it through compared to numerous attacks able to get through Symantec Security Cloud to our Exchange users.
Comparing Exchange to Gmail is like comparing kwiwis to radishes, in my view.
For me, GMail is worse than useless as a spam filter. I don't know if I have especially unusual email or what, but there's so much ends up in the spam bin I need to wade through it all anyway, so it costs me time.
At least, it did. I've given up on it.
As I've said elsewhere, as an engineer at heart, a model-maker and problem solver, it's like sandpaper on the brain when people act in ways that mystify me, and for which I have no effective, working model.
I downvoted you because this type of commentary is the least useful thing you can read on HN.
Nobody learned anything useful from knowing that some guy with a GMail account isn't pleased with the spam filter. What was this supposed to achieve? Some long side-thread with people sharing anecdotes about whether or not GMail worked for them? This is the sort of thing that gets made fun of at n-gate.com.
Please don't comment about the
voting on comments. It never
does any good, and it makes
In contrast to your comment, I've always found the personal experiences of HN users to be incredibly valuable data points, which is why I shared my own. I'll re-evaluate that.
WRT the guidelines, I'm painfully aware of that one. While I appreciate its wisdom, it means that one can never learn from the otherwise anonymous downvotes, so to my mind it's worth taking the risk. In this case I really have learned something.
Again, thank you.
I didn't, because I'm already aware through personal experience that gmail's spam filter is much too strict, consigning legitimate emails to the spam folder despite the fact that any idiot could tell you they were real.
But in general, I don't see why one user's comment that the spam filter is terrible should be viewed as less "useful" than another user's comment that the spam filter is great.
Maybe it is, but no amount of people showing up in the thread asserting that it sucks will make for interesting reading. What would make them interesting is an educated guess about why it's bad.
I don't know any other system that would then allow me to learn to read and write Cyrillic and even if we'd had the right hardware listen to a native speaker.
The core feature of a modern social network was invented like 100kya. Doing this stuff on a computer is just a straightforward application. Email is mail. Chat/talk is talk. Finger is a bulletin board. All of those showed up within a few years of serious multiple user systems.
Just because something has a history of semantic value to humans doesn’t mean that translating it into new technologies isn’t novel.
The fact that there are real inventions worth celebrating doesn't mean we should celebrate every obvious use of those inventions.
Given it could have been a variant Facebook, much like MySpace could have also been Friendster before it shows how there is a lot of real innovation in things that seem obvious in hindsight.
I’d argue there are many, many more “obvious” ways to port existing social conventions to the digital domain that haven’t yet been done.
"This new protocol allows anyone in the world to transfer information semantically and graphically with anyone else in the world"
"Big deal... that's just another form of vocalization that animals have been doing for millions of years!"
With the decline of traditional multiuser systems, it might not be common knowledge anymore that e.g. ~joe is an alias for joe's home directory, so prefixing usernames with ~ was already an established pattern. This is still seen in URLs, but is somewhat rare now.
And of course "cd ~", and "cd ~user1" are still everyday commands on pretty much every computer
There were all sorts of fun things you could finger around the 'net. My favorite was a VAX at McMurdo Station in Antarctica. The latency was several seconds. I posted about this years later on Slashdot and a former sysadmin chided me and said that all of the fingers were chewing Ho their minuscule satellite-provided bandwidth.
Bus error - cored dumped
For example: http://www.ram.org/computing/plan/plan.html
Robert T Morris is a YC partner now I understand.
> Robert Morris is a professor of computer science at MIT... In 1988 his discovery of buffer overflow [sic] first brought the Internet to the attention of the general public.
“Thank you for applying for a job here. We took a look at your code examples and don’t think you have enough experience with algorithms to meet the high bar here at $BIGCO.”
On one hand, some things feel a high-level as C - such as many of the procs starting with "proc oops". On the other hand, things like "proc netloc" seem to have a chunk of assembly embedded in the middle of it. The use of symbols in the code (e.g., "←", which seems to be the assignment operator) is also interesting.
Even though I don't fully understand it, I find old code like this absolutely fascinating. I also feel the same way about old electronics/electrical manuals.
Looks like they had their own character set
Fun story, VLSI Technologies (chip maker, later eaten by Phillips in 1999) wrote a bunch of its internal CAD systems in SAIL. It was stuff like routing and layout and simulation. They had a group maintaining SAIL tools as well.
Oh they were also big into ClearCase, and had a whole team administering a policy layer for that. Not optimal.
I wonder if that was a regular problem for him?
HN discussion is here.
and it will stream the job logs to you. Most of the code in
I had just closed twitter when I read that. Indeed, the times are not gentle on the internet.