Hacker News new | past | comments | ask | show | jobs | submit | abstractbeliefs's comments login

The very first line of the article states that this is a retrospective of the August '23 incident, hence the downvotes.


I don't know about this instance in particular, but the vertical scale in similar maps is often exaggerated to make it easier to differentiate the different floors.

At the cost of distorting elements with a vertical dimension, it means that all the wireframe layouts don't end up overlaying each other.


Forgotten not in the sense of lost knowledge, but more that the individual is not known proportionally to the importance of his work, or perhaps consistently when compared to his peers.

While specialists in his field know his work and his name (but not even everyone in software does), the public do not.

While your parents and friends see the dramatised exploits of Turing in films like The Imitation Game, or his face on the currency, the same is not said for Church.

Every field has it's public heroes, usually related to the stature of their work - Fleming and Jenner, Marconi, Ford, Bell. Turing.

Anyone will at least recognise these names, but not so for Church.


Ok, and if I asked a random member of the public for the name of a mathematician (excepting Turing, for clarity) what name do you think they would come up with? Pythagoras? Euler? Erdős?

I think the reality is that only a very small number of scientists, mathematicians, and similar are household names.


People's conception of Turing is massively skewed by the ending. His persona is now defined by his sexuality and treatment more than his contributions to maths and computer science. Andrew Hodges book is great. I had the fortune to go to his author tour the year it came out, he was doing the compsci departments of the UK and it was a really nice seminar.

Benedic Cumbersome was a good actor, but it's important to remember Michelangelo actually didn't look like Kirk Douglas.


I like that you called Benedict 'Cumbersome'. He is indeed cumbersome in so many roles.


I read a review in the Guardian [1] which used a series of malapropisms for his name, and riffed on the concept.

[1] https://www.theguardian.com/tv-and-radio/article/2024/may/29...


Also his contribution to the war might be one of the biggest reason he is well known.


Right - For every Newton, who (rightly) gets credit for his immense contributions, there are people like Euler, who are relatively unknown outside the field in spite of significant contributions[1].

[1] Massive in the case of Euler obviously.


Agreed - what I tried to highlight through a series about people that have contributed significantly, but are not so well known outside of the fields they impacted, there is always cooperation to a large extent and others involved - rarely a lone individual as the Turing movie and much of the press in the UK likes to portray. And many people, who should be known, get lost in the complexity of history. It's worth bringing the attention of this to a wider audience - people are genuinely inquisitive. Plus as another example, on an intro course to AI 45 out of 52 bachelor students had never heard of Church!


Isn't that just because they haven't made a blockbuster feature film about him yet?


No free software no support. You don't have to merge it upstream right away, but publish it for others to study and use as permitted by the license.


It's not so funny, because this article was written at the time freenode was burning to the ground, and the article suggests that it would be very easy for everyone to move to libera if it turned out to be the better option.

And that it worked so well is the point of the article!

IRC is highly resilient because it's simple to set up, the clients and servers are free software and can be endlessly and independently configured to talk to each other, and it's light on resources - both server, client, and bandwidth.

I'm on terrible, legacy satellite internet (ie, not starlink, with 700ms-1.2s round trip times), and IRC is the only chat system that works fine.


IRC is so lightweight that a single-core sub 500MHz system can host 10,000 or more connections, and did back in the day.


Hah, the general issue with IRC was (even without DDoS) the bandwidth.

I went to Monash University in Australia, starting in 1994. We had an EFnet server (biggest, or one of the biggest). I was just a student so I don't know the specifics, but it was only allowed to run from 6pm to 6am because otherwise it used most of the University's bandwidth, if not the entire country:

At the time, Australia's internet capacity overseas was a SINGLE 1.5Mbit link. Telstra did buy a 45Mbit link in late 1995, and then another in early 1996.

Still, that's insane, the country had less than 100Mbps.

I remember working for a web design company in the mid-late 90s and we had servers at a datacenter in Melbourne. I remember downloading Netscape Navigator (1.1n!) and being blown away by how fast it downloaded, and then realizing for the duration of the download I was using something in the order of 5% of the country's international bandwidth.

Reference: https://en.wikipedia.org/wiki/Internet_in_Australia


In the fall of 96 or 97? I was getting some freshmen setup with computers. I was on the football team so we were on campus (WUSTL) 7 days before school started so almost no one was on campus. We were trying to download navigator from tucows (I think) and I remembered wuarchive still existed and was on campus. We had 100 meg in the buildings and fiber backbone. We pulled NN at 5.5Mb/s... it finished so fast we downloaded it a second time to make sure. And I said "We won't see speeds like that for 7-10 years." I was pretty correct. That's when I realized that live video would be doable but not for a while.


>Still, that's insane, the country had less than 100Mbps.

It's easy to take for granted just how much Fucking God Damn Technology we have mundane access to today.

That 14700K or 7800X3D CPU? That's more powerful than entire national supercomputers back in the day.

A 64 gigabyte stick of DDR5 RAM? Bill Gates once said 640 kilobytes is enough for everyone. And you probably needed multiple sticks.

A 20TB hard drive? With helium? Do you understand the bandwidth of a station wagon full of hard drives barreling down the freeway in the 1980s?! Wait, nobody probably even knows what a station wagon is anymore...

1gbit/s internet? 10gbit/s ethernet? We sung the melody of dial up modems conducting international diplomacy better than any politicians.

360Hz liquid crystal display monitor with billions of colors and millions of pixels? We used to completely evacuate air out of glass or ceramic tubes and fire fucking lasers with them to show monotone pictures in neon orange or green.

Modern, minimalist user interfaces? To hell with them, Windows 95 was the pinnacle of human engineered interface design.

...Wait, did we actually devolve? Maybe it was a mistake to make sand think after all.


> Bill Gates once said 640 kilobytes is enough for everyone.

Gates has always denied saying this, and no one has ever produced the original quote. It’s more like something IBM would have said about the PC, they’re the ones that created that limit.


Not lasers, but something even better. Randall Munroe once described CRTs as "desktop particle accelerators".


And here we are working on systems with dual 400Gbps and sending data out of storage clusters at 5 TiB/s.


Still, that's insane, the country had less than 100Mbps.

It's not 'insane'. Bandwidth goes a long way when you aren't downloading 100Mb JS frameworks and watching cat videos all day.


Remember Twisted.Dal.Net server? :) Admin of it did some magikery on his server and hosted 50k local clients!! :) And it was 2001 or so... Thats crazy :)


Oops, when I saw how old the article was I just skimmed it.

But yeah it worked out really well with Libera.


It's a surprise, I'm a UK citizen living in a UK dependency and will shortly be returning to Great Britain, and I can't access it.

Huh!


Same in the OT of BMU.


You can do it in a virtual machine, and people frequently do. The software can't detect nor escape it.

Unrelated (seriously), there's also OARC, the online amateur radio club. It's on discord (boo, proprietary), but it's got some of the most exciting projects and a really young crowd. I'd highly recommend it.

https://www.oarc.uk/


Yes - the CFAA as a criminal issue requires "mens rea", the intention to commit a crime. Very few laws (such as involuntary manslaughter) have exceptions to this.

If you break someone's shit by accident, they can still sue you, and people should probably sue HP, but you can't try to have a prosecutor bring them to a criminal trial under the CFAA.


The problem is that not all the secrets are so easily extracted - sometimes the design/software is the secret.

If you put all the design up for nuclear weapons but just kept the nuclear codes secret it's great that no one can fire ours, but people could implement the design on their own with different codes.

To use a more realistic example, consider air defence missile systems use to shoot down incoming missiles and drones. The secrets here aren't keys, it's software-driven behaviours including the approach to identify radar tracks as hostile or noise, identify when to commit a missile and how to target it, the code implicitly contains the vulnerabilities where the radar tracking is less effective and more evadable and how the system tries to mitigate this, etc.

When you take away all the secret behaviours, you quickly end up with not much more than just the drivers connecting the hardware to the logic, which isn't a lot of code that's driving the funding. How you set a missile tubes tilt and direction is trivial stuff and is mostly reused from old platforms that were funded decades ago.

Additionally, there is some talk about how this is kind of like security through obscurity. When it comes to things like weapons and other high-tech capabilities, security isn't only security against being owned, it's security against your opponent closing the technological gap. Unfair wars, where you have a significant tech lead on your opponent, means your population bleeds less.


> sometimes the design/software is the secret

Decreasingly the case, and we might be at the point I can say "poorly architected" if that's the case. A random 19 year old developer in the military with access to the secrets would be a big security hole too.

That's what we should be comparing with regards to open-source vs not open-source - in either case access to weaponry would of course be heavily gated.

Some of the other arguments you make here are no different than conventional arguments against open-source:

> the real code would still be private

Precisely the point, get the crowd to optimize the low-risk parts, build communities around those libraries and frameworks and recruit from it

> opponents might catch up

To remain an industry leader it's actually better to get everyone playing your game rather than trying to compete and stay ahead in a wild west scenario. You want to capture it really, so you can control it and be the leader in it. Open-source is one great way to do that (browser vendors come to mind).


You can say "poorly architected" all you want, but it's true. Military capability is frequently determined by software-implemented behaviours, not data you can plug into a generic public framework.

>19 year old developer with access would be a big security hole

It's true, they are. That's why militaries and defence companies go to great lengths to vet their staff and why even within vetted staff, sensitive material is compartmentalised to minimise the risk from any given individual. Even despite that, military secrets are still leaked on an all too regular basis.


> That's why militaries and defence companies go to great lengths to vet their staff

What a joke, no they don't. They establish security internally by gating access, not trusting everyone because they've been "pre-vetted".


Gating access is compartmentalisation. If you're being brought onto, say, missile development, you absolutely will have to submit to both vetting (knowing who you are prior to access) and compartmentalisation (permitting access only to your relevant secrets throughout).

I'm not saying that just because you have some kind of clearance you will get access to everything, but it's part of the preconditions to your own relevant access.


Yes, those security clearances are the real gates, not anything in your JavaScript codebase, and that's the point - there's already clearance in place within the military, there should be nothing in a codebase that can bypass a security clearance requirement anyway.

A very secure codebase is designed in a way that all the sensitive parts are separated from the parts general users (and developers) have access to - it shouldn't be all imbued together such that sensitive parts about missiles are exposed to login APIs etc. as it seems like you were saying.

It may even be lower risk than not to open-source as the public is more likely to find and fix actual security quirks that a private contractor might miss (or could even be paid as a spy to purposely leave vulnerable).

There's also the community/recruitment aspect. AI/LLM companies are cleverly open-sourcing major parts of their work while keeping the only important part that makes them valuable private - it's a win:win as they keep their secrets yet provide for and stimulate a developer community.


I don't think that's quite how it worked.

The colour was created by different phosphors on the inside of the screen, there was nothing different about the electron beams. The number and pitch, the resolution, of these different phosphor dots determined the resolution of the screen.

The shadow masking was to prevent the beam for, say, the red phosphors sweeping blue/green subpixels when moving from one pixel to another, since it wasn't practical to turn the beam off and then back on once the steering coils had been changed. Steering was continuous, so without shadowmasks, you would illuminate on the neighbour subpixels you pass on the way.

You could have done it all with a single beam, if you really wanted, but it's not very practical - you'd need to sweep slower since you could only illuminate 1 subpixel at a time, it'd take much longer to to steer, illuminate at the right level, and move to the next with the right beam power selected.


> I don't think that's quite how it worked.

I think he described exactly what you did (in fewer words) in the second half of his post. ("Your typical color TV ..."

The first half discusses that some projection screens had 3 different tubes. ("Very few broadcast TVs ...")

Either that or he edited his post.


My objection is:

> Your typical color TV effectively created "pixels" by their use of shadow masks and three separate beams.

It's true that almost all colour TVs are pixelated and have shadow masks and three separate beams, but it's got the cause and consequence the wrong way around.

The pixels are created by the phosphor dots, not the mask/tube design. It's possible to use this design without three tubes or the screen, strictly, but a consequence of this design suggests the optional addition of a shadow mask and additional tubes as a natural, later progression for improved performance.


> It's possible to use this design without three tubes or the screen, strictly,

Without shadow masks or an aperture grille seems pretty dang hard. Even ignoring the alignment concerns, it requires a much, much higher bandwidth.

The shadow mask and aperture grille approaches were the only approaches that ever really worked for color CRTs.

Phosphors and some kind of masking is how you make different colors on a single tube. Indeed, the shadow mask or aperture grille was offset 3 times and used to etch for the phosphor coatings in most manufacturing processes.

> l addition of a shadow mask and additional tubes as a natural, later progression for improved performance.

There's just one tube-- multiple guns/beams, though.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: