Hacker News new | past | comments | ask | show | jobs | submit login

I'll be honest, I'm kind of enjoying 'Aviation Week' here on HN. This is the kind of stuff that got me interested in math and science to begin with. Too bad my vision is so awful.



I think it's more like War Machinery Porn Week, really. Never really felt fully comfortable about this stuff myself. But for whatever burred tooth in the cogs of fate got us here, we'd be reading of (and marvelling at) the derring-do of people history has taught us to despise.

Nevertheless, here we are. And here, most importantly, at least from my perspective, I am. And - or perhaps, "and so" - I enjoyed this one.


It's really important to separate the tools from the intent IMHO.

Just as one can use a butcher knife for its intended purpose or use it to kill, it's possible to design military equipment to "kill people and break things" without necessarily meaning to wage aggressive warfare.

It's true that the military tools are often misused, but that wouldn't change in the real world by simply not having them; pacifist countries are sheep in a field full of wolves.

Just look at Ukraine, and then compare to nations that e.g. gave up nuclear weapons and then suffered "regime change". The list includes Libya, parts of Ukraine, and effectively Iraq (who were very close to a nuke circa 1991). There's a reason Iran and North Korea want the bomb, and that reason is because the deterrent value is real, not imagined.

Rather it's the same as an underlying principle behind the Second Amendment push (no one cares for my defense more than I care for my defense), scaled up to the geopolitical level.

So just as I think it's possible to appreciate the craftsmanship and design that goes into a well-made katana even if you don't intend to run the sword through someone's guts, I think it's possible to appreciate at a technical level some of the technology used in military gear without feeling like it means you support war. ;)


One could also argue that the SR-71 was a tool that prevented death rather than facilitated it -- by delivering intelligence. It was a reconnaissance aircraft and despite being operated by the Air Force the payload of the Blackbird was never weaponized.


An interceptor variant reached the prototype stage. Ant one point, quite a few were on order.

     http://en.wikipedia.org/wiki/Lockheed_YF-12


It's really important to separate the tools from the intent IMHO.

First thing I'll say: I'm not sure I've reached a conclusion regarding that statement or not.

But I've been leaning rather more strongly to the idea that things can be classified as beneficial vs. otherwise. Though sometimes counterintuitively.

One of the more interesting hypotheticals I've run across recently is the Paperclip Maximizer example: http://wiki.lesswrong.com/wiki/Paperclip_maximizer

The paperclip maximizer is the canonical thought experiment showing how an artificial general intelligence, even one designed competently and without malice, could ultimately destroy humanity. The thought experiment assumes an AI stable structure of goals or values, and shows that AIs with apparently innocuous values could pose an existential threat.

The idea: any simple-minded optimization behavior which doesn't take into consideration human values can, taken to the logical extreme, prove hazardous.

It was posted to HN a few years ago, though it didn't trigger much discussion at the time:

https://news.ycombinator.com/item?id=1747413


It's a great thought experiment. I think I even read it when it came across HN. But it seems to me to be an argument about why we keep humans "in the loop", as it were, and less about what tools you give the humans.

It does open some questions about how "innovative" one might want to be when developing a weapon, even for defensive uses.

It's sort of unfortunate that nuclear physics made the gains it did, when it did, as a big part of the reason the U.S. ended up making the big push for the bomb, at a time when they needed all the resources they could get put into things like logistics for shipping materiel, was because of the fear that Germany might get it first. I.e., "if someone will get the bomb, we'd better do it better they do".

I suppose the Cold War would have ensured proliferation one way or another, but WWII certainly did not help the cause of non-proliferation.

But either way, war or defense (whatever you call it) can never be a simple-minded optimized anything. It is almost the very highest level of human holistic competition. So I'm not worried about the technology (as long as we don't make it self-aware, of course), I'm worried about the people.


Especially considering that human beings can, very easily, become totally divorced from healthy, "normal" human values.

It's not enough to merely keep humans in the picture, but to keep healthy, undamaged humans in the picture.

Let's say, perhaps, that after a particularly costly war, the only humans left are a mixture of mentally unstable, angry, ambitious, victory-driven amputees, with intense biases imbued upon them by surviving particularly horrific and violent combat. These people, in a warped attempt to say "never again", optimize an artificially intelligent, fully automated child-rearing skinner box [1] to mold children into their own image, as the natural and perfect outcome which produces a society averse to violent warfare. The result is that every child that emerges from the skinner box is an angry, warped sociopath, missing limbs, who rationalizes even trivial behavior with an arbitrary moral high ground of extreme polar ideology.

But wait... aren't humans... technically classifiable as self-assembling intelligent constructs, spewed forth from the bald nothingness of space and time by mere coincidence? What if WE are the beast we fear?

Oh... oh god.

[1] https://en.wikipedia.org/wiki/B._F._Skinner


Organizational behavior and group decisionmaking is among the more fascinating fields I've encountered.


it seems to me to be an argument about why we keep humans "in the loop"

That's helpful but not sufficient.

• It doesn't address the criticality of appropriate feedback controls and limits. People and algorithms both make bad decisions.

• A given group of humans might not act in the best interests of all humans, or even a specified larger group, or even themselves.

Re: how innovative we want to be even with defensive weapons. Body armor is generally far less dangerous than an RPG or assault rifle. But defensive technologies such as antibiotics can, if misused, lead to larger downstream threats (through antibiotic resistance). The areas of unintended consequences, moral and morale hazard, and the like, make for fascinating study.

Re: the bomb. Yes, the Germans and Japanese were both conducting nuclear research (though I believe the Japanese project was limited and/or curtailed). Another interesting speculation I've seen is of what might have happened had the US not used the atomic bomb on Hiroshima and Nagasaki: it's possible that the true horrors of the weapon wouldn't have been realized and that the next military action (the Korean War) might have gone nuclear.

The places where I've been spending time looking at things are a bit more nuanced though: sustainability in light of finite resources and/or maximum flows: is saving lives really an unalloyed good? What of technology in general which might increase technological risks (simply of failure). Was the Green Revolution a good thing? Humans have been something of a paperclip maximizer, except that our paperclips are humans. In the long run even that may not work out for us. Systems need negative feedback loops.


That's a good point. People on HN are sometimes a little quick to put on their "it's just a really cool piece of technology!" blinders and ignore the gruesome, wasteful military industrial complex that brought them these "badass fuckin' guns" which do, in fact, kill real humans in real life.


Nobody is really ignoring the reality of it. These are weapons of war. Not everyone reading HN shares the same outlook on the morality of war. I am personally an anti-war veteran, but that's a personal outlook that comes from some careful deliberation after observing how and why we wage war, and being fairly dissatisfied with both.


For whatever it is worth, while obviously a part of the military-industrial complex, the SR-71 was built to spy, not kill.

The YF-12 was to kill however.


The beauty of technology is it's ability to overcome obstacles and act as solutions to a problem. I do not find that the morality of the problem can tarnish it.


Really? Not ever? I hate to go all Godwins on this, but it is the most extreme counter point I can think of. Gas chambers...


Need to exterminate roaches quickly? Gas chamber your house. Save yourself from numerous diseases... I mean c'mon, you couldn't think of any legit use of poison gas technology?


Internet exists ONLY because of War Machinery.

Imo, the whole electronics (tv, computer, chip) industry advanced as much as it did due to the War Machinery.


Maybe the NSA is trying to give us a smooth transition back into Cold War mode with the Russians.


I will always regret not becoming a military pilot when I was young enough. When I was a kid, nobody ever told me how to pursue that career and I didn't have a clue. In my 30s, I became an Army officer and had a great time but always wish I could have gone the flight route.

My son is almost 2 and you can bet that I'll tell him what he needs to do to achieve this. [1]

[1] - To be a USAF pilot, your best bet is to bust your ass in high school and be both a student and an athlete and secure one of the coveted congressional endorsements to the U.S. Air Force Academy, where you must bust your ass and hope for a flight slot. Similar route for the Navy and USMC: U.S. Naval Academy.

The process to become a U.S. Army helicopter or fixed-wing pilot is a lot easier. Pursue a slot as an Aviation warrant officer. Successfully complete basic training, WOCS, and flight school. You don't need a college degree but it helps. The key is age: the doors close at around 32, so you have to start early.


Currently, the US Air Force is moving away from its focus on pilot culture to intelligence. In other words, you'll start seeing more colonels and generals with an intelligence background instead of a pilot background. There's a running joke in the USAF that intelligence drops the bombs and pilots are just bus drivers.


I don't think it has to be that hard...I bought a small plane from a Navy pilot (an instructor for F-18s at the time). I asked him how he got the job, and he said he joined on a whim, after being recruited at the mall. The deal he got when he signed up was he'd get to be a pilot as long as he didn't wash out of the program.

(I also spent an evening at the bar w/ a class of his students. I was expecting super gung-ho jock types, but they were actually very laid back and a little bit dorky. Interesting experience for a boy raised by hippies in the Oregon woods.)


At least you don't have to be born with perfect vision like the old days, with laser eye surgery now anyone can be a pilot!


Don't forget: get pilot's license at 17, and for at least senior year of high school, attend a military academy prep school.


What come across particularly well to me in this piece has nothing to do with aviation but job satisfaction. This guy got to do an amazing job, one he had dreamed since childhood. It was intense and took astonishing levels focus, effort and a dedication to be the best he could. His reward for this was the flying, the very job itself. Best of all when it was over he was ready for it to end. I hope I can look back on my career so beautifully.


Little 'diversions' like this one are great and make HN more fun to visit, without depleting its overall quality of content.


If I didn't wear glasses, I would have joined the Air Force.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: