Hacker News new | comments | show | ask | jobs | submit | rurounijones's comments login

This dude seems to be able to prove "Superhuman" abilities in scientific environments based solely on mental fortitude and control http://www.icemanwimhof.com/innerfire

By the definition you gave above:

* Bombing of Dresden: Terrorism

* Nuking of Hiroshima: Terrorism

* America's Drone strikes: Terrorism

Context and motive is everything. You cannot boil it down to an easily digestible soubd-byte to base your opinion on.

I'm not claiming to give a formal definition of terrorism, I'm just saying that if such a definition exists it should definitely include the Paris attacks, otherwise it would be meaningless.

Regardless of that, the events that you mentioned are massive, coordinated, state-level actions. I just can't accept labeling those as terrorism without that word losing any meaning. Yes, those events are horrible, but not every horrible thing that involves killing civilians is terrorism even when if's brutal mass murder. There are other words too, like genocide, war crimes, massacre, assassinations, etc.

Yes you can.

Hiroshima was in the midst of a full blown state vs. state war, in which the US was not the aggressor against Japan.

While it is true that it was hoped the atomic bomb would be as useful psychologically as it was physically, there is no true comparison here, and this is argumentative for the sake of trying to make a point.

> there is no true comparison here

You're right, the Hiroshima attack in itself was worse in terms of the death and suffering of innocents than all Islamic terrorism of the past 20 years put together.

True. Ironically, it's generally understood by Americans that this was done to break Japan's will to continue fighting. It worked. Everything we do in war is to break the will of the enemy to continue.

Thus unless we expect one side to nobly surrender, we ought to expect the most extreme atrocities to occur in war.

When you think about it, it's silly to draw an arbitrary distinction of humane combat, since if we really wanted to be humane we'd have our leaders compete in chess boxing or something that would minimize the destruction of property and loss of life.

Don't forget shooting at redcoats from behind the cover of a tree during the revolutionary war.

Seems like a very long article based on very little substance and a lot of faux-pondering.

(Disclaimer: Yes, I bought an SC ship and want to see the game, and genre, succeed. I also regard anything Mr. Smart says with a dose of cynicism)


Yeah at least the critic on the eacapist magazin had actual sources for their claims. Other outlets are just jumpin on the bandwagon


> If they think that being fully automated is the main reason why people are interested in them they are mistaken. [Citation Needed]


Sorry. I didn't keep detailed records of the comments I've been reading about let's encrypt over the last half year or so. :)

But I've noticed that most people were happy about free and no restrictions while the automatic part mainly just got how is that going to work and will it screw up my server comments.


I think that is the techie HN echo chamber effect.

I have spoken to lots of people (Yay! More anecdata) about the idea of lets encrypt (Business people with small websites) and the "automatic" bit was very much a plus point.


Would you happen to know if these business people run their own web server or use a hosting service?


The ones I know of run their own servers for small things like wordpress blogs, small magento e-commerce installations etc.

To be brutally honest, they would be better off with a hosting service to take care of things since SSL is not the only issue they have and they lack full-time techies.


https://plus.google.com/+BensonLeung/posts/jGP5249NppF is his method (Not sure if you seen this). I am not sure if you can emulate this with a multimeter.


Interesting that the Pixel can tell it's a bad cable. Makes me wonder if it will still charge from it.


"Russian Ships Near Data Cables Are Too Close for U.S. Comfort"

Because they might spot the US sub upgrading the taps on the cables.


> People don't get that the wolves are gone because we killed them, on purpose.

The implication there is that we had good reasons to kill them which is very reasonable; but do those reasons still exist in the modern world that would preclude them from being introduced?


The main conflict is the same as it ever was. Many farmers hate them because wolves kill livestock. Though, perhaps we can more easily afford the losses now.

I found The Economist's 2012 special on the reintroduction of the wolf to be quite informative: http://www.economist.com/news/christmas/21568656-after-mille...


This is one reason why lynx are often suggested as a first big predator to reintroduce. They keep to forests, while sheep graze on open ground. Sheep can be protected with fences (keeping them from spreading too thin), livestock guardian dogs, and active shepherding. Farmers just resent the effort.


This architecture and the ferocity with which they implemented it (No service calls!) seems very alien but they obviously thing the benefits outweigh the problems.

Does anyone else do this?


I've built a couple of systems using this method in my early days of attempting to smash large systems down into smaller ones (around 2006-2011), and explicitly using ESIs and then Varnish ESIs as the layer of orchestration and integration.

There used to be some real drawbacks to this, especially as Varnish wasn't really happy with potentially hundreds of ESI calls per HTTP request, yet that was what we were doing (a call to a JSON API that returned a collection of things would create a document that contained ESIs pointing to each thing, and each thing could also contain ESIs to hydrate themselves, and so on).

One of the interesting bugs was that if you had a lot of ESIs then only the first 10 or 20 succeeded, and the rest did not. We raised bugs for this at the time (Varnish 2.1.4) and they were all resolved (by Varnish 3).

Varnish is great at this now and this approach can and does work, but it does still require discipline to ensure that other pitfalls are avoided (circular references, how to handle errors if you are serving non text/html, how to prevent putting a strain on Varnish now that you're forcing it to orchestrate and manage connections for all subresources).

But yeah, it works.

However... I personally moved away from building this kind of thing as it really wasn't an elegant solution.

This approach made it more difficult to add things like tracing, soak testing of new endpoints, and many other things that really increase in importance after you've gone through the first iterations of many simple v1 services.


Nexus 5 : 137.9 x 69.2 x 8.6 mm ( Front face 9,542.68 sq mm )

Nexus 5X: 147.0 x 72.6 x 7.9 mm ( Front face 10,672.2 sq mm)

The front face will be larger. The only way it is smaller is the reduction in thickness.

[EDIT] My mistake, thought you were talking about the Nexus 5


I would argue that this is only if they do not follow Semantic Versioning, which any self-respecting ruby gem should. (Ignoring the fact that SO MANY production gems still use 0.x.x versions...)


Semantic Versioning is not a magical cure-all. It's still up to the author to make the right choice about which number to bump.

We have a node/gulp-driven build with, ultimately, about 500 dependencies. We don't store node_modules with the code, we run npm install with each build. This leads to periodic failures because some sub-sub-sub-dependency bumped number 3 when it should have bumped number 2, causing unwanted build-time behavior. Even if our package.json lists specific versions of top-level dependencies instead of ranges this will happen. It's aggravating.

Projects exist to bundle up known-good node_modules and restore at build time to avoid this. NPM offers the concepts of shrinkwrapping and peer dependencies so you can tell consumers which specific version of each runtime dependency they have to use as well. IMO if SemVer actually worked in practice, none of this would be necessary.

Some of this is an indictment of node/npm - maybe Ruby is just better - but I still think SemVer is overestimated.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact