Hacker News new | comments | show | ask | jobs | submit login
Tell HN: Sci-Hub's TLS certificate has started failing
244 points by AlbertoGP 4 months ago | hide | past | web | favorite | 145 comments
Sci-Hub started failing earlier today with a SEC_ERROR_REVOKED_CERTIFICATE error, which affects all alternative Web addresses. Don't know about .onion.

Is this some kind of planned maintenance, or a sign of further trouble?




This type of thing is my number one objection to Certificate Authorities.

In fact, it's my objection to computation illiteracy being acceptable in general amongst users. Devs and agencies cannot be trusted not to screw with things. If the average Joe cannot understand what is going on behind the curtains, they aren't free.

Freedom is a scary thing to many groups, and unfortunately, more and more we are seeing the pendulum swing further and further away from the Internet's original intent: to facilitate the fast and open communication of information. I want to say free and open, but unfortunately I have trouble being able to maintain that level of idealism anymore.


I don't know how the US Postal Service works, but I still can receive and send sensitive personal information through the US Mail without any real concern that it will fall into the wrong hands. For a variety of reasons, digital communications have never felt as secure. I think a lot of that is due to the lack of legal consequences in the digital realm relative to those in the physical realm (i.e. snail mail.)


You are on the right track. Snail mail, letters, and correspondences were first class citizens in terms of things like protection by the Fourth Amendment. The postal service (at least as far as I am aware), is one of the few Third parties you can share that type of sensitive information with and have it by the letter of the law still be protected by an expectation of privacy.

(If there is a lawyer in the house who could let us know otherwise, that would be awesome).

The Supreme Court, however, has been reticent to apply any interpretive oomph to the idea that electronic message sending represents the same type of "private correspondence" that a snail mail letter represents.

In fact, if anything it has generally leaned in the opposite direction. The SC has ruled that sharing personal or private information with a third party thereby nullifies your expectation of privacy and protection of your information by due process. This happened in the early days of telephony I think.

"But that is only from the Government's point of view!", I hear from the gallery...

Yes... Unfortunately it does set a societal precedent through the institution, however.

I mean, if as an arbitrary business, I can hand your information to the government and they can make use of it, then surely I am free to share data I have about you with other people\businesses?

Once that becomes legally acceptable, and people are willing to pay me for as much info as I can give, suddenly the economic incentive is to collect and sell as much information as possible. Note that this isn't a tech problem, but a social one.

As long as we don't take a stand by making law to cover the issues of electronic activity being considered "protected correspondence" with a reasonable expectation of privacy, we will continue to see these blatant invasions into our personal affairs by business and government alike.

The thing that will hold back that lawmaking though is that there are some VERY deep pockets that would see incredibly vast revenue streams dry up by passing something like that.

One could see places like Alphabet, Microsoft, Facebook, and the other Silicon Valley darlings doing everything in their power to convince both the government and the populace that doing something like that would be a terrible idea.

And they would be right to a degree. Many "free" services would have to switch to to a subscription model or something similar, and it could mean major cutbacks for many tech companies that haven't matured enough to diversify away from an ad supported business model.

I can't speak for anyone else, but to me, that is a small price to pay to establish a right to the privacy of the exercise of our wills in the electronic realm. Thinking in the Internet age is truly an inspiring thing to behold. However, the Net that enables this capacity for collective thought is just a tool. We will get out of the Net what we as people put into it. To me, the Net has always been about empowering and uplifting every person by putting the collective knowledge and wisdom of humanity at each person's fingertips before anything else. One should always come away from the Net having found something, but at the same time, one has the right to use the Net and not have anything TAKEN. A "Right to Lurk" as it were.

If I had to choose something as the basis for a new Constitutional Amendment, it would be something that would explicitly codify the expansion of the legal "expectation of privacy" to encompass all electronic forms of communication, commerce, and assembly; protecting the aforementioned from search and seizure by the government without due process. It wouldn't do anything for SciHub sadly, but it would be a step back in the right direction in terms of curbing some of the more demonstrably harmful ills the Net has facilitated in our society.

P.S. Sorry for the mind dump. It felt great. If you are still reading, you're awesome.


> the Internet's original intent: to facilitate the fast and open communication of information

Where'd you get that from?

I think the Internet's original intent was to do it "because we can". Everything else came afterward.


The Internet was a United States Department of Defense project to create a communications network that could still function despite damage. It had nothing to do with fast, nor open, communication. That was just a by-product during the 90s.


There were other internetworking projects at the same time as ARPANET. The DoD's tech was better, and to people like the IETF, "better tech" is all that matters in picking a standard. But most of the people driving the adoption of internetworking among large corporations and organizations weren't DoD people, or even people with a defence mindset. They were just sysadmins, librarians, and communications engineers trying to freely peer their networks with other networks, adopting whatever standards came along that would allow them to do that. Such folk worked on their own standards (see e.g. MIT's Chaosnet) but chose to switch over to the [IETF standardization of the] ARPANET stack when it became clear that's where the future was.


Precisely this. The Internet was born out of the dreams and efforts of many nerdy people who had no reason to do it other than it could be done, and nobody had done it before.

That reasoning drove the majority of computer technology progress during the 1980s and 1990s. It wasn't until AFTER the birth of the World Wide Web did mainstream businesses start to really look at monetizing this new market, and in a symbiotic way hackers and nerds and geeks started crowing about altruistic philosophies like decentralization and how information "wants" to be free.


To answer both of your points, fast was implied by virtue of it being a communication network.

The open part can be pedantically removed in the case of ARPAnet, but I've not met anyone who confuses the Internet with ARPAnet. The Internet, as it came to be called in the 90's with the rise of the World Wide Web, WAS at it's core 'open'. Pretty much everyone I've met who was around and working on the ARPAnet saw it as a foregone conclusion it (a network based on the lessons learned through ARPAnet) would be going public in one way or another.


> but I've not met anyone who confuses the Internet with ARPAnet.

Now you have! :) Unless you don't count online interaction.


In particular, to guarantee a devastating response to a Soviet first nuclear strike. As I recall.


You're referring to ARPA net, which is not "The Internet".


File a defect log to the book of Genesis, assign it to original developer, the God himself.

Why are you surprised that humans are being humans?

Why can't you put trust into the fact that there are more people being good most of the time.

Internet tools should be like paper and pencil, opinionless. Pencil maker doesn't get to control what gets written by the pencil, a social media platform maker doesn't get to control what gets said on the platform. Only when legally required, the pencil may be seized; only when legally required a social media post be taken down.

If with the protection and power of USA we can't stand by the Freedom of Expression in the marketplace of ideas, we are doomed to get an authoritarian overlords.

This "I don't like this free and open internet" because my ideas are losing is very dangerous power grab.


Well said. I blame 2nd coming Jobs (iMac and iPod era) and same period Microsoft for a lot of this as they competed with each other.

Usability became more important than flexibility. And intuitive operation prioritized over ease of learning.

There's a lot to be said for a harder to use computer with a learning curve, but which affords you more power to be a creator instead of a consumer at the end of the curve.

I'd go so far as to say that's better for us (as in, all humans).


This is really silly to me.

Making computers accessible seems like a completely reasonable, sound priority. Yes, computer literacy is something we all need to work towards, but we'll never be in a world where the average person understands PKI, and saying that we should limit accessibility until they do is absurd.


Not what I'm suggesting. More what I am suggesting is that we REALLY need to get away from teaching software products, and start teaching how to use computers.

Example:

Teaching fundamental abstractions before basing education on one set or another.

Teaching fundamental program sets/basics of toolchains (Think scripts, text editors, and intro to program compilation.)

Teaching fundamental protocols.

Teaching about infrastructure.

Teaching how to do X in Windows/Mac/Linux.

IN THAT ORDER. Notice that that curriculum, while it would likely have to choose one OS or another based on circumstances, focuses on what you can DO with a computer, and lays a foundation through which the neophyte user can begin to understand what a computer and the NET really are.

The NET isn't a pretty screen. It isn't one company's search engine, it's the means by which info goes from HERE to THERE. A computer isn't some mere calculator. It is an extension of our minds (and should be civically treated as such, but that's another post).

I can die happy if within my lifetime, my occupation in the tech industry becomes "unskilled" labor. For I will have contributed to finding a way to elevate mankind to a me level.


but we'll never be in a world where the average person understands PKI

2000 years ago: "we'll never be in a world where the average person can read and write English"


I think my point stands even with a loose definition of 'never'.


Yes, apparently all we need to do is wait 2000 years for people to understand PKI.


Then let's re-evaluate the situation in 2000 years. Until then, PKI is still useful without the average person understands it. Just like how languages are useful without everyone understanding it.

I doubt this is even the point of GP.


Yeah, understanding the internal combustion engine should not be a prerequisite for riding a bus.


Most people have at least a vague idea of how a combustion engine works, don't they?


Kids ride buses; I'm pretty sure they don't understand the marvels of engineering they're benefiting from.

My point is: it should be possible to use something without fully understanding the minutiae of how it works. We call this “user interface design”.

You should be allowed to live in a house without a full understanding of the architectural details that prevent it from falling down.


I'm sure no single person understands every detail of the diesel engine, arguably nobody so far can solve the Navier-Strokes equation exactly. But even children understand that the car needs fuel and they might now that it has an engine that can break if given the wrong fuel. They could, at least. The fact that there are those who don't only corroborated the fact that there are folks who don't know how ... what was the actual problem? Will be owned by having their traffic captured.



> Usability became more important than flexibility. And intuitive operation prioritized over ease of learning.

Why is this problematic? Why is more people having access to computing a bad thing? This is a dangerous, and (dare I say) elitist opinion. We're seeing positive impact, every day, thanks to the ubiquitously available computing resources.

If you'd like to use a computer with a learning curve, use a computer with a learning curve. Don't drag the whole world with you.


The trade of is worth some thought. It takes a lot of effort to cater to intuition. If we put some of that effort into giving the user acces to the tools it should make a better result in the long run. The simple version of how a combustion engine works is inspiring and helps us aperciate the bus.

It might be a bit to deep an argument but where do you think intuition comes from? If we design peceaved reality after existing intuition you get a giant feedback loop that calls for ever more unrealistic representations.

Mabe an analogy can be had with electronics being mostly paralel processes while we can barely figure out how to implement it in higher languages. Our intuition likes the 1 thing at a time approach.


I think the idea is that we should cater to intuition, and give everyone the necessary tools to understand and tweak it, but it's not necessary to do good. To the average people, appreciating the bus brings much less utility than being able to take the bus. Should we not have the bus just because people can't appreciate the combustion engine?

But then, of course, intuition is more useful for some people. For those people, no one is stopping you from digging into the engine itself. This is really the price discrimination idea applied to utility: offering a cheaper, easier to use interface (taking the bus) means now the bus is useful not only to the engineers, but the passengers as well.

The situation about computing is not really that much different. If you want to learn the internal of computers, Gentoo exists, feel free to use it. But should everyone use Gentoo? Not really. I'm probably more proficient in Linux than the average developer, but I don't see the need of using Gentoo myself either.

That is not to say we can't do better either. The signature design for SQLAlchemy is leaking abstractions (but in a good way). The average dashboard for a car today is way more complicated than the dashboard of a car 70 years ago. Yet, that's not stopping car ownership to grow. Maybe there's a lesson for us somewhere there as well.


The problem is that with the popularity of tablets and phones, the desktop and laptop market is shrinking. I wonder how much of the recent hardware price increases you can really blame on cryptocurrencies vs it's becoming a niche to have root on your computer and is priced as such.


More importantly, the ability to write and run your own code easily, of which learning is only part of the process. From that perspective, Apple was closed from almost the beginning, while the PC and various other micros of the time (ZX, C64, etc.) started out open.


The Apple ][ wasn't closed, you could hack away on it just as much as any other 8bit machine of the day. I think (correct me if I'm wrong) it was from the Mac onward life got a bit more challenging.


I found the early macos’ (6,7,8 and 9) incredibly frustrating for this reason. Pc users has dos and qbasic, but other than hypercard, mac didnt seem to have a way into the internals to learn about coding. Stuck in userland.


It’s a sign of trouble, but I’m not sure it’s really “further” trouble, all it takes is for them to get a cert from Let’s Encrypt and call it a day. I’m surprised they weren’t using LE to begin with actually - since LE is available, why would you ever pay for another CA (excluding EV certificates)?


This is a much bigger deal than people are giving it credit for.

At any point in history, have CAs revoked certs solely to censor a target website?

Maybe the answer is yes. I don't know. But this is a rude wake-up call for me and everyone else who tried to force the world into this shape.

We've all been shouting "You have to use TLS! It's fundamental security 101. If you're not using https, your site is probably broken. And there's no reason not to do it, since it's so easy."

Surprise: Now nobdoy trusts http, and those that control https can revoke their trust based on arbitrary human morals rather than solid technical reasons.

I was a pentester for years and not once did anybody mention this threat anywhere. It's blindingly obvious in hindsight, but it was too easy not to think about it.

Let's Encrypt is in the exact same position. Why do we trust them? Think about it -- they're under US jurisdiction and subject to US laws. The government could compel them to revoke certs.

We're lucky that it's just a minor annoyance. Picture a world where no major browser renders http at all, and the only way to get a site online is to have a trusted cert.

This is not far from reality: If the Magic Leap turns out not to be vaporware, they're going to be launching a DRM-powered internet that can't be adblocked. And that means we'll all be subject to government whims far more than we'd like to admit.


It's true HTTPS traffic can be blocked (at least partially) by revoking certificates, but is this worse than the censorship opportunities offered by unauthenticated HTTP?

Whether the traffic is authenticated or not, ISPs block sites when instructed by government orders. And that is much harder to work around than a revoked TLS cert (although a revoked cert does mean "totally blocked" for almost all users).

These are not purely technical challenges, but rather political issues that must be addressed as such. We will always lose to our governments if we focus only on technical solutions to censorship.

I'd like to add that the USA government, for the most part, does not accept an absolute right to private communication. The possibility of ubiquitous end-to-end encrypted communications with tools like TLS and WhatsApp is not something we should take for granted. The USA matters here because we are the powerful nation with the strongest legal and cultural commitment to freedom of speech.


> At any point in history, have CAs revoked certs solely to censor a target website?

Revoking certs from US CAs is probably the new DNS seizure, post- encryption and TLD expansion.

Although hopefully it'll die a quick death when they realize it's ineffective.


I do not see how censorship at the CA level is worse than the ISP/ DNS level. We should not discourage people from using TLS because it allows for one of many other methods to censor.

SEC takedowns have happened for years without relying on TLS.


ISP/DNS-takedowns affect only customers of a single provider. Compelling every US provider is tedious and would still not affect people in other jurisdictions. CA-levy takedowns affect everyone.


A takedown against the hosting ISP would affect everyone, and DNS takedowns can involve changing the authoritative record.


The site can still move to a different jurisdiction and mirrors would still work. As evidenced by various takedown attempts against Sci-Hib.


A site can move to a different CA and mirrors would still work.


All the various DNS alternatives people have pitched over the years pretty directly address the centralization issue you are getting at here. Often by casting themselves as decentralized.

Really the issue is that decentralization isn't very compatible with convenience and people generally place a lot more value on convenience than things like Sci-Hub.


Self-signed certificates, trust on first use (or verify out of band) like SSH, works around most of this


Correct me if I'm wrong, but the signed server-identifying cert is swapped in TLS before the connection is encrypted, no?

So it's not technically infeasible to have networking gear drop any connection which doesn't chain back to a government-approved root?


> Correct me if I'm wrong, but the signed server-identifying cert is swapped in TLS before the connection is encrypted, no?

Only on old TLS versions. TLS 1.3 changed it so the server certificate is also encrypted.


What about SNI?


So it's not technically infeasible to have networking gear drop any connection which doesn't chain back to a government-approved root?

Yse, and that is a very scary thought. China is doing something similar already.


You could use a VPN. If you live in a place like China that blocks VPNs, you are screwed, agreed.


There are plenty of reasons to pay for a certificate. Wildcard certificates only came out last month on LE, and people might still be weary to switch their primary site over so quickly. Additionally, there's still a few cases I can think of where a custom certificate might be needed.

For instance, I recently consolidated my personal projects and site onto one server. I needed a single certificate that'd cover two domains. Digicert combined two of my orders into one certificate with two wildcard SANs. You wouldn't be able to do that with LE.

Edit: I was under the impression you couldn't do multiple wildcard SANs in LE but according to some forum posts it's fully possible as long as validation passes.


You can definitely get certs from LetsEncrypt that have SANs for multiple unrelated domains. I haven’t tried out wildcard certificates at all yet, but I would be surprised if it didn’t allow combining those features…


I've got two wildcard domains that are completely unrelated from LetsEncrypt.


It worked for me as soon as wildcards went live.


Besides having a fancy name in the URL bar I can't think of any.


Using TLS on a server which doesn't have outbound/inbound access to the Internet/LetsEncrypt servers.

Sharing a (wildcard) certificate between multiple servers.


Thats a distribution problem. You don't need to be accessible from the outside internet to use a LE certificate. You can request a cert from another one, then copy it. (Either manually or automatically, that is up to you)


...and then LE also revokes their certificate.


At this point you could go back to HTTP. And for LE: close down the company if that happens.


But then you have the issue of browsers marking your website as insecure, ISPs taking it upon themselves to enforce censorship, etc.


Yes. But at least as a company, you will still have your dignity. To not become a government pet.


Really? A "government pet"?


better name?


That is a really farfetched scenario which I would never imagine happening. LE are not Cloudflare, I am sure they would not ridicule their neutral mission without actual legal force.


Unfortunately it's not far fetched. I wish it were. It's just an unpleasant thought we've decided not to think about.

LE are a US corp, subject to US laws, and at the whim of US court orders. The implications of that are worth gaming out.

Our laws are (arguably) mostly fair. But there are cases where they're not.

Picture a world where an administration rises to power in the US on a platform that seems insane, but everybody endorses it anyway. And this platform just so happens to be against the sort of thing you're trying to do on the internet.

Sci-hub are stealing. We as a community are mostly fine with that. We don't consider it stealing, because the moral good outweighs the bad by a hundred to one.

But this isn't about Sci-hub. This is about a world in which we're free to do as we please, because we have the ability to decide what we want to do. If we want to make a site that can make information available, and someone else doesn't like that information or feels that they own it, what do you do? You have no power.

And when you blindly put your faith in institutions like Let's Encrypt on their platform of openness and trust, you set yourself up for a shift: One day you wake up and find out you were mistaken, and we were all mistaken to push this centralized model in the name of security and convenience.

And of course, that's the fundamental truth, isn't it? Liberties have always been eroded by pushing security and convenience.

We should think carefully about who we trust, and why.


Sorry this is a pet peeve of mine, but I don't think sci-hub is stealing, it's infringing copyright. And this is one of those instances where I think what words we use to describe what sci-hub is doing matters.


without actual legal force.

That's precisely the main concern here, given the nature of SciHub.


If you're on AWS it's easy enough to just use their CA with ALB


Only one of the domain/wildcard pairs listed in Sci-Hub's certificate actually works for me right now. It's probable the certificate was revoked not because some publisher threatened legal action against Comodo, but because Sci-Hub no longer controls some of the domains listed in their certificate. If this is the case then they would be able to get a new certificate for free with those names removed.

I wouldn't go rushing to blame Comodo for kowtowing to publishers' demands until Alexandra tells us that's what actually happened...


This is exactly what I said would happen when Google started making all of us use HTTPS.


You're not the only one. I hypothesised about this before in previous SciHub discussions, and SciHub isn't the only site that is/will be affected. Security is the ostensible benefit, and it's the one they advertise the most; easier censorship and centralised access control is the other---something which a lot of the pro-(traditional)-HTTPS advocates don't advertise. Make HTTPS mandatory (so no more HTTP), make it near impossible to bypass in browsers, and you have created a very effective censorship system controlled by the CAs and whatever interests they serve.

Slowly ostracising and forcing "compliance" of those who don't toe the line is easier than before. They want you to be obedient sheep, living in an illusion of security and safety while continuing to mindlessly consume under their control.

I will resist the urge to post that memorable Franklin quote.


Let's not pretend the real reason google isn't so gung-ho about https everywhere is to prevent the ISPs from muscling into their businesses:

* ads on page

* seeing all internet traffic to enhance targeting


You make it sound like this is a bad thing. Regardless of if it also benefits Google's business model, both of these things benefit users and website owners. The only thing that benefits from injecting ads into a page or deeper ISP tracking is some ISP executive's bonus.


Challenge: Next person that downvotes me, also leave a comment explaining why.


But then you have the other issue of your ISP having the ability to meddle with what it's sending you and silently injecting or removing content from the page. Really, I don't see a great solution either way.


This reads like a conspiracy theory. Who are the pro HTTPS everywhere people that are _also_ secretly trying to further censorship? This would have to be a fairly vast conspiracy, but, you don't provide any evidence.


It's a simple fact. It doesn't need to be planned or conspirated. The fact that root CAs are at the top, modulo self signed certs exchanged person to person IRL, is certainly no secret. The conspiracy would be that they might use the fact to censor or listen, whoever they are. The conspiracy would be not to tell anyone beforehand, because otherwise it wouldn't be a conspiracy anymore. What kind of exhaustive evidence do you expect for the claim that they are not saying something? And maybe you are right, this is extremely hypothetical, they would never take action to invalidate certs... Hey, wait a second!!!

The real conspiracy would be to know a better system and not tell anyone. Which is like sponsoring scientific discoveries and then hiding those behind a pay wall. Oh wait again.


Who are "they"? Google? Mozilla? Microsoft? Apple? The 100s of CAs? And, are they all working together? And there hasn't been a single whistleblower to expose this?

What do you mean "listen"? CAs can invalidate certs at will, but, they have no mechanism to listen on communications (unless you give them your private key, but, then anyone you give your private key to can eavesdrop).

The CA system has lots of issues. It would be a pretty big conspiracy if there were thousands of people that knew of something better and said nothing. But, there is absolutely no proof that anyone has any idea how to do better than the CA system. Do you know of such a system or have any evidence that someone else does? Google, despite their flaws w.r.t. privacy, has done quite a bit of work to improve the CA situation - Certificate Transparency, for example.


> Who are "they"?

A court seizing a domain thereby invalidating a cert.

> What do you mean "listen"?

If I accept a custom cert, but don't validate the key, I might as well use none, basically. That's the extent of my knowledge, I don't know what Certificate Transparency is doing, for example.

> The CA system has lots of issues. It would be a pretty big conspiracy if there were thousands of people that knew of something better and said nothing. But, there is absolutely no proof that anyone has any idea how to do better than the CA system. Do you know of such a system or have any evidence that someone else does?

> It would be a pretty big conspiracy

Exactly, so why do you expect that twitter sized post could it explain it convincingly?

> if there were thousands of people that knew of something better and said nothing

Ironically, it might be the ability to censor communication to suppress such voices, however hypothetical that is, that triggered the GP.

> But, there is absolutely no proof that anyone has any idea how to do better than the CA system.

PGP is used with key exchange in real live. I'm not using it, just arguing for the sake of the argument. It has problems, too, but "better" is not a binary value , except in the limited scope of the specific problem. PGP doesn't need root CAs.


What is "this"?


Likely they mean censorship via certificate revocation, which it isn't clear yet that this is, but they have a valid point. If you put control over your trust in a third party, you give them the power to do this kind of thing.


You can always manually trust certificates via various means, it's the basis for almost any corporate network, most browsers also offer a simply dialog to bypass certificate warnings.


Browsers seem to be moving towards making it harder to bypass, which is probably a good thing for the average user. I wouldn't be surprised to see the ability to ignore https errors (or access http sites at all) locked behind a developer setting or something.


As a user, it's hard to bypass, but as a developer, it is literally easier to "crack" the browser by patching the jump/value to always go to "cert is good" than try to find and make the corresponding changes in the source and then figure out how to recompile everything else identically. I'm not sure what to think of that...


Chrome's already moving in this direction. Bypassing a HSTS error means you have to type "thisisunsafe", and they change the keyword sometimes (it used to be "badidea").


Firefox makes working around HSTS even harder. But to be fair, HSTS is the domain owner explicitly declaring "do require a valid certificate here!".


To disable HSTS permanently just do this: https://security.stackexchange.com/a/102315


On Chrome I can see that but I doubt Firefox would make that move.

Plus, they won't remove the functionality to manually trust a cert (Business Users would complain).


> On Chrome I can see that but I doubt Firefox would make that move.

Firefox has implemented the same rules around .dev tld's as google. I use vivaldi when accessing internal company .dev domains because firefox won't let me tell it to accept the self-signed certificate.


An option could be to use certificates signed by a self-signed CA added to your trust store.


That's exactly how it's set up. Doesn't help, I'm apparently not allowed to tell my browser what to do in this instance.


Firefox will happily accept self-signed certificates chaining to manually imported CAs. However, there are a lot of severely outdated guides on creating self-signed certificates out there, and many of the certificates produced that way won't be accepted by any modern browser. OpenSSL's terrible command-line UX certainly doesn't help matters. I've found easypki[1] to be the most convenient tool for this purpose.

[1]: https://github.com/google/easypki


The person you're replying to doesn't have any problem with certs. Their problem is that they (or their employer) hijack a TLD for whatever ludicrous reason, and HSTS pre-loading applies to their hijacked names the same as it would to real names.


Are you sure? The initial post was about .dev being HSTS-preloaded, but the comment I was replying to was an answer to the suggestion that they could use self-signed certificates after importing them to the trust store.


Yeah, and having read over this thread about three times I'm actually less sure than I was. I'll be checking today.


Ah, got it.

Well, another argument in favor of not overloading TLDs for internal domains, then, and just buying an additional domain if you really want to have separate internal and external domains.


IIRC that's because Mozilla sources their HSTS Preload List from Chromium


You can also just use HTTP, since sci-hub doesn't force HTTPS.


Being wiped from the internet by a disagreeable CA authority. If that is what "this" is.


http://sci-hub.tw/ is reachable no problem. Cut down on the hyperbole please.


FWIW, onion addresses don't require a cert, they are end-to-end encrypted by design.


Technically, nothing requires a cert. It's just best practice to have one.


That is not what "this" is.


The grandparent comment is right. I am not going to mess with default browser settings, and almost nobody is.


[flagged]


How exactly do you see Google doing anything here, outside of a general "they encourage HTTPS" (but the site still works fine over HTTP)?


http://sci-hub.tw working here (Safari, MacOS) Sunday 8am in CA.


Similarly, the certificate I'm seeing on sci-hub.hk is set to expire in 2019, issued by COMODO.

Connecting from St. Petersburg Russia.


It's not the expiration date; they used OCSP[1] to revoke it earlier. Your browser is probably just not looking up the certificate status on the OCSP server.

[1] https://en.wikipedia.org/wiki/Online_Certificate_Status_Prot...


http://sci-hub.hk works, but https://sci-hub.hk gives me a revoked certificate error which I can't seem to work around in Firefox 55. Connecting from Belgium.


The onion domain should be fine because tor has it's own encryption/verification scheme built into their DNS. The domain is the public key (a derivative of it) and the private key is used to sign data.


Also, http://scihub22266oqcxt.onion/ is down. And there's no mention of an onion link on the Sci-Hub website. I wonder what's up.


crt.sh reports it was revoked on the 26th: https://crt.sh/?id=274083328


    Issuer: COMODO

That figures. When will we stop giving money to those scumbags? Trying to register the Let's Encrypt trademark was enough for me to never give them a cent again.


I doubt they just went out and did it randomly. I'd guess it was done via court order. The ACS got a court order against them that also ordered that 'internet search engines, web hosting sites, internet service providers (ISPs), domain name registrars and domain name registries cease facilitating “any or all domain names and websites through which Defendant Sci-Hub engages in unlawful access to, use, reproduction, and distribution of the ACS Marks or ACS's Copyrighted Works.”' (https://www.sciencemag.org/news/2017/11/court-demands-search...)

Alternatively, Microsoft might have had something to do with it (they're super anti-piracy, and have a contract with all the CAs that requires them to unilaterally revoke any cert at Microsoft's discretion), but I think that's far less likely than the court order.


>they're super anti-piracy, and have a contract with all the CAs that requires them to unilaterally revoke any cert at Microsoft's discretion

source?


https://social.technet.microsoft.com/wiki/contents/articles/...

> If Microsoft, it its sole discretion, identifies a DV Server Authentication certificate is being used to promote malware or unwanted software, Microsoft will contact the responsible CA and request that it revoke the certificate. The CA must either revoke the certificate within a commercially-reasonable timeframe, or it must request an exception from Microsoft within two (2) business days of receiving Microsoft’s request. Microsoft may either grant or deny the exception at its sole discretion. In the event that Microsoft does not grant the exception, the CA must revoke the certificate within a commercially-reasonable timeframe not to exceed two (2) business days.


This is absolutely insane, and Microsoft really has no position to make these demands. Does McDonalds have the right to get your drivers licensed revoked? (Even if you say... use the drive thru to steal mcnuggets?)

Hell no, and neither does microsoft.


Microsoft runs a root store. That gives them more leverage over the CAs than McDonalds has.


I'd be curious what would happen if the "too big to fail" issuers pushed back against this.

Microsoft's only option is to completely drop the root cert, right? So there's no real non-nuclear option...

In the broader sense, this is one downside of the shift towards Lets Encrypt and CAs being more interchangable: increased power of the root stores relative to them.

Sometimes that's good, sometimes it's evil.


> Microsoft's only option is to completely drop the root cert, right? So there's no real non-nuclear option...

In small-scale disputes MS (and other browser vendors) would not have to nuke an entire large CA to get their way. In principle they could just blacklist the individual certs/names, leaving the CA's other certs alone.

That ability/implied threat probably does mean that the CAs tend to comply with MS piracy/copyright-related revocation requests, because refusing to comply would piss off MS (and possibly law enforcement) without actually stopping them from getting their way by other means.


Sidenote to this: if you want to sign a Windows driver (and on XP+, you do), you can only use Microsoft approved CAs.


Could MS not ultimately stop honoring said vendor's certificates?


Yes, in the extreme case, Microsoft would be able to issue an urgent security update whose only purpose was to remove this CA from the Schannel trust store. The effect would be that IE, Edge, Chrome and most other SSL/TLS applications on Windows ceased to trust those certs. That's obviously really drastic, but they could certainly do it. (Firefox and various Free things wouldn't be affected because even on Windows they don't use Microsoft's trust store)


I‘m really no expert on American law but can such a broad worded verdict be legal? I would have imagined that they‘d have to name every company/person that has to comply with it.


The last article I saw on it was saying they got tired of playing whack-a-mole so went back and the court gave them a blanket ban.

I'm sure they could challenge it if they wanted to step on to US soil which, in this case, probably isn't such a good idea.


Related: https://news.ycombinator.com/item?id=16938593

Edit: The author registered Stripe Inc. in a different state than the payment processor and acquired an EV cert which got revoked.


Max Weber wrote the government has a monopoly on the legitimate use of force while arguing that we trade safety for freedom to build modern societies.

Going forward the ability to trust information will matter as much as physical safety. We're starting to build institutions that regulate that for us, CAs are one of the first.

Depending on where you stand this is either a success or a failure of institutional trust.


Keep in mind all of Weber's argument.

Government has a monopoly on the legitimate use of force.

Weber claims that the state is the "only human Gemeinschaft which lays claim to the monopoly on the legitimated use of physical force. However, this monopoly is limited to a certain geographical area, and in fact this limitation to a particular area is one of the things that defines a state."[2] In other words, Weber describes the state as any organization that succeeds in holding the exclusive right to use, threaten, or authorize physical force against residents of its territory. Such a monopoly, according to Weber, must occur via a process of legitimation.

https://en.wikipedia.org/wiki/Monopoly_on_violence

This is mis-read by many Libertarians, including Charles Koch,[1] who directly funds a wide set of Libertarian institutional propaganda mills,[2] that this invalidates government. It does not.

Absent a monopoly, there are multiple parties that claim legitimacy over use of force, including lical strongmen, tribes, or corporations, for all of which there is an extensive history of same (including Koch Industries, to the present).

Government's monopoly is not for unlimited use of force, but for legitimate use.

And if some alternate structure emerges claiming this right, it is, ipso facto, government.

It is also possible for actual or nominal governments' use of force to be illegitimate. Which it rather frequently is.

________________________________

Notes:

1. https://www.marketplace.org/2015/10/21/business/corner-offic...

2. Amply documented, see: https://en.wikipedia.org/wiki/Political_activities_of_the_Ko... https://www.sourcewatch.org/index.php/Koch_Brothers


It sounds like a more fundamental question is, "Who authorized this Max Weber dude to dictate who may legitimately use force?"

He has an opinion, I have an opinion, you have an opinion, Charles Koch has an opinion... everybody has an opinion.

At some point, the answer to questions like this always comes down to "God," or "Nobody," or "Whoever has the most money/biggest weapons." It's an unsatisfying debate.


The point is that this is the justification that's been cited by, and is at the root of, the criticism of government (and, more covertly, taxes).

Koch uses this as his justification, but misstates and apparently misunderstands the concept. This is his prior, the lynchpin of his argument, and it is mis-applied.

The sentiments of Weber are not inconsistent with a long prior line.

Your "God or Nobody" presumption is incorrect. The principles also arise out of systems studies and ontology.


What's there to stop them self signing their cert and allowing everyone interested to add it to their certificate store?


I‘d imagine it’s because the average user is pretty dumb and probably not able/willing to go that extra step of adding a certificate manually.


I wouldn't say someone is dumb because they don't understand how SSL in the browser works. In the same sense I'm not dumb because I can't perform heart surgery.


Dumb in the sense of "will not say anything, not complain nor ask question", you know, literally mute.


It probably depends on what domain youre using.


This seems to affect all their domain variants, but it still works without encryption: http://sci-hub.tw

Edit: already noted by detaro https://news.ycombinator.com/item?id=16952051


I just tried https://sci-hub.tw/ (Chrome) and got a secure connection without warnings. Since the issuer of the certificate that my browser showed for the connection is "Comodo" I guess the revocation didn't reach my browser yet?

EDIT:

IE and Firefox say "insecure".

What really annoys me: There does not seem to be any way - none that I could find - to get IE and Firefox to connect anyway?


AFAIK Chrome doesn't check the CA revocation lists (but instead an aggregation of them by Google), so it doesn't know yet the cert is "bad".


On sci-hub.hk I get a non-SSL connection, if I try the 'trick' of http://journal_name.springer.com.sci-hub.hk/path/to?article I get the revoked SSL certificate

Firefox 59


There does not seem to be any way - none that I could find - to get IE and Firefox to connect anyway?

In Firefox, I think you can only by disabling the check (Preferences → Advanced → Certificates → Query OCSP ...). You probably don't want to keep that disabled, though.


It's fairly marginal, which is why it's switched off in Chrome. To get acceptable performance in the real world they have to soft fail. But that means most bad guys would just force it to fail and then it treats that as OK. So it's a seat-belt that snaps if there's a sudden impact. Not great.


COMODO has revoked their certificate, probably under a court order.

You can temporarily work around this by disabling 'Query OCSP responder servers to confirm the current validity of certificates' under Privacy & Security in Firefox.


and not forgetting to enable back the OCSP queries once you have finished browsing Sci-Hub!


I'm a little confused.

Setting aside the legal/ethical underpinnings..

Is the issue here that people are worried that an SA can revoke a cert or that it will be harder for the layperson to get to this particular site?


Both, I think. The certificate has clearly been revoked.[0] If Sci-Hub requested that, no problem. But if some third party did it, that's clearly a vulnerability.

And yes, the impact is that many people lose access. Or need to access via HTTP instead of HTTPS. Which exposes information about what they access. Unless the use the Tor onion site, which is maybe beyond most people's skills.

0) https://crt.sh/?id=274083328


shouldnt we be using some trustless system for encryption by now?


Easier said than done…





Applications are open for YC Winter 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: