
Help Reform Computer Crime Laws - dinodaizovi
https://hackerone.com/news/legally-blind-and-deaf
======
obstinate
> The hackers with the skills to break into software and networks, who choose
> to come forward with their knowledge and share their findings, should be
> legally exempt from criminal prosecution under laws designed to punish
> crime.

I know this is an unpopular opinion here, but I personally think that you
shouldn't mess with people's shit unless they invite you to (e.g. by having a
bounty, research partnership program, etc.). Yes, some organizations will be
less secure because of it. Similarly, some houses are less secure because the
locks are low quality. It isn't up to you to decide how thoroughly said locks
should be checked.

~~~
arice
The stance you take is harmful when said organizations are responsible for the
stewardship of the data of others, and being "less secure" places the general
public at risk. The true impact of a breach is rarely limited to a single
organization.

It is even further harmful when the laws are aggressively applied to prevent
research into personal property, especially when your personal safety may
depend upon it. For example, your car:
[https://twitter.com/0xcharlie/status/600729130355666944](https://twitter.com/0xcharlie/status/600729130355666944)

~~~
obstinate
> The stance you take is harmful when said organizations are responsible for
> the stewardship of the data of others

Do you make a habit of visiting banks uninvited to test their vaults?

~~~
arice
I don't make a habit of storing assets in banks that fail to insure me against
a total loss of those assets. That insurance just happens to require extensive
third-party verification of security practices that may be publicly audited
upon request.

The analogy doesn't hold when applied to the digital services we all depend
upon as such assurances are impossible.

~~~
sdrothrock
> The analogy doesn't hold when applied to the digital services we all depend
> upon as such assurances are impossible.

Rather than allowing anyone to try to crack a server as long as they claim to
be a white hat, I'd much rather require corporations to go through a standard,
"extensive third-party verification of security practices that may be publicly
audited upon request" and default cracking attempts to "illegal."

I may be misunderstanding something in what you're saying, though -- if I am,
could you clarify that for me?

------
stevecalifornia
IMO, security researchers have the legal and moral obligation to contact their
research target before conducting research. If said target refuses their
request, the security researcher should have the legal right and moral
obligation to publicly disclose that their request to conduct research was
refused. And that's it.

------
limaoscarjuliet
Not so easy to do. As usual, there is a risk that such legislation will be
abused to let people who are trying to break in to a system for malicious
reasons, claim later it was research. A decent law would require researchers
to register with police/whatever before they start researching. Which then
excludes researching government/police systems (because they would know up
front). etc. Not so easy!

~~~
indrax
Let's imagine a decent law that would require people to register what kinds of
information they are protecting, from what kinds of use.

One kind of security researcher we would want to protect is the one who finds
out they can get information, but doesn't get everything, or doesn't keep what
they get, or doesn't get anything really sensitive. Or who establishes that
they can modify a system, but doesn't change anything important.

If someone gets my credit card number, and doesn't use it, and points out the
problem, __I want to thank them __.

We make the analogy of breaking into houses, but bad information security is
more like someone putting up a post-it note that says "This is a lock."

Sometimes just looking past the lock violates the letter of the law.

