> Fun game to play: Take statements from Comey et al. Replace "smartphones" with "brains"/"memories"/"thoughts". Technology will get us there!
> "Everybody is walking around with a Swiss bank account in his brain if government can't get in. You cannot take an absolutist view on this."
> "How do we solve or disrupt a terrorist plot if law enforcement can't access the memories and thoughts inside suspected terrorists' brains?"
Both funny and scary.
Time to rewatch the TV animation version of Ghost in the Shell  again. Released in 2000s, it portrayed and predicted our world remarkably well, and it'll give you a lot of inspirations of what would the future society look like when everyone uses an electronic brain.
Perhaps but not quickly. Decoding brain activity is still a hard thing to do.
> "(with encryption) Everybody is walking around with a Swiss bank account in his smartphone if government can't get in. You cannot take an absolutist view on this (, and we must add backdoors)."
Replace "smartphones" with "brains".
> "(with encryption) Everybody is walking around with a Swiss bank account in his brain if government can't get in. You cannot take an absolutist view on this (, and we must add backdoors)."
Get the joke now?
Yes there's countermeasures, but these seem to be security through obscurity (can the detector spot the use of counter measures?).
20% more likely than polygraph isn't very inspiring though. their success rate is abysmal.
Even if consent is required, it'll just be like traffic stop no-win situations: if they ask you to consent to a search due to suspicion of drug possession and you say yes, they'll search, and if you say no, they'll wait until a drug dog comes, and then the dog will effectively conduct the search and/or will be used as a tool to permit a full search. Or if you refuse to take a breathalyzer test, they essentially presume guilt and treat you like you took the test and failed. Anyone who refuses to consent to the Truth-O-Matic test will probably suffer similar consequences. It's mostly an illusion of consent. I think this future is inevitable without some very serious legislation; perhaps a constitutional amendment.
A counter-point is that one could imagine a potential distant future exigent circumstance where some sophisticated criminal might have information that could truly prevent an imminent attack that'll very likely kill millions of people, or something, and which could be discovered through such a detector. In those fanciful scenarios I honestly do find myself wanting to agree with Comey: fuck privacy, siphon those fucking thoughts. It's certainly a much more ethical and effective alternative to torture, at least, which is how the US government would currently handle such a scenario (if they could cover it up).
Security and privacy/liberty are always a tradeoff, but even if you weigh privacy exponentially higher, there's always still some theoretical risk that would tip the scale back towards security, in my view. Nuclear armageddon, for example. The privacy hill to die on shouldn't be Mega-Corpse Mountain.
This scenario is absurdly far-fetched in 2019, of course, but in 2079? 2119? Who knows? If Aum Shinrikyo and ISIS can do what they did and plan what they planned in the 1990s and 2010s, what about new incarnations of zealous death cults in a world that may have extremely intelligent AIs and/or DIY-potential for almost anything?
Even a single instance of losing the final bastion of privacy is such an incredibly slippery slope that it's hard to imagine any safe way to allow its use in very extreme situations like these without setting up the conditions for horrific abuse and a Stasi/1984-esque kowtowing of all of society. But I think this aspect of the law enforcement perspective needs to be taken into consideration as well, even while simultaneously acknowledging that law enforcement will inevitably cry wolf and exaggerate the likelihood or impact of imminent threats. Of course this'll happen, but what happens if they do see an actual wolf one day?
A terrorist attack is a result of pressure that has been building up finally erupting. If you want to stop attacks new outlets need to be created to channel that energy elsewhere.
The country is a democracy but when dealing with other nations things are more of a dictatorship. That power inbalance makes the otherside powerless. Powerless people without hope do stupid things like blow themselves/friends/family up. Find ways to give them a voice.
The number don't add up. Being scared of terrorist attacks is like being afraid of winning the lottery. Your chances are lower than you think.
Don't want to detract from the rest of your comment. But, one of the fundamental "points" of terrorism is to make this this the case. I.e. to instill fear (even if not remotely likely for it to happen.) The fear is what they want, unless they're trying to wage a guerilla war.
The collective unconscious makes it real.
I’ve heard this said before, but I don’t know if I believe it. How do the chances change if, let’s say, you live in a major city that has been attacked before (perhaps more than once)?
Right now, terrorism is a very minimal risk for an average individual. Even if every single US terrorist attack caused as much death and destruction as occurred on 9/11, it still would be. But what if, due to future circumstances we can't currently predict, every attack had the potential to be many orders of magnitude worse than 9/11, even if there's only a single perpetrator, or just a few? Of course, we'd cross that bridge if and when we get to it, but, in general, the cost of terrorism/mass murder will continue to decrease, and its force-multiplying potential will continue to increase.
In this hypothetical case, it makes no difference if the terrorist is from a different country, or your own country, or motivated by ideology or just a Joker-esque mentality or because their girlfrienf broke up with them a month ago. If it really could take just one nut to cause more suffering in a day than Hitler or Stalin caused in their entire lives, we'll need to rethink the role of government and the boundaries of privacy. Right now, your description is accurate, and it probably will be for the next 40+ years, but the distant future is tricky to predict. Even if the odds of an attack occurring remain roughly the same, the overall "expected value" of terrorism could still skyrocket.
I'm particularly worried about hypothetical new fringe, cult-ish elements like Aum Shinrikyo or even the Manson Family, rather than an organization akin to al-Qaeda or ISIS. I've long harbored a persistent fear that Aum Shinrikyo may be a prototype or inspiration for a future organization or movement which could thrust all of human civilization into darkness through relatively little effort, and before anyone fully realizes the risk (like the risk of planes as suicide bombs before 9/11). All of these copycat mass shootings - while not currently "scalable" (in the worst way possible) - don't inspire optimism on that front, either. Al-Qaeda has specific political goals and strategies, but what about people who sincerely just want to kill every or almost every person in existence? How do you deal with that? Those people scare me far more than any violent religious or political extremist, when considered in the long run.
No matter how much you try to remediate disenfranchisement and discontent and mental illness, some number of people like that with violent urges will always exist. And some people will also perpetrate mass murders without sharing any of those charsct. Aum members didn't exactly seem to be the pressured underclass:
>Aum Shinrikyo recruited approximately 300 scientists with degrees in medicine, biochemistry, biology, and genetic engineering.
>Around 75 researchers were discovered who were working on radioactive materials and other nuclear related studies. [Materials] found in the raids indicated that Aum members measured radioactivity levels at a cult compound
They had esteemed professors and scientists manufacturing biological, chemical, and nuclear WMDs to cause. These weren't powerless people without hope; these were powerful, well-off people with a fervent hope for a new world forged from the ashes of the one they attempted to annihilate.
They may have been very close to killing tens of thousands of more people than they did, if not for a few mistakes, and happenstance of bystanders noticing their dispersion devices pre-activation on several occasions.
This is very difficult to think about as someone who's very pro-individual liberty and privacy, but our current way of thinking may become obsolete if it won't be that difficult for a single, ordinary civilian to kill millions if they become motivated to do so. "Give me liberty, or give me death" will make a lot less sense if liberty implies almost certain death. We should start thinking about how a surveillance state could be created and maintained in the most ethical way possible. For example, automated collection and detection without human analysts invading privacy until something is already flagged by the system (to prevent "LOVEINT"-type abuses).
More people need to be exposed to Bostrom's ideas. I hope he does more podcasts.
they can't do that. https://en.wikipedia.org/wiki/Rodriguez_v._United_States
Counter measures is to store false data as true in the brain.
"That statistic has risen, in one study, to 100% when predicting a lie in an individual when baseline lie/truth levels were closely studied with training from pattern recognition technology (machine learning)."
... not that I'd expect this point to make any difference if the technology to inspect brains were on the horizon.
You can use it to plan and carry one on your own though. And of course you'll know about an attack if your team is organizing one. Shouldn't all people get brain-screened (from experienced TSA brain scanning operators) to ensure the government gets that information?
My point is that having a brain is not enough to communicate - you also need some kind of transmission medium that connects you to other brains.
So even if you would agree to the opinion that any and all communication must be monitored (because terrorists), that would still no allow you to monitor the brain, because the brain itself is not what enables communication.
Monitoring a smartphone would be more justified because you legitimately need access to it to monitor an end-to-end-encrapted connection.
(My "devil's advocate" was to note that brains and smartphones are not equivalent, not that monitoring brains should be allowed. On the contrary, my point is that even if you were ok with monitoring smartphones, there are still arguments against monitoring brains.)
It's simply too dangerous to allow that level of control to exist in the hands of a proven malevolent actor that will use it to ensure the extension and maintenance of its power. That power must be seen for the intolerable burden it's becoming before that point and destroyed, replaced with other structures which have no right to make any such unreasonable intrusions.
At least government agencies are in theory bound to laws, moral obligations and a democratic process. Meanwhile, private entities don't even have to pretend to work in anyone else's interest than their own.
And "you don't have to buy it" is no excuse. If you seriosly want to abolish the state completely, then the same problems that are currently solved by public infrastructure will still exist. You'll be just as unable to opt-out of private roads are you are currently unable to opt-out of public roads.
Of course we're actually talking about chips in your brain, not roads. So good luck opting out of that.
The same reason none of the previous quotes apply to actors without political power. They do not have the ability to compel people to surrender to unlimited search on penalty of death. This is only reserved and pursued by holders of political authority.
Removing the duty to obey and the right to coerce is the only solution to this problem in the long term. Anything less and they will keep pushing for it loudly and outright doing it as long as they believe they can get away with it. Which, let's be honest in reference to the way rubber stamping of anything that is requested in the name of national security, is basically always, in practice.
If I run the company that controls peoples' goddamn brain implants, I can conduct unlimited searches or perform arbitrary death sentences at the click of a button.
If smartphones are any indication, people would either straight-up not know about the issues with control/surveillance or have them as one factor of many to trade-off against.
And there are likely lots of benefits that brain implants could offer. If we go full sci-fi, some things they could enable:
- Perfect memory (thanks to digital storage, built-in or in the cloud).
- A perfect sense of location (thanks to built-in GPS and a direct uplink to a Maps-like service).
- Automation of tedious mental tasks (thanks to a bult-in scripting engine).
- Perfect control over your emotions.
- The ability to record and replay any experience.
- Telepathic communication.
- Taking part in fully immersive virtual worlds.
- Seamless interaction with electronic devices, to a point where they bahave and feel like a literal part of your body.
- etc etc.
So if you're a rational consumer, your choice is to have all of that and more - or to throw it all away, based on the seemingly remote possibility that the company could monitor or manipulate your thoughts or make you jump out of the nearest window. And the company is giving their strongest pinkie-promise that they would never do such things.
Even if you stay strong, if enough people take up the offer, network effects will kick in and make "not installing" an increasingly difficult choice:
- Employers will take the enhanced abilities for granted, so you'll have trouble getting a job.
- Operating devices could become difficult to impossible because manual controls will be seen as an unnecessary expense by manufacturers.
- You will be socially isolated, because taking part in telepathic communication will be difficult and you plainly can't visit virtual locations.
If all that's to much of a hassle to you and you just want to force people to get the implants, you can just hire a "private security contractor" (or build your own army) and force people at gunpoint. If there is no state, who is gonna stop you?
Look at the world we actually live in, this is the way it works, the only thing which actually restrains wielders of political authority is the things they can't do. Not the things they're "restricted by law" from doing.
If they're still around when this is possible, the worst case outcome is inevitable. That doesn't mean that if they're not things can still get bad, by the way. Merely that the combination of the two things taken together is a surefire recipe for dystopia.
That's too easy to dismiss if we - already today - have major industries that only exist because people act against their own interests - and we have armies of marketers and advertisers that work 40 hours a week to undermine peoples' capabilities of rational descision making.
Even if they didn't, market failures are real, the "invisible hand" a lot less so. We know that if everyone acts only in their own interest, the end result can end up bad for everyone. (Or good only for a small elite)
The market needs guiding principles and I'd very much prefer those principles being enacted by a democratic, law-bound government than some oligarch who is responsible to no one but himself.
> Look at the world we actually live in, this is the way it works, the only thing which actually restrains wielders of political authority is the things they can't do. Not the things they're "restricted by law" from doing.
And yet if leaders of private companies had the same powers, they'd magically restrict themselves. Why?
By the way, that isn't even true: Take Trump: You have a political leader with openly authoritarian views at the most popular position on earth, a party ready to follow wherever he wants to go and democratic institutions weakened by regulatory capture - and he still has to frequently backpedal because people defend the laws and don't just bend to his will.
In a private company, where the only law is "the boss is right", who should do that?
They dont? Says who? Who the heck do you think will have power to do whatever they want whenever they want?
You think the 1% will distribute wealth and power fairly and evenly?
Governments are in place to democratically do this. If you think the 1% wouldn't kill you in a heartbeat take a look at world history or some dictatorships today.
You really have no idea what you're even saying.
Take away government you'll just have kings and queens. You think they would respect your rights and/or life.
You must still be a teenager and have not gotten through this part of world history yet. When you do you will understand.
Check my comment history, I've been here almost four times longer than you, with over ten times more karma than you, and have had the same view of political authority for the entire time. Your unfamiliarity with the dangers of political authority and the idea held by many people for the past half century that it must be destroyed doesn't change any of that. In fact in light of it, I'm probably also significantly older and better read on the associated topics than you.
Mark my words; unless political authority is destroyed there is no future that isn't a dystopia once this technology is widely available and employed.
It's quite depressing to see this venue spiral into the lack of critical thinking necessary to not see the blazingly obvious, irreconcilable and irreparable problem of allocating these powers to an agency that also has the unlimited undischargeable right to coercion, but I guess that's just the way the world is failing these days.
You're missing the point that this power being "reserved" by holders of political authority is exactly what prevents corporations from exercising it.
It's also a theoretical/idealistic divide, assuming corporations always act within the rule of law. See e.g. the long history of Ford's various dealings with their own plotting "terrorists": union organisers.
That which can be asserted without evidence can be discarded without evidence.
The same can be said of either side of the discussion—you appear to have provided no evidence that corporations would refrain from abuse of such powers were they not so restrained. Whereas there is plenty of evidence that they do abuse such powers even in our current system where they are somewhat restrained from doing so.
Generally speaking, the way to buy a soul is to offer convinieance. It doesn't even have to be major.
For all those thinking there will be regulation or protections, where are those protections for consumers of the current privacy violating services and products?
You sign on the dotted line and are bound by the terms, there is no big other to protect you from you own choices.
You choose whether you sell yourself, but once you do, don't expect a bigger force to keep you protected...
You are the only person who is going to advocate for you and the people you care about.
Even that isn't true. Plenty of information can be gathered on even the most privacy oriented person due to the actions of others: see Facebook's shadow profiles, or even the use of the DNA your relatives uploaded to learn about your medical history.
If I ever commit a crime, or are framed for one and my DNA is used against me, you better believe my family has sold me out by giving up our shared and essential life force signature.
My point is, there is no big other, no religion, or government who will keep you safe from yourself and your choices. (Or the choices of your family and friends). Any human made laws are show through out history to be maliable, very flexible.
Those that bend the rules take power over those who dont, or don't understand.
A great example of this is tax law ... Huge power disparity between those wealthly enough to find ways to bend the law and those who cannot or do not...
Say the word "drugs" or the word "guns" and watch the people rush to throw the Bill of Rights away.
How are the privacy implications of Alexa microphones any different from those of smartphones (which contain microphones and voice assistant software, just like the Echo does)?
I fear for how much restraint we will show when we become technologically able to detect thoughts that are deemed reprehensible in current society. Some ideas that were considered reprehensible in the past have become part of ordinary norms in the modern day; I still don't consider ourselves today to be infallible in always making those decisions correctly.
For now, even having ideas currently considered most heretical/reprehensible is still legal, as long as it never leaves your head - but I'm not quite confident it will stay that way in the face of mounting pressure.
But generally we are not an authoritarian society thus we will likely not start this type of witch hunt. Other more authoritarian countries may -- those that already do a lot of arbitrary arrests, and forced disappearances, thus will unfortunately make those state agencies more effective.
Second thought: The first people to get subjected to stuff like this is the same ones who are most likely right now to interrogated or have their social media inspected when crossing the border. This will just be a continuation of that existing trend. If you do not allow your neural state to be monitored and inspected in response to various stimuli you will be turned away at the border. Give it 5 years.
Third thought: There would be a lot of health and education opportunities. See brain patterns changing over time -- like a fitbit-like attention/focus score based on passive 24hr monitoring rather than active testing (e.g. brain training.) One could use those metrics to optimize your live to maximize your brain efficiency. Second, long-term one can see how patterns change during development and it may lead to better detection and understanding of mental illness, especially austim, bipolar, schizophrenia, depression and ADHD. One could likely see early warnings of these and do interventions and judge their effectiveness better than just self-reports.
...until you notice the UK police arrests about 8 people a day for tweets saying things like « It’s ok to be white » or calling a transgender a man . The understanding of what is bad has gone from actual crimes to just saying a positive thing about a group that the majority thinks should never be talked about positively.
Not even 10. We are very far from that, even excluding practicality.
I'm going to need a source for that.
You’re free to privately believe that privacy shouldn’t exist. That is materially different from trying to eradicate privacy to cement state control over a populace — in an age where the State has almost immeasurably great power to track, monitor, and police its citizens.
Which makes... complete sense. It would be and has been a problem whenever we aren't intolerant to Nazis and ISIS.
>It would be and has been a problem whenever we aren't intolerant to Nazis
It's a problem you say? Weimar Republic were arguably "intolerant" of nazis with its hate speech laws and look what that got us.
Usually when it comes to privacy I can begrudgingly accept that governments will violate it to some degree. This is a field, however, where I think governments must be outright banned from. We must not allow a situation to appear where the TSA or the police will run a quick "brain check" on somebody. It must be avoided. This would be a one way ticket to disaster.
I don't understand this statement. I hope after Snowden we all know that data at a commercial company = data at government? What's the difference? I don't see any practical barriers to various government agencies having access to all data at all companies. With and without company knowledge, mostly secretly, with and without company and customer consent, legally and illegally. This goes for the US, all US allies and all US enemies. The intentions of commercial companies when it comes to data protection from government are pretty irrelevant.
If you (as a people) want to ban your government from access to anything specific, you must first have control over your government. Which is inherently impossible in a system that is anything other than a direct democracy. Just my very unpopular opinion. There is no nation in the world that comes close to a people having control over a government, except arguably Switzerland. But even there I have my doubts...
Yes, because private companies are regulated. By the state.
Meanwhile, private companies have nevertheless managed to make surveillance and ridiculously detailed tracking into a whole industry.
> Also, people won't have the choice when they're compelled to be scanned on arrest vs logging in to some optional private business.
Except in reality, this stuff has long since stopped being optional. If you buy any reasonably modern car, you'll be tracked by private companies. If you have a bank account, you'll be tracked by private companies. If you write an email... you get the point. How exactly do you opt-out of that?
I choose to plead the 5th. The wording is "nor shall be compelled in any criminal case to be a witness against himself" which obviously covers brain scans.
Of course the police will say I have the right to remain silent and they have the right to lock me in a jail cell. And TSA will absolutely do brain scans. I guess the right to remain silent is successfully getting eroded.
The standard, surprisingly popular, rhetoric I hear against direct democracy in other European countries is really very unconvincing to me: people are too stupid to vote, people can't know enough about specific subjects to vote well, people aren't interested in politics (duh, why would you in a system where you can't influence politics), people can be influenced too easily through propaganda and other nonsense arguments which would all apply to a representative democracy as well.
People don't like change I guess and they (want to) love their country the way it is. On top of that I get a feeling that people somehow look up to politicians subconsciously as somehow being above them or at least being above their neighbors. They seem to equate leadership and sales skills to being able to take hard decisions and being incorruptible / divine. It's like the "king effect" (just made this up): everybody hates the king's decisions, but at the same time the king is the hero and pride of the nation and who could possibly want any other king?
In any case, this form of direct democracy wouldn’t solve the problems the parent is talking about. To achieve that, you’d need to vote on every decisions made by every public official ever.
Every Swiss citizen can propose a law to be voted on one of the next referendums, provided that they will collect the appropriate minimum number of signatures in its support.
However, you missed the part of my comment where I wasn’t talking about Switzerland at all, I was talking in generalities about the places where it was being discussed as an idea to implement.
Take LA for example. A lot of people who post here seem to have strong opinions on how that city is run, and I’d bet a lot of them even live there. If the LA city council elections generate turnout in the mid-teens, that is considered very high. Turnout below 10% is not strange.
Switzerland’s voter turnout is actually quite pedestrian. It’s higher than a lot of local elections in the US for sure, and perhaps even high enough to prevent the kind of special interest local government policies you see all over the US. But just because something seems to mostly work in one single, small, rich European country, doesn’t mean it will work as well anywhere else.
That’s an interesting thesis, but it’s not at all substantiated. Switzerland may have a better voter turnout than LA city council elections, but it has some of the worst turnout in the OECD. If I was going to take a complete guess at the reason (as you did), I’d say it’s a cultural thing more than anything else (outside the influence of compulsory voting laws).
I can't say it's worse than other systems, but it's also far from obvious to me that it's better.
To have such a large society being run that way would have lots of unintended consequences. Imagine a country where voters are so burnt out on voting for every policy that the response rate becomes extremely low - and the only people who vote on any given policy are the slim minority with vested interests.
They mitigate the "low response" phenomenon by encouraging political awareness from a young age and trust in public institutions.
It's imperfect and it may be a massive task to make such cultural changes in the US, but I don't think there's anything inherently stopping it from scaling.
Likewise California's rural areas can get very conservative. There just happen to be a lot more voters living in the cities.
- Transportation. Any time you want to go somewhere by car or plane (and potentially bus/train), the government has decided it has the right to search you.
- Interstate commerce. Most economic activity is taxed, and may be searched passively or actively to ensure that the state gets everything it wants.
- Communication. TV, radio, telephone, and now the internet are monitored and adjusted by the state.
Brain-to-network communication will eventually be commoditized, and the state will transitively invent some justification for monitoring it. Thus you may think thoughts which are shared with the network and for which you will be held liable.
The only difference is that today you type the thoughts on a keyboard.
I wrote an essay about this in 2011, for university.
A much more likely scenario is that brainstreams would be stored in various tech giants' systems (and because analytics is important, ~nothing is discarded).
Instead of running a quick check, the police or official would pull your
thought history. But no, not directly from the source. Of course not.
To maintain a veneer of legitimacy, the actual tech companies will not sell to governments or law enforcement. There will be a forest of brokers, who sell to anyone.
And if you thought advertising was invasive or creepy now, with this kind of technology we would be entering a world Philip K. Dick would have found too depressing to write about.
And observing from the outside, US as a nation is still years - likely decades - from being able to stomach general warrants. But you are already well on your way towards something like that: reverse warrants are now a thing. Law enforcement in US has already issued a warrant for recorded information on all persons who were near an event after the fact.
The logical conclusion with brain data archives will be to issue warrants for full mental activity history for X years on people who have had thoughts on a given unwelcome subject.
But even then, warrants are inconvenient. It'll be much simpler and easier to just buy access to the data in bulk, all the time. That avoids the awkward questions on storage, access and oversight.
A completely free, unregulated market is a bit like anarchy: it might work well if all of the actors were mature and aware of their responsibilities. Unfortunately the ones who wish for that the most are usually the ones least suited for it. They support these ideas not because of a noble goal of perfect freedom but because they are annoyed that there is a more powerful entity that gets in the way of what they want to do with/to others.
The problem with statements like this is that they are presently unfalsifiable. Even though you've stated it as fact, it's clearly only an opinion that you've arrived at based on your own perspective and experience. However, you can't prove it and nobody else can disprove it.
The general practice of extrapolating worst-possible motivation from every action is, in my opinion, one of the reasons why it's so hard to communicate rationally these days. If you envision everyone 'on the other side' of an argument as demons, maybe sub-human, why bother negotiating? Fuck 'em, maybe a meteor will strike and wipe them out or they will kill themselves off through one of their many vices.
Ironically the brain-reading capability discussed is that they might lay this bare, expose some of the incorrect asumptions and start to heal some of the relationships between cultures and people.
Take a regular state and strip away all those regulations about consumer protection, environment protection, substance abuse, contract enforcement etc, and stop all this socialist stuff like public healthcare, public unemployment insurance, and public roads, and you are left with is stateless anarchy. If you remove anything less you can always claim there's regulation and no truly free markets.
All of this is a free market in your definition, and is essentially how European states are run. But it's somehow not what "free market" advocates in the US want today.
On top of that, Euro side feels like having much much more regulations than US. Which causes it much easier for big companies, but hinder small businesses and upstarts. While technically it may count as free market, it feels like gatekeeping.
I would suggest that you read theory. This is not socialism.
What do you mean by "keeping in check?" What evil would companies do otherwise?
Dump trash or chemicals wherever it's most convenient, reduce product quality because of margins, read emails of dumb users who reuse their email-password for your social network site (Oh, wait a minute), work conditions that border on indentured servitude...
There's plenty of room at the bottom (as Feynman said in a completely different context).
Let's take a look at this one:
> Dump trash or chemicals wherever it's most convenient
To make these stop, government didn't show itself very fit at reducing the problem. How can this be changed? That's what I would find interesting to hear.
In practice, this has been eroded to the point of practical meaninglessness, such that the FBI director can issue a statement that clearly violates both its spirit and its letter, and few people bat an eye.
or you could just be black.
Government or Corporation should never have access to anything like this. They _never_ use it for good.
Good my lord, pardon me:
Though I am bound to every act of duty,
I am not bound to that all slaves are free to.
Utter my thoughts? Why, say they are vile and false;
As where's that palace whereinto foul things
Sometimes intrude not? who has a breast so pure,
But some uncleanly apprehensions
Keep leets and law-days and in session sit
With meditations lawful?
> Good my lord, pardon me:
> Though I am bound to every act of duty,
> I am not bound to that all slaves are free to.
> Utter my thoughts? Why, say they are vile and false;
> As where's that palace whereinto foul things
> Sometimes intrude not? who has a breast so pure,
> But some uncleanly apprehensions
> Keep leets and law-days and in session sit
> With meditations lawful?
In the US, I have a right against forced self-incrimination. The rights against indiscriminate search and seizure and protection of speech also seem to apply.
Being able to read my thoughts would seem to negate my protection against the disclosure of self-incriminating speech (presuming thoughts are considered speech).
This frontier is all the more concerning while our dominant model of commerce on the internet is the exchange of content for attention. The attention market already drives businesses to develop sophisticated, targeted models of users, so that the businesses can most efficiently encourage addiction to their services.
Being able to tune those mechanisms in real-time could be disastrous. It is encouraging to see at least one government making an effort to get ahead of this, and their work can be an example for the rest of the world on how to protect people from a looming dystopia.
I say stupidly, because it is such an obvious thing, but one that might be ignored or forgotten, without a sustained reminder. In the face of such an understanding, the distinction between government or commercial privacy is meaningless. If that lack of privacy can have significant power over your life, the source doesn't matter. The content doesn't matter.
In some ways it feels like a waste to try and directly read the mind when humans can be pretty well predicted by their outside thoughts alone.
My point is, we will never have privacy from tech because tech's most valuable field is ourselves. So we all respect others people privacy but we need to harvest every data point we can because otherwise what will we do, kernels and drivers?
I, for one, do not plan to put anything in my brain I have not designed myself, and for sure nothing that auto-updates!
If I want to hack my consciousness, I’ll go read a book. Naked Lunch was enough of a brain bug, thanks. Or how about Black Mirror?
And once we get past not knowing wtf we’re doing, take a look at how we’ve weaponized every major scientific advance and tell me this would be the exception. Rabble-rousing and voter manipulation on social media will look like arts and crafts hour.
I believe in the way, and all that’s happening is natural. I don't think the collective organism of us would really create a hell for itself. Human civil courage and compassion is a golden guard against the truly reprehensible.
Let them try.
Also +1 switzerland. Hope to see that style of being on a global scale.
• Driving the news: Neuroethicists are sounding the alarm.
• Earlier this month the U.K.'s Royal Society published a landmark report on the promise and risk of neurotechnology, predicting a "neural revolution" in the coming decades.
• And next month Chilean lawmakers will propose an amendment to the country's constitution enshrining protections for neural data as a fundamental human right, according to Yuste, who is advising on the process.
• A major concern is that brain data could be commercialized, the way advertisers are already using less intimate information about people's preferences, habits and location. Adding neural data to the mix could supercharge the privacy threat.
• "Accessing data directly from the brain would be a paradigm shift because of the level of intimacy and sensitivity of the information," says Anastasia Greenberg, a neuroscientist with a law degree.
The intrusion to brains has already started. Schools in china are experimenting on using brain-wave trackers to track whether students are paying attention in class. But the parents do not seem to care. It is likely to give false readings so the accuracy is unclear.
The picture is scary. They do not seem to care about the privacy issue even though the benefits are unclear. They are OK to sell out for ... maybe nothing.
It seems at least a possibility that we will enter a future where similar people will buy the related product of "RFID hats" [as branded with whatever acronym this technology adopts, and perhaps as an "also suggested by Amazon" to boot].
A. You can think "type these letters" and make your fingers type them, or you can route that same electronic message via bluetooth to the phone near you. This surgery is like going from dial-up internet speed to broadband. You can type much faster without fingers!
What is a thought? Is it the words you hear inside your head as you think? The images? Do those even have a concrete biological representation? Are we just talking about patterns in neural signaling cascades? If so, I doubt that “thinking about a tree” looks the same from one person to the next. Which means that these patterns will have to be learned from training data (once the sensing technology exists).
Brains exist to decide to take actions. In the above case, companies with an interface to your brain will only “understand” your thoughts insofar as they map to actions you can take through their services, or “API”, so to speak. In that sense, It feels like we’ve already crossed this line.
Let me be more concrete: unless there were, say, a bomb manufacturer with a brain-API that you “trained” and used regularly, nobody will be able to decode that you think about bombs in your spare time.
Unless, of course, a (hypothetical) brain computer interface just translates your thoughts into “stream of consciousness” strings of words - but if that were the case, how does that add value over a speech-to-text google search?
TL;DR, if you’re worried about the thought police, look around you - we’re already there. Your “thoughts” are just the actions you decide to take using technology.