This article is so wrong on so many levels. First off, Amazon is definitely not selling "suicide kits", let alone advertising them. What they really meant here is the automated "people usually buy this with" section that suggests other items that are commonly bought along with the viewed item, meaning that the viewer had to be at least aware of what they were searching for (they were probably looking for some chemical).
Second of all and most important... WHY the hell were these CHILDREN buying stuff off of Amazon, somehow WITHOUT THEIR PARENT'S KNOWLEDGE? Did they just pay with their own cc, have it delivered to the house all while their parents/guardians DIDN'T notice?
If their parents are so uncaring for their children, then I'm not even surprised they wanted to commit suicide.
It has been pointed out to Amazon that these items shouldn't be recommended together and their legal response was that they weren't going to change anything. Never seen an Amazon Defense Force post before.
This was annoying for me too. I went to Amazon to search for suicide kit and found nothing. I searched for whatever that chemical was and tried to see related items that were concerning and still saw nothing.
As documented in the article, Ebay and Etsy stopped selling Sodium Nitrate in lab grade purities because of this, Amazon has not. It's not at all weird for parents to let their children buy things on Amazon especially with their own money. If you're having trouble thinking of ways this could happen, you could read the article which talks about a few examples.
Visa and Mastercard both have gift cards available at gas stations, grocery stores and corner stores everywhere you go. There is no age restriction with those. You just need enough cash to buy one.
Likewise, a lot of banks have changed their debit cards to VISA debit cards so they can be used like credit cards, again regardless of age.
As a developer, as a data scientist, as a designer, as a business leader, you are responsible for things like this. Do not make excuses. Do not disclaim liability. Acknowledge the mistake and fix it. We can do better.
These journalists, on the other hand, have parroted the lawsuit’s PR campaign for clicks, and in doing so, have actively, purposefully, and knowingly spread this social contagion farther than Amazon ever could through their inadvertent recommendations.
I don't see how you can prevent this from happening without banning recommendations completely. There are lots of ways to combine items so that they become dangerous. I don't think even a human in the loop could prevent "suicide kits" reliably.
I went to my local hardware store the other day to buy a rope. The owner did recommend some good rope options, and he didn't offer me a handy guide to tying a noose and killing myself.
So it's possible. I'd venture to guess that store has never actually given teens advice on how to kill themselves. If they were I bet they wouldn't stay in business long.
We don't have to excuse these big tech companies for the consequences of what they do. We just don't. We can demand they don't freeze accounts and keep people's money without letting them talk to a human, we can demand they don't recommend harmful products or send counterfeit goods to people, we can demand they don't evade labor laws or local zoning laws.
We don't have to put up with this bullshit. I don't give a fuck if Amazon's recommendations engine is cost effective for them. Their business model isn't my fucking problem they need to exist in society in a healthy positive way just like everyone else or they can go out of business.
But would you ban stores from also recommending a book on knots when you buy a rope? After all, the book also contains instructions on how to tie a slip knot.
I am pretty sure if a normal sized store was doing that regularly in a community and a bunch of teenagers ended up dead and concerned parents walked in and begged them to stop and they told the parents to fuck off, there would be a fistfight. The owner of the store wouldn't feel comfortable going to local restaurants. There would be consequences.
We live in a fucking society, with people in it. These tech giants are wrecking major parts of it. The degree to which everyone in this thread is just utterly blind to the moral and interpersonal obligations of corporations and unwilling to insist on something better is really something.
> We don't have to put up with this bullshit. I don't give a fuck if Amazon's recommendations engine is cost effective for them. Their business model isn't my fucking problem they need to exist in society in a healthy positive way just like everyone else or they can go out of business.
You’re so ready to make everything worse for everyone just because some two-bit journalists’ clickbait has put you into a state of panicked hysteria over nothing. Your lack of control over your own mental state should alarm you more than this story.
There is room for others to firmly believe that Amazon could simply preclude some items from “customers also bought” product suggestions, when those product combinations create serious issues. That’s all many here are asking to see changed.
I'm going to assume you're from the same country that allows 16 year olds to buy guns and people of any age to buy guns at a swap meet with no paperwork or whatever.
I'm sure the gun nuts will come out and correct me if I'm wrong. But I sure know the easiest way to kill myself if I find myself in the US, and it's not ordering some obscure chemicals.
That's why you need to make a real effort to prevent things like this one from happening (identify critical areas, make necessary changes) and when it happens, take an action immediately.
I'm generally on the side of believing that Amazon is not legally responsible for what people do after buying things that other retailers sell them on their marketplace. That's a bad road to go down, and I'm afraid it would lead to unintended consequences if applied broadly.
I also do not believe that Amazon is headed by a cabal of cackling villains who want kids to die, because they make $2.39 every time one of them buys poison on their marketplace. Instead, I think they're a big company that has a bunch of regular people focused on a bunch of different goals, and not really paying attention to this kind of thing until it becomes an issue. Hanlon's razor, etc.
BUT, given the bad press, why wouldn't Amazon put high-purity Sodium Nitrite behind some kind of age verification? They do have that process in place already for other items. Why not override the (algorithmically-generated) "suicide bundle" with something else? I'm sure they've done that in the past, too. Why do they just not do anything, and instead allow themselves to get caught up in a lawsuit and confirm people's beliefs that they're cackling villains? Seems like an unforced error.
This story had me thinking about the different responses from Google and Amazon for this kind of thing. If you search about topics related to suicide on google, they will show you the suicide hotline number at the top of the search. If you search amazon for suicide related products, amazon will sell you a whole kit of supplies to go through with it.
Which is why the comments about "well algorithms are hard" falls flat. Google has complex algorithms too, but they can still add a safety layer. People have specifically asked amazon to stop selling suicide kits, and as far as I have read amazon refused to change. The issue is not that amazon cannot fix this, but that they don't want to. Hopefully this recent bad press will lead them to change their tune, but it is worrying when they can't see the best course of action without bad press.
> Amazon says that searches for the word “suicide” present customers with a banner at the top of their search results with the phone number for the National Suicide Prevention Hotline.
If my memory serves me, there have been studies done to evaluate the effectiveness of these measures. I don’t recall any off the top of my head, but scientists have also wondered and evaluated. The data is out there for someone more motivated than myself. :)
If I have to rate my digital content surely Amazon shouldn't be selling sodium-nitrite to minors as they unlikely to use it for preserving processed meats. Apparently their recommendation engine blindly picked up on the trend that people are buying it for suicide, and it started recommending it for that. In 2022 there is just no excuse for this, nobody should be deploying software in production that is capable of this kind of harm. They had decades to iron out these kinks.
> If I have to rate my digital content surely Amazon shouldn't be selling sodium-nitrite to minors as they unlikely to use it for preserving processed meats.
Did you actually think about this for more than a second.
Despite all this moral panic bullshit I would wager the majority of minors purchasing sodium nitrite are doing so for its legitimate uses. Teenagers with cooking or aquarium hobbies are not that rare - sodium nitrite suicides seem to have been in the single to low double digits for most countries - which when they report 150 to 200 percent increases over the year sounds all the more sensational.
This is stupid. Might as well ban ropes.
Some of you might want to look up the stats on intentional and unintentional Tylenol toxicity and number of deaths - it is 1000s and 100s more respectively.
The article says that the Amazon recommendation algorithm was also recommending along with the sodium nitrite, "a scale to measure a lethal dose, a drug to prevent vomiting, and a book with instructions on how to use the chemical to attempt suicide". This is beyond just suggesting the chemical itself, this is almost suggesting to certain users to commit suicide.
I mean I don’t automatically believe this bullshit based on the source - it’s coming from a law firm with very specific priorities in producing a narrative.
Having known about 60 minutes scandals over the years I am just as likely to believe they withheld this story because the facts didn’t even meet their questionable standards. Some emotional grieving parents and a law firm are all you need to them turn that into another media circus.
Maybe their recommendation system was aggregating. Or maybe it was just suggesting specific things the user had viewed or purchased recently.
I’ll withhold judgement until less charged “facts” are out in the open.
For a forum that so often likes to cite Gell Man amnesia - how about fucking now?
The way I imagine this working is: you buy X,Y and Z. If enough people buy these together (along with other things) the algo will deduce that if you buy X you also probably want Y and Z.
It will recommend them. This is not new and there are plenty examples.
> nobody should be deploying software in production that is capable of this kind of harm.
This "kind" of harm seems to hide a huge amount of complexity. Let's assume the algorithm starts off on day 1 just looking for purchases that tend to appear in clusters. There's a potentially unbounded number of clusters that could be regarded as harmful. The options are surely:
1. Be vigilant and special case items one by one. I presume they are doing this. Maybe they aren't putting enough resources behind it or not reacting quickly enough but even if they were then things would always slip through.
2. Never use any kind of bundling algorithm ever.
(1) is a forever ongoing task so it doesn't sound like you're referring to that as that's not something where you can "iron out the kinks".
However it also doesn't sound like you mean (2) as you seem to be saying there's a way to fix this once and for all.
Is there a third option I haven't thought of?
EDIT - I guess maybe all "bundles" could require human curation before they are allowed to be displayed. It would be a gargantuan task but it's a valid third option.
There's a hell of a difference between recommending pairing a cute shirt with some cute earrings and pairing an industrial chemical with an item from the pharmaceuticals category. There are extremely basic and straightforward safeguards that could have been put in place that simply do not exist.
Why is it amazon’s responsibility to safeguard your children? That’s your responsibility.
They can still figure out how to buy the items separately, and Amazon isn’t inducing suicidal ideation.
Perhaps your kids shouldn’t have unrestricted and unmonitored access to order things from Amazon? Start there, instead of insisting that Amazon do the parenting for you.
If you can't see the difference between things like "Amazon shouldn't sell dirty magazines to kids" and "Amazon should have basic protections in place to avoid selling lab grade chemicals along with drugs to facilitate a painless suicide as a bundle to whoever wants them with free shipping," I don't know how to have a reasonable conversation.
It's not about parents having oversight over kids. It's about mental illness: "kits" like this shouldn't exist at all for kids or otherwise. You use it once and you're dead because that's the whole point of the combination of goods. We as a society should be past the point where we need to discuss whether selling a literal suicide kit is okay or not. It's not, and we know from research that making suicide more accessible and convenient leads to more suicide. The only outcome of allowing Amazon to do what it's doing is additional deaths that could have been prevented.
> Why is it amazon’s responsibility to safeguard your children? That’s your responsibility.
Indeed. And we intend to exercise that responsibility to safeguard our children by stopping Amazon from mixing suicide kits in with shipments of innocuous household goods.
You intend to exercise your responsibility by abdicating it, and instead insisting that Amazon and other retailers do it for you, and society at large bear the cost?
I think we disagree on the definition of “responsibility”.
Yes. Society can bear the cost of having to jump through a few extra steps to buy pure concentrated chemicals that have been demonstrated to repeatedly kill teenagers.
So the people who have legitimate use for those chemicals can jump through a few hoops to get them.
And Amazon, a trillion dollar company, can do a better job with its algorithms and stop recommending them and selling them to teenagers.
I am super comfortable inflicting those costs on the world.
I’m sure you’re comfortable with whatever authoritarian bureaucratic nonsense would help calm your social media-induced panic attack if the hour, but that’s not a standard of evidence under which I’m willing to make the world a bit worse.
It sounds like we won’t agree on ethical fundamentals, much less this topic, so I wish you well, and hope you can source enough bubblewrap to feel safe existing in the world.
The world is worst because a trillion dollar company don't have the incentive to operate responsibly. Because they don't care, and shrug responsibility at every turn. Because they are not willing to uphold any standards. Nobody expects a bubble-wrap, just a recommendation engine that's safe around chemicals. What an expectations from a trillion dollar company. Please don't recommend dangerous chemicals to children or information that tells them how to harm themselves.
By the way your car is not a death-trap because of us nagging the industry to do better. You are welcome!
It’s not authoritarian nonsense to have and enforce basic principles of product safety in commerce.
For evidence refer to the fact that your family hasn’t died of botulism or a falling building facade.
This is pretty basic stuff. It’s not a social media panic attack it’s an active lawsuit backed by sworn statements and evidence, and the fact pattern is genuinely apalling. I get skepticism at the headline but read through the details.
To be clear, Amazon isn't just selling these things. They're actively promoting their sale, together. Their systems are literally encouraging the combined purchase of multiple products from different sellers in different categories to create a deadly suicide kit. Are you suggesting Amazon has no responsibility for what their systems—unprompted—promote?
It does not seem like they are doing 1 in any sufficient way: its pretty easy to argue that this is by far the most important special case (it’s literally life and death stakes), and they are failing at it even after a growing PR campaign and advocacy made them aware months (years?) ago.
Amazon is a $1T company. They have the resources. This is willful negligence.
This is a good example showing that all work is not removed by automating a task, such as selecting which products are related and displaying that on an online store’s product page. Some work may be removed, but new work is added to compensate for what the algorithms can not trivially do, such as considering whether the links between these products are ethical to present to users.
It seems to me that companies massively underestimate (or willingly don’t compensate for) the importance of this newly introduced work, only cashing in on the savings from the work removed. Other examples of similar behavior include automatic moderation, which also massively fails to remove actual dangerous content.
This kind of behavior looks like greed to me, but I guess it’s legal, and needs more regulation.
Exactly this, Amazon's recommendation engine doesn't have the domain knowledge about chemicals and such, what can be mixed with what. Their recommendation may mislead people.
How would that even address the problem? Someone buying these things to commit suicide will still buy these things even if they need to do a few more clicks and searches because the items aren't conveniently bundled or recommended.
> EDIT - I guess maybe all "bundles" could require human curation before they are allowed to be displayed. It would be a gargantuan task but it's a valid third option.
This is the correct way to handle it. It being a "gargantuan task" doesn't matter. Everybody would freak out if the local supermarket advertised this kind of product but somehow "tech" (read: online retailers) can get away with this because they are big?
Amazon posts billions in profits, it can certainly afford to control their algorithms. If it's impossible, then stop using algorithms and use human curation like literally every brick and mortar business.
Nothing slipped through. Amazon has been fully informed about it, is doing it on purpose, and is engaging in expensive legal battles to be allowed to keep doing it.
This story is so much worse than the headline suggests, despite that seeming impossible.
If they artificially limit their "AI" from outside pressure, then they will have admitted that they can do it, and they'll get pressured to do it more and more. They don't want to deal with that, and would rather wait until a law is passed "forcing" them to do so - coincidentally Amazon will be best positioned to comply with that law and for some reason it'll only ever be used to go after smaller competitors.
They don't want to establish the precedent that recommending a harmful item makes them liable for that harm at any cost even at the cost of a few dead children. Before this in a previous scandal they were promoting "literature" that suggested parents of autistic children cure them with bleach and recommended the bleach they could inadvertently murder their children with.
> Amazon shouldn't be selling sodium-nitrite to minors
The category of things that "might be harmful to minors" is impossibly broad. I suspect Amazon will be removing this chemical from their listings, just as Amazon and eBay constantly add new items to their banned items list. There are several chemicals that I ordered from Amazon in the past that now can't be found online, and I expect it's only a matter of time until pure sodium nitrite also disappears.
The bigger question is: Should we be forcing every online marketplace to perform strict age verification of every customer? From the article:
> Kristine was able to create an Amazon account even though she was under 18, skirting Amazon rules against underage account holders—the lawsuit notes that Amazon does not verify age.
If a minor can get ahold of a credit card, it's not hard to imagine that they can also get ahold of their parent's drivers license to go through an arbitrary age verification too.
Even if they did perform age verification on accounts, how would you expect Amazon to verify that the person pressing the Checkout button is actually the parent, and not a minor who sat down at their parents' computer and went to Amazon.com?
It's really easy to make blanket dismissals of difficult situations in comments ("Amazon shouldn't be selling X to minors...") but the reality is that the only way to prevent these things is to remove them entirely from the marketplace. Amazon does remove a lot of potentially misused medications and chemicals from their marketplace, but the list is never going to perfectly capture every single combination of items that might trend as being harmful.
What would you propose as a practical solution? A blanket ban on anything that might be harmful?
Some people believe in the right to suicide. I don’t think that’s appalling at all. What’s appalling is how America prolongs the suffering of terminally ill or those with horrific chronic diseases who want out. Agreed on the teen part though.
Some people believe in the right to doctor-assisted suicide. I don't know anyone who advocates that suicide drugs should be available over the counter to anyone who's feeling suicidal today.
Why is it legal to buy a gun but illegal to decide to end your own life? The hypocrisy is through the roof. How are we free if we are forced to live?
Btw: we should not encourage suicide. I think we need to help people that are in a position where they are considering and also there are things that are deeply f'ed with our society to begin with that slowly push people to this (ie helping suicidal people is a drop in the bucket if we don't fix fundamental issues)
>Why is it legal to buy a gun but illegal to decide to end your own life?
AFAIK, in the US, suicide is not a crime. Assisting in another person's suicide is a crime in most states.
As the post you're replying to said, we could allow physicians to offer painless and dignified deaths after the patient proves they are of sound mind or will never recover from a horrible condition. Much like the courts recognize there can be reasonable limits to purchasing and possessing guns. Admittedly, I don't know what fraction of those who commit suicide do so because of a temporary mental state they could recover from with help, but I'm guessing it's the majority. If so, it'd be inhumane to remove basic barriers that might save people, even if it'd make it easier for the smaller group of people who "rightly" believe they would rather die.
This is a complex topic that touches on so many sore spots in society. But that's when I most doubt all-or-nothing stances. They tend to come from strong emotions rather than reason.
So let's be real for one sec. As an adult you will find a way to kill youself if this is the goal.
We are hypothesing that people don't want to actually do it and that we are doing the a service from preventing it. This could be the case - but how many people decide to end their life on a whim?
Re: recovering with help. That was one of the main things i had in mind. If the factors that lead you to suicide and/or drive you insane are not removed do you actually believe that stopping you from the final act is going to be effective? As a society we don't give 2 shits about people that are struggling. Money is our religion everything else is secondary :(
First of all, there are numerous studies that show that barriers to suicide are extraordinarily effective at preventing suicide - the vast majority of people who commit suicide do so on a whim, and things like barriers in subway stations or on top of tall buildings, foul smelling gas and others drastically reduce the overall number of suicides in large areas, not only in the specific places they guard.
Secondly, in many studies of attempted suicides, the vast majority of people of people who survived one suicide attempt did not try again (even though the risk of further attempts is much higher with an initial attempt than in the general population, it is still quite low - around 13% in one 35-year long follow-up study).
This doesn't even count the huge number of people that have suicidal thoughts and don't actually attempt suicide - at least sometimes because of barriers like those mentioned before, which prevent a suicidal thought from turning into a rash decision like jumping in from the of the train or off the building.
In fact, for many people, this type of "near-miss" offered by such barriers is exactly what allows them to realize they need urgent help: they don't want to die, but had been ignoring similar thoughts as meer fancies, until such a moment when they realize that they would have actually done and may try again if they don't seek immediate help.
I don't have the number and I don't know how many people do so on a whim. Certain measures are working in certain situations (if there is physical barrier sure).
My bigger point was that nobody decides to kill themselves for fun and what needs to happen is systematically helping people not knee jerk reactions like banning X or Y. I may be wrong
The point, IIUC, is not in being forced to live - it's that ending your own life may, in this way or another (like in this example with Amazon), explicitly or implicitly, encourage others to end theirs, who would not have went for it otherwise.
Honest question: how many people do you believe end their life like this because of the recommendation vs actually researching it before and they atr just buying the stuff?
If it's because of the reco: how do we ensure that we ban all other possible lethal combos? Disable recommendations? Ban certain things?
> I don't know anyone who advocates that suicide drugs should be available over the counter to anyone who's feeling suicidal today.
This is how you end up with acetaminophen banned from gas station convenience stores and locked up behind the pharmacy counter. Or, for that matter, prohibition of alcohol, undoubtedly the most popular way to slowly kill yourself.
I believe in the right to suicide, and I also think transaction costs matter. What if we had 25-cent suicide booths a la Futurama on every street corner? Sometimes people are going through hard times, and it needs to be easier to seek help and connect with other humans who care than it is to off yourself.
If you believe in a right to suicide, don't you believe in a right to making it as low cost and accessible as possible? If you have to afford it and you have to get past a gatekeeper it doesn't seem much of a right.
No, because suicide is also an extremely dangerous option to other groups of people - some mental illnesses cause suicidal ideation, and people who suffer from them are often terrified in their moments of clarity of the possibility that they may actually do it in their moments of "insanity".
The vast majority of people who attempt suicide actually regret it afterwards, and have no deep desire to die. Making it hard to actually kill yourself is an extremely important safety net for these people, who outnumber the terminally ill who actually wish to die by a large margin.
Not to mention, people can be compelled to attempt suicide by social pressure, either in bullying or by family members who view them as a burden. Having to go through a neutral third party who can help make sure you actually deeply desire this step is critical to protect you from others.
Well then you basically do not care about suicidal people.
Because, if you had ever spoken to anyone who has felt suicidal, you will see that most of them admit that they have had couple second lapses in judgement and are extremely glad that they did not have a 5 second suicide button that they could press.
So, me, being someone who actually cares about these people, and who has spoken to them, know that most people who are suicidal do not want to be given this option.
You would know this if you had ever spoken to someone in your life, ever, who has had depressive or suicidal thoughts.
> You would know this if you had ever spoken to someone in your life, ever, who has had depressive or suicidal thoughts.
Actually I'm a trustee of a registered charity that provides support to people with a wide range of serious welfare and mental health issues and have dealt with them in practice multiple times, providing a great deal of support from my own personal time to people struggling.
I'm sorry that you made an incorrect assumption. I'm afraid that I find it's pretty typical among people who think they should have a right to control others.
Oh, fuck off. I am one of those people, I have been for the last 15 years. And I believe I have the right to go peacefully when I decide to. That others force me to suffer because they might be sad is incredibly selfish of them. So take your condescending shit somewhere where you have any clue in what you're talking about.
Do you have any friends who have attempted suicide, failed, and now regret the attempt immensely?
Do you wish they had succeeded in their suicide attempt?
I am talking about a situation where they regret the decision, are alive now, and are glad that they are alive, and that the attempt failed.
For that situation, do you think they should have died, even though now they admit that it was a mistake and are glad that they didn't go through with it?
> I have been for the last 15 years
This wasn't about a situation where someone has made a decision, and been certain of it for years.
Instead, if you read the conversation, it was about if someone should be able to make a quick, decision, within literally 5 seconds.
Do you understand how giving someone a decision, in this short a period of time, is extremely different than if someone has thought about the situation for a long time, and is very certain?
Do you not understand that suicide affects everyone around the person? It's an incredibly selfish decision and definitely should be done with more than 5 seconds of thought.
You are plainly ignoring the seriousness of the decision and treating it like any other bodily autonomy discussion. Multiple people have told you about how people with suicidal ideation operate and how serious this topic is. Multiple times.
You choosing to ignore it all is, plain and simple, obtuse. Period. If you don't want to be called obtuse, don't be obtuse.
I strongly believe in the right to assisted suicide for the terminally ill, but I also believe that it shouldn't be as easy as placing an order on Amazon. Suicide is overwhelmingly an issue with folks who are mentally unwell, not physiologically. There's no excuse for what Amazon is doing here.
And why would you deny the same right to mentally ill? It's just as much suffering as any other chronic illness sometimes more. Either you believe in bodily autonomy or you don't.
There is a fundamental difference between a disease that causes pain that leads a rational, terminally ill person to choose a dignified death and a disease which induces pain and suicidal ideation, but which is treatable. Mental health treatment is available and mental illness is almost never terminal.
> afraid they are they may actually do it in a moment of insanity.
That's certainly not a universal experience. There are rational and valid reasons a mentally ill person may want to die and other people should not deny them that right. It's inhumane torture and a complete disrespect of their personhood to disregard deny their capacity to so reason and force them to live in suffering.
There are, but they are vastly outnumbered by the non-rational ones. There are extremely common drugs used to treat depression (SSRIs) that cause suicidal tendencies in people younger than 24, for example. Depression itself causes suicidal ideation, as do many psychosis.
If you actually want to help someone suffering from this, you must check that their desire to die persists for a long enough time, and is not simply a psychotic episode that they would regret later. This is often easily distinguishable, as people who start having thoughts of suicide are often terrified of actually following through just moments later.
It is also extremely well established fact that the vast majority of people whose suicide attempts are thwarted are thankful for it, and that cutting access to the easiest suicide methods drastically reduces the total rate of suicides: most people who commit suicide don't want to die, they are just having an irrational episode.
> There are, but they are vastly outnumbered by the non-rational ones.
Good luck backing that one up.
> This is often easily distinguishable, as people who start having thoughts of suicide are often terrified of actually following through just moments later.
I this is a terrible criterion because even if you've made the right decision death is still incredibly scary and so is dying (which can go wrong, especially when efficient methods are witheld.)
Your overall point sounds good superficially but I don't see why you think its within your rights to be a gatekeeper for me though. It's none of your business, __at all__. You can want to help all you want, but forcing your "help" on others isn't within your rights.
Again, speak to some actual people or organizations who are or work with the mentally ill.
> Good luck backing that one up.
Here[0] is a study that followed 98 people for 35 years after an original attempt at suicide, and found that 13% of them eventually died by suicide, some up to 15 years later. I would say that clearly shows that the vast majority of people who attempted suicide (87%) did not intend to die.
> Your overall point sounds good superficially but I don't see why you think its within your rights to be a gatekeeper for me though. It's none of your business, __at all__. You can want to help all you want, but forcing your "help" on others isn't within your rights.
You're again missing a fundamental aspect of mental illness: that many people who have a mental witness need an advocate to help keep them safe from themselves. Not just for suicide, but also self harm, extreme risk-taking behavior, substance abuse and others. You don't just let late stage Alzheimer's patients wander outside: even though freedom of movement is a fundamental human right, they don't know where and why they are going and can easily harm themselves in the process.
My first thought was that maybe the parents should blame themselves instead of figurative rope-sellers for being a failure of a parent who let their kids go to the point where they attempt suicide, while claiming they were not even "aware of it". But that's probably too jaded for here.
Yeah that’s super jaded. Seriously consider stepping back and thinking about how much grief the parents are going through.
I have a young child who has voiced suicidal thoughts and and it’s so fucking hard. The pandemic has been really rough on a lot of kids.
Thank god we are we’re finally able to get professional help (after almost a year of waiting for an available therapist).
You don’t know shit about the circumstances of these families, and jumping to the conclusion that the parents are at fault and not at least considering that a $1T company could lift a finger is kinda messed up.
Sorry for the angry tone but I’m seeing a few folks make comments like yours and it’s infuriating. This isn’t some joke topic and the Silicon Valley contrarian attitude really is frustrating to see on this thread.
This isn’t a contrarian attitude, it’s a “parent your own children instead of insisting corporations parent all of us” attitude.
I feel for the parents’ grief. I also understand how difficult parenting a troubled teenager must be.
I also categorically reject the idea that Amazon is culpable, and I think that these particular parents, in their grief, are merely attempting to externalize their feelings of guilt.
They want to rage at something or someone other than themselves.
Asking Amazon to not bundle these product and instead put up a suicide hotline suggestion instead is not “insisting corporations parent all of us”. It’s insisting on basic decency.
And I *AM* parenting my children when I insist on this. The only thing I’m thinking about is if my kid goes to a dark place again and starts searching the internet.
But please: why don’t you tell me how to be better parent to prevent this?
Because it's not entirely their fault. Providing easy access to suicide is dangerous to people who suffer from mental illness, children or no. They may have well been seeking help for their children, but then along comes Amazon and sells them easy access to something they can use in a moment of psychosis to harm themselves.
Saying that Amazon is not culpable is like saying Amazon would not be culpable in an overdose death if they were selling heroin to children in withdrawal, because the parents should have been more careful. Sure, they should have, they can't discharge the blame. But the source of the substance is also culpable.
How did their children have unrestricted internet access, a credit card, an Amazon account, and the ability to place an order and have it shipped and delivered to their home without any form of parental review?
I’m not going to say it’s the parent’s fault, but they’re a hell of a lot more culpable than Amazon, which isn’t any more culpable than a hardware store selling rope.
Literally addressed in the case, which is part of the complaint.
~~~
The circumstances surrounding Amazon’s sales to both Kristine and Ethan were
highly irregular. Amazon has a policy that people under the age of 18 can only use the service with the involvement of a parent or guardian. However, Kristine, at just sixteen, had created her own account to purchase the poisonous chemical and was never asked her age when she set up the account. The package delivered to Kristine’s home was addressed without a last name. It read only “Kristine.”
Seventeen-year-old Ethan used the account that belonged to his mother, Nikki, to purchase Sodium Nitrite. When Nikki received the email receipt for the purchase, she immediately called Amazon’s customer service to tell them there must have been some mistake and that nobody at her home had ordered the item. Amazon told Nikki the order was cancelled. Instead, the Sodium Nitrite was delivered to her home four days later.
> … created her own account to purchase the poisonous chemical
Everything is poisonous in excess. These are scare-words written by a lawyer looking for a big payout.
> … and was never asked her age when she set up the account.
She had to supply a credit card and agree to the terms of service.
Do you want Amazon to engage in privacy intrusive age verification? Maybe we should be forced to upload government ID?
Or maybe you should be responsible for parenting your own children, and not trying to parent the rest of us.
> The package delivered to Kristine’s home was addressed without a last name. It read only “Kristine.”
So?
> Instead, the Sodium Nitrite was delivered to her home four days later.
Yeah, that happens.
Guess what’s going to happen even more now?
All the irresponsible journalists driving social media panic are advertising sodium nitrate as an effective method of suicide to the world. People who never had heard of it or considered it are going to, now.
Not all mental illness that leads to suicidal thoughts is so cut and dry. Children can suffer from schizophrenia or bipolar disorder and have hallucinations or other psychotic states that make them suicidal without any associated trauma.
At their scale, it is entirely possible that this wasn’t malice, just bad algorithms/poor testing. All they had to do was say sorry and fix the recommendations. Instead, they’re lawyering up for a fight.
Just how much profit are they making from these kits? Must be negligible to them? I don’t understand why they think it is worth fighting this? Or is it their policy to always fight, never back down, never say sorry etc? I honestly don’t understand. Even ignoring ethics and looking at it from only economics, it doesn’t make sense to me. What am I missing?
The combination of items is a suicide kit, but as far as I can tell Amazon didn't have a literal "suicide kit" item (chemical + drug + scale + booklet) in the catalog, it's just about "others who bought this also bought...".
What difference does that make? If a child was buying this this chemical from Wal-Mart and the cashier checking them out said, "Oh, looks like you're trying to commit suicide. Have you considered buying some medication to keep those toxic chemicals down, or a book that will walk you through everything?" wouldn't you find that at all problematic?
If we can hold companies liable for what their employees recommend, I don't know why we wouldn't also hold them liable for what their recommendation engines recommend.
Furthermore, according to the article, the sodium nitrate mentioned in the lawsuit has a higher purity (99%) than the sodium nitrate used or processing meat (6%).
While I agree that it seems very problematic that these ingredients are so accessible, I also have to challenge this:
> They had decades to iron out these kinks.
Aren’t the developments in algorithms that lead to these kinds of unexpected outcomes a bit more recent?
This seems similar to the content moderation problems the social media giants are having.
The sheer volume of products in Amazon’s marketplace is such that these kinds of things seem bound to happen when applying recommendation engines.
This is not to imply that such recommendations are acceptable, but my primarily contention is that this doesn’t seem like a problem that has been well solved anywhere yet.
> Aren’t the developments in algorithms that lead to these kinds of unexpected outcomes a bit more recent?
No. Unexpected outcomes are pretty normal for search engines (and really, any complex computer system) since... forever, and really, the solution, regardless of input is is really easy: on input if (keyword in prohibited_phrases) disable recommendations, and on output if (keyword in recommendation_results) disable recommendations to user. I do believe this was solved sometime prior to E. F. Codd's 1970 "Relational Completeness of Data Base Sublanguages".
I was referring to the product pairings, not the search result autocompletion (which does seem like a pretty glaring problem).
Cutting off a search early is easy. Deciding that “even though these are commonly purchased together, this pairing shouldn’t be shown” seems like a fundamentally different problem.
To me, its shows the limitations of 'AI' in 2022. We keep calling software 'intelligent' and saying things like 'we don't know exactly how it works', we tell people that it is going to take their jobs - that all seems like hubris to me. Situations like this prove that AI is not intelligent at all and has a long way to go before it is anything like intelligent.
>> as they unlikely to use it for preserving processed meats
Maybe, but as long ago as at age 12 I was buying it from the local pharmacy to make awesome smoke bombs.
>> In 2022 there is just no excuse for this, nobody should be deploying software in production that is capable of this kind of harm
You realize the #1 founding reason of the internet outside of sharing academic research was to distribute an entire "book" capable of this type of harm?
The purity of the chemical that Amazon is selling is high enough to make it inappropriate for virtually any other (consumer) use case besides suicide. This isn't the same as what you bought when you were 12.
That's absurd. The detail that most products contain a lower concentration is not inconsistent with this compound's concentration being practical. The "purity" is also a non-issue - the harm is in the amount.
While I buy 100% sucrose in 10lb quantities, the cupcakes I make with it are a small fraction sugar. It would be very difficult to guage my recipes if the concentration of sugar I bought varied per bag. Sucrose is explosive if atomized and ignited or mixed with oxidizing agents.
The same is true of salt (deadly at 3g/kg body mass and sold in 5-10lb quantities, by the way), which is important when I mix up a dry rub or table seasoning.
Why did you speak universally for something you have copious evidence to dispute?
> It would be very difficult to guage my recipes if the concentration of sugar I bought varied per bag
The US government sets the purity for sodium nitrite for use in food. You can't buy food-grade sodium nitrite at other concentrations. And using too much sucrose or salt in your recipe doesn't make your cupcakes lethal.
Comparing salt to the toxicity of sodium nitrite is an absolutely bonkers argument. You'd need about 225g of salt (about 3/4 cup) to be lethal for a 75kg person. Good luck eating that much salt. By comparison, you need 5g of sodium nitrite.
please explain how increased purity makes the compound less suitable, not MORE, for household use? our “kitchen lab” has all sorts of chemically pure ingredients and I feel comfortable knowing there aren’t random impurities in my “house grade” feedstocks.
The purity of food grade sodium nitrate is 6% percent, and then you mix it with salt. The purity of what Amazon is selling is ~99%.
Purity here doesn't mean the rest is impurities, it means the percent of what you're buying is that compound. The rest is usually common salt, in this case.
The reason it's done like this is so mixing up a teaspoon and a tablespoon isn't a lethal difference.
The LD_lo of sodium nitrite is a bit over 5g for a 75kg person. You use an ounce of 6% sodium nitrite to cure about 25kg of meat. That's about 28g, or 1.7g of pure sodium nitrite. The equivalent lethal dose of what Amazon is selling is equal to ~three ounces of the stuff you use for food (enough for 75kg of meat), which ends up looking like a teaspoon at 99% purity. Do you have the confidence to correctly use a compound to cure meat where (for household use) curing 5kg requires so little chemical that you'd need to pinch it onto a scale?
A closer metaphor might be banning those magnetic bead sculptures because a child might swallow the beads. They were banned for anyone to buy not just children.
They also are high concentration hydrogen peroxide that will eat ones arm off. We intentionally don't sell it to the general public at that concentration for safety reasons. It seems that the benefits outweigh the costs in my example and with thebtopic at hand. A bit more convenience for some is not worth the preventable death of others.
> We intentionally don't sell it to the general public at that concentration for safety reasons
We don’t?
Then please explain how I have a bottle of 30% H2O2 sitting on my paints & solvents shelf in the garage. I bought it at my local farm & feed shop. No ID, licence, or even proof of farm ownership required.
Shit is scary af, but I gotta say it does on hell of a job at cleaning mold and mildew out of my window frames.
It’s a store that is open to the general public. You claimed that stores don’t sell it to the general public. Obviously they do, because I walked in off the street and bought it.
I'm kinda torn should Amazon be selling high-purity dangerous chemicals or not. On other hand it makes life easy, but on other maybe somethings should have some control in place. That being specialist store who might look at order and give some customer support.
> A bit more convenience for some is not worth the preventable death of others.
Why not?
Why should I be inconvenienced because a rounding-error’s worth of people made the choice, of their own free will, to misuse a product to harm themselves?
This didn’t happen accidentally. It’s not some kind of attractive nuisance.
> Why should I be inconvenienced because a rounding-error’s worth of people made the choice, of their own free will, to misuse a product to harm themselves?
I guess I’m not surprised that if self-obsessed people lacking nuance and empathy grow in number everywhere that they are also present on HN. Just look at the wording. I’ve noticed them more frequently as of late, in threads regarding war, death and suffering.
Furthermore, to claim a severely depressed teenager is a fully-formed person doing something “of their own free will” betrays fundamental ignorance regarding what depression is, as well as how teenagers function.
So providing an effective means of protecting depressed non-adults from the frictionless purchase of above-laboratory strength chemicals to potentially kill themselves by as trivial a measure as verifying age on delivery is somehow infringing on your liberty. Got it.
Trivial as compared to taking away the freedom of being able to purchase stuff like that at all via mail order – which I do not necessarily consider to be a reasonable request, nor do I agree with the lawsuit as is. There are already established mechanisms to verify age and identity in place to do that, package delivery companies do it all the time. It’s simply more expensive to do that. But Amazon clearly did not respond to reasonable discourse. I fail to see what you consider radical or restricting about my position.
There was also a quickly deleted response accusing me of ”pathological empathy that would abandon any principled position if it offered an immediate salve” to my “emotional panic”.
I believe this reveals more about the character and expectations of the commenter than anything else. To their credit, they deleted it.
Rational discourse about touchy subjects appears to be largely impossible. I expected better from HN. It doesn‘t follow that because you apparently hold an absolutist stance regarding what you consider freedom to be, that I am on the opposite end of this absolutism.
> I fail to see what you consider radical or restricting about my position.
Society should not bear the cost or responsibility for parenting other people’s children and preventing all possible accidents, much less willful self-harm.
The possibility of bad things happening is the cost of having a healthy, responsible, capable society of well-adjusted adults.
It's simple. They do it every time I order wine or booze online, they do it every time someone orders pot, it'd simple and easy. Even Amazon does it in their whole foods branch when they deliver wine, safeway does it when they deliver me beer. It is simple and trivial.
I bought a kilogram of pure NaOH from Amazon to make soap with no questions asked. That will also eat my arm off. We don't sell peroxides because they can be used to make explosives.
And then they'd sue Amazon for "promoting sex kits to children."
This type of lawsuit/logic/reporting is an absolute field day, you just have to come up with the clickbait "[frequently purchased together items] kit promoted TO CHILDREN."
I'm very sick and tired of the "What about the children?" mentality. The same excuse is used worldwide for censorship of movies and games, banning "dangerous" products, making anti-LGBT laws because "gay propaganda is making our kids gay", attacking couples because they kissed in public...
A child is their parents'/caretakers' responsibility. I shouldn't have to give away my freedoms just because of some negligent parents.
Or lawn darts, or those old time open fans that you can stick your hands in (or more often kids), or x-ray radiation based shoe sizing machines, or asbestos tiles.
As stated in other comments, this is a concentration that's not used for anything else. The food preservative is a different concentration.
A better analogy would be to ban ropes that are pre-tied into noises and cannot be untied, so they can only be used for hanging. Yep, I agree, those should be banned too.
Yes, but similar to high concentration hydrogen peroxide, when the higher concentration product is causing preventable harm and death they make only the more diluted version readily available to the public and keeps the higher concentration stuff just for the folks that truly need it (industry).
An alternative explanation is that 3%, 6%, and 12% are more commonly useful and so the market meets the need. 34% H2O2 is readily available (it’s hardly “kept for industry”, but rather industry/food prep is the most common purchaser of it, just like I can buy urea or ammonium nitrate even though commercial farms are a more common user of it.)
Yes, but that's not amazon, it's not a general store, it's a chemical supplier.
The issue here is ease of access and direction (the instructions on how to do it), not the fact that these things exist.
It's like hydrogen peroxide. You can go to a chemical supplier and buy high concentration stuff that will eat your arm off. They don't sell that at the grocery store, because it'd be dangerous. The high concentration stuff is still available to those who need it, but it's not so easy to get and readily available that it causes preventable deaths.
They sell bleach and ammonia-based cleaning products at the grocery store, and those will eat your lungs out if you mix them.
They sell dry ice, and that’ll give you frostbite in seconds if you’re stupid.
They sell extra-strength Tylenol, and that’ll destroy your liver if you just take 10-20 of them.
They don’t sell concentrated hydrogen peroxide at the grocery store because nobody is asking and there’s not a market for it — not because it’s dangerous.
Mixing 5-9% solutions will still produce chlorine/chloramine gas and can cause significant lung damage.
The maximum commercially available (sodium hypochlorite) bleach concentration is about 15%, and that’s due to its non-linear rate of decomposition and extremely short shelf-life at higher concentrations.
A 15% solution must be used very soon after manufacture and does not have significant non-industrial uses — that’s why it’s not stocked in grocery stores. It has nothing to do with its safety at that concentration.
100% bleach would be anhydrous sodium hypochlorite, which is highly unstable and can decompose explosively.
Seriously... I understand the concerns here but I'm so disturbed by some of the comments, like the idea that "teenagers would almost certainly not be interested in learning how to cure meats." Statements like that make a lot of prejudicial assumptions.
I'm so tired of kneejerk reactions that X is involved in some tragic events so we should ban them, as opposed to looking deeper into the real reasons things are happening. It's much easier to blame Amazon (who I do not have positive feelings about) for selling sodium nitrate than it is to deal with the socioeconomic and environmental circumstances leading to someone wanting to kill themselves. Pills and purchasing restrictions have become the "let them eat cake" of the 21st century.
We should do everything possible to prevent suicides however I have a 2 issues with this lawsuit, firstly Amazon is not selling these items as "a kit". These items are being algorithmically suggested as frequently being bought together, no one is combining these items together and selling them onwards. Additional checks and balances for dangerous items would be appropriate. Secondly, there are several items that be used for this purpose (rope + ladder, razerblades, etc), where do you draw the line?
> The complaint includes screenshots showing that Amazon auto-fills the search field for buyers, plugging in the word “suicide” after someone types “sodium nitrite” and suggesting that search above “sodium nitrite salt.”
I'm unable to reproduce this in the US (using the Amazon app); my autosuggestions (after typing "sodium nitrite "):
This is all well and good but I’d love if this much energy was spent on figuring out why an unbelievable number of children are trying to kill themselves.
These things weren’t problems until the modern era, and are concentrated in White and East Asian populations.
Interestingly, some people are arguing that mass shootings are a form of suicide, and that these epidemics are linked.
To add onto other comments about how strange this article is, the claims regarding CBS are clearly sensationalist.
The story is claiming that Amazon is in the wrong primarily for promoting these products on their site. CBS running a story on this, and basically telling the world "Hey, you can buy suicide kits on Amazon", is essentially as bad, if not worse (because they're publicizing it as opposed to just recommending books to people already looking at the product). It's possible a CBS segment could cause more deaths than anything Amazon is doing depending on who watches it. I'd imagine someone at CBS was smart enough to realize this and cancelled the segment.
My last roommate was planning on ending his life and once we found out and talked with him about it, the situation escalated and we called the authorities. They searched his room and found this exact product. He bought it off of Amazon.
It’s astounding how many odd chemicals you can buy online without a reason for using them.
Why should I need to give a reason for purchasing most chemicals?
If I did, and was intent on buying this to commit suicide, how long would it take for me to read the extra sentence on the site that told me about sodium nitrite in the first place that would now additionally advise me to say that I was buying it for (acceptable to others) reason X?
> a book with instructions on how to use the chemical to attempt suicide
I'm not an Amazon apologist. But if someone is adding a book on how to commit suicide to their cart then my first worry is not going to be what an online vendor suggests to bundle with that.
> “knowingly assisting in the deaths of healthy children by selling them suicide kits.”
I think it shows a wild conception of health if we think someone buying a book on how to commit suicide and tools to go with it is healthy
Seems like at the crux of this lawsuit, isn't that Amazon is selling products they're legally allowed to sell, it is that Amazon is also selling a booklet with information that is readily available online telling people how to use that product in unintended (deadly) ways.
HN hates Amazon, HN also hates censorship, the question is which do we as a community hate more? Should a company be liable for the free access of information? There's even some people in the comments already calling for new age restrictions on Amazon accounts for certain products, I wonder how long once that is implemented before certain localities pass laws restricting access to "unpure" materials like sex ed/safe sex?
I won't even address the "suicide kit" claim since that is simply sensationalist reporting. Ironically I think if this lawsuit is successful on its merit, you could also sue arstechnica for promoting the same thing they're accusing Amazon of. That's the nature of censorship, nobody is really safe once you open those floodgates.
> HN hates Amazon, HN also hates censorship, the question is which do we as a community hate more?
The most surprising thing about this is that the sensational journalists are getting a free pass. The journalists have now advertised this chemical to the world (including minors who read the news) far, far more than the little box on Amazon that shows related products.
How many people knew that this chemical combined with another medication was used for suicide? Probably not many, until journalists decided that this anti-Amazon story was the next big thing. Now it seems like everyone who follows tech news or sees this scrolling through their social media feed has been informed about this combination of chemicals and how it is used for suicide. Amazon isn't the only place to buy this chemical, and I doubt the recommendation algorithm was the impetus for people searching for the chemical in the first place.
The way this story is blowing up seems utterly dangerous in itself, yet most journalists aren't even bothering to try to hide the names of the chemicals. Amazon's recommendation algorithm didn't have any intent in this process, but all of these journalists broadcasting this information far and wide to boost their personal click rate certainly could have thought a little bit harder about what they were doing.
I think you’re making it more complicated that it actually is.
I believe the case is specifically focusing on the promotion of the kits to underage children. As a society we have already embraced censorship for children.
If we can count how on Amazon to not offer up R rated movies to children via Amazon Prime (which they dutifully do), it seems reasonable that we should expect (and sadly now demand) that they not promote combinations of products to children that are designed to lead to their death.
> I believe the case is specifically focusing on the promotion of the kits to underage children. As a society we have already embraced censorship for children
Underage children are already not allowed to have Amazon accounts (unless explicitly granted as a sub-account of their parents' account).
How, exactly, would you propose that Amazon determine if the person viewing a page and entering credit card information and a mailing address is a minor?
The only actual solution to this problem is to forbid selling of this specific product on Amazon, which I have no doubt is in process. Many chemicals and OTC medications have already disappeared from Amazon due to fears of abuse by minors.
> t seems reasonable that we should expect (and sadly now demand) that they not promote combinations of products to children that are designed to lead to their death.
There was no such intent in the recommendation algorithm. Did you know this combination of products could be used for suicide before it was blasted all over the internet by these news articles? I didn't, and I doubt anyone at Amazon was aware of it either (until now).
Are you also holding the journalists to the same standard? These articles about "suicide kits" have been filling my social media feeds and news outlets for the past day. The number of minors being exposed to this information by news outlets has to be many orders of magnitude higher than the number of minors who ever saw this combination of "recommended items" on Amazon. Keep in mind that Amazon wasn't just showing this to random shoppers, you had to search for specific ingredients to begin with. Journalists, on the other hand, have been advertising this combination far and wide to everyone who just wants to read news.
The difference is that one is obviously categorized, and the other has no categorization. There's hundreds, if not thousands, of items on Amazon that can kill.
Even if we start a lost of possible harmful items, would it solve the issue in this case? How are minors buying anything in the interent without supervision? Were they even using an account marked as a minor?
That there is the real problem. You can't possibly kid-safe the entirety of the world, internet included; without running into practically Sisyphean challenges that don't even need to be approached.
I mean, the solution is rather simple.
Instead of trying to kid-safe the world, why not just make the world less shitty to live in?
It's a 2 birds 1 stone trick, except this proverbial stone will be taking down many more than just 2 birds. Almost all of the current problems we deal with as a society, all stem from how shitty society has become for many people. But because some people are happy with the way things are going, since things are mostly fine for them; the rest of us have to put up with it? And people are surprised when kids, teens and adults are deciding to opt-out? And their best solution is to make it harder for people to find way to opt-out?
Seriously, really?
Is it really so surprising that people would want a fast ride out of hell at that point? Because that's what it must be like for them if they figure ending it all there and then, is better off than putting up with the rest of societies bullshit any longer.
I have more to say on this, but am trying to follow HN's rules about flamewar stuff, and I realize that this comment thread is very easily turned into such a thing unless careful, so I am cutting my comment off here.
1. Are they well off because they have digital devices?
2. Are they well off because their parents can afford to have just 1 working member?
3. Are they well off because their parents make more than 100k a year? Or some other arbitrary number?
I've met some families in the past that meet all 3 criteria and exceeded the 3rd; but they rent. Are they also well off?
I do think it's interesting that the 'well off' kids are the ones offing themselves. But if that's the case, then we need to look very seriously at what's occurring at that point, because it just means I am more correct than previously stated.
Someone who is well off by the standards of those who are potentially poorer than that person, may still have plenty of other reasons to be unhappy with life. Perhaps not valid to you or I, since they are 'well off', but our opinions on how well to do they are don't matter. The only thing that matters ultimately is that they don't feel like its worth it to continue living.
That, needs to be solved. And that, is part of why I am right about how society has become shitty for many people. Even if they are well to do. Especially so. That should be the last group to be wanting to off itself.
So that tells me that it's not money related, or object related.
It's societally related somehow, in some fashion, which we need to find out, and fix.
Recommending a book that instructs you how to use these products in combination for the purpose of suicide is wrong. I’d argue selling and profiting from that kind of information is wrong, too. No one’s saying Amazon can’t sell these chemicals. But it’s dubious at best to defend selling the info, and it’s downright evil to recommend it.
Yes, it’s their algorithm. But they tweak it all the time to catch issues like this.
If they recommended Mein Kampf and white robes when you ordered rope, surely we’d all agree that would be wrong, even though those individual items are defendable. Recommending them in combination would be evil.
Likewise, recommending guns and bullets alongside a book about mass shootings would be evil.
It can't be evil. The algorithm just recommends products without understanding their intent.
The real question shouldn't be around the algorithm, but rather around the suicide book being banned. Or even around unauthorized account creation/access or payment.
It's "wrong" only if you believe that taking one's own life is wrong. This is a deep, long-standing issue that has valid points on both sides. Not so black and white. I can think of at least one good use case: where someone with a terminal disease (or otherwise poor quality of life, through no control of their own). Then, maybe, peacefully taking one's own life to reduce their own suffering, not to mention the suffering of their loved ones, might actually be the "right" choice. Hard to generalize this as a universal rule since context matters so much here.
I believe teenagers killing themselves is bad and wrong and we should try to stop it.
If your values diverge from that great for you I guess but I think my values on this are more consistent with a healthy society and approach to regulating corporate behavior.
Except the algorithm is just working off the data it has available, and the data it has collected; and that data tells it that people who buy that list of ingredients have also bought that booklet in the past. So it suggests that booklet to others as well, since they have the same list of ingredients in their cart.
The algorithm is not evil, and Amazon itself is not either, provided they do something to try to fix the problem. But the problem is in how algorithms themselves operate.
It's just taking known info and applying it to current situation.
The only actual, proper, and relatively easy by comparison solution to all of this is to just make it so people have no want or reason to commit suicide, or any other 'evil' acts that might be suggested by a foolish algorithm.
Banning things has never worked. Just look at all those super successful wars on drugs, alcohol, etc. Never, worked. Ever.
The true evil, in my opinion, is the attempt to kid-safe the world to the point where it removes all potential for joy and happiness. That leads to suicide by the first groups who commit it, which produces data for our A.I driven machines, which leads to algorithms happily suggesting the perfect thing for it.
I would like to add finally, that just because you don't see how your or any one elses opinion is or is not controversial, does not make it so. Calling opinion controversial based upon opinion itself as well, is just argumentation. Which should be fine in a logical world. But we do not live in a logical world, which is why people are preferring to opt-out of it.
And that's my opinion on the matter. Algorithms just happily oblige the user.
Close, maybe. Prosecutable, unlikely given the algorithm doesn't understand what is being recommended (products, not an idea) an would not meet mens rea.
When Amazon has had to do this 1000+ unique times, then we can have the debate of whether their extra work is worth the lost lives of kids. Probably still a loser of an argument IMO but let’s just take a simple first step yeah?
Razor blades, pen knives, #8 marine line, paracord. All of these things are trivially used to kill yourself.
It is not societies job to protect your snot goblin for killing themselves because you let the little tyke have free access to order whatever they want from the internet.
Because banning things has never worked, for literally almost anything in human history. It just makes it more desirable since you can't have it. Humans are weird like that.
But maybe it's fitting we are like that, because at the end of the day, who's right is it to be banning anything? Certainly not yours, or mine. Sure, protect the children... but against what? Themselves? Kids have been doing stupid things for a long time, and been mostly fine. It's only recently that society has decided to ban joy as well by happenstance via the many methods employed to reduce the spread of covid that things started to go overboard.
How about we just stop making life shitty for people. That will be what makes this entire discussion moot.
It's not that kids will stop doing it, but that Amazon would have taken action to prevent this on their platform (in the least invasive way), and there's no lawsuit at that point. Interesting that the article doesn't mention if the book is still available or not.
"How about we just stop making life shitty for people."
Easier said than done. Most of the child suicides are so well off families.
Sure, but nothing in life worth doing is easy. Sometimes you have to take the hard path to gain anything worthwhile.
And quite frankly, there is no reason or excuse for society to continue to be made shittier and shittier just to suit the inadequate opinions of literally anyone.
Please be objective and reasonable. Define shitty. Where do we start with the fixes? It's easy to write an angry comment talking about generalized ideas that are universally acceptable but can have opposing ways of getting it done.
Throwing up small obstacles to suicide has been done forever and isn’t controversial. People are impulsive. We put up railings and hotline phones on bridges and stuff like that.
Can you articulate an argument for why it’s a good thing?
This is controversial; people fought against adding ugly suicide netting to the Golden Gate Bridge for years, and there are strong utilitarianism-based arguments for limiting the lengths we go to prevent suicide.
Because the social benefits of literally recommending a suicide handbook alongside a chemical that kills in 20 minutes on a platform widely used by almost every family in America without a second thought is zero.
There’s no cost benefit analysis to do here for the rest of us. There’s no pros and cons. Amazon doing this is bad, they were told about it and didn’t fix it. They have godlike amounts of money and are telling us to go fuck ourselves if it bothers us.
They could make an algorithm that doesn’t literally recommend suicide to teens and they should. There’s no other side of the argument we are talking about a recommendation here. The recommendation engine is a consumer product and it’s faulty.
As for the chemical the legitimate needs for it are likely vanishingly small and I find it really unlikely the people that need it would really mind some age verification or another extra safety step when ordering it.
We have been so corrupted by corporate influence we don’t even think to insist that companies act in the benefit of society anymore. Making profits isn’t a god given right if the product is bad for people.
> I find it really unlikely the people that need it would really mind some age verification or another extra safety step when ordering it.
Well, you just articulated one of the major social benefits. We do really mind. The benefits are clearly not zero, like you claimed.
What you didn’t articulate is an argument for why we should invest so much — and give up privacy and our freedoms — to try (and fail) to prevent such a vanishingly small number of self-inflicted deaths.
This is emotional outrage bait, not a basis for instituting evidence-based policy.
Frankly, it’s not our problem, and not our cost to be bourne.
If you want to take away anyone’s ability to use the internet for commerce without interference, why don’t you you start with taking away your own children’s free access to the internet and credit cards, rather than everyone else?
You can do that yourself, today — and probably should.
Who is we? You order highly lethal lab chemicals from a one click consumer products site you use to restock your kitchen or is this some kind of hypothetical cost you’re worried about incurring here?
If this is really your approach to consumer product safety then I suppose you’re consistent, if wrong.
I bet it’s not though. I suspect you’re pretty happy with the outcome of a framework where dangerous products are regulated.
The freedom argument is specious we don’t regulate commerce with corporate freedom as an overriding value. We mandate everything from child proof packaging to rear view mirrors to alcohol consumption of pilots to competence exams for barbers to labeling for Doritos.
I suspect most in this thread are idly arguing from bias rather than reading the facts of Amazon’s behavior here it’s pretty indefensible.
Our tech giants are corporations and the things they produce are products. By a few historical accidents (especially section 230) and a shit ton of corporate PR we have been conditioned not to evaluate them as consumer products that can be defective and harmful but we should start.
> You order highly lethal lab chemicals from a one click consumer products site
Yes, I do. You do too. Your medicine cabinet and cleaning supplies are full of lethal “lab chemicals” that would kill if ingested, or ingested in even mild excess.
But this isn’t even a “highly lethal lab chemical”, it’s just sodium nitrate; mined straight from the earth, and has been used as a food preservative for at least a couple centuries.
> … you use to restock your kitchen
It’s none of your business what we buy these things for, whether it’s to stock our kitchen for meat preservation, or backyard science experiments with our kids.
> The freedom argument is specious we don’t regulate commerce with corporate freedom as an overriding value.
The freedom you’re attacking out of emotional panic isn’t Amazon’s. They can comply with whatever arbitrary nonsensical “save the children” crap you invent, and compliance will cost a microscopic percentage of their overall revenue.
The freedom you’re attacking is mine, and everyone else’s.
Take responsibility for yourself and your own children. If you can’t do that, blame yourself, not me.
You want to be free to see a handbook with suicide instructions next to the pure sodium nitrite you order? That’s the freedom you are personally advocating for? You’re not seeing anything about this turn of events that could be improved upon?
Do you also want to be free to buy baby formula and not be sure what’s in it, or a car where the seatbelts fail if you weigh over 150lbs? Do you want freedom to see pornographic billboards near the playground?
These things aren’t freedom by any recognizable definition. They’re just defective products made by unaccountable corporations. Like Amazon’s recommendation engine in this case.
The idea that corporations should just be able to do whatever they want is so fucking toxic. Seeing this attitude pervasive here on the discussion board of the private equity firm that’s advised so much of our modern tech culture sure does explain a lot though.
> You want to be free to see a handbook with suicide instructions next to the pure sodium nitrite you order? That’s the freedom you are personally advocating for?
I want to be free to order goods without having to upload government-issued ID or otherwise jump through hoops to assuage your panic.
I also want to be free to see a suicide book next to a sodium nitrate purchase without an emotionally panicked busy-body deciding that we have to save the children by infringing on the latter freedom.
> Do you also want to be free to buy baby formula and not be sure what’s in it, or a car where the seatbelts fail if you weigh over 150lbs? Do you want freedom to see pornographic billboards near the playground?
You’re conflating a lot of very different issues here. Can you actually explain how these are related?
> The idea that corporations should just be able to do whatever they want is so fucking toxic.
Nobody is advocating that here, that I can tell.
Furthermore, far more toxicity arises out of nanny-scolds advocating for ever-increasing regulation so that nobody will ever feel unsafe again.
If you want to look for the source of policies that stifle children, replace their healthy childhood experiences with institutionalized suffocation, and as a result, have created the most mentally unwell generation (literally) ever — try a mirror.
I dont think these two things are necessary connected. It's possible to restrict purchase/ownership of some products and not others. There are many localities where you can't buy booze or guns but can buy pork, and visa versa. It's not a "there coming for us" kind of thing. These suicide kit items in question could be controlled without the "floodgates" opening. This is a classic fearmongering argument. Don't do this reasonable thing that most folks agree is right because it will hypothetically result in some sort of unreasonable consequence when taken to an absurd level in an hypothetical argument.
> HN hates Amazon, HN also hates censorship, the question is which do we as a community hate more? Should a company be liable for the free access of information?
I haven't seen anyone suggesting that Amazon ban teens from buying the book, just the dangerous chemicals. And calling that "censorship" seems silly. Is it also censorship that kids aren't allowed to own guns, or that you're not allowed to own plutonium?
> I won't even address the "suicide kit" claim since that is simply sensationalist reporting.
What else would you call a bundle of items consisting of a book with instructions on a way to commit suicide, plus all of the materials needed to do so?
> I haven't seen anyone suggesting that Amazon ban teens from buying the book, just the dangerous chemicals.
Thus proposing a new system of age restrictions, which can be re-used for other things once they exist.
> And calling that "censorship" seems silly.
The lawsuit is asking Amazon to stop sharing information, or associating that information in particular ways, that is in fact censorship.
> What else would you call a bundle of items consisting of a book with instructions on a way to commit suicide, plus all of the materials needed to do so?
I'd call it an automated "Frequently purchased together" recommendation.
Children do not have the kinds of rights you think they should have. And until they are adults, we protect them from harm.
And society is obligated to not work against our societal norms. Amazon will lose this case and will make the necessary changes.
Amazon continuously demonstrates a dirt bag mentality. Few employees last more than 6 months and they have such a bad reputation. And the more we know about there shady ways… the worse it gets.
Because society values family and children more than ‘censorship concerns’.
It’s not censorship it’s product liability. The product fucking kills children.
I think internet contrarianism has warped everyone’s brains into an unrecognizable state.
The fact that Amazon is defending this behavior instead of saying oh shit we fucked up and dedicating a fraction of a basis point of a days profit to fixing it is apalling. The fact that HN readers are doing the same is incomprehensible.
As a society we can stop companies from mixing chemicals that kill teenagers in minutes and instructions on how to do just that with standard household purchases.
We can put safeguards around sales of chemicals like that and our societal ability to preserve self-expression will be fine when we do.
> It’s not censorship it’s product liability. The product fucking kills children.
Amazon sells millions of products that could, in fact, kill children. Just go to:
Health & Household > Health Care > OTC Medications & Treatments
That clearly isn't the issue at the center of this lawsuit, since if it were there are far lower hanging fruit.
> The fact that Amazon is defending this behavior instead of saying oh shit we fucked up and dedicating a fraction of a basis point of a days profit to fixing it is apalling.
Because once you do that it only emboldens additional lawsuits and calls for censorship. In effect Amazon would be admitting all future liability, every product that could kill a child (which, as I said, is millions) they're now liable for.
> As a society we can stop [..] instructions on how to do just that
You start with "this isn't censorship, this is access to chemicals" but end with "why can't we as a society censor this type of information?"
We judge these things as a matter of degree. Like any heavy object could kill children, but some objects are more likely to do so than others. We decide how much protection to put around access to them by evaluating cost/benefit relationships.
We use those tools of analysis decide that selling suicide kits as described in the article, even after learning of the consequences, is fucking insane and indefensible.
We do that because we are rational members of society instead of autonomous algorithmic message board argument bots (or attorneys) who have developed a sociopathic disregard for the well being of others and basic common sense.
15 to 20 extra strength Tylenol can kill most children - and it’s used quite frequently in intentional attempts - literally 100s of times more often than sodium nitrite. Your argument has no merit.
Ingesting table salt or drinking soy sauce is also a method of suicide and has killed many children and adults. Plenty of items that have perfectly legitimate uses and which should remain 100% legal can have deadly effects if misused. You can try censoring the information on how to misuse the substances, but that's never going to be very successful.
I agree. The point that this article doesn't should be on addressing the underlying issue. We can ban this one method and another will take it's place. Ways of getting people help and possibly avoiding whatever scenarios are leading to this should be looked at.
I think you underestimate how easy it would be to walk into most stores and find something to kill yourself with, if you have the right information. Any pharmacy, hardware store, walmart type store, grocery store, etc.
Are you joking? Amazon is one of the A's in FAANG, the mostly highly coveted pseudo-conglomerate to be employed by on HN. People trumpet their roles at FAANG every day here, and most of the rest fawn up to them non-stop.
And that's not even getting started on the obsession with AWS here.
I don't know, whenever anyone on any forum posts "I'm at FAANG and I hate it, I want to die", it's always Amazon.
Everyone I know with other options ignores Amazon recruiters.
I don't think this is a niche opinion, though I wouldn't go as far as saying "HN hates Amazon". Hell, I don't even hate Amazon, I buy stuff there all the time.
So no mention of banning the book with the instructions? I guess some kinds of speech are more protected than others (book with instructions vs recommended products). Or is it that Amazon has the deeper pockets?
My biggest question is how these teens are purchasing anything unsupervised on the internet.
I believe retailers should be banned from selling suicide kits to minors as a matter of law. I don't however think they should be civilly liable for their products being used for self-harm.
Having said that, it also strikes me as a particularly boneheaded business policy to effectively kill off your own customers. At the very least while Amazon is spending millions defending themselves from lawsuits they should be quietly discontinuing this practice.
This doesn't likely meet the standard for negligence on Amazon's part because it would need to be definitively shown that Amazon's negligence to prevent harmful recommendations (themselves not necessarily illegal) was an essential and substantial cause of the suicides.
§ 401 of the California Penal Code, cited in the complaint, states:
"(a) Any person who deliberately aids, advises, or encourages another to commit suicide is guilty of a felony.
(b) A person whose actions are compliant with the provisions of the End of Life Option Act (Part 1.85 (commencing with Section 443) of Division 1 of the Health and Safety Code) shall not be prosecuted under this section."
(b) doesn't apply here, as stated in the complaint. Did Amazon meet any of the requirements of (a)? Sodium nitrite is also used for a variety of other valid industrial purposes. They're not alleging intent, but they are alleging that Amazon breached its duty of care by facilitating the sale of the product and by automatically recommending other products to use in conjunction the chemical to ease the suicide.
The core problem with this case is in proving causation. Typically, in cases involving state suicide-assistance statutes that do not involve physician-assisted suicide, the causal chain is far more obvious: it typically involves a person verbally encouraging or furnishing the suicidal person with supplies, suicide pacts, and other similar cases. The theory that a retailer or an e-retailer can be held to have breached its duty of care or that it substantially caused a suicide is unlikely to hold up. It's rather like blaming a hardware store for selling rope. Unless they can prove that Amazon violated the statute, and even if the court considers the negligence claim to be adequately plead, they're not likely going to be able to prove causation.
This is also a much weaker argument than those used to try to hold brick and mortar gun retailers liable for suicides, because some of those theories can at least rely on 15 U.S.C. § 7903 (5)(B) if the person who committed suicide indicated at the point of sale their intention to cause harm to themselves or others.
Amazon and the brand will probably settle the case, but the impact will probably just be that Amazon itself will make life harder for chemical retailers. If there are more controversies they may also just regulate content like "how to kill yourself" e-books more stringently.
If a family owned market sold these items to a 16 year old - and a man Bezo’s age were working the register, and looked up the store policy and concluded this was a permissible sale- I think people would not accept that.
And yet we accept automation unblinking, even though automation includes the above scenario.
I have always figured that when it comes to suicide, the best way to do it is to take out a dictator or other evil entity in the process. Make it count. What’s the worst that can happen? You die trying.
How are these “Amazon” suicide kits? I mean, yeah they’re a little bit to blame. Like, maybe 10%? But how come the person manufacturing these kits doesn’t have their mugshot up on the article?
I can understand that this was, initially, an unintended consequence of their recommendation algorithm.
But when informed of the problem, how on earth can Amazon justify doubling down on this and trying to disclaim all responsibility. At that point, they know that they have been selling suicide kits to children.
This surely makes them complicit in all similar deaths going forward, where these suicide kits have been sold on their platform. What a disgrace.
My roommate was planning on using this to end his life and after we confronted him he said he chose this because you essentially fall asleep and don’t wake up. Sounds painless but typically those who use it can’t give thorough reviews on the product.
Even though this is very serious, I actually lol'd when I saw this.
I can't buy "prescription" pet food from Amazon without an actual "prescription" (for whatever that's worth -- nothing) from a Vet, yet anyone can buy drugs that will kill you.
The irony is abundant, but the results very sad.
I don't know how Amazon should handle this, but given their heavy-handedness over pet food, this seems more than a little absurd.
I shouldn’t even need to carry government-issued papers, and I think it’s a serious step backwards that it’s essentially required to navigate modern life.
I am curious, though; do you support requiring government-issued ID to vote?
The societal harms of children huffing paint are, in your opinion, not as bad as the personal inconvenience of having to pull out your ID at the checkout line?
Not surprising coming from South Carolina law enforcement, but these are THC gummies with clear packaging (that have a warning label and state "Cannabis Product").
Are we really to believe criminal is going to unpackage and give out to unsuspecting children, just to get them hooked on the marijuanas? Instead of selling them?
While I don’t believe the parent argument is well founded, this is also meaningfully different from the “sue the gunmakers” argument.
This scenario feels a bit closer to Walmart selling guns without properly vetting the buyer, which does seem like something the seller could/should get in trouble for.
This is also an imperfect comparison because Amazon is just facilitating the sale, not directly selling the product. But my primary point is that there are more layers to consider here.
It's more like if a gun store sold a kid a gun and recommended they buy a booklet titled "School shooting for dummies" and then that kid went and shot up a school.
Second of all and most important... WHY the hell were these CHILDREN buying stuff off of Amazon, somehow WITHOUT THEIR PARENT'S KNOWLEDGE? Did they just pay with their own cc, have it delivered to the house all while their parents/guardians DIDN'T notice?
If their parents are so uncaring for their children, then I'm not even surprised they wanted to commit suicide.
Just my 2¢ and sorry for yelling a bit