Hacker News new | past | comments | ask | show | jobs | submit login
Senate tells social media CEOs they have 'blood on their hands' (engadget.com)
14 points by DaveFlater 5 months ago | hide | past | favorite | 35 comments



I am not following that closely US politics, but are other sectors under the same scrutiny: guns manufacturer, automoto companies (road rage)?


Gun manufacturers get some scrutiny, but it isn’t a fair comparison in that social media is engineered to keep people addicted and messes with their mental state in a destructive way. Guns are a tool abused by people with mental predisposition to do so.

The whole scene is performative nonsense anyway. Nothing will happen.


Alcohol and sugary foods are also engineered to keep people addicted and mess with their mental and physical states and are completely legal and not frowned upon.

Agree on the performative nonsense part though.


> Alcohol and sugary foods are also engineered to keep people addicted and mess with their mental and physical states

Humans have been drinking alcohol and eating sugary foods for thousands of years. They weren’t “engineered” to do anything. They are just natural substances which humans stumbled upon and found inherently attractive (due to our biological predispositions).

Both are very energy-dense. In a premodern / early modern society in which most people engage in hard physical labour every day, that energy-density is valuable. In a late modern society in which most people are sedentary, it becomes much more harmful.

Many of the negative health consequences of both have only become known in recent decades. And several of those long-term health consequences are only apparent due to increased longevity, in centuries prior many people would have perished due to other causes long beforehand, and even deaths directly attributable to these substances would have seemed less significant against the background of generally higher mortality.


Alcohol is pretty heavily regulated. Sugary foods are not, which does suck—they should be.


Sugary soft drinks (Coke, Pepsi, etc) are higher on my list than cookies or cheesecake. Schools are startking to ban soda from being sold


I agree—I don't think overall cookies or cheesecake are a major source of obesity, since they're really not eaten that often. You're right about soda, but in general the large amount of sugar (in whatever form) in so many foods is a problem.


There have been attempts to regulate them, e.g. in NYC.


Definitely a step in the right direction.


They’re also labelled, regulated, and liable.


In what ways are they liable?


The alcohol industry is thousands of years old and has gone through varying levels of regulation and deregulation. The Prohibition happened. It was a big deal.

But even then, even given their incredibly entrenched history of consumption and relatively obvious risks, they still face real civil liabilities. Hard seltzer companies, most recently, have been sued and fined for promoting their brands as healthy. A very similar case happened to the sugary drink Vitamin Water.

But, if you really want to compare the alcohol industry to social media, you’d first need to give Budweiser a way to, in real time, modify the alcohol content of the can based on the current mental state of the drinker. Do that and it might come close to be a valid comparison. But you’d still need to change the clearly labeled alcohol content on the front of the can, and instead bury it in a fourteen-page terms of service agreement. Then, also give the can the ability to simultaneously monitor you, and inform the third-party salty snack-food manufacturers that your drunk and it’s time to send you a push notification from the chips bag, as you have a 40% increased likelihood of wanting chips. That might get you there.

The prescription pain-killer and tobacco industries have paid hundreds of billions in fines for covering up their addictive properties. The list goes on.

Section 230 needs to be seriously amended


Guns and cars don’t change people’s mental state. Some of the states with the lowest homicide rates are states like Idaho, where half of households own a gun. This is also true over time. In the 1950s in the US, the percentage of households with a gun was much higher, but homicide rates were lower. Gun ownership rates in Europe were also a lot higher in the mid 20th century, while homicide rates were not.


Rate of gun ownership in Montana is among highest.

Rate of suicide in Montana is among highest.

Guess the method of suicide.


Does owning a gun cause suicidal thoughts? Most of Western Europe has higher suicide rates than the US: https://www.phillyvoice.com/why-suicide-rise-us-falling-most.... Eastern Europe has even higher suicide rates. And suicide rates in the US are going up as the percentage of households with a gun is going down.


It can make a difference. Someone with suicidal thoughts who has a gun, the gun can give them a quick and easy method of acting on it, with high certainty of success. Without a gun, they’d have to resort to other methods, which involve more work, have lower likelihood of success, and greater risk (and hence anxiety) of the “you don’t actually die you just end up severely disabled” outcome

For someone dead set on killing themselves, it doesn’t make a big difference. For someone who has a sudden impulsive thought of suicide, having a gun can mean they die, without one by the time they’ve worked out how to kill themselves the impulse has passed

There isn’t a simple correlation between gun ownership rates and suicide because suicide rates are determined by many factors of which gun availability is just one. But I think it is a real factor, and there is research which supports that it is, e.g. https://pubmed.ncbi.nlm.nih.gov/36652694/


No but it increases the risk of a suicide attempt and it's lethality. The link between homicide and guns is small and nebulous, the link between guns and suicide is clear and much stronger.


Can’t you put parental controls on children’s phones and not approve these apps? Also, why are parents on social media? I don’t use social media and my kids don’t use them either. Any app that goes on their phones needs to be approved by me. I’ve already told them they aren’t getting social media apps. It’s been drilled into them since they were 5 or so that these things are bad for multiple reasons, but primarily 1. Anything you say there lasts forever so you have to always watch what you say and 2. Creeps hang out there and want to do you harm.

Have the conversation and get off those apps. Unless adults do it the kids never will.


I can put all kinds of restrictions on my child’s phone, but I have no control over the garbage her friends show her on their phones at school.


You can't escape social media just by not using it yourself. Put as many app controls as you want on your kids phone and some other kid will still make recordings of them and upload them to make fun of them, or worse. And if most kids are on social media and you aren't it can be isolating.

I'm not even criticizing your approach, just saying that you can't go home again.


Not denying it. But then it’s a very limited time where they actually have access and that too not all the time.

In addition, they could point out online predatory practices more in school as part of their sex Ed curriculum. They’re already taught about domestic violence and predatory behavior.

Not trying to let social media companies off the hook. But there are other things that can be done as a society.


Does HN count as social media?


It's easy to blame those at the forefront for the damage they do, and rightly so.

Our system (free market, capitalism) runs on this sort of headlong-rush mentality. Some of the blame belongs with the no-rules deregulation of our society. It seemed like a good idea at the time (to some) but it means, some things are lost.

Like any sense of responsibility to the public. I.e. if everybody is doing it (unfettered access to social media for everybody) you just lose in the market if you don't play ball. You can be as socially-conscious as you like, but the market will crush you and leave you in the dustbin if you don't take every available customer opportunity. Leaving just the ruthless at the top.

It's inevitable, and so I believe 'blood on their hands' is probably true, but who's hands? Those who participate in the system as it is, or those who created the system?


I think you're eliding something about the social media companies that is true: they are uniquely exempted from regulation. A car company have to follow federally mandated safety rules, are answerable to the NHTSA, and can be sued by consumers if they do something negligent. That's the standard level of regulation most industries see in America.

There is no equivalent for social media and the only laws around it that does exist - section 230, legally protects social media sites from basically any legal liability with regards to lawsuits. A protection which is so extreme it's almost difficult to imagine what would be acceptable.

Take an example, let's say Zuck gets angry because Elon Musk has attacked him, Zuck can tell the engineers "Write me an algorithm that finds every bad word said about Elon Musk on Facebook and push those comments out to every single user, I want 3billion people seeing baseless accusations against Musk atleast once a day, and the more extreme the better, we're not done until credible calls for the assassination of Musk appear regularly on our site". And it'll work! We see that people pick up on how FB's algorithm works and will create that content to gain followers. But since it can all by done by algorithmically selecting from user generated content there's no liability for Facebook or Zuckerberg.


Section 230 only shields them from liability when they moderate user content. Prior to that, you either didn't moderate at all and were free from liability because it was the users words, or you did moderate and were liable for any user content on your site because the moderation implied editorial control.

I would argue that car companies experience similar protections, they just didnt need a specific exemption because product liability already precludes the manufacturer being sued for things the user does. I cannot sue Corvette if I get busted speeding in one of their cars, even if every ad they put on the TV shows someone driving a Corvette recklessly.

I also can't sue Corvette or whoever for selling me a car more powerful than I can handle if I burn out at highway speeds and crash.

Even though Corvette or Ferrari could change their product lines to not sell cars that practically encourage reckless driving, they are not liable for what users do with the cars. They have "editorial control" of a sort over what they choose to sell and they're not liable for what they choose to sell in the same way that Facebook is able to choose what posts to emphasize without liability for what the user posted.

> "Write me an algorithm that finds every bad word said about Elon Musk on Facebook and push those comments out to every single user, I want 3billion people seeing baseless accusations against Musk atleast once a day, and the more extreme the better, we're not done until credible calls for the assassination of Musk appear regularly on our site".

"Make me a car with so much horsepower that it can barely be driven safely, and then drop a Fast and Furious style ad to get the motorheads flying around the streets in them. I want at least 3 Paul Walker tier wrecks a week."

Still works for a car company. Sports car drivers are 43% more likely to have an at-fault accident on their record compared to the national average.

I own one, I'm not trying to call anyone out, but I really do think sports car manufacturers make vehicles that are very easy to drive dangerously, and then they advertise people driving dangerously in them. I wouldn't be surprised if Porsche 911 sales went up after Paul Walker died trying to drift in one.


Does this include senators who voted to authorize Gulf War II? Given the slow turnover, I suppose it must.


They know/knew [1]. Whether or not their remorse is authentic...shrug.

[1] https://www.youtube.com/watch?v=J54k7WrbfMg


There was a time you could let a teenager use the internet unsupervised as long as they understood some basic safety. That time has passed. The only people to blame now are the parents.


> The only people to blame now are the parents.

That's a false dichotomy. I can blame myself for not keeping my family away from toxic algorithms, but I can also blame the facebook leadership for creating that monster. one that also infects every site I visit with trackers, and then uses that to push ads across my whole family.


I used to think this too, and to some extent there are things children do that you probably can attribute to parenting. But kids are also their own people and make their own decisions, the younger they are the stupider the decisions it seems. I'm long past blaming parents for the actions of their children.


There was a time when the internet was not optimised to suck your life/money from you. Parents (I am not one by choice for this very subject and many more reasons) cannot be fully blamed for things that are engineered by people who are vastly smarter at doing so. Including specialised AI's/statistical analysis etc. I see around me that children get completely excluded from 'life' if the parents try to limit social media use. It's up to the parents, but also schools, government etc.


Black mirror has a great episode about this called Arkangel. I hope it changes your mind.


The solution will be greater monitoring and more intrusive systems of content control.

The real solution is to change the direction of technology towards technologies that aren’t harmful to kids and societies (by definition, the same) but help them, but that’s less profitable for companies like Meta.

Adding more policing to platforms cannot and will not solve these problems, but it will perfect absolutely insanely good systems of control that will inevitably be used for other purposes and under “unintended” circumstances.

I wonder if this is the theater of pretext for deploying information control systems at scale, because if I designed a bulletproof publicly defendable justification for rolling them out it would look pretty much like this.

This isn’t conspiracy theory and should be dismissed. This action comes directly on the heels of disinformation being identified by the WEF as the leading threat in the world in 2024, and huge concerns about interference in the 2024 elections in the US.

If we assume that it’s never about what it’s about (aka mass communication principle #1: the cover story is rarely the real story) and there are no coincidences in national security, this really looks like the pretext is being put in place for greater monitoring and control.

- https://www.pymnts.com/news/security-and-risk/2024/attack-ve...

- https://www.weforum.org/agenda/2024/01/ai-democracy-election...

- https://news.un.org/en/story/2023/06/1137302

Also, stuff like this has policymakers deeply concerned:

https://www.nytimes.com/2024/01/30/us/politics/ai-child-sex-...

It helps to look at the policy documents that decision makers are reading to understand why these information tools are being put in place.

See key takeaways on Page 68:

https://www.dni.gov/files/ODNI/documents/assessments/GlobalT...


This is my takeaway as well.


Drug overdose deaths still killing 100K per year in the US, but congress has forgotten about it and is railing against social media for having "blood on their hands." It's perfectly clear that this is about power and the fact that social media is a threat to theirs. These people have no concern whatsoever for anything else.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: