Hacker News new | past | comments | ask | show | jobs | submit login

From the article:

> Because TikTok’s “algorithm curates and recommends a tailored compilation of videos for a user’s FYP based on a variety of factors, including the user’s age and other demographics, online interactions, and other metadata,” it becomes TikTok’s own speech. And now TikTok has to answer for it in court. Basically, the court ruled that when a company is choosing what to show kids and elderly parents, and seeks to keep them addicted to sell more ads, they can’t pretend it’s everyone else’s fault when the inevitable horrible thing happens.

If that reading is correct, then Section 230 isn't nullified, but there's something that isn't shielded from liability any more, which IIUC is basically the "Recommended For You"-type content feed curation algorithms. But I haven't read the ruling itself, so it could potentially be more expansive than that.

But assuming Matt Stoller's analysis there is accurate: frankly, I avoid those recommendation systems like the plague anyway, so if the platforms have to roll them back or at least be a little more thoughtful about how they're implemented, it's not necessarily a bad thing. There's no new liability for what users post (which is good overall IMO), but there can be liability for the platform implementation itself in some cases. But I think we'll have to see how this plays out.




What is "recommended for you" if not a search result with no terms? From a practical point of view, unless you go the route of OnlyFans and disallow discovery on your own website, how do you allow any discovery if any form of algorithmic recommendation is outlawed?


If it were the results of a search with no terms then it wouldn't be "for" a given subject. The "you" in "recommended for you" is the search term.


That's just branding. It's called Home in Facebook and Instagram, and it's the exact same thing. It's a form of discovery that's tailored to the user, just like normal searches are (even on Google and Bing etc).


Indeed, regardless of the branding for the feature, the service is making a decision about what to show a given user based on what the service knows about them. That is not a search result with no terms; the user is the term.


Now for a followup question: How does any website surface any content when they're liable for the content?

When you can be held liable for surfacing the wrong (for unclear definitions of wrong) content to the wrong person, even Google could be held liable. Imagine if this child found a blackout video on the fifth page of their search results on "blackout". After all, YouTube hosted such videos as well.


TikTok is not being held liable for hosting and serving the content. They're being held liable for recommending the content to a user with no other search context provided by said user. In this case, it is because the visitor of the site was a young girl that they chose to surface this video and there was no other context. The girl did not search "blackout".


> because the visitor of the site was a young girl that they chose to surface this video

That's one hell of a specific accusation - that they looked at her age alone and determined solely based on that to show her that specific video?

First off, at 10, she should have had an age-gated account that shows curated content specifically for children. There's nothing to indicate that her parents set up such an account for her.

Also, it's well understood that Tiktok takes a user's previously watched videos into account when recommending videos. It can identify traits about the people based off that (and by personal experience, I can assert that it will lock down your account if it thinks you're a child), but they have no hard data on someone's age. Something about her video history triggered displaying this video (alongside thousands of other videos).

Finally, no, the girl did not do a search (that we're aware of). But would the judge's opinion have changed? I don't believe so, based off of their logic. TikTok used an algorithm to recommend a video. TikTok uses that same algorithm with a filter to show search results.

In any case, a tragedy happened. But putting the blame on TikTok seems more like an attack on TikTok and not an attempt to reign in the industry at large.

Plus, at some point, we have to ask the question: where were the parents in all of this?

Anyways.


«Had Nylah viewed a Blackout Challenge video through TikTok’s search function, rather than through her FYP, then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content.»

You can of course choose not to believe the judges saying it matters for them, but it becomes a very different discussion...


> That's one hell of a specific accusation - that they looked at her age alone and determined solely based on that to show her that specific video?

I suppose I did not phrase that very carefully. What I meant is that they chose to surface the video because a specific young girl visited the site -- one who had a specific history of watched videos.

> In any case, a tragedy happened. But putting the blame on TikTok seems more like an attack on TikTok and not an attempt to reign in the industry at large.

It's always going to start with one case. This could be protectionism but it very well could instead be the start of reining in the industry.


> Now for a followup question: How does any website surface any content when they're liable for the content?

Chronological order, location based, posts-by-followed-accounts, etc. "Most liked", etc.

Essentially by only using 'simple' algorithms.


Is not the set of such things offered still editorial judgement?

(And as an addendum, even if you think the answer to that is no, do you trust a judge who can probably barely work an iphone to come to the same conclusion, with your company in the crosshairs?)


I'd say no, because they averages over the entire group. If you ranked based on say, most liked in your friends circle, or most liked by people with a high cosine similarity to your profile, then it starts to slide back into editorial judgment.


Not really, as the variables comes from the content itself, not from the company intention.

And for the addendum, that's why we have hearings and experts. No judge can be expected to be knowledgable about everything in life.


This is only a circuit court ruling - there is a good chance it will be overturned by the supreme court. The cited supreme court case (Moody v. NetChoice) does not require personalization:

> presenting a curated and “edited compilation of [third party] speech” is itself protected speech.

This circuit court case mentions the personalization but doesn't limit its judgment based on its presence - almost any type of curation other than the kind of moderation explicitly exempted by the CDA could create liability, though in practice I don't think "sorting by upvotes with some decay" would end up qualifying.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: