Hacker News new | past | comments | ask | show | jobs | submit login
LinkedIn does not use European users' data for training its AI (techradar.com)
59 points by robertclaus 4 hours ago | hide | past | favorite | 43 comments





Which implies that elsewhere...

There are some consumer protections that I really do wish we imported into the U.S., especially food safety and chemical usage. Too much regulatory capture for that, though.


> elsewhere

In other places, LinkedIn silently opted the users into AI training. From few days ago:

https://news.ycombinator.com/item?id=41582951 - LinkedIn scraped user data for training before updating its terms of service

https://news.ycombinator.com/item?id=41584929 - LinkedIn silently opts users into generative AI data scraping by default


Why do you care if someone trains an AI on content you have chosen to post publicly (LinkedIn profile/posts)? I’d understand if it was your DMs or something but this stuff is no secret.

Posting images, articles and other content doesn’t grant everyone the right to use it for every purpose. Especially not to republishing it partially under the excuse a machine is doing it. It’s just not the same as someone getting inspired by it or citing it.

Automatically doing something is a whole other quality from a person doing it. Police watching a protest is fine, police filming it or documenting all participants via face recognition is forbidden (at least here).


I understand and completely agree republishing content (even altered) isn't cool, and I also agree government use of technology for mass surveillance is incompatible with our idea of democratic/open societies. However, in the case of LinkedIn posts you have already given the ideas behind your content to the world (and in this case specifically and explicitly LinkedIn) for free.

I've said this before, but it's sad how quickly this community swapped from being champions of the free exchange and use of information for the betterment of humanity to gleefully stomping on an incredible and beautiful new technology because someone else might make money off it. Reminds me of the Judgement of Solomon [1] (people would rather kill the whole technology and all the incredible things that may come with it then miss out on "my cut!", "my cut!" even if it's a single LinkedIn post in a corpus of billions)

https://en.wikipedia.org/wiki/Judgement_of_Solomon


Has this community done a complete 180 randomly out of the blue, or is it a reflection of how the new technology is being used? And if this previously ultra friendly community did a 180, imagine the feelings in the general public that never had the friendly attitude.

>I also agree government use of technology for mass surveillance is incompatible with our idea of democratic/open societies.

well, corporate use of technology for mass surveillance is equally incompatible with the idea of a democratic society, which is why the EU imposes limits on what LInkedIn can do with your data, and thank God for it.

Free exchange of information is being able to access a textbook at the library, not a corporate behemoth vacuuming people's personal information so they can sell them to the highest bidder to put you under surveillance the next time you attempt to switch a job or send you more ads. Betterment of humanity? Blink twice if a LinkedIn PR person is holding you at gunpoint


> Betterment of humanity? Blink twice if a LinkedIn PR person is holding you at gunpoint

I was referring to Generative AI in general, this use-case is quite boring.

> a corporate behemoth vacuuming people's personal information so they can sell them to the highest bidder

Do people not use LinkedIn to explicitly signal to the world that they're looking for a job? Why does it matter how that information is being delivered? If you don't want the world to know you're looking for a job simply don't update LinkedIn?

> corporate use of technology for mass surveillance is equally incompatible with the idea of a democratic society

The argument against government power/surveillance is that they have a monopoly on it and may use their power to hurt people. It is good to legally protect sensitive information like health data from advertisers, but in this case you can, again, simply not use LinkedIn. What difference does it make if the info is collected by a company looking for new hires, a third party analytics company working on behalf of them, or LinkedIn itself working on behalf of them? It's not private data.


"just dont use LinkedIn" is such a narrow minded thing to say. How do you feasibly expect people to exist in society without interacting with any of these systems and corporations? if its not LinkedIn its Indeed or w/e else. They all collect data and most of them are pumping it into some kind of LLM or behavioral analysis algo. That is not functionally different than the argument for the Government doing it, except for that the Gov has a monopoly on violence.

This applies for pretty much everything in our daily life like banking, shopping etc... "just don't interact" is such a useless nothing-burger that side-steps the problem entirely. You can "solve" all societal problems by becoming a hermit, moving to the woods and living off the land... but that is not a functional or reasonable thing to do for 99.99% of people. Its baffling especially when the reasonable solution is simply having a bare minimum standards of protection across the board, which many countries already implement to great effect.


> not a corporate behemoth vacuuming people's personal information so they can sell them to the highest bidder to put you under surveillance the next time you attempt to switch a job or send you more ads

I genuinely don't see what the problem is.

I rarely post updates on LinkedIn. When I do, they're updates that are intended to be broadcast to the public. If some execs at LinkedIn are smart enough to find a way to profit off the back of that, why should I be upset about that? Why are you upset about it?


Well, look at it this way: you gave stuff to LinkedIn.

Whatever their terms say, they're storing your stuff, they are serving your stuff, and they've reserved the right to extract value from your stuff in perpetuity, according to your agreements when you signed up, when you posted, and when they updated. Doesn't matter about your privacy settings, because it all happened on LI.

I mean, you can delete the stuff you posted if you don't want future AIs trained? Delete your whole account if you didn't like LinkedIn messing with it?

(I could likewise say this about MS Windows, Apples or anything: if you don't want someone to have your stuff, give it to someone else, or don't give at all?)

But in the end, you voluntarily gave it to them, because it was free, but you are the product, and not the artist.


Microsoft probably owns the physical media from before LinkedIn was acquired so by your physical ownership logic they can keep using all the data you have "deleted" and ignore your new opt outs on all those backups..

The point of making legislation is to have things to enforce at times like buy outs to say things there is no reasonable way to enforce our expectation that our 2FA numbers are not abused by this new buyer so the buyout can not continue.

Maybe you don't need a job social network, but presuming you do you have no way of knowing what org will own the physical media of the one you pick today unless it is in a country with competence, I.e. not the US.


I partially agree, you agree to the terms of service. However one should also not forget that LinkedIn is quite dominant and a very big player. For many businesses it is simply not an option to not be on their platform, unless they can afford to lose potential customers and employees.

Rules change for monopolies and oligopolies, and for a good reason IMHO. LinkedIn belongs to Microsoft, so does GitHub. Don’t forget Windows, Azure, Office, Visual Studio and a long list of other products. They want to take your data from all possible sources and if you just point to the TOS alone this would be totally valid. But we have to look at the bigger picture and already do so in other areas, for example GDPR.


Yeah, everyone should be allowed to cut down trees on public property for firewood, or dump their trash in public parks!

If they didn't want these resources to be exploited, they shouldn't make them publicly accessible!


Cutting down a tree implies the tree is no longer there and cannot be used by others. In this case, the content is still there, unaltered.

This is probably the worst analogy I've read this year.

Are we back to the “you wouldn’t download a car” pointof things?

Of course this then later leads to: "Linkedin AI has non-European bias"

I'm of two minds.


Exactly what i was thinking.

I wonder how they use US data too. LinkedIn is so cringe, the value of its data in the mix is probably negative.

How fucked In the head do you have to be to train ANY AI on LinkedIn Data.

It could be used for all sorts of things:

    You could use it to indicate the fit between employees and companies;
    You could use it to detect lies and exaggerations in CVs;
    You could use it to estimate when employees are likely to be considering seeking new opportunities.
There's significant potential for business relevant applications.

But linkedin is far from accurate information. It's full of hollow mindless corporate PR and trumped up CVs.

I'm looking forward to getting fired because my employer thinks I'm considering new opportunities because they used LinkedIn AI

That's cool, but when decent companies think that legitimately useful people are considering resigning they often don't fire but instead offer a raise.

Lol

Don't worry, people are hard at work making sure these AI systems are "aligned".

I think it depends on what your objective is. Like if you want to simulate the LinkedIn experience, it's natural to train on LinkedIn data.

HR and marketing will love it.

Make of that what you will


Well, not a general purpose one, but...

knowing LI doesn't have any NSFW content, is full of marketing content (both corporate marketing and self boasting individuals), tends to prefer positive signals (all projects succeed, etc.), is mostly done in English, and so on...

... There is clearly a market for text generation in this language bubble. Think of all the internal and external communication in your $BigCorp. That has immediate use not only in marketing, but also HR, recruitment, company policing, etc. Your next company town hall can be prepared and led (!) by this thing.


Half of LinkedIn posts are already written by chatgpt, you don't need a model trained on the output of another model

You just described most open source models.

maybe someone needs an AI that generates smug self-congratulatory word salad

No, but the ability to probabilistically detect it could be used to build useful filtering and prioritisation functions.

Training is one thing, but that feature to have AI help you with your LI posts seems psychotic. Generic generated slop for what exactly.

Yup, I noticed this a few days ago in some subreddit like /r/assholedesign, I think a few months ago we had a similar feature on instagram and perhaps fb, I don't know if it's still active in EU on those meta products

Good.

LinkedIn is the bottom of the barrel of the labour pool. Wonder why even train the data with any of them.

Just imagine if we had the same privacy protections as the EU in the US.

Well one can't have everything.

The average webdev in many countries of Europe earns pennies and cost of living being still abysmal. Meanwhile one can only be breathing in the USA with a CS degree and easily make quite a lot, more than enough to get by.


Development makes decent money here. Not the 250k$ you'd get in silicon valley but the costs of living are also way lower here. It's definitely one of the better classes of jobs.

Fuck LinkedIn. They should have already been sued for their exploitation of people's identities a long time ago.

Imagine an AI that's only allowed to mine the data of people to stupid to elect representatives that protect their privacy.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: