Add an image on your own website that links to Facebook. Problem solved. You keep your like buttons, their servers are no longer involved in serving your web page.
https://www.jitbit.com/alexblog/256-targetblank---the-most-u... (This applies to more than just target="_blank")
Closed (fixed) Firefox bug: https://bugzilla.mozilla.org/show_bug.cgi?id=1503681
Open Chromium bug: https://bugs.chromium.org/p/chromium/issues/detail?id=898942
Either way making sure that window.opener isn't available to random sites is a critical security feature and in some browsers that require you to set noreferrer, so better safe than sorry.
Not sure if FB still supports that, and where I work, we have used static buttons since forever.
Unless you have rights holders permissions then the social links at the bottom of that article look like copyright and probably trademark infringements. (I'm not saying that's a good thing, just how it appears.)
> 8. You must not use or make derivative use of Facebook icons, or use terms for Facebook features and functionality, if such use could confuse users into thinking that the reference is to Facebook features or functionality.
We know that Facebook uses that paragraph against alternative like buttons.
Many years ago German computer magazine publisher Heise created a version of the like button that works like this: The button is initially greyed out and has to be activated by a slide button to be used. Communication with Facebook's servers starts only when the button is activated.
After threats from Facebook Heise had to change the look of the initial button so that is has none of the Facebook branding. Only the dynamically loaded original Facebook button looks like Facebook like button. 
Link to the original alternative like button project in German is . An fork with English documentation is .
EDIT: Their current branding guidelines for the "thumb icon"  say:
> Do link the Thumb Icon directly to your Page on Facebook when using the Thumb Icon online.
So a thumb icon linking to your page should be OK.
EDIT 2: The branding guideline also says:
> Don't use an outlined thumb with the cuff detached.
So you can use the "Thumb Icon" but not in a way that replicates the current original Facebook like button because that one is outlined and has the cuff detached.
BTW this is exactly what Privacy Badger does: It replaces the original Facebook Like Button (cuff detached) with the thumb icon from the official assets (cuff connected).
> We know that Facebook uses that paragraph against alternative like buttons.
Did you quote the wrong section? That term can't really apply to alternative "like us on Facebook" buttons, because such a button can't confuse users into thinking it refers to Facebook features or functionality, because it actually does refer to Facebook features and functionality.
> Unsurprisingly, Facebook didn't like this change. A spokesperson told the German publication that the way it has implemented the Facebook Like button violates the Facebook Platform Policies, specifically quoting this clause:
>> 8. You must not use or make derivative use of Facebook icons, or use terms for Facebook features and functionality, if such use could confuse users into thinking that the reference is to Facebook features or functionality.
> > 8. You must not use or make derivative use of Facebook icons, or use terms for Facebook features and functionality, if such use could confuse users into thinking that the reference is to Facebook features or functionality.
As written, if the reference is to Facebook features or functionality, then there can be no confusion and this clause does not apply. This would seem to be the case here.
I'm adding European, because that's what the article is about and that's where I'm from. Not sure what would happen in a US court.
IMHO to be complete the law should require web widget providers to serve what it says on the tin.
AND NOTHING ELSE
For example, visiting a store doesn't give the store owner the right to search your bag.
Then lets not stop there and include all advertisement???
The advertiser knows the topic of the website he is advertising on, he knows what kind of audience is attracted by a specific article. He can place his advertisement at the top or the bottom to further filter down.
This gives him everything he need to advertise his product on that website. The web master can host the images. A neutral 3rd party, preferably a government agency, can track impressions and provide the advertiser with a crude estimate of traffic by region.
I think it shouldn't stop at having other people do all kinds of things and pay for it. The EU could easily fund its own technologies.
THE EU could give you [say] a Facebook like button in html and require you use it. That they have their own TOS is just irrelevant. Or worse, Facebook shouldn't have to invest in terms of service. We should have detailed laws removing the need for a TOS. Standard laws for social networks should apply.
A restaurant owner doesn't have to clutter up his place with 100 no smoking signs. There is no contract to sign before you can eat.
Who cares? Facebook is a zero sum game at this point for advertisers/content creators. Facebook stacking the odds like a casino does chuck-a-luck. There's only them winning here, nobody else.
The site owner that might be sued by Facebook for copyright infringement, for starters.
Facebook has no right to have their arbitrary code on other people's websites. So they can't force any specific way to show their button. From the end user's perspective it's all the same.
Except in cases of fair use, which isn’t nearly as broad as people think, the use of other parties’ images is subject to whatever licensing restrictions they choose to put on them. You can choose not to display their images if you do not accept those terms.
That’s a very broad question. For one thing, fair use is a concept only in US law. Other countries may have their own versions of it, but all of them have their own unique limits. Even under the US version, it is not at all clear that it would fall under fair use. If I start making t-shirts with the Facebook logo on them, is that fair use? No, it isn’t. Is there much difference between that and putting it on a website that I make money from? The right jury would say no.
By the way, this website does exactly that.
May I reach to you (from qbix.com) as we are building a new social operating system for the web?
Unfortunately, we live in the day and age where designers and marketers feel it is more profitable and fashionable to distinguish between the ingroup and the outgroup, rather than to empower all users.
Because without that address bar that tech people seem to think normal users dont use they lose their ability to reason about the web.
It will wind up similar to the missing start button on Windows.
Disclaimer: I worked at a local college and had to support students in CS and other Microsoft Office courses. They lost it whenever Microsoft took away the start button but I would tell people its still on your keyboard... Anyway also saw a lot of F11 people losing it.
If you are, I would very much like to hear how you are able to get a like from linking an image to Facebook.
Those who understand what you wrote are busy making money via Facebook ads, milking the gravy train while it lasts.
When was the last time you typed in nytimes.com or some similar foo into a browser?
Facebook for news...seriously? Even HN, though I sometimes follow links to news sites, is primarily of interest for me as a source of links to obscure blog posts and similar non-news stuff. The news links are usually colored as "already visited" for me. Nevertheless, quite often the comments to these news articles are still worth reading. Which stands in stark contrast to Facebook comments on news articles...
Also, I don't use Facebook. Most of the people I know navigate to the news site directly, as social media is just an echo chamber, there are lots of links to obscure news sites publishing fake news, etc.
Granted, I'm sure I'm an outlier but it was an entertaining coincidence, especially as I don't typically read the NY Times.
There are so many mixed in signals when it comes to the "decline of news" that it's hard to pinpoint causes or use it in arguments.
So no, sending your data to Facebook isn't dumb, and in fact, it would destroy most smaller e-commerce businesses if they couldn't send their data to Facebook.
The loose thread now is in how the companies are required to communicate their data mining. These twenty page privacy policies that I agree to with a flick of the scrollbar and a button click, or these equally boring popovers when I visit a site, are where the governmental innovation needs to happen next.
Many of the popovers (basically all that you can't easily dismiss without giving consent to anything unnecessary) don't result in valid consent.
Twenty page privacy policies are also questionable: "the request for consent shall be presented in a manner which is clearly distinguishable from the other matters, in an intelligible and easily accessible form, using clear and plain language."
I also hope that the ones that ask for consent with a modal pop-up create a modal pop-up offering you to revoke consent on every page load: "It shall be as easy to withdraw as to give consent."
Strict enforcement of the existing rules is all that's needed. Getting consent is going to be really hard, to the point where web sites may be best of not asking for it, and only doing what they can without processing personal data.
I'd expect the first enforcement action to come either this year or early next year.
It would help if companies could respect the rules in EU that says data collection should be voluntary and opt in.
Then the privacy policies could be really short.
That said I agree with others that reasonable standard policies would be great for both consumers and businesses:
Something like the Creative Commons licenses comes to mind:
- 0, green: nothing (no analytics, no state, so no login possible)
- sessions, green: login possible
- telemetry, yellow: anonymized, short lived (< 3 business days) data, not linked to use, not shared outside of development
- 1 party analytics, yellow: like telemetry but longer lifespan and shared outside of development
- 3 party analytics, red: uses Google Analytics standard edition or any other 3rd party tracker that shares data
>The information to be provided to data subjects pursuant to Articles 13 and 14 may be provided in combination with standardised icons in order to give in an easily visible, intelligible and clearly legible manner a meaningful overview of the intended processing. Where the icons are presented electronically they shall be machine-readable.
Such info could be tagged in page head and then you could do things like search for a forum that doesn't (according to policy) use your data for revenue (or share it outside the named business -- perhaps that's "PP0", in analogy to CC0), etc..
Just thinking on my feet, E&OE.
That is something I'd like to see standardized and mandated - complete with mandatory audits so it doesn't fizzle out like it did last time.
Also, maybe it would work if it was legally enforced now? I suspect this was a case of Too Soon™.
Or at least have those as standardised starting points that cover the routine points that will be the same for 90% of data processing operations, so you only have to specify additional detail for things that might be unusual or surprising.
It would simplify things greatly if instead of all that boilerplate, a short list of one-liners is all you need to state if you're only performing normal data processing for common purposes, as defined by official privacy standards along the lines pbhjpbhj suggests but perhaps specific to each common purpose. Then you only need to elaborate on anything unusual or particularly sensitive, and anyone interested in how you're processing data about them can quickly identify such cases (or verify that there aren't any and they don't have anything to worry about).
And the way to do that is standardisation.
In many situations, at least here in Europe, you can go about your normal life without worrying too much about tricky contracts catching you out. There are consumer protection rules that restrict what can be done, prohibiting it entirely in some of the most serious cases, but also setting out reasonable expectations in some sense so that any business wanting to violate those expectations has to be clear about their alternative or might find it doesn't stand up if challenged.
One difficulty with the online world at the moment is that because it's very international in nature, even rules that apply across say the whole EU or at federal level in the US don't necessarily provide any guarantees to visitors of websites or recipients of emails because the business or other organisation they're dealing with might not be in the same jurisdiction as them.
On top of that, these big data-hoarding organisations pose an unprecedented threat to our privacy and ultimately to our freedom and way of life because there is an unprecedented amount of data collection and processing going on. Some things didn't really matter much at a small local scale, like the person passing you in the street seeing your face and knowing you were there at that moment in time, yet forgetting you a moment later. The exact same data points can matter a great deal more when we're talking about huge numbers of them being collected and collated by a single entity that can then process a more informative data set in ways that would never have been possible in the simpler case. Now the marketer or the government or the criminal who hacks the marketer or bribes the government official has a detailed record of your normal daily movements and any anomalies, or your spending patterns across everywhere you shop and everything that says about you, and so on.
We need a clear basic framework for what we as a society are and are not willing to permit in these areas, for how we trade off the potential advantages of organisations that might genuinely be trying to help us having access to more data against the potential risks of organisations that are not necessarily acting in our best interests having access to more data, even if in some cases they might be the same organisation using the same data in different contexts.
I personally regard the GDPR as a swing and a miss in this context. The intent might have been good, but it's so complicated and ambiguous that in many ways it creates problems rather than solving them. Crucially, that is particularly true for organisations that were trying to be responsible about how they work with personal data and privacy issues, which might have been looking to the GDPR and the national regulators for clarity about the ethics and legality of different practices with pros and cons.
To pick a less obvious example, maybe we should have clear defaults about analytics. For example, perhaps a business is allowed to monitor how its customers are using its own hosted systems by default, but activities like accessing users' personal data uploaded to those systems for other purposes, exporting users' personal data from their local devices, or sharing any of this data with third parties requires explicit disclosure and maybe some level of consent.
Privacy policies could indeed be much clearer if only the exceptions to common sense had to be declared in some standardised way, and if an acceptable definition of "common sense" were itself provided somewhere through legislative or regulatory means.
Personally, with massive PII dumps getting leaked every week I'm not surprised governments are starting to act.
No, you don't. That's covered by the rule "Compliance with a legal obligation" because you have to do it, but only store as much as you need.
If you have a legitimate basis to collect and store personal data for some purpose X, then that doesn't allow you to use the data you collected and stored for anything else - if you want to use the same data for some other purpose (like targeting ads or given them to your "partners" to target ads), then you need consent; and if you give them to your "partners" to allegedly execute that legal need X but it turns out that they're using it to target ads or reselling data, then you're liable for that.
That's debatable. The GDPR itself explicitly notes [Recital 47] that even direct marketing can constitute a legitimate interest.
However, there are specific provisions for that case, particularly the explicit provision [Article 21, para 3] that if the data subject objects to processing for direct marketing purposes then that is black and white and that processing must be stopped.
Compliance with a legal obligation is valid grounds to store and process data, but the information requirement still applies - you need to inform the customer what you're collecting and why, you just don't need their consent in this case.
E.g. the GDPR article 13.1.d / 14.2.b - you need to inform the data subject about what exactly is your legitimate need that justifies the processing of data; and customers then can judge whether that need (and the collected data for it) seems reasonable or warrants a complaint to the regulator.
The GDPR is an EU regulation. An EU regulation is a bunch of rules that have direct legal effect across the Union.
That's a legal basis for processing, which you also have to disclose. It doesn't exempt you from disclosing other required information such as the types of personal data you're collecting or your policy on retention.
If the sites are relying on consent as their legal basis for processing personal data then hiding it in those policies is 100% a violation of the GDPR.
Enforcement action is unlikely to make headlines, though as it'd be such an open-and-shut case it won't even make a courtroom. The supervisory authorities will just impose administrative fines.
This ruling feels poorly thought out to me. Activities on the web aren’t totally private, that’s how it’s always been. Getting rid of 3rd party content makes it ... kind of not the web anymore.
The Facebook like button is a web tracker, disguised as a social engagement button. If not its primary -, then its secondary function is to (indiscriminately) track users and non-users outside of its walled garden, like some reversed Trojan Horse.
Hotlinking an image is just that: hotlinking an image. Facebook relies on us and lawmakers to say: "We just can't ban third party content!", while we perfectly could leave innocent third party content alone, and focus our sights on the spy button. It isn't reasonable, nor common sense to conflate the two: even if similar in syntax, the context is vastly different.
You can't build a business on assumptions made on an ambiguous ruling. And while common sense seems reasonable there it has no definition. Why should investors take the risk?
An image served from a third party server can very much have a secondary function to track users.
Swap "costly" for "mildly inconvenient" and then I could almost see where they're coming from but I think they're missing the forest for the trees here. Let the "like button" die, rulings like this take the wind out from beneath it and eventually it's a metric you'll never be burdened with.
Either they take a stand against holding all businesses liable for transfer of data, or those who pay them a membership fee will see an increase in liability or a decrease in the perceived precision of advertising data.
They're not missing the forest for the trees. They're busy logging the forest for the trees, and would very much like everyone to hear their call to action and join their side.
One solution would be to have a "turn on like" button, but the image solution that some comments point out seems like a good option as well
(Or fb could have data sent only when the person clicked the button, but that's unlikely to happen)
> Under EU data protection law, therefore, a European retailer and the US platform are jointly responsible for gathering the data
I really hope this means that Facebook and all those stats/ads providers can be held responsible if they don't take adequate measures to ensure that only data from users who have given valid consent is sent.
Going after individual site operators is a fight against windmills. It would be much more effective if they could go after a company that provides an Ad SDK to hundreds of thousands of apps, but just tells the app developers in the fine print "by using this SDK you confirm that you have gotten consent from your users" - and as a result, knowingly accepts that nobody will care and data from non-consenting users will be collected.
Yes, it does mean that, and on top of it authorities can also go after each web site that has a like button on it
I know these are a novelty on Android where most people use Chrome because it's pre-installed - but add-ons are small, self-contained downloadable additions to your browser. There are multiple such add-ons that will block ads for you. They also work in-app where the Firefox WebView/custom tab is used.
I still don't understand which law allows _not_ to show a choose your default browser installer, like Windows had to after the court decision.
This is old tech, and it works very well, on any variant of android. Apple is the one that's very behind.
Then ot doesn't matter. If she trusts Google, there's nothing to do.
I used to work for one of the larger social sites in the UK with many millions of unique users and we found that the social buttons got next to no engagement. Before I left we began the conversation about removing them entirely as they were just dead space on the page.
However there are whole businesses that have been built around these buttons (fx Upworthy).
And yes, I would have thought the same logic would apply.
CDN's are especially nefarious when it comes to privacy because they make it so hard to block third party content while retaining functionality.
But for data collection there is a big difference given that FB knows who you are, you are probably already logged in, and log/use that data. Do CDNs such as Cloudflare, Google Fonts etc log and analyse usage behaviour?
Regardless, I agree, it will be very interesting to see what will happen when all these external services are dragged into GDPR cases. Lots of sites includes fonts, chat widgets, buttons and other stuff that track behaviour without being consented by the user and left out of their privacy policies.
Though this is a firewall for the people against business practice/malpractices. Which is a good thing. I'm sure there will be many cases of this causing issues, but on the whole, it does fall in the favour of the end-user, us the people.
I say firewall, more an IDS that reacts to breaches. But it is good how they are at least not ignoring and overlooking such details and this is a fine example of it being well thought out.
The only way the third party site can know that is via third party cookies, or attempting fingerprinting as a third party iframe. Do you see any other way?
I thought that the EU already realized that this is a matter of cookies - in this case third party cookies of a site that you HAVE logged into. Browser makers should just let the user make a decision whether they want the requests to be automatically sent with third party cookies in this case — OR to explicitly approve every single time they log in using oAuth or want to share something.
That's essentially what a Facebook like button is
IMO, decent websites have been using something like that for a long time (the small subset of them that have like buttons, that is), so nothing will change for them.
Why should I have to surrender my privacy to read an article on your site? Why do you think that it’s ok?
The only time a your site should be sending tracking information to someone your user’s have not explicitly stated they want you specifically to share is when they have actually interact with the bottom. Not a mouse over, not a resource load, not an invisible overlay.
The use has to consciously opt to do that.
If you can’t ensure that your site isn’t abusing users/readers you need to gate all your pages with a page stating that you will be providing other companies with tracking information that provides your browsing history. You should also list all of the companies you will be sending that data to.
If you don’t want to do that because it will hurt “engagement” or “conversion” that’s your problem.
Alternatively you could have a banner that says “you’ve used our site so we sent information about your browsing history to these companies, and there is no support for deleting that information. We recognize that you may not like that but we don’t care about your privacy, and have no intention to preserve it”
The browsers (well Safari at least) actively work to break those things being used for tracking, but fundamentally (and the reason FB, etc require you to embed JS that loads their trackers) tracking companies treat user privacy systems as an adversary and continuously update to defeat it. Look at Google circumventing it in the past (and being hit with fines because of it). Nowadays they’re simply more clever in not crossing legal lines.
False. A browser could easily be configured to block or prompt before loading 3rd party content (early versions of IE use to do this). It would be very annoying, but it's possible and that's where we're slowly going with all the cookie/gdpr popups. There's always a tradeoff between security and convenience.
People are trying to legislate what should be a technical solution.
Read paras. 71-81 in the judgment - it sounds to me like 3rd-party adverting would be covered by the same logic.
I mean, I get it... but the whole point was to stop the behavior, not side step it.
That is true...unless you are not based in the EU and don't "envisage" (a term used in the GDPR) serving EU customers. Then you don't have to deal with any of this nonsense and are free to add whatever like buttons/analytics solutions you would like. A US site that doesn't offer translations in European languages, doesn't accept EU currencies, and doesn't use an EU domain extension, is not subject to GDPR - even if EU users can access it.
"Whereas the mere accessibility of the controller’s, processor’s or an intermediary’s website in the Union, of an email address or of other contact details, or the use of a language generally used in the third country where the controller is established, is insufficient to ascertain such intention, factors such as the use of a language or a currency generally used in one or more Member States with the possibility of ordering goods and services in that other language, or the mentioning of customers or users who are in the Union, may make it apparent that the controller envisages offering goods or services to data subjects in the Union."
A side project of mine, starting in the Junkbuster days, is fighting cross-site tracking/profiling, and almost every Web site does it at least a little. Legal precedents suggesting liability for that seems huge, and maybe end the technological arms race (which I think the privacy&security people will otherwise ultimately lose).
What cost is involved by not embedding third party bescons on your website?
How does it not improve consumer protections? It's literally stopping doing the thing that is causing harm.
> “With its decision, the ECJ places enormous responsibility on thousands of website operators – from small travel blogs to online megastores and the portals of large publishers,” Bitkom CEA Bernard Rohleder said.
Yes, this is exactly how serious this situation is. I'm glad you're getting a handle on just how damn huge this problem really is. Aren't you glad we're finally doing something about it?
> He warned that the decision would go beyond Facebook and effect all social media plug ins, which are important for many firms to expand their reach on the web
Uh, yes, that's the the idea? Your firm's right to expand their reach does not overrule my right to privacy.
People can still share and like your links on the social platforms. It doesn't require me to be forced into it.
The board (Präsidium) includes the CEO of Microsoft Germany, IBM, Heweltt-Packard, Samsung as well as SAP (they are at least German), Vodafone, Deutsche Telekom, etc.
Bitkom is know to promote weak privacy rules and big data analysis.
With weak arguments like those it seems it's not a very good one..
Share Count queries to the Social Networks are proxied through this service securely and visitor privacy is protected... like an anonymous VPN.
This URL Shortener service is also GDPR compatible as retargeting pixels are not set for EU subjects regardless of what customers want to set. In the roadmap is to add an opt-in message on the redirect.
Btw, Share Count Proxy is also whitelisted by Firefox which provides the added advantage of share counts actually showing on Firefox if you use the proxy while direct calls to Facebook.com, Pinterest, etc are blocked.
For a company on that scale to remain compliant to things like cookie law (mention every cookie and what it does for opt-in) there is no easy way to see if you're compliant. We need some standard (like security.txt) which defines how cookie data, impressum or other site specific links are expected which has to be machine readable. Right now every company creates it's own mess of html which is no fun scraping to figure out if the company is compliant or not. (yet scraping is what everyone in compliance expects to happen).
I wonder how these laws can be enforced without creating a huge administrative backlog.
As I understand it, it will include things like Do-Not-Track and a better cookie banner legislation, which makes the banners less common.
Enforcement is trickier. Let's wait for a few more rulings and see if that's enough.
I fail to understand how anybody could scrape a web site to figure out whether it is compliant with the GDPR. For example, if I claim my site encrypts your data at rest, how do you verify it by scraping the site? If I say I don’t share your data with third parties, how do you verify it by scraping the site? If I say I throw away the encryption key when you delete your account, how do you verify it by scraping the site? if I say that, after deleting your account, all data is gone after at most 30 days, how do you verify it by scraping the site?
The GDPR isn't solving a (purely) technical problem. It applies even if you're using a pen, paper, and a filing cabinet just as much as if you're running a global social media platform.
What's the technical solution to showing compliance with "data protection by default and by design"?
What's the technical solution to ensuring that "only personal data which are necessary for each specific purpose of the processing are processed"?
These are inherently organisational issues, not technical ones.
Facebook and FashionID are joint data controllers, and FashionID aren't liable for additional processing that Facebook does with the data.
If consent is the legal basis upon which the processing is based then both entities must have consent.
That is: “to give you this service we need to store some info” - OK.
“To give you this service we need to share info with advertisers” - not ok.
That is: you need to be able to provide the service using only non targeted ads if the user wants it.
Mark my words. This is going to have enumerable unintended consequences and the Internet will suffer for it. Fuck the EU.
Yeah man, just like people self-regulated to only drive at safe speeds, always wear seat belts, not hand over money to scammers.
Pray tell, how is the EU authoritarian? You do know that every EU government members is elected, either directly or indirectly, by the EU citizens, right?
(Perhaps a better requirement would be to require the browser distributors to include warning labels about such features if they are done automatically.)
If you do not like the idea that you own your data, not companies then you can give your consent to process data to everybody.
What they will do with the data you give to them, is a separate issue than the web browser. The company you are dealing with still needs to have a proper policy for that, but that is different than the issue of the client configuration.
Requiring a warning message about cookies on the web page is not helpful, because that is the wrong place to put it; the browser can provide its own such warning, and the user can configure it. (Lynx provides the possibility to ask when a cookie is received.)
So, the actual problem is the browser providers designing them stupid, and making them such complicated that it is difficult to make up a new one which is actually good.
edit: I hope that this will also hold for Google with reCAPTCHA and Analytics.
On the other hand, I guess this is only one more checkbox to tick among the checkboxes we already have to tick.
It’s more than a formality, it means that either party could get sued if they violate the terms of the agreement.
Having contractual relationships in place is common in this type of legislation. HIPAA regulations require formal contractual relationships with suppliers and contractors.
Well the part companies doesn't seem to "get" is that this consent should be informed and voluntary, which means opt-in and not only available after 3 minutes of jumping through ridiculous hoops to opt out.
I'm with you that consent before action would be the right way to go. But since we can't rely on sites to be ethical, it'll stay the browser's business to protect the user.
I don't know how other countries are doing, but Germany's officials are apparently very understaffed, so complaints will regularly sit for months and they won't have a lot of time to understand the details, so I don't put my trust in oversight for the foreseeable future.
Don't know of Germany's position but the UK's ICO added significant staff and budget when the switch from Data Protection to GDPR came in.
It's not unnecessary data - FB needs that data to target users with ads, the shop needs FB to show ads to people that are likely to buy what they've been shown. I do agree with you that this would be very different if there was no business interest in tracking users, but there is.
Google Ads have already been fined €44m under GDPR, and I believe there is another case already in the French system.
Showing ads is not the service provided as no one at all would accept them optionally. FB, and Google, should be targeting in non-personal ways even where that necessitates less targeting. That was part of the point on which Google got their €44m fine.
As seen in TFA, the web of third party tracking appears more liable, and in need of consent than many, particularly American companies with a vested interest, have claimed whilst hiding behind those.
As far as I am aware, and I'm fairly certain about this: funding is not a valid legitimate interest under GDPR. If it was it would be a loophole big enough for a medium sized planet ;-)
The difference is that they now have to inform you that they are doing it, who is involved and who to direct requests for information / deletion to.
I think "legitimate interest" has a different meaning in the context of GDPR. I'm fairly certain about this, but you don't need to take my word for it: https://duckduckgo.com/?q=gdpr+%22legitimate+interest%22&t=h...
As I said: if your interpretation was correct (in this context) it would be a loophole so big it would make the rule meaningless.
> The difference is that they now have to inform you that they are doing it...
I don't think so. AFAIK the difference is they now need to make it opt in and voluntary.
That's my point - actually, not a lot has changed. There's just larger fines and more bureaucratic hoops.
It's why you still do see all kinds of tracking - but you'll now get information about it.
I have not heard statements going against this from any lawyers.
I'm interested because my understanding has been most Europeans understood it the same way I did.
Not saying I fully believe that explanation yet, but I'll try to find out more.
BTW and FWIW: I'm not the one downvoting you and I disagree with those who do.
ICO says this  about choosing a lawful basis: "You must not adopt a one-size-fits-all approach. No one basis should be seen as always better, safer or more important than the others, and there is no hierarchy in the order of the list in the GDPR."
I think I'll stay clear of your lawyer advise.
> According to the European Court of Justice ruling, a site that embeds the Facebook “like” icon and link on its pages also sends user data to the US web giant.
This is categorically false. The site that embeds the like icon is sending absolutely nothing to Facebook. The user's browser is the one sending information. You have control over your browser. You can do something about it if you don't like it.
The EU's regulations infantilize the public and removes consumer choice.
> Both which are in their full control.
So is your car when you leave the car wash, yet if they had put something damaging in the gas tank, I'm betting you'd complain about them.
And so the EU joined the US into fining companies a few million for privacy violations.
It certainly does not convey that intuitively. Nothing's preventing Facebook from presenting greyed out versions of their icons, for example.
How does forbidding hostile user data harvesting means that innovation will never happen?