Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft .Net Core telemetry is not opt-in (github.com/dotnet)
140 points by setquk on May 29, 2018 | hide | past | favorite | 114 comments

It boggles the mind that a bunch of very smart people at Microsoft thought this was a good idea. Say what you will about Oracle but Java has never been malware that continuously spies on users. Unfortunately the GDPR explicitly allows this kind of IoT-style ubiquitous anonymized monitoring: as long as the information cannot be tied back to an individual person (meaning no IP addresses but GUIDs are fine) and the business can claim a legitimate business interest they're likely to get away with it. This sort of ubiquitous monitoring is only going to get more popular. In Hong Kong we're seeing new Condo apartments that have full-fledged "internal sensor grids" a la Star Trek (as it was described to me). In theory though the data collection firm (which is not the same firm that owns the building) has no way of correlating it back to a specific person/apartment number.

Like all things, there must be a sense of context and proportionality. This issue is specifically about the .NET Core SDK command line tools.

Getting data about which commands are used most, if they're successful or throwing errors, possible typos, unclear command flags, etc, are useful when shipping software frameworks used by 10s of millions to build products that run the entire Fortune 500 and thousands of other businesses.

This is not ubiquitous spyware, and is about as generic as it gets while still remaining useful. Every software company (many of which are featured here on HN) does the same analytical tracking to figure out how their products are being used and it results in better features. What exactly is the danger here, especially with the lack of personal info and an opt-out method?

They even make the data public along with insights into how it helped: https://blogs.msdn.microsoft.com/dotnet/2017/07/21/what-weve...

> Like all things, there must be a sense of context and proportionality. This issue is specifically about the .NET Core SDK command line tools.

Exactly. Command line tools don't usually collect your data and send it off somewhere without asking. You wouldn't expect ls, cat and grep to spy on you. That's why it's so surprising.

But even big interactive applications like IDEs don't normally do this without asking. IntelliJ had a dialog pop up just a few days ago to ask me nicely to participate, defaulting to non-participation (!). Visual Studio and Office used to do the same thing, as I recall.

(Additionally, once you figure out how to opt-out in .NET Core, does it tell you somehow that you did so successfully? I think I've set the environment variable correctly, but how can I be sure?)

> Exactly. Command line tools don't usually collect your data and send it off somewhere without asking. You wouldn't expect ls, cat and grep to spy on you. That's why it's so surprising.

Except in Ubuntu Linux where apt-get it has done so for ages without anyone complaining:


Has it? It says right on the page you're linking to that it's disabled by default, so you would have to explicitly opt in. Thus nobody complaining, as those who don't want it won't enable it.

I think it comes from Debian[1] actually, where it's also opt-in – my Debian machine doesn't have it installed. Like it says in the Debian package description: "Vote for your favourite packages automatically". It's voting, not spying – strictly voluntary. Those of us who don't want to vote don't have to.

[1] https://popcon.debian.org/

> it's disabled by default

Fair enough, but it wasn’t always like that.

> “Vote for your favourite packages automatically". It's voting, not spying

That’s just playing with words. What it does is exactly the same: provide the upstream author statistics about what is being used and not.

Surely you can’t mean that this is a serious argument.

My argument is in the part of the sentence you didn't quote.

When people go to vote for their government, is that implemented by sending government agents to eavesdrop on people's conversations to suss out what their political preference might be? No. People voluntarily (this is the core of my analogy) go to vote, to make their voice heard.

That's the difference between voting and spying.

Of course, if popcon were enabled by default, then that would be different and not like voting at all! But that's not what it said on the page you linked.

Collecting command line arguments is a HUGE invasion of privacy.

Imagine I ran a command "make.exe my-secret-project-name" or "make.exe process_gdrp_removal_request SOME_SSN".

And as the linked page admits, they really do collect all command line arguments, not just a white list of valid commands. Perhaps someone accidentally pastes an AWS API Secret into the wrong terminal window and it ended up as "dotnet.exe MY_AWS_SECRET". ("You will notice misspellings, like “bulid”. That’s what the user typed. It’s information").

They don't collect command line arguments according to the github issue.

According to this PR they do collect command line arguments.


"My understanding is that this was recently discussed with our privacy team and we concluded that collecting the arguments themselves (hashed or not) is not acceptable per our privacy policies. Not sure whether the code already reflects that, but it's being worked on."

Somebody in the thread for the issue said it's "not acceptable per our privacy policies" and "being worked on" two years ago, but did they ever actually stop collecting command line arguments?

They do collect at least the first argument, how else do they end up with "bulid"? Accidents happen and someone could paste or tab-complete a confidential thing in that spot...

I Guess they whitelist it. If someone types a valid command send the command. If someone types an invalid command then it prob sends something that isn’t descriptive. (Assumptions I’m making)

So if you push your api key to github by accident is github responsible?

Invalid comparison, in my humble opinion.

Pushing a secret API key to github, even accidentally, does not happen covertly or as an unexpected side effect of some other tool written by github, and you'll soon realize what happens because you'll see your key sitting there. You also have a general idea of what you are playing with when you start using github, whether it's private repos or public repos.

Having a local CLI tool submit your secret API key as part of telemetry collecting command line arguments is totally unexpected and invisible for the user, and you won't even see it has happened.

> Perhaps someone accidentally pastes an AWS API Secret into the wrong terminal window and it ended up as "dotnet.exe MY_AWS_SECRET".

While I'm not defending the practice, I don't see how, when you make a copy-paste mistake into a wrong app, you blame the app for receiving the result of that mistake?

> you blame the app for receiving the result of that mistake

Yes. People all make mistakes. Good tools mitigate mistakes, while bad tools (like this telemetry) exacerbates problems.

So when you paste your secrets by mistake into your browsers' instant-search box, you blame your browser for sending that data over the network?

I would say there is a different expectation of privacy. When I am executing arguments on my command line, I am not expecting it to get collected and sent anywhere.

Does it happen in other apps? Sure. But I would not expect my command-line builds are phoning home what exactly I am doing.

Of course not. The browser performed its primary task, and the consequences are reasonably obvious to the user, who can at least make an informed corrective action.

Is the difference not obvious?

I think I should have said: the accidental paste is irrelevant here. It will always be troublesome if you paste secrets in the wrong place (especially if it takes it to the network).

When you run make.exe some-program-that-uses-aws YOUR_AWS_SECRET intentionally it is troublesome by itself.

> It will always be troublesome

You have a strange notion that small troubles and big troubles are all the same.

Any tool that purposefully builds in unnecessary minefields like these deserves to be shamed.

Does this justify hashing user identification for later correlation with other products?


Java has been pushing "sponsored offers" (i.e. unwanted spyware like browser toolbars and similar user-hostile software) in their installers for years, so they're not exactly a shining beacon of light.

Yes Java has been a perpetural problem, at least the official JRE and JDK. The JRE ships with junk and the JDK forces you to forge an agreement cookie to download it.

OpenJDK is fine however.

I haven't had a problem with that for a while.

The installers I've been getting directly from the Oracle JDK pages haven't installed anything extra, or asked to.

Are you sure? It's been there since 2013 (https://www.computerworld.com/article/2494794/malware-vulner...) I don't think there's been any change since. Java 8 still had a page about disabling the offers: https://www.java.com/en/download/faq/disable_offers.xml

The option to suppress sponsor offers is still there in the control panel for Java 10.0.1.

It's of course possible that some users don't experience these "sponsored offers" due to geotargeting, A/B-testing, lack of available "offers" at a specific time, or any other reason controlled by the server side of the installer.

>but GUIDs are fine In general terms, I think you are mistaken by that - as long as information, including pseudonymous, can be traced back directly or indirectly to an individual it is to be considered personal data.

If it is genuinely anonymous, then it's far less of a problem and not actually spying on individual users.

I went down the rabbit hole on this one a little. They're sending a SHA256 hash of the user's mac address with every telemetry event (which can be trivially reversed. I leave it to the reader to decide whether MAC addresses are personal identifiers).

And also for what it's worth, it appears that they are using this ID to correlate what users are doing across their products - as indicated by this comment: "// The hashed mac address needs to be the same hashed value as produced by the other distinct sources given the same input. (e.g. VsCode)" (https://github.com/dotnet/cli/blob/b45f1fb439b36872c249b07f1...)

That's a blatant violation then.

The problem is that deanonymization [1] is a thing. It's quite likely that that the Microsoft dataset is itself is anonymous but when it's correlated with other datasets it becomes possible to surveil people. The danger here is that governments are happy to let businesses collect tons of anonymous data and then, when the moment is right, swoop in and correlate this data. This happens so often with stuff like IP addresses that they have to be considered personal data even though in theory the same IP address can be recycled.

[1] https://en.wikipedia.org/wiki/De-anonymization

Yeah, I kind of agree here. If it is possible to track down a users identity in a systematic way when using these methods it shouldn't be too hard to prove that in a whitepaper that would stand up in court?

This stuff is Free and Open Source, right? And the phone-home stuff is done with ordinary network connections, right? So the telemetry could be stopped either using firewall rules, or by modifying the code.

Edit looks like it can be disabled with the "DOTNET_CLI_TELEMETRY_OPTOUT" environmental variable, following https://github.com/dotnet/cli/issues/3093#issuecomment-22997...

There seems to have been some sort of recent push at Microsoft on this.

A little while back, an update to Office for Mac caused each application to pop a dialog on startup to share "diagnostic data" with Microsoft. The only options were "Full" and "Basic" data sharing. The buttons in the dialog were "Accept" (once you'd clicked an option) and "Learn More". No "Don't Share Data" option.

Many users complained in the support forums about this, saying they wanted a way to completely opt out. Worse: if you closed the dialog without selecting any option, you were automatically "opted in" to "Full" data sharing (as you could see if you then went into settings and looked at the privacy tab).

This week they rolled out another update. Which sneakily looks like it fixed this; now the dialog has buttons "No" and "Yes", but if you read it the question it's asking is "Can Office send enhanced error reporting?" There still is not a "do not share data" option.

I'm surprised they haven't gotten more flak from large companies and governments about their telemetry.

I have been contacted twice in the past year by Microsoft rep's who both told me that one of my products is among the top installed add-ins for Excel (they didn't say top "X", just "top" - but I'll take it). I assume they collect this data via telemetry.

What bothered me is that when I asked questions back, I would get no reply. If they are going to collect this data, the least they could do is help developers out with some aggregate data. I've been selling this product for 10+ years, and to this day, I still vacillate on which version of the .NET framework to target.

I've always gladly submitted bug reports and begrudgingly accepted some telemetry in the belief that it helps the developers fix bugs and build a better product. But apparently this data isn't even passed back to developers?

.NET Core does not "violate" GDPR any more than Google Analytics does, or any other system that receives data. It's up to the data controller to ensure that no personal data is sent, which is done by setting the environment variable DOTNET_CLI_TELEMETRY_OPTOUT.

There are fair arguments against the nature of opt-in telemetry, but saying "they violate GDPR" is just hyperbole, imo.

It violates GDPR because it defaults to enabled. GDPR states that data collection should be opt-in.

The data is neither personal or sensitive. So which data collection needs to be opt-in?

Of course all metadata is subjectively speaking personally identifiable information. But which data are you claiming isn't?

All the data they collect. Atleast based on the GDPR training thing I had to do yesterday for work.

Is opt-out a valid strategy for GDPR compliance? I got the impression that these things had to be opt-in?

They have to be opt-in. The only way you can work around it is if you can guarantee that the data you collect does not uniquely identify someone.

Repeat after me: If you don't collect user-data, but just aggregated and anonymized usage-data, the GDPR does not comply.

No need to panic.

I know the all the spin-doctors Silicon Valley and the privacy-hostile tech companies can afford is trying to make it seem like GDPR makes everything into a giant mess where you need a team of lawyers to just write basic web-server logs.

But just because the spin-doctors are trying to spin things that way, doesn't mean it's true.

This is ofcourse big scale privacy-violators trying to give the general population the impression that the GDPR is "ridiculous", something to disregard, and in the process, trying to sabotage a law which will make it unlawful for them to keep tracking, mining and selling the shit out of your personal information.

They are trying to make you turn your back on someone who is actually fighting for your privacy. That's just scummy as heck.

GDPR is about common sense and basic decency. It works out just fine for everyone not into scummy user-hostile shit. You just need to be up front about what user-data you collect and how you intend to use it.

No user-data? Move along. No GDPR.

But it's not aggregated, and the combination of geolocation + os kernel build number + runtime id + osversion + sdkversion could very well be unique. Just take a look at some choice lines from one of the .tsv files at https://blogs.msdn.microsoft.com/dotnet/2017/07/21/what-weve... :

  Timestamp       Occurences      Command Geography       OSFamily        RuntimeID       OSVersion       SDKVersion
  5/8/2017 12:00:00 AM    3       fable   Madagascar      Windows win10-x64       10.0.14393      1.0.3
  6/8/2017 12:00:00 AM    1       fable   Germany Windows win7-x86        6.1.7601        1.0.1
  4/11/2017 12:00:00 AM   3       user-secrets    Vietnam Windows win10-x64       10.0.14393      1.0.0
  5/1/2017 12:00:00 AM    1       user-secrets    Thailand        Windows win10-x64       10.0.14393      1.0.0-preview2-003131
  4/3/2017 12:00:00 AM    1       restore3        Peru    Linux   debian.8-x64    8       1.0.1
Especially with commands being logged, one slip-up (paste, tabcomplete etc) could end up in public .tsv files for anyone to see.

> Especially with commands being logged, one slip-up (paste, tabcomplete etc) could end up in public .tsv files for anyone to see.

That argument can be extended to anything and as such cannot be deemed valid.

I could accidentally post AWS secrets as my user-agent in my browser. Does that mean everyone on the internet should be held accountable to GDPR because I might visit them and my user-agent end up in their server-logs?

Ofcourse not.

Microsoft is clearly showing a reasonable effort w.r.t. limiting what gets logged and how. That should be more than enough to appease any GDPR-concerns. They can't be held accountable for all possible errors or error-modes in the known universe.

If anyone here is claiming they should, what they then also claim is that any software handling user-data must be 100% bug-free to be GDPR-compliant. And that's obviously a ludicrous position.

Stop the nonsense.

Why not just make the logging opt-in then? Opt-out for non-primary features like sneaky telemetry is not reasonable.

There's a difference between not being bug-free, and purposefully implementing a user-hostile feature like this.

Please, try to avoid crediting me for absurd arguments that I haven't made.

GDPR aside, this kind of "telemetry" is largely unwelcome.

And of course, all metadata is personally identifying.

Let's say your browser knows just my screen resolution. Just based on that, you can't identify me.

Not all metadata is personally identifying. A very careful bundle of metadata might be.

Any large enough bundle of carelessly collected metadata will be enough to identify a person.

From their perspective, they claim that it is not personal info since it's anonymised. On that, they're right.

But it still makes a shitty user experience to do something that is counter to what your users expect and want.

Maybe, maybe not. They record 3 octets of your IP address and the experience is opt-out rather than opt-in. ECJ ruled that IP addresses are PII but did not specify if using a partial address is OK. On the basis it is the 3 most significant octets, then this could be used to identify a company or a pool of users or a unique user on an unpopulated network.

Edit: also from [1]: "Hashed MAC address — Determine a cryptographically (SHA256) anonymous and unique ID for a machine. Useful to determine the aggregate number of machines that use .NET Core. This data will not be shared in the public data releases."

I think a machine could be a user in this circumstance? Also a hash isn't anonymous if it's idempotent based on the IP address alone as it's merely a derived value.

[1] https://blogs.msdn.microsoft.com/dotnet/2017/07/21/what-weve...

Atleast per german court rulings, which at that time largely covered what is defined as PII under GDPR, say that the full IP is PII but the first three octets is not.

The last three octets absolutely could be in conjunction with some other pieces of data, though.

Of course, yes. Once you can identify a unique person it's personal data, even if you cut of octets. If you only log the IP for example for detecting duplicate accounts, it should be good enough.

ECJ ruling citation please? That would be extremely useful in settling this argument.

In the US under HIPAA laws, your IP address counts as PHI, not just PII

Source: I work in IS for healthcare providers

No, an IP on its own is not PHI. IPs must be removed when de-identifying PHI, which is not the same at all as claiming IPs are themselves PHI.

[1] - https://www.hhs.gov/hipaa/for-professionals/privacy/special-...

> But it still makes a shitty user experience to do something that is counter to what your users expect and want.

One way to improve shitty user experiences is to study them. This requires information.

How do I answer questions like: what is the most frequently-used command? Which command causes the most people to give up? Which argument is being asked for, which already exists? What effect does a change in documentation have? How many projects do people work on and do we need to invest in tools for repositories with many projects?

Etc etc.

This is all Product Management 101: form hypothesis and test it.

Would be great to give the user an option to send statistics or something like that to do just that.

I really like VSC. But things like this really question all the positive experience. I doubt product management is going to help here.

Will now scan network transfer VSC is causing. In many other cases I wouldn't even bother anymore.

> Would be great to give the user an option to send statistics or something like that to do just that.

This is exactly what they do.

You could answer such questions the usual established way it's normally done in other fields: sample a voluntarily participating statistically significant subset of the population.

Other fields do this when there is no better alternative. It's is orders of magnitudes more expensive and also, more to the point, quite unreliable.

Telemetry can be opt-OUT just as long as it displays very clearly to the user what is going on.

I'd prefer opt-IN but I fully understand if microsoft don't.

Having something be opt-OUT and not displayed to the user (meaning they are in no position to opt out because they aren't aware they even need to) is of course entirely unacceptable no matter which way you turn it. It can't be in fine print it needs to be in big bold letters on first run, install, etc.

Why can't they just make it opt-in?

When one of my kids ask for something and I say no, they might ask again. But the more they continue to ask, the more determined I am to say no. The difference between me and Microsoft (in this context), is I'm trying to bring up respectful children.

This is just what Microsoft are doing - judging by the +1's, there's a huge majority of users who just don't want this. Instead of giving their customers what they're asking for, they just dig their heels in further. This is one of the main reasons I started the transition away from Windows since the 8 preview.

They said that making it opt-in would mean less data. Well, doesn't that tell them something?

I'm sure the .NET team would have got the same answers if this where opt-in - and let's face it, the discoveries aren't exactly ground breaking.

If you ask me, I probably would have opted in. I'm sure other people would do the same (if they could). But it's not, so I jumped through the hoops to disable it (added it to /etc/environment). You don't ask, you don't get.

And another thing, why not disable it using the registry when running on Windows? Personal experience tells me that only a minority of Windows devs even know what an environment variable is, let alone know how to set to globally and persistent.

It's arguable whether or not this is personally identifiable. Even if it's not, it still leaves a bad taste as it's still classed as "spying" - they're sneaking the data out by making people take effort to disable it; it's beyond a simple Y/N, people have to actually learn how to do something. And it's also giving people the impression that their own apps will be contaminated.

It's a command-line tool I'm running on my local computer for professional use, not a click-bait website or social media platform.

Joy [1] also does this, with no obvious warning, and no way to opt out. I reported it and the contributors said they had no problem with it, and if I didn't like it I could build from source, because correcting my activities would be useful to them.

[1] https://github.com/matthewmueller/joy/issues/79

I believe it's the same with VSCode. All "telemetry.enableCrashReporter", "telemetry.enableTelemetry" and "workbench.settings.enableNaturalLanguageSearch" from user settings default to true.

Imho people can spend all the time they want discussing this.

The only way to sort this out is someone suing Microsoft and then let a judge declare whether telemetry not being opt-in is a GDPR violation or not.

This has been covered here many times, but consent is not the only basis for lawful processing under GDPR.

This title reads like .NET Core is forcefully replacing all other .NET frameworks on your system.

It is. Future versions of powershell will ship with .Net core only.

PowerShell is an extremely small part of the .NET ecosystem. It's more like a REPL for a programming language than a shell. And regardless, that a future version of a piece of software would be implemented in the latest version of the framework is not at all the same thing as what I'm talking about.

I read 'not' as 'now'. For a second there I thought the world was a slightly better place today :/

I'm all for continuing the fight, but I'm suprised this very old news is on the front page again.

I bet we’ll see research articles about how people use companies public data to pinpoint individuals in different ways. I’d think we all are going to be surprised what you can find from that data.

Note that recovering the plain MAC address from SHA256(MAC) costs a single GPU day on a 1080.

Oh this will get exciting

Can I suggest to change the title. It is misleading at the current state.

PS: this should be opt-in due to community request and not by some philosophy or law

Sure thing, we've added the missing “telemetry”. Thanks!


This should link to the entire github issue, not a single person's recent comment in a 2 year old thread. Also the title should be changed to clarify "telemetry".

Please change the title to include "telemetry".

This has been an issue for 2 years now, so what is it they've done with this oh so valuable data? What improvements have been made as a result of the data collected?

I suspect the answer is that no concrete improvements have been made and all it's accomplished is to make them look bad.

And there's the money shot:

"Hashed MAC address — Determine a cryptographically (SHA256) anonymous and unique ID for a machine. Useful to determine the aggregate number of machines that use .NET Core. This data will not be shared in the public data releases."

Someone please correct me if I'm wrong, but (2^48 possible MAC addresses) / (60000MH/s) / (3600 s/h) = 1.3 h to calculate the SHA256 sum of every MAC address in an AWS p3.16xlarge instance (~$50)


Looks about right to me.

I wonder why they didn't use something like bcrypt/scrypt with lots of rounds.

It doesn't really matter which hash function they use here... Your common MAC-48 address is a [Manufacturer ID (3 bytes), Device ID (3 bytes)], this further reduces the enumeration requirements...

You may be right. But it may not be a straight sha256. It could be a multi-round hash based on sha2. In the same way shadow hashes are default called sha512, but in reality it's a 5000 round version, so your price could be $250000 instead. (Or less/more)

You could cut down the search space a lot by only enumerating the MAC addresses of known vendors (https://gist.github.com/aallan/b4bb86db86079509e6159810ae9bd...).

That cuts the search space down to 23000 vendors * 0xFFFFFF = 385875945000. With a hashrate of 60000MHs, you could SHA256 hash that entire space in 6.5 seconds. If you have an NVidia GTX 1080, you can do it in ~2 minutes 16 seconds.

But private releases, that's a different matter...

And as we know from various recent events, private releases can become public releases without proper data control.

So they learned that there are a lot of Linux distros and tried to cater to that and that Mac users don't like open ssl. Improvements 1 and 3 are virtually the same and 4 is just some generic padding.

This is nothing they couldn't have to through more traditional and less invasive feedback mechanisms.

Have you ever debugged issues in production apps without logs? It’s awful. Production apps running on machines you don’t control? Even worse. To suggest that zero value has been derived by the client side analytics is naive at best.

Do you think your inconvenience somehow gives you a right to violate users privacy?

> Production apps running on machines you don’t control?

Yes I have. Last time I did it via a dialogue that would pop-up asking the user to submit the information along with a text area that contained all the information being sent.

It's possible to get the information you want and respect your users.

Your argument wasn’t about privacy. I was contending your claim that they did nothing with the data - which is obviously false if you do software debugging, and clearly false if you read the sibling link with what they’ve actually done.

This is the correct way to handle it.

For single issues - yes. If you have a large deploy base that you can't easily talk to, aggregate reports which can give you don't easy correlation results are much better. For example https://crash-stats.mozilla.com/home/product/Firefox results in much more stability for FF users. Asking everyone would not be possible.

It's more "something crashed, can we send this information to help diagnose the problem? Here's a big textbox from which you can review what we're sending if you want".

Versus "screw you we're taking it anyway".

In some environments this is a massive compliance issue.

Funnily enough, my tab crashed opening the .net core bug linked and firefox has changed this to prompt users before sending the report.

Unfortunately it's still opt-out, doesn't show what data it's sharing and looks to be purposefully misleading, so no info from me.

> firefox has changed this to prompt users before sending the report

Wasn't it always that way? I remember those opt-in crash dialogs from a long time ago – I think they already had them in Netscape, or at least the (pre-Firefox) Mozilla suite.

No, it's not. Unless used very sparingly, it makes for a shitty user experience.

I'd risk a guess that the average user doesn't have any problem with aggregate, anonymized data being collected for statistical and debugging purposes. A lot of people here on HN do and they like to extrapolate that to the whole society, but it's a stretch.

The average user clicks through privacy agreements and contracts without looking. It's up to us to promote good behaviour by technology vendors rather than say I told you so later which is the current status quo.

But that takes way too much effort. It's simpler to just demand admin access and remote to the user's machine.

I know you're being sarcastic, but with this particular software we offered support like that as well. 90% of the time the first level support (and eventually QA) would use this dialog and add in stuff like replication steps manually. It was much better way to triage and debug issues than the usual email with an "it doesn't work" title and an excel document containing a screenshot.

So add "it saved the company buckets of money" to the list of reasons this is a better approach.

Does that mean that every .Net application has a built-in botnet unless disabled by developers? Can I disable is a as user?

This has nothing to do with your application or code. This is the .NET Core command line tools that send generic telemetry on which commands are used most, timings, and SDK versions. It shows a message when you first use the tools and can be disabled, check the docs: https://docs.microsoft.com/en-us/dotnet/core/tools/telemetry

Any published app may not invoke telemetry according to their documentation. Only the CLI tools related to build or developer inner loop activities.

PS: according to documentation of the feature

it can be disabled with an environment variable

Edit: manigandham answer is more correct

What do you need to stop trusting Microsoft?

They haven't changed a lot. They're still patent trolls, and if they still exist is mostly because they're still living off the momentum of their 90s monopoly.

This is the type of BS that comes out of the great minds at Microsoft: https://www.cbsnews.com/news/hiybbprqag-how-google-tripped-u... <- a great example of how your analytics data might be used.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact